OS X Security Defaults WTF



  • @DescentJS said:

    @Master Chief said:

    @blakeyrat said:

    @Master Chief said:
    Lots of things, mostly small nitpicky stuff that Microsoft just refuses to address.  The way Windows stores its passwords, for instance, is downright awful (essentially the same since Win2K).
    Well, ok, but that's not "The Unix Design," that's "some particular Linux distros have improved this aspect".
     

    Well yes, but the universal point is that they did address it, both in open source and in enterprise linux.  Not sure about OSX. Microsoft has not.

    Probably because there's some apps somewhere that would fail if they changed it.  Just like with almost all the other insecure things in windows, it's a backward compatibility issue.

    For anybody curious about this topic, to save you some Googling there's a Wikipedia page on the protocol Windows uses to authenticate computers not on a domain: NTLM. (Domain-joined computers use Kerberos with caching to allow it to work offline for a time.)

    All modern Windows use NTLMv2 by default. It was broken in February of this year, by an exploit that apparently went unnoticed for 17 years, but quickly patched afterwards... despite being "the same since Windows 2000" it doesn't seem particularly bad. (No surprise there, things that work in Windows generally don't get tinkered with.)

    I'd be interested in hearing from anybody in the know what modern Linux distros are using, and why it's superior to NTLMv2.



  • @Master Chief said:

    The way Windows stores its passwords, for instance, is downright awful (essentially the same since Win2K).
     

    I'll play the devils advocate here... I'll asume you're talking about LM/NTLM hashes. Their security as used by SAM is a moot point, because:

    1) There are no known vulnerabilities for modern windows versions (as in XP, Vista, 7)  that allow access to those hashes without administrator credentials. So...

    2) If someone gets those hashes it means your system is either stolen or already r00ted and you're fucked anyway, anything that isn't encrypted is accessible by the atacker.

    3) DPAPI and friends use different hashing methods, so even if your password hash is brute forced (=colission found, recovering the original is much harder if not impossible outside dictionary attacks)  your private keys, including the FS encryption key are still secure.

    And for the most fun part: In most *NIX systems by default until rescent times password hashes were stored in a world readable file, and encrypted with the stock crypt() which isn't much more secure than NTLM! 

    Now, what people are legitemately complaining about LM/NTLM hashes was it's use in SMB/CIFS, but even there it's mostly unused in modern versions, and back when it was still used the unix alternative was NFSv3 which actually lacks anything resembling actual security.



  • @blakeyrat said:

    I'd be interested in hearing from anybody in the know what modern Linux distros are using, and why it's superior to NTLMv2.
     

    Note that we are talking about 2 different things here:

    Password hashes:

    Windows uses 2 hashes to store your password in it's local security database: LM (inherited from IBM, based on DES) and NTLM (based on md4). Both are generally very weak.

    Unixes traditionally used a readeable-by-anyone /etc/passwd file with crypt()-hashes (DES), which in most modern distributions is replaced by /etc/shadow which is not readable to normal users and can use many different hashes (mostly some salted form of MD5/SHA1/SHA256 are being used)

    Network authentication:

    Windows systems use SMB/CIFS which used LM/NTLMv1 authentication back in the nt4/2k days and had lots of security issues. Modern versions use NTLMv2 which is generaly considered secure (note the 17-year vunerability was in implementation not protocol design afaik).

    Unixes mostly use NFS which until version 4 didn't actually have any real authentication method (relying on your firewall and inability of local untrusted users to bind to tcp/udp port<1024). Modern systems use NVSv4 which uses KERBEROS for authentication.



  • @bdew said:

    2) If someone gets those hashes it means your system is either stolen or already r00ted and you're fucked anyway, anything that isn't encrypted is accessible by the atacker.

    But what if you also use your Windows password as your TDWTF password? Then what you really care about is still protected by the hash.



  • @Spectre said:

    @bdew said:
    2) If someone gets those hashes it means your system is either stolen or already r00ted and you're fucked anyway, anything that isn't encrypted is accessible by the atacker.

    But what if you also use your Windows password as your TDWTF password? Then what you really care about is still protected by the hash.

     

    Unless TDWTF uses the same hash function, no.

    When brute forcing hashes you are generally finding colissions, getting the exact password is much harder (need to find ALL colissions, birthday paradox doesn't apply) and mostly impossible to verify. So if your windows password is "sex" the attacker will get "ASD12ma9s!@3" which hashes to the same value using NTLM but it will be useless on TDWTF because if it uses MD5 or whatever. That is also the reason why your encryption keys and other stuff protected by DPAPI is still safe under those conditions.

    Now if the attacker finds your password with a dictionary attack it might actually be the exact same password, but then TRWTF is you.



  • @bdew said:

    Password
    hashes:

    Windows uses 2 hashes to store your password in it's local
    security database: LM (inherited from IBM, based on DES) and NTLM (based
    on md4). Both are generally very weak.

    Unixes traditionally used a
    readeable-by-anyone /etc/passwd file with crypt()-hashes (DES), which in
    most modern distributions is replaced by /etc/shadow which is not
    readable to normal users and can use many different hashes (mostly some
    salted form of MD5/SHA1/SHA256 are being used)

    Now, wait, according to that Wiki article (unless I misread it), NTLMv2 is used for local security.

    @wikipedia said:

    NTLM is still used in the following situations:

    * No Active Directory domain exists (commonly referred to as "workgroup" or "peer-to-peer").

    It then says:

    @wikipedia said:

    In Windows Vista and above, neither LM or NTLM are used by default[citation needed]. NTLM is still supported for inbound authentication, but for outbound authentication a newer version of NTLM, called NTLMv2, is sent by default instead.

    Which implies to me that while Windows 2000 and up supported NTLMv2, it wasn't forced on users until Vista. Note that NTLMv2 existed in Windows 2000 and XP, but apparently wasn't used by default? I don't know, the wording is confusing, and being Wikipedia it's probably all wrong anyway.

    In any case, I'm not seeing any "OMG Microsoft sux!!!" around this issue.

    Edit: and of course all of this is moot in the Dirty Thug at the Bus Station scenario.



  • I'll agree that there should be an option during the configuration (say, when you're setting your password) to turn auto-login on or off, though, since most users are on desktop machines and most users maintain a low security profile, defaulting to off makes sense.

    You'll also note that in Windows Vista and earlier (sorry, I'm not yet running 7), Auto-Login and Require Password on Wakeup are not one and the same option and must be changed separately. In fact, this is also a good usability feature; a setting should control one and only one thing. Yes, it would be nice if there was a pretty little link to the related options, but that relationship isn't clear in Vista Home's User management either.

    I'll absolutely agree that the firewall defaulting to off is a bad thing.

    As for blakeyrat's #4, I'll simply point out that you're being foolish. Don't use the Dock for launching apps unless you run maybe one or two VERY often - use the keyboard shortcuts. Option+Q to quit apps, I set Option+Space to enable Spotlight; I launch all my apps from Spotlight. Oh, and it's Option+T to cycle between apps. Sadly, in a potential usability fail, while Option+Space is the default Spotlight launch key, you have to enable the keyboard shortcut in Spotlight's Preferences panel on 10.6.



  • @Master Chief said:

    @blakeyrat said:

    @Master Chief said:
    Lots of things, mostly small nitpicky stuff that Microsoft just refuses to address.  The way Windows stores its passwords, for instance, is downright awful (essentially the same since Win2K).

    Well, ok, but that's not "The Unix Design," that's "some particular Linux distros have improved this aspect".

     

    Well yes, but the universal point is that they did address it, both in open source and in enterprise linux.  Not sure about OSX. Microsoft has not.

    1. Modularity, shadow came into effect without major application breakage, admins can also select what hash algo to use on passwords, if they can't do the strong one they want they have the option of upgrading only that part of the login/authentication system.
    2. Modularity, a UNIX system can be more readily stripped down, what isn't present can't be used against it.
    3. Well-defined directories and paths for everything from the start, meaning you generally don't get people shoving binaries into /etc/, or configuration files into /bin/
    4. Modularity, the tool chain is designed to be used in parts, meaning building quick security tools is simple
    5. It's a developer/admin friendly system, which means those types of people are more likely to do useful things there (eg: Qubes).
    6. Modularity, certain parts of the security chain actually support plug-ins and thus can help lock the system down in ways the devs of that system didn't anticipate (eg: PAM)
    7. Online documentation, even if it isn't always complete, it's hell-of-alot easier to get at
    8. Everything is a file, usually text. No need to open a heavy, often restricted GUI to look at certain settings, thanks to #4 this is much more useful than it would be alone.
    All of these things can be attributed to most implementations of UNIX, and are in some way attached to a core UNIX philosphy. Other things that aren't actually "UNIX philosophy" but are present in all UNIX systems I've dealt with (Linux, FreeBSD, OpenBSD):
    1. Centralized logging (PCI/DSS requirement FYI) is part of most syslogd's
    2. Package management, everything is upgraded from a central, vendor suported location
    3. Authentication with keys is easy
    4. The primary remote administration interface (ssh) can be run in a batch mode, meaning an admin can quickly whip up a script that connects to a large amount of servers and runs comments on them, and ey'd still be able to see the full output if they wanted/needed to.
    5. Most scheduling daemons mail root the output of any jobs, every distro I've dealt with will use this to have security checks regually emailed to admins


  • @davem said:

    As for blakeyrat's #4, I'll simply point out that you're being foolish. Don't use the Dock for launching apps unless you run maybe one or two VERY often

    Isn't that what the taskbar is FOR?



  • Many of the items you list only apply to managed networks, making them kind of irrelevant to the question in this thread. So I'm glossing over points...

    Also a lot of the points, NT-in-general has, although NT-in-the-form-of-desktop-Windows does not. (For example, point 2.)

    @Lingerance said:

    Well-defined directories and paths for everything from the start, meaning you generally don't get people shoving binaries into /etc/, or configuration files into /bin/

    NT has that.

    @Lingerance said:

    It's a developer/admin friendly system, which means those types of people are more likely to do useful things there (eg: Qubes).

    It's definitely geek-friendly, when you define geek as "the type of person who likes open source shit." (That is, the Slashdot definition.) But that's kind of a circular definition.

    Despite that, Windows has far better tools for actual software developers, that is, people building software for other people. And probably for admins as well, but I can't really speak to that.

    In any case, I'd phrase it differently: they're more likely to *start* projects there. They won't bother actually making the product production-quality unless Microsoft announces that they're working on the same feature. At which time, the project will gain dozens of developers and be production-quality in short order.

    @Lingerance said:

    Centralized logging (PCI/DSS requirement FYI) is part of most syslogd's

    NT has that.

    @Lingerance said:

    Package management, everything is upgraded from a central, vendor suported location

    NT can have that, if you're on a network that implements it.

    @Lingerance said:

    Authentication with keys is easy

    I don't know what this means... physical or virtual keys? NT does fine with thumbprint scanners, RFID tags/cards, and SecurID keyfobs.



  • @blakeyrat said:

    NT has [Well-defined directories and paths for everything from the start].
    Not really, thanks to backwards compatability. Eitherway it isn't respected by the vast majority of applications making this a stronger point on *NIX.

    @blakeyrat said:

    It's definitely geek-friendly, when you define geek as "the type of person who likes open source shit." (That is, the Slashdot definition.) But that's kind of a circular definition.
    I didn't say "geek friendly" I said developer and admin friendly. Being that there are a wide variety of tools (for admins) and languages (for devs) that do their task well and it's easy to get something that's needed done. The main point of this being the ability to make security tools with very little effort, whereas a fair amount of documentation diving has to be done on Windows to find the poper hooks. Existing CLI tools can be wrapped by new ones (or even GUI ones), although this sounds incredebly hackish it means that an admin/dev can make the sec tool and move on. In the Windows environment there's a much thicker line between a dev and admin, and creating tools to help auditing gets to be more tedious than just doing everything manually with point and click. Example: I want to see what users tried to access a specific server, and how many times they did that in a single day. On Windows that'd pretty much be a full-blown app (read the Event Log, count each login attempt on each day, and print it out) with API calls that'd I'd most likely have to look up, vs being able to just read the log, run it through grep+sort+uniq+maybe sed or run it throguh awk. Once I have the command that does that I have the option of making it into a reusable script, and I could further limit it to just listing who tried logging in and compare that against a file stating who is actually supposed to be able to access the server. Maybe I want something to happen when this happens, maybe I want this emailed to me in a report. Each increment of the script being more useful takes a small amount of time to just write and test, versus doing it with more and more API calls (which again I'd have to look up).



    Another point that's along these lines is handling of a break-in. Once you've discovered the break-in, you'd take actions to prevent it, on Windows you'd most likely end up doing this through the GUI, if the change requires actions that can't be pushed throguh AD then you get to either: 1) figgure out how to script it and push that script to all the servers 2) Do the exact same actions on all servers. I've had trainning for working with Windows servers, a very small amount of that covered how to do stuff in the CLI (This was from official MS Press books), the rest was through GUI, versus my Linux+ course which did virtually everything through the CLI. Which trainning is going to make admins that waste time doing something that could've been automated? If the admin has to spend time doing something they shouldn't have to it means they're spending less time doing something more useful. This could have security considerations.

    The only things that an admin should really be doing that is extremely repeditive is manual spot audits on servers, testing (effects of their actions) and reading logs.

    @blakeyrat said:

    Despite that, Windows has far better tools for actual software developers, that is, people building software for other people. And probably for admins as well, but I can't really speak to that.
    Yet the very same tools are not good for things that are very small, and often get done with a single file. An IDE is only good for languages it supports, otherwise it's just a fancy text-editor (which may or may not be able to useful things when working with a cmd.exe script).

    @blakeyrat said:

    @Lingerance said:
    Package management, everything is upgraded from a central, vendor suported location
    NT can have that, if you're on a network that implements it.
    Through WSUS and pushing MSIs through a GPO, yes, but that's two different systems and they don't share benefits with the other. My understanding of the GPO method is weak, however last I checked it doesn't really hook into anything that makes sure everything is up to date (eg: a compliance scanning tool would check to make sure you're not running anything that's ancient).

    @blakeyrat said:

    @Lingerance said:
    Authentication with keys is easy
    I don't know what this means... physical or virtual keys? NT does fine with thumbprint scanners, RFID tags/cards, and SecurID keyfobs.
    Virtual keys. I was refering to SSH here.

    @blakeyrat said:

    NT has [centralized logging].
    Not until WS2008/Vista. This is one of the things Windows didn't really do until late-game, as such it gets a bad reputation in the server environment. Kudos to them for finally allowing it, kudos for also making WS2003 get it if you have a WS2008 or Vista machine on the network, but why take so long? Another point of why *NIX systems giving a better impression of being secure, Microsoft had been very weak on the security side historically, or was at least percieved that way.



  • @Lingerance said:

    @blakeyrat said:
    It's definitely geek-friendly, when you define geek as "the type of person who likes open source shit." (That is, the Slashdot definition.) But that's kind of a circular definition.
    I didn't say "geek friendly" I said developer and admin friendly.

    Yah, I was contesting your assertion, not repeating it.

    @Lingerance said:

    Being that there are a wide variety of tools (for admins) and languages (for devs) that do their task well and it's easy to get something that's needed done.

    ... as opposed to Windows?

    @Lingerance said:

    Example: I want to see what users tried to access a specific server, and how many times they did that in a single day. On Windows that'd pretty much be a full-blown app (read the Event Log, count each login attempt on each day, and print it out) with API calls that'd I'd most likely have to look up, vs being able to just read the log, run it through grep+sort+uniq+maybe sed or run it throguh awk. Once I have the command that does that I have the option of making it into a reusable script, and I could further limit it to just listing who tried logging in and compare that against a file stating who is actually supposed to be able to access the server. Maybe I want something to happen when this happens, maybe I want this emailed to me in a report. Each increment of the script being more useful takes a small amount of time to just write and test, versus doing it with more and more API calls (which again I'd have to look up).

    I would maintain there's no difference, difficulty-wise, between doing this on a Unix-alike, and doing it in Windows using PowerShell or a .net app.

    Obviously, if you don't know Windows, you'd have to look up the API calls. But, hey guess what? If you don't know Unix, you'd have to look up the API calls there, too. So you're basically arguing that Unix is superior because you're more familiar with it... not a very compelling argument to me.

    @Lingerance said:

    I've had trainning for working with Windows servers, a very small amount of that covered how to do stuff in the CLI (This was from official MS Press books), the rest was through GUI,

    Until recently, Windows didn't have great CLI tools for network administration. That said, JScript/VBScript wasn't awful, and anything you could do in the GUI you could do there, also.

    (BTW, Microsoft Press is a publisher; just because a Microsoft Press book tells you to use the GUI doesn't mean a CLI interface doesn't exist. Similarly, Microsoft publishes Halo games, that doesn't mean they have a warehouse somewhere full of power armor and starships. Just FYI.)

    @Lingerance said:

    versus my Linux+ course which did virtually everything through the CLI. Which trainning is going to make admins that waste time doing something that could've been automated?

    You're not considering the possibility that your Windows training was crap. Which, if it didn't cover JScript, VBScript, or PowerShell, it likely was.

    @Lingerance said:

    Yet the very same tools are not good for things that are very small, and often get done with a single file.

    ... uh, if you say so. Want to back-up this sentence with some kind of evidence, or an example?

    @Lingerance said:

    An IDE is only good for languages it supports, otherwise it's just a fancy text-editor (which may or may not be able to useful things when working with a cmd.exe script).

    Why the holy shit would you be working with a cmd.exe script? Is it 1987 in your timezone? Yes, yes, your training in Windows administration definitely was shit.

    @blakeyrat said:

    @Lingerance said:
    I don't know what this means... physical or virtual keys? NT does fine with thumbprint scanners, RFID tags/cards, and SecurID keyfobs.
    Virtual keys. I was refering to SSH here.

    Last time I set it up it was anything but easy, it took fucking ages. Not to say it's easy in Windows, but it's certainly not easy in Unix unless something has changed very recently.

    Once again, "person who's been working with Unix for 15 years can do it quickly" != "easy".



  • @blakeyrat said:

    [quote user="Lingerance"]An IDE is only good for languages it supports, otherwise it's just a fancy text-editor (which may or may not be able to useful things when working with a cmd.exe script).

    Why the holy shit would you be working with a cmd.exe script? Is it 1987 in your timezone? Yes, yes, your training in Windows administration definitely was shit.
    [/quote]

    I think Unix users see cmd as the default shell and go "cmd is to Windows as bash is to Unix, therefore cmd scripts are to Windows as shell scripts are to Unix!".



  • @MiffTheFox said:

    @blakeyrat said:
    @Lingerance said:
    An IDE is only good for languages it supports, otherwise it's just a fancy text-editor (which may or may not be able to useful things when working with a cmd.exe script).

    Why the holy shit would you be working with a cmd.exe script? Is it 1987 in your timezone? Yes, yes, your training in Windows administration definitely was shit.

    I think Unix users see cmd as the default shell and go "cmd is to Windows as bash is to Unix, therefore cmd scripts are to Windows as shell scripts are to Unix!".

    I think you're right, but how could you think that after (supposedly) getting instruction in admin-ing Windows? That's the part that threw me.



  • @blakeyrat said:

    I would maintain there's no difference, difficulty-wise, between doing this on a Unix-alike, and doing it in Windows using PowerShell or a .net app.

    Obviously, if you don't know Windows, you'd have to look up the API calls. But, hey guess what? If you don't know Unix, you'd have to look up the API calls there, too. So you're basically arguing that Unix is superior because you're more familiar with it... not a very compelling argument to me.

    My point was you can wrap a program around the CLI program that can do what you want, and there's an abundance of those. @blakeyrat said:
    Until recently, Windows didn't have great CLI tools for network administration. That said, JScript/VBScript wasn't *awful*, and anything you could do in the GUI you could do there, also.
    Yeah, PowerShell came out while I was at school, as such it wasn't covered. @blakeyrat said:
    You're not considering the possibility that your Windows training was crap. Which, if it didn't cover JScript, VBScript, or PowerShell, it likely was.
    The first two are programming languages, which don't really fit in an admin course, the third came out while I was in school, so there wasn't curriculum to learn it, and by the time I'd learned of it I was already disillusioned from the shitty admin GUIs and didn't really put much effort towards learning it (I was actually neutral towards Windows before school). Also, I am aware much of my trainning was crap (we had two identical courses with similar names, this wasn't uni so we didn't pick/chose individual classes, everyone shared the same schedule). @blakeyrat said:
    (BTW, Microsoft Press is a publisher; just because a Microsoft Press book tells you to use the GUI doesn't mean a CLI interface doesn't exist. Similarly, Microsoft publishes Halo games, that doesn't mean they have a warehouse somewhere full of power armor and starships. Just FYI.)
    Seeing as how said books are part of the official trainning towards being an MSCE, not covering the CLI is rather stupid of them isn't it? I was aware of the CLI before hand and made good effort to learn it as much as I could, but generally didn't like how I couldn't filter the results with a built-in tool. @blakeyrat said:
    Last time I set it up it was anything but easy, it took fucking ages. Not to say it's easy in Windows, but it's certainly not easy in Unix unless something has changed very recently.
    Really? It's two commands: ssh-keygen and ssh-copy-id, I learned about them when I double-tab completed on "ssh".@blakeyrat said:
    Why the holy shit would you be working with a cmd.exe script? Is it 1987 in your timezone? Yes, yes, your training in Windows administration definitely was shit.
    Why wouldn't I be? cmd.exe is the shell. If I do something in the shell I can repeat it later by shoving the command in a script, then the script grows into something increasingly useful over time. It's probably a good thing they replaced it with PowerShell if it's really as shitty as you claim, but at the time I wasn't given alternatives, so I'm really curious, what should I had learned?

    Annecdote time. This isn't an argument, this is just me ranting about what I remember from trying PowerShell. Details may be wrong. I actually did try PowerShell, I'm told it's really awesome, unfortunately when I tried it it really sucked mostly because some winfag was all like "Yeah, it's like a UNIX-shell in Windows, but done much better" (part bolded is the part I took litterally and shouldn't). So of course I'm all like, maybe MS actually did something right! Download it, and it's all like "You need .NET 2.0, you have 1.1", I'm like OK, I go get .NET 2.0, and try to install it and it's all like "I'm installed already." PowerShell installer is all like "You need .NET 2.0" I'm all like fuckit, I'll try this at school. So at school I get .NET 2.0 and PowerShell and .NET 2.0 is all like "You need this other thing." Other thing is a wierd gibberish like most windows DLLs, so I'm hoping the .NET 2.0 will be nice an actually link to that thing. Nope. Said thing requires some other thing to install, but actually linked to it, and that was the last requirement. After getting everything up and running I type "ls", it outputs whatever the default output is, I want to change it, so "ls -1", it does the exact same thing. So does "dir", curious to what the options are I try "ls -h", "ls -help", "ls --help", "ls /?", "ls -?", "ls /h", "ls /help" all of which do nothing to change the output. So "info ls", "help ls", "man ls", neither of which actually give the options for ls (apparently they fixed this). Eventually I discover a list of built-in commands, and play with a few of them. Eventually class starts, I ask the instructor if he's played with PowerShell yet, he hasn't, I get distracted by the class and forget about PowerShell. So this brings up a few points: why can't WUS or the Add/Remove programs grab extras like PowerShell, Tweak UI themselves? Why do the installers not try to downloads the _free_ requirements, why don't the download pages link to their requirements? Why didn't "help <command>" actually give info on the command, like every other CLI? Eitherway, if I get a Windows computer I'll try PowerShell again.


  • @Lingerance said:

    My point was you can wrap a program around the CLI program that can do what you want, and there's an abundance of those.

    Same as on Windows. Anything you can do in the GUI, you can do in a CLI program via CMD.exe. You can also do it though JScript/VBScript, or a C API.

    @Lingerance said:

    The first two are programming languages, which don't really fit in an admin course,

    VBScript/JScript (I do the slashy because they're really the same language with different syntax) are how you admin Windows. They're not programming languages, but scripting languages like-- hmm... bash maybe!?

    Christ, who did you take this admin course from? Fire them. Out of a cannon. Into the sun.

    @Lingerance said:

    Seeing as how said books are part of the official trainning towards being an MSCE, not covering the CLI is rather stupid of them isn't it? I was aware of the CLI before hand and made good effort to learn it as much as I could, but generally didn't like how I couldn't filter the results with a built-in tool.

    If you think CMD.exe is "the" CLI, then you're so wrong on base principles that it's really hard to state. CMD.exe exists (mostly) for backwards-compatibility only.

    @Lingerance said:

    Really? It's two commands: ssh-keygen and ssh-copy-id, I learned about them when I double-tab completed on "ssh".

    "Yeah! It's really easy if you ignore 57 of the steps! And that 99% of the population don't even know how to use a terminal! SOOO EASY!" I hate Linux users. (Ironically, I have no particular hatred of Linux... just its users.)

    @Lingerance said:

    Why wouldn't I be? cmd.exe is the shell.

    It's a shell. Explorer is also a shell. Hell, Word's fucking "Save As..." dialog is a shell. (It doesn't rely on Explorer.) It's not "the" shell.

    @Lingerance said:

    If I do something in the shell I can repeat it later by shoving the command in a script, then the script grows into something increasingly useful over time. It's probably a good thing they replaced it with PowerShell if it's really as shitty as you claim, but at the time I wasn't given alternatives, so I'm really curious, what should I had learned?

    JScript/VBScript. If that wasn't obvious from context. At least if you're interested in admin-ing Windows. At this point in time, you could use PowerShell or probably even .net for the purpose, as well.



  •  First I want to say, that if onmy Mac a program tries to access the internet or open listening ports, a dialog box pops up asking me if I want to allow that.

    I agree that that the markers to see if an app is running are hard to see. As I understand, what Apple really wanted to do is have an OS where it makes no difference if an app is running already or not. They've gone much further with this in their iPhone OS. But the paradigm is a bit flawed.

    As to Unix vs. Windows. WhatI don't like about Unix, is that if you don't know what the command you're looking for is called, you're out of luck.

    What I do like, is that the tools are more mature in the sense that what worked 15 or 20 years ago, usually still works today. If I have a bash script from 1994, it will probably still work and do something useful, a .BAT file from that era is not likely to be of any use. Simple things like checking if a certain program is running, and otherwise restarting it, are simple to do, and work the same as they did 20 years ago. Same goes forparsing some logging or whatever. Because nearly everything is a text file or a pipe, it's easy to string the old tools together in new ways.

    I've basically left the Windows playform in 2005, worked on Linux and HP-UX for a couple of years, and I'm using a Mac as a "pretty Unix" for the last couple of years. Our IT department pre-configured the machine, so I don't know what it's setup defaults are. If they do turn the Firewall off by default, I would find that pretty shocking in this day and age.

    As for the differences between 10.4 and 10.6, I think the biggest for me are TimeMachine, the multi-touch tech and having virual desktops integrated in to the core OS (Spaces).



  • @RogerWilco said:

    I agree that that the markers to see if an app is running are hard to see. As I understand, what Apple really wanted to do is have an OS where it makes no difference if an app is running already or not.

    That's fine as long as you don't have any operations that succeed or fail based on whether the app is running. Like, for example, dragging an icon out of the taskbar. You either make which apps are running not matter, or you don't, and right now Apple's in some crazy no-man's-land right in the middle.

    @RogerWilco said:

    As for the differences between 10.4 and 10.6, I think the biggest for me are TimeMachine, the multi-touch tech and having virual desktops integrated in to the core OS (Spaces).

    Time Machine is just a fancy word for Shadowcopy, that feature Windows has had since Windows 2000. Except Microsoft's implementation is better.



  • @blakeyrat said:

    Now, wait, according to that Wiki article (unless I misread it), NTLMv2 is used for local security.
     

    Not really, it talks about network NTLM(v2) authentication being used over the network when the system is not part of a (AD) domain.

    The terminology is somewhat confusing, there's a hash function NTLM (AKA "NT hash") based on MD4 and considered weak, it's used for local password storage. Then there's an old authentication protocol with the same name that uses it insecurely and a newer one (NTLMv2) which combines it with a challenge/response mechanism to keep it secure.

    And then there is Active Directory which replaces all that mess with a different and secure mess (Kerberos over LDAP) for authentication. On a funny side note the Samba project is trying to get it right for the last 10 years and still doesn't have anything fully working.

    Anyway, i'm not really seing any serious WTFery from microsoft with all this, all this legacy stuff is kept to provide backwards-compatibility to 20-year-old stuff (LM stands for LanManager, an IBM network file system from the 90s) and is turned off by default on modern systems.

    PS: Come to think of it, i was wrong comparing NFSv4+Kerberos on unixes to CIFS+NTLMv2, Kerberos requires the client system to have a trust relationship with the server/domain to work and doesn't actually need to verify credentilas, it's how AD works.

    If you need to provide access to NFS from untrusted computers you are still stuck with using the old (non)security model of NFSv3... so that is definetly TRWTF in this discussion, windows turns out to be more secure than *nix! (for this specific use case)



  • @bdew said:

    If you need to provide access to NFS from untrusted computers use (S)FTP
    FTPTFY.



  • @Xyro said:

    @bdew said:

    If you need to provide access to NFS from untrusted computers use (S)FTP
    FTPTFY.

     

    Because you totaly can't do that from windows, and because (S)FTP is totaly a network file system with locking, random access writes and stuff.



  • You are now both on my favorite people list.

     

    It took me a moment to get that one. I shall now blatantly rip it off whenever the opportunity arises.



  • @RogerWilco said:

    having virual desktops integrated in to the core OS

    For what it's worth, core OS support for multiple desktops in Windows NT exists since NT 3.1 (see CreateDesktop/OpenDesktop and related APIs). Support for multiple interactive user sessions exists since NT 4 (?) terminal server and is integrated to XP.

     



  • @Durnus said:

    @WhiskeyJack said:

    Incidentally, there's a setting you can use to turn off the "3-dimensional icons sitting on a table" effect, which might make the little fuzzy purple dot a little easier to see.

     But yeah, I liked the plain black triangles better too.

     

    Here you go: http://www.silvermac.com/2007/leopard-dock-with-black-triangle/

    I have a mac (originally bought it for iDevice programming, don't use it much) and I turned off the table effect early on. Not using the black triangles, but I couldn't find the article that shows how to turn off the table so I decided just to post this one.

     EDIT: Although just for reference, simply turning off the 3d table effect does a *LOT* to help dock readability. It puts a white dot (IIRC) above running applications, and nothing above nonrunning applications. (And the background is black, so it shows up really nicely.)

    To turn the 3d glassy dock back into a nice, high-visibility 2d black dock with white dots, open a Terminal window and type

    $ defaults write com.apple.dock no-glass -bool YES
    $ killall Dock

    Dock will respawn immediately in 2d style.

    I did this first thing after getting 10.5 and completely forgot about it, so my first reaction to blakeyrat's dock rant was, WTF?
    I had to google to remind myself what the heck I did to make the egregious shiny go away.



  • @Master Chief said:

     @blakeyrat said:

    I've never been able to figure out what makes Unix so inherently secure anyway over Windows NT-based OSes. (Single-user OSes, like Windows 98 and Mac Classic-- duh, but NT?) Unix doesn't seem to do anything that NT doesn't, except it has a less precise permissions model. The only real difference is that NT is hugely popular for home users, and Unix-like OSes aren't.

    Lots of things, mostly small nitpicky stuff that Microsoft just refuses to address.  The way Windows stores its passwords, for instance, is downright awful (essentially the same since Win2K).  CentOS uses salted MD5 hashes, which are excessively hard to reverse.

    Vista/2008 and later don't store NTLM hashes by default, if that's your beef.

  • :belt_onion:

    @Lingerance said:

    Modularity, a UNIX system can be more readily stripped down, what isn't present can't be used against it.

    I read ahead until the end of this thread before responding so I know BlakeyRat already proved you know shit about Windows. But he forgot to mention that you should really look into Windows 2008 Server Cores. From MSDN

    @MSDN said:


    Server Core is a minimal server installation option for computers running on the Windows Server 2008 R2 operating system. Server Core provides a low-maintenance environment capable of providing core server roles.

    (snip)

    To accomplish its core, critical roles, the Server Core installation option only installs the binaries required by its supported roles. For example, the Explorer shell is not installed with Server Core. Instead, the Server Core user interface is the command prompt.

    When configured, Server Core can be managed locally and remotely using Windows PowerShell, by using a terminal server connection from a command line, as well as remotely, by using the Microsoft Management Console (MMC) or command line tools that support remote usage.

    Server Core supports the following server roles:

    • Active Directory (AD)
    • Active Directory Lightweight Directory Services (AD LDS)
    • DHCP Server
    • DNS Server
    • File Services
    • Hyper-V
    • Print Services
    • Streaming Media Services
    • Web Server (IIS)
    • Active Directory Certificate Services

    Why would a stripped down Unix file server be anymore secure than a Windows Server Core with the "File Services" role?

    And stop using cmd.exe for crying out loud. Everything is done by PowerShell nowadays. Maybe should start comparing Windows 7 with the Classic Mac. Classic Mac really sucks!



  • @blakeyrat said:

    @RogerWilco said:
    I agree that that the markers to see if an app is running are hard to see. As I understand, what Apple really wanted to do is have an OS where it makes no difference if an app is running already or not.

    That's fine as long as you don't have any operations that succeed or fail based on whether the app is running. Like, for example, dragging an icon out of the taskbar. You either make which apps are running not matter, or you don't, and right now Apple's in some crazy no-man's-land right in the middle.

    As I said, I think it's currently flawed. I think it will never really work properly, as it requires infinite resources.

    @blakeyrat said:

    @RogerWilco said:
    As for the differences between 10.4 and 10.6, I think the biggest for me are TimeMachine, the multi-touch tech and having virual desktops integrated in to the core OS (Spaces).

    Time Machine is just a fancy word for Shadowcopy, that feature Windows has had since Windows 2000. Except Microsoft's implementation is better.

     

    Never heard of it. I only have access to Windows XP, where do I find it in there? I would certainly give it a try as I really like it in OSX. I would really like to make automatic back-ups of my documents and images on my windows laptop onto my NAS as well.

    @alegr said:

    @RogerWilco said:

    having virual desktops integrated in to the core OS

    For what it's worth, core OS support for multiple desktops in Windows NT exists since NT 3.1 (see CreateDesktop/OpenDesktop and related APIs). Support for multiple interactive user sessions exists since NT 4 (?) terminal server and is integrated to XP.


     Again, where do I find this in Windows XP?

     It seems there have been all these options in Windows that somehow were never advertized in such a way that I could find them and I have thus been missing all these years. As I still use my old Windows laptop as well, (Mac is for work, old laptop for private stuff) maybe you can enlighten me? I've googled these terms based on your messages but it's not clear to me.

     



  • @RogerWilco said:

    Never heard of it. I only have access to Windows XP, where do I find it in there? I would certainly give it a try as I really like it in OSX. I would really like to make automatic back-ups of my documents and images on my windows laptop onto my NAS as well.

    I can't help you with XP, I haven't touched it since Vista came out and I don't have any XP machines around here to reference. It probably has to be the Professional version, though... I can tell you that much. (And it's possible it only works in XP if you're in a AD domain, like in Windows 2000... I don't think I ever tried it on a non-AD XP box.)

    XP has a normal backup tool for your documents and images, right? The one in Vista and Windows 7 is quite nice, but I never used the XP version of same so I can't speak for it. (If all else fails, you can shove a robocopy line into a batch file, and make a scheduled task to run it.)



  • @bjolling said:

    -snip-

        Why would a stripped down Unix file server be anymore secure than a Windows Server Core with the "File Services" role?

        And stop using cmd.exe for crying out loud. Everything is done by PowerShell nowadays. Maybe should start comparing Windows 7 with the Classic Mac. Classic Mac really sucks!

    We're talking about Unix here, not classic MacOS. I agree that was bad, but so was MS-DOS and a lot of other stuff out there.

    I've been able to do on Unix (Linux, HP-UX, IRIX, BSD, Darwin) what Windows Server Core and PowerShell seem to offer since 1994. I think it could be done before that, but 1994 was the first time I used Unix. One of the big advantages of the Unix flavours as a server OS, is that the things you learned 15 years ago still work and are valid. And the tools are usually flexible enough that you can still meet any new challenge with them that rears it's ugly head.

    Windows is a desktop OS that's slowly trying to become a server OS, Unix is going the other direction. I think neither have fully bridged that gap yet. OSX and the latest Windows Server version might be getting close though.

    How many of the tools you used on Windows NT 3.1 are you still using today? cmd.exe is probably the only one. Maybe minesweeper?



  • @blakeyrat said:

    @RogerWilco said:
    Never heard of it. I only have access to Windows XP, where do I find it in there? I would certainly give it a try as I really like it in OSX. I would really like to make automatic back-ups of my documents and images on my windows laptop onto my NAS as well.

    I can't help you with XP, I haven't touched it since Vista came out and I don't have any XP machines around here to reference. It probably has to be the Professional version, though... I can tell you that much. (And it's possible it only works in XP if you're in a AD domain, like in Windows 2000... I don't think I ever tried it on a non-AD XP box.)

    But then I am using it at home and only have Windows XP Home. To use it on OSX, I just need to select what I want to sync in TimeMachine, and enable TimeMachine support on my QNAP TS-210 NAS.

    @blakeyrat said:

    XP has a normal backup tool for your documents and images, right? The one in Vista and Windows 7 is quite nice, but I never used the XP version of same so I can't speak for it. (If all else fails, you can shove a robocopy line into a batch file, and make a scheduled task to run it.)

    I tihnk the backup tool needs to be run manually and only does full backups? I'm not sure, I'll check. Robocopy is another thing I had never heard about. I googled it, and it doesn't work for me, as it can't handle open files.

    I used rsync on my Macbook before I spent 29 euro's to upgrade my OSX 10.4, which has the same limitations as robocopy I think. I don't think I can get Professional and an ActiveDirectory server for that? That's why I liked TimeMachine, it's a solution that works for home users.



  • @RogerWilco said:

    I've been able to do on Unix (Linux, HP-UX, IRIX, BSD, Darwin) what Windows Server Core and PowerShell seem to offer since 1994.

    Well, ok, but someone making a purchasing decision today only cares about what each system can do today. (And today + 5 years.) What happened in 1994 is completely irrelevant to their decision. (And also to any "which OS is better" debates.)

    @RogerWilco said:

    One of the big advantages of the Unix flavours as a server OS, is that the things you learned 15 years ago still work and are valid.

    Yeah, but one of the big disadvantages is that nothing at all changes if it challenges any of the 15-year experience. Something like PowerShell could never have been made in the Unix culture, because they've all already memorized their shitty-ass CLI a decade ago.

    @RogerWilco said:

    And the tools are usually flexible enough that you can still meet any new challenge with them that rears it's ugly head.

    If the challenges involve sorting long text files, yes. (Which, given, for a server is pretty appropriate.)

    @RogerWilco said:

    Windows is a desktop OS that's slowly trying to become a server OS,

    You know all modern Windows are based on NT, right? NT which was a server OS originally?

    @RogerWilco said:

    Unix is going the other direction.

    You're trying to draw a clever parallel here, but it's based on a false premise. Both Windows and Unix started as server OSes. Oh wait, that's false too, since early NT and Unix were also both used for workstations as well. So server/workstation.

    @RogerWilco said:

    How many of the tools you used on Windows NT 3.1 are you still using today? cmd.exe is probably the only one. Maybe minesweeper?

    Who cares? Irrelevant.



  • @RogerWilco said:

    @blakeyrat said:

    XP has a normal backup tool for your documents and images, right? The one in Vista and Windows 7 is quite nice, but I never used the XP version of same so I can't speak for it. (If all else fails, you can shove a robocopy line into a batch file, and make a scheduled task to run it.)

    I tihnk the backup tool needs to be run manually and only does full backups?

    The Backup utility in Vista/Windows 7 runs as a service and only backs-up files when you're not using your computer. Again: I can't speak for the one in XP. Edit: Oh, and you can tell it to include/exclude folders if you want, but the defaults are probably fine.

    @RogerWilco said:

    Robocopy is another thing I had never heard about. I googled it, and it doesn't work for me, as it can't handle open files.

    True, it's not perfect, but it's better than nothing. You can use something like SyncBack as well, but I think it also has issues with open files.

    @RogerWilco said:

    I don't think I can get Professional and an ActiveDirectory server for that?

    Well, XP Home is shite, so you should be running Professional anyway. That aside:

    1) Once more I repeat: I can't speak for XP. Maybe it needs an AD, maybe it freakin' doesn't. I don't know. Don't take anything I say about XP on faith.
    2) Windows Home Server is a excellent product, you could consider that. (Although it's probably not worth it if you only have a NAS and a couple machines.) It sets up an AD, and it's "a solution that works for home users".



  • @blakeyrat said:

    @RogerWilco said:
    How many of the tools you used on Windows NT 3.1 are you still using today? cmd.exe is probably the only one. Maybe minesweeper?
    Who cares? Irrelevant.

    I don't think that's irrelevant. I like the fact that what I learned on Linux/Solaris/HPux/SCO in the 90s still holds today. I don't like that every time Windows moves on and becomes better, I have to unlearn old things becasue they are obsolete or even wrong. Yes, I'm an old fart. Byte me.

    (However, I also like that Linux has in fact moved on and become more usable for a dumb user like me, like Win7/VS 2010 is easier, prettier and more useful than Win 3.1/VB4).



  • @b-redeker said:

    I don't like that every time Windows moves on and becomes better, I have to unlearn old things becasue they are obsolete or even wrong.

    Look, regardless of how you phrase it or attempt to justify it to me, I'll always believe that stagnation is a bad thing. Microsoft (and Apple's) ability to shake things up is really what I like most about those companies... IMO, the new Office 2007 interface would have been brilliant even if it sucked simply because it got people, once again, thinking about how software should work. Ditto (more or less) with the iPhone's touchscreen interface.

    Anyway, odds are the wrong things you knew were wrong back when you first learned them. (For example, Windows 95/98 programs that wrote data into Program Files.)



  • @blakeyrat said:

    The Backup utility in Vista/Windows 7 runs as a service and only backs-up files when you're not using your computer.
    Actually, it can run while you're using the computer without any problem - it uses shadow copy to create a snapshot of a disk, then copies that snapshot to the backup (so it has no problems with open files, even if you're writing to them while the backup is being made).@blakeyrat said:
    1) Once more I repeat: I can't speak for XP. Maybe it needs an AD, maybe it freakin' doesn't. I don't know. Don't take anything I say about XP on faith.
    IIRC (it's been a while since I actively used XP), shadow copy in XP is much more limited than in Vista and up.@blakeyrat said:
    2) Windows Home Server is a excellent product, you could consider that. (Although it's probably not worth it if you only have a NAS and a couple machines.) It sets up an AD, and it's "a solution that works for home users".
    It doesn't AFAIK - it instead syncs usernames and accounts over computers differently (the accounts are still local).



  • @ender said:

    @blakeyrat said:
    The Backup utility in Vista/Windows 7 runs as a service and only backs-up files when you're not using your computer.
    Actually, it can run while you're using the computer without any problem - it uses shadow copy to create a snapshot of a disk, then copies that snapshot to the backup (so it has no problems with open files, even if you're writing to them while the backup is being made).

    Well, the technicalities of how it works are irrelevant to me. The important thing is that it's not going to suddenly start thrashing my disk around while I'm trying to play Global Alliance. And it successfully meets that standard.

    @ender said:

    @blakeyrat said:
    2) Windows Home Server is a excellent product, you could consider that. (Although it's probably not worth it if you only have a NAS and a couple machines.) It sets up an AD, and it's "a solution that works for home users".
    It doesn't AFAIK - it instead syncs usernames and accounts over computers differently (the accounts are still local).

    Hm... I'm nearly 100% sure it sets up an AD. But maybe you're right, I haven't used it in awhile.



  • @blakeyrat said:

    XP has a normal backup tool for your documents and images, right? The one in Vista and Windows 7 is quite nice, but I never used the XP version of same so I can't speak for it. (If all else fails, you can shove a robocopy line into a batch file, and make a scheduled task to run it.)

    Heh, that's what I do.  I have two 1 TB external drives and every night, Windows fires up a 67 line batch file that copies everything I could ever need to them, alternating every other day so I always have two days worth of backed up data.



  • @Master Chief said:

    @blakeyrat said:

    XP has a normal backup tool for your documents and images, right? The one in Vista and Windows 7 is quite nice, but I never used the XP version of same so I can't speak for it. (If all else fails, you can shove a robocopy line into a batch file, and make a scheduled task to run it.)

    Heh, that's what I do.  I have two 1 TB external drives and every night, Windows fires up a 67 line batch file that copies everything I could ever need to them, alternating every other day so I always have two days worth of backed up data.

    What do the other 65 lines do?



  • @blakeyrat said:

    @Master Chief said:

    @blakeyrat said:

    XP has a normal backup tool for your documents and images, right? The one in Vista and Windows 7 is quite nice, but I never used the XP version of same so I can't speak for it. (If all else fails, you can shove a robocopy line into a batch file, and make a scheduled task to run it.)

    Heh, that's what I do.  I have two 1 TB external drives and every night, Windows fires up a 67 line batch file that copies everything I could ever need to them, alternating every other day so I always have two days worth of backed up data.

    What do the other 65 lines do?

    He never said that all the lines had text on them, could just be blank lines.


  • :belt_onion:

    @RogerWilco said:

    @bjolling said:

    -snip-

        Why would a stripped down Unix file server be anymore secure than a Windows Server Core with the "File Services" role?

        And stop using cmd.exe for crying out loud. Everything is done by PowerShell nowadays. Maybe should start comparing Windows 7 with the Classic Mac. Classic Mac really sucks!

    We're talking about Unix here, not classic MacOS. I agree that was bad, but so was MS-DOS and a lot of other stuff out there.

    I've been able to do on Unix (Linux, HP-UX, IRIX, BSD, Darwin) what Windows Server Core and PowerShell seem to offer since 1994. I think it could be done before that, but 1994 was the first time I used Unix. One of the big advantages of the Unix flavours as a server OS, is that the things you learned 15 years ago still work and are valid. And the tools are usually flexible enough that you can still meet any new challenge with them that rears it's ugly head.

    Windows is a desktop OS that's slowly trying to become a server OS, Unix is going the other direction. I think neither have fully bridged that gap yet. OSX and the latest Windows Server version might be getting close though.

    Why shouldn't I be allowed to compare Classic Mac to Windows 7. Most Microsoft bashers do exactly the same thing. From this very thread:

    • "Time Machine has no equivalent on Windows XP". WTF? Windows XP is 10 years old. Time machine didn't exist on Mac either 10 years ago
    • "cmd.exe sucks compared to *nix CLI". Except that cmd.exe is 15 (?) years old and made obsolete by Powershell years ago.
    • "*nix is more modular that Windows Server". Windows Server Code was introduced with Windows Server 2008 and allows the same modularity

    If you really want to compare OSes and functionality, please respect the timelines a bit.

    @RogerWilco said:

    How many of the tools you used on Windows NT 3.1 are you still using today? cmd.exe is probably the only one. Maybe minesweeper?
    Plenty of *nix tools have disappeared as well since the Windows NT3.1 days. Ten years ago I bought some "newbie guide to installing red hat". That books is worthless now, because many tools that are mentioned are not included in the distro anymore. Usually they are replaced with better tools because the old ones had issues. But when Microsoft does the same thing, they are obviously evil.

    Addendum:

    If you really need to work on Windows, why not replace your Windows XP with Windows 7? I've done the same on a 5 year old machine and the result is quite good. Most drivers are available nowadays. The upgrade will give your old PC a second life since Windows 7 is faster and more responsive than XP. It looks nicer and contains a lot of features that I really miss when I'm working on XP (At home I'm on Windows 7 Ultimate).

    If you want to upgrade (instead of wipe/reinstall), it does present a bit of work because you need to upgrade to Windows Vista first



  • @bjolling said:

    @RogerWilco said:

    @bjolling said:

    -snip-

        Why would a stripped down Unix file server be anymore secure than a Windows Server Core with the "File Services" role?

        And stop using cmd.exe for crying out loud. Everything is done by PowerShell nowadays. Maybe should start comparing Windows 7 with the Classic Mac. Classic Mac really sucks!

    We're talking about Unix here, not classic MacOS. I agree that was bad, but so was MS-DOS and a lot of other stuff out there.

    I've been able to do on Unix (Linux, HP-UX, IRIX, BSD, Darwin) what Windows Server Core and PowerShell seem to offer since 1994. I think it could be done before that, but 1994 was the first time I used Unix. One of the big advantages of the Unix flavours as a server OS, is that the things you learned 15 years ago still work and are valid. And the tools are usually flexible enough that you can still meet any new challenge with them that rears it's ugly head.

    Windows is a desktop OS that's slowly trying to become a server OS, Unix is going the other direction. I think neither have fully bridged that gap yet. OSX and the latest Windows Server version might be getting close though.

    Why shouldn't I be allowed to compare Classic Mac to Windows 7. Most Microsoft bashers do exactly the same thing. From this very thread:

    • "Time Machine has no equivalent on Windows XP". WTF? Windows XP is 10 years old. Time machine didn't exist on Mac either 10 years ago
    • "cmd.exe sucks compared to *nix CLI". Except that cmd.exe is 15 (?) years old and made obsolete by Powershell years ago.
    • "*nix is more modular that Windows Server". Windows Server Code was introduced with Windows Server 2008 and allows the same modularity

    It would be fun to do it the other way around.

    "Unix was modular in 1994!" "Well, ok, but Mac OS and Windows could fucking print WYSIWYG in 1994. Eat that!"



  • @blakeyrat said:

    @bjolling said:

    @RogerWilco said:

    @bjolling said:

    -snip-

        Why would a stripped down Unix file server be anymore secure than a Windows Server Core with the "File Services" role?

        And stop using cmd.exe for crying out loud. Everything is done by PowerShell nowadays. Maybe should start comparing Windows 7 with the Classic Mac. Classic Mac really sucks!

    We're talking about Unix here, not classic MacOS. I agree that was bad, but so was MS-DOS and a lot of other stuff out there.

    I've been able to do on Unix (Linux, HP-UX, IRIX, BSD, Darwin) what Windows Server Core and PowerShell seem to offer since 1994. I think it could be done before that, but 1994 was the first time I used Unix. One of the big advantages of the Unix flavours as a server OS, is that the things you learned 15 years ago still work and are valid. And the tools are usually flexible enough that you can still meet any new challenge with them that rears it's ugly head.

    Windows is a desktop OS that's slowly trying to become a server OS, Unix is going the other direction. I think neither have fully bridged that gap yet. OSX and the latest Windows Server version might be getting close though.

    Why shouldn't I be allowed to compare Classic Mac to Windows 7. Most Microsoft bashers do exactly the same thing. From this very thread:

    • "Time Machine has no equivalent on Windows XP". WTF? Windows XP is 10 years old. Time machine didn't exist on Mac either 10 years ago
    • "cmd.exe sucks compared to *nix CLI". Except that cmd.exe is 15 (?) years old and made obsolete by Powershell years ago.
    • "*nix is more modular that Windows Server". Windows Server Code was introduced with Windows Server 2008 and allows the same modularity
    It would be fun to do it the other way around.

    "Unix was modular in 1994!" "Well, ok, but Mac OS and Windows could fucking print WYSIWYG in 1994. Eat that!"

    Unix could do that to (as long as what you see is raw text).



  • @blakeyrat said:

    "Well, ok, but Mac OS and Windows could fucking print WYSIWYG in 1994. Eat that!"

    echo WYSIWYG | lpr


  • @bjolling said:

    @Lingerance said:
    Modularity, a UNIX system can be more readily stripped down, what isn't present can't be used against it.

    I read ahead until the end of this thread before responding so I know BlakeyRat already proved you know shit about Windows. But he forgot to mention that you should really look into Windows 2008 Server Cores. From MSDN

    Why would a stripped down Unix file server be anymore secure than a Windows Server Core with the "File Services" role?

    Part of my overall point was UNIX had those long ago, those are things that have become part of its image. It's good Windows Server is finally improving, but that doesn't change the fact that Windows has historically had a very shitty security model, and backwards compatability will fuck anything new they throw on for a while. On top of that Windows Server is rarely the latest and greatest and I have no guarantee that I'd even be using Windows Server 2008 machines should I end up somewhere having to deal with WS. (I've worked with large companies, they've have Windows Servers as old as NT, and the bulk of them were 9 years old) @bjolling said:
    And stop using cmd.exe for crying out loud. Everything is done by PowerShell nowadays. Maybe should start comparing Windows 7 with the Classic Mac. Classic Mac really sucks!
    So a 4 year old technology would be on every single Windows Server box I would expect to be using? "Ancient "servers exist (read: older than 4 years), they occasionally need to be dealt with. Contrast the *NIX shell, ksh/bash are mature, robust and available on pretty much any *NIX server you'd deal with. The original question was "what's so secure about *NIX?", the things I listed are all things that have been with *NIX for years, and are embedded into the _reputation_ of that familly of OSen. What reputation does Windows Server have? The most positive thing I can think of is it works great with Windows desktops in terms of functionality, which is unrelated to security.

    Also, I didn't speak of Windows until the original reply to my post.


  • @Lingerance said:

    It's good Windows Server is finally improving, but that doesn't change the fact that Windows has historically had a very shitty security model, and backwards compatability will fuck anything new they throw on for a while.

    Like what? Let's see some money where your mouth is.

    @Lingerance said:

    On top of that Windows Server is rarely the latest and greatest and I have no guarantee that I'd even be using Windows Server 2008 machines should I end up somewhere having to deal with WS. (I've worked with large companies, they've have Windows Servers as old as NT, and the bulk of them were 9 years old)

    So... Windows is bad because you work for companies too cheap to upgrade? Brilliant logic, there.

    @Lingerance said:

    So a 4 year old technology would be on every single Windows Server box I would expect to be using?

    That's completely irrelevant to the argument here. See, this is 2010... so when we compare Windows Server to Linux, we're comparing the 2010 version of Windows Server to 2010 versions of Linux. Do you understand? Because I've already been over this once.

    Yes, if you live in 2001, Windows Server is probably inferior to Linux. (Arguably, since Linuxes then were insecure as all get-out, and Windows Server 2000 was still much easier to admin.) But we don't live in 2001, we live in 2010... if you have arguments that only apply to 2001, feel free to drop them into your time portal so we can have debated them then. Or... whatever.

    @Lingerance said:

    "Ancient "servers exist (read: older than 4 years), they occasionally need to be dealt with.

    Yes. Relevance?

    Shocking fact: Microsoft doesn't possess a time machine.

    @Lingerance said:

    OSen

    Seriously?

    @Lingerance said:

    What reputation does Windows Server have? The most positive thing I can think of is it works great with Windows desktops in terms of functionality, which is unrelated to security.

    Among Slashdot readers it has an awful reputation, but then again, so does every Microsoft product. Among sane people, any Windows since about Windows 2000 has been on-par, or close to it, to Linux servers.



  • @blakeyrat said:

    @Lingerance said:
    It's good Windows Server is finally improving, but that doesn't change the fact that Windows has historically had a very shitty security model, and backwards compatability will fuck anything new they throw on for a while.

    Like what? Let's see some money where your mouth is.

    @Lingerance said:

    On top of that Windows Server is rarely the latest and greatest and I have no guarantee that I'd even be using Windows Server 2008 machines should I end up somewhere having to deal with WS. (I've worked with large companies, they've have Windows Servers as old as NT, and the bulk of them were 9 years old)

    So... Windows is bad because you work for companies too cheap to upgrade? Brilliant logic, there.

    @Lingerance said:

    So a 4 year old technology would be on every single Windows Server box I would expect to be using?

    That's completely irrelevant to the argument here. See, this is 2010... so when we compare Windows Server to Linux, we're comparing the 2010 version of Windows Server to 2010 versions of Linux. Do you understand? Because I've already been over this once.

    Yes, if you live in 2001, Windows Server is probably inferior to Linux. (Arguably, since Linuxes then were insecure as all get-out, and Windows Server 2000 was still much easier to admin.) But we don't live in 2001, we live in 2010... if you have arguments that only apply to 2001, feel free to drop them into your time portal so we can have debated them then. Or... whatever.

    @Lingerance said:

    "Ancient "servers exist (read: older than 4 years), they occasionally need to be dealt with.

    Yes. Relevance?

    Shocking fact: Microsoft doesn't possess a time machine.

    @Lingerance said:

    OSen

    Seriously?

    @Lingerance said:

    What reputation does Windows Server have? The most positive thing I can think of is it works great with Windows desktops in terms of functionality, which is unrelated to security.

    Among Slashdot readers it has an awful reputation, but then again, so does every Microsoft product. Among sane people, any Windows since about Windows 2000 has been on-par, or close to it, to Linux servers.

    Maybe an OnSen?



  • @blakeyrat said:

    @Lingerance said:
    It's good Windows Server is finally improving, but that doesn't change the fact that Windows has historically had a very shitty security model, and backwards compatability will fuck anything new they throw on for a while.
    Like what? Let's see some money where your mouth is.
    You seriously forgot all of the 90s? When did Windows first get a built-in firewall? XP, wasn't active by default until SP2. FreeBSD had one in 1993, the same project provided firewalls for multiple UNIX systems, I'm still trying to find when they were shipped with it. Linux has ipchains in 1998. What was the first release that actually put Security as a major priority? Vista. First UNIX variant to focus on security? OpenBSD in 1994. What was the first release that has central logging? Vista/2008. OpenBSD's syslogd is the same one as 4.3BSD's, which was released in 1986. What was the first release that actually allowed for a limited install? WS 2008.



    Honestly, after a fair amount of time reading various project's HISTORY and CHANGELOG files I'm just going to concede. Of course Windows is secure, it has always had an awesome security track record.



  • @Lingerance said:

    @blakeyrat said:
    @Lingerance said:
    It's good Windows Server is finally improving, but that doesn't change the fact that Windows has historically had a very shitty security model, and backwards compatability will fuck anything new they throw on for a while.
    Like what? Let's see some money where your mouth is.
    You seriously forgot all of the 90s? When did Windows first get a built-in firewall? XP, wasn't active by default until SP2. FreeBSD had one in 1993, the same project provided firewalls for multiple UNIX systems, I'm still trying to find when they were shipped with it. Linux has ipchains in 1998. What was the first release that actually put Security as a major priority? Vista. First UNIX variant to focus on security? OpenBSD in 1994. What was the first release that has central logging? Vista/2008. OpenBSD's syslogd is the same one as 4.3BSD's, which was released in 1986. What was the first release that actually allowed for a limited install? WS 2008.



    Honestly, after a fair amount of time reading various project's HISTORY and CHANGELOG files I'm just going to concede. Of course Windows is secure, it has always had an awesome security track record.

    Oh, I'm sorry, we're talking about desktops now? You're going to get whiplash, changing subjects too quickly.

    Edit: now that I read past the first sentence, I see you're still missing the point entirely. As long as Windows 2008 *has* all of those features, it's on-par with other server OSes, yes? It doesn't matter whether it got the feature in 1994, or got it in 2008, the point is that the feature exists *now*. There's no such thing as "track record." If you can prove Windows 2008 has security holes, then point them fucking out and shut me up. Otherwise, you shut up.

    I have no clue why the hell you were reading ALL CAPS FILES.



  • @Lingerance said:

    What was the first release that has central logging? Vista/2008.
    Oh hey, look, it's NT 3.1 with a non-existant Event Viewer!



  • @ender said:

    @Lingerance said:
    What was the first release that has central logging? Vista/2008.
    Oh hey, look, it's NT 3.1 with a non-existant Event Viewer!

    Unpossible! Lingerance says that Event Viewer didn't exist until Vista! Your witch powers have bent space and time... surely a Slashdot reader like Lingerance could never be wrong about a Microsoft product!


Log in to reply