I hate cmd.exe



  • @morbiuswilters said:

    Well, for starters, you have to call to external programs to do most everything for you.[...]
    Actually, most of the stuff you wrote is true only for the basic Bourne shell (/bin/sh). When you run bash, you can do most of these things internally.



  • @morbiuswilters said:

    There's no telling what version of that program will be available or what
    arguments it will like or choke on. 

    Surely there's a standard for that?

    <fanboyism blatant>We should all use Tcl, anyway.</fanboyism>



  • @aogail said:

    I fail to see how .NET gives me a scripting interface that is always available on all NT systems. All NT systems includes NT 4. Also, .NET is not installed by default on all but the most recent NT systems.

     

    It IS present on most up to date systems, or it is a quick, free download away.

    If you don't trust .NET then you have VBscript.

    Really, this is getting old. Time for you to run along and play in some Mac forums isn't it?



  • @ender said:

    Actually, most of the stuff you wrote is true only for the basic Bourne shell (/bin/sh). When you run bash, you can do most of these things internally.

    Bash has more built-in commands than bourne, but you still mostly have to call out to external programs for a lot of things.  For any non-trivial script, this can be much slower than using a language like perl that has many more built-in functions.  Also, if you code specifically for bash, you are guaranteed even less compatibility than the DOS shell.  With all the available shells for UNIX, any portable code has to be written to the lowest common denominator. 



  • Perhaps this blog post by Raymond Chen, Windows developer/blogger, will be of interest:

    This is what happens when a language develops not by design but by evolution. It becomes filled with all sorts of strange quirks in order to accommodate new behavior while remaining compatible with old behavior. Nobody actually likes the batch language; they just are used to it.

     

    And an interesting reader comment:

    The problems with "echo" are not exclusive to Windows. autoconf, for instance, has to do several tests to check for various quirks of the "echo" command on ancient Unix versions.

    And the Unix "echo" also has a similar problem to "ECHO ON": if you try "echo -n", for instance, it will "eat" the -n argument (unless you are using some of the quirky Unix versions).

     

     



  • @Spectre said:

    Surely there's a standard for that?

    Not sure what you mean by that.  Different UNIX variants have different versions of the same programs.  There is usually a base set of options and binaries that are available on all platforms, but you end up coding to the lowest common denominator.  Also, there's the extensive amount of work that goes into discovering all the idiosyncracies of the different platforms.  It's similar to coding C for UNIX, you end up with different libcs, different compilers and a bunch of ifdefs to work around the incompatibilities.  My point is that coding for UNIX has many of its own pitfalls.  At the end of the day you have to use your brain, RTFM and test thoroughly.  That's why the OP is pretty silly because even if the behavior of set is weird it's very well-documented and has a work-around available.  Having to learn the ins-and-outs of a programming language or platform isn't a WTF, it's part of being a good developer and it's why being a good developer takes a lot of work.



  • @CodeSimian said:

    Perhaps this blog post by Raymond Chen, Windows developer/blogger, will be of interest:

    This is what happens when a language develops not by design but by evolution. It becomes filled with all sorts of strange quirks in order to accommodate new behavior while remaining compatible with old behavior. Nobody actually likes the batch language; they just are used to it.

     

    And an interesting reader comment:

    The problems with "echo" are not exclusive to Windows. autoconf, for instance, has to do several tests to check for various quirks of the "echo" command on ancient Unix versions.

    And the Unix "echo" also has a similar problem to "ECHO ON": if you try "echo -n", for instance, it will "eat" the -n argument (unless you are using some of the quirky Unix versions).

     

    Thank you for the link, I think it sums up my point well: batch isn't great but every tool has rough edges.  If you've ever done a lot of work with autoconf, you know what a PITA it is.  Not only is it full of these kinds of tests, but frequently even autoconf has incompatibilites between different versions.  I've run up against the problem of a new version of autoconf barfing on 5 year old configure scripts.  I ended up having to build an older version of autoconf and put it in a separate location and write a wrapper script to detect the bug I was experiencing and switch between the old and new version.  It sucked but I also appreciate the difficulty of maintaining perfect compatibility between different versions of a program.  Most old configure scripts worked with the new version of autoconf but this particular one used a feature that had changed in some minor way.  Sometimes you just gotta suck it up and work around it -- that's why they pay us the big bucks. 



  • @aogail said:

    Hooray for MS apologists.

    Do you really love MS so much that you overlook even this ridiculous behavior? No one in their right mind would expect a shell to not properly support variable changes inside if blocks. By the way, adding new and incompatible syntax to work around a bug doesn't count as fixing the bug.

     

    Oh, yay. Yet another script kiddie who thinks they know something, but actually know nothing.

    When you turn 13, let us know. We'll welcome you to your teenage years.

    Bye now. Run home to Mommy. She knows school is out, and has your milk and cookies ready. 



  • @DaveK said:

    Whoops!   What went wrong?  Should we blame MS for this too?
     

    Of course! Obviously, those evil people as M$ hacked into all copies of the bash shell source codez on the planet (simultaneously, so no one would know) and messed with the sources to make bash act like DOS command.com and Windoze Command.exe, so they could say that ALL shellz behave the same! What are you, stoopid or su'thin? 

    Oops! Had to edit this cuz I fergot to say you're a M$ fanatic supporter fanboi and I hate you! Luser!

    There. 



  • @morbiuswilters said:

    @Spectre said:

    Surely there's a standard for that?

    Not sure what you mean by that.

    I mean that POSIX/SUSv3/whatever thingy you can download off [url=http://www.opengroup.org/onlinepubs/009695399/]The Open Group[/url]. I realize that not everyone adheres to it, but at least it's a standard.



  • @Spectre said:

    I mean that POSIX/SUSv3/whatever thingy you can download off The Open Group. I realize that not everyone adheres to it, but at least it's a standard.

    There are plenty of things that fall outside the realm of the POSIX standard.  For example, Windows NT is technically POSIX-compliant, but nobody would argue that it's that similar to UNIX.  When dealing with very basic tasks, you can usually count on POSIX but for any complex program, it's hit-or-miss.  Once again, you have to RTFM, use your brain when the manual doesn't cover what you are dealing with and test extensively in every environment you are deploying to.  Having worked with mixed-environment UNIX setups (BSD, OS X, Linux and Solaris) I can attest to the difficulty of writing software that runs the same on each platform.  Even amongst different Linux distros it can be quite a pain.  That's why I always push for standardizing on a platform and working to remove as many of the system variables as possible.



  • @MasterPlanSoftware said:

    @aogail said:

    Are you saying that there is no job for which cmd scripts are the best tool?

    Hmmm... I think what you used for mobiuswilters earlier is deserved here as well. *Golf hand clap*.

    Seriously, WTF would you do anything requiring any sort of if more complex than IF EXISTS or IF NOT EXISTS in a friggin' BATCH file?

    I have a couple of batch files I use regularly. I work on internal software, and with every change a history file is also made. When the production executable is updated, the history file is also updated so that the users can see what changes were made if they want. We keep attributes on the executables set to read only, so users don't accidentally delete them. I use a batch file something like:

    [code] attrib -r netdrive:\appname.exe
    xcopy appname.exe netdrive:\ /y
    attrib +r netdrive:\appname.exe
    attrib -r netdrive:\appname.history
    xcopy appname.history netdrive:\ /y
    attrib +r netdrive:\appname.history[/code]

    Anything more complicated than that, I start an IDE and actually write testable, maintainable code for; it's quick, easy, documentable, and suitable for version control. 

     

    I know of no case where another option wouldn't be a better choice. A batch file might be quicker/easier, but I wouldn't say it is the 'better tool'.

    @MasterPlanSoftware said:

    @aogail said:

    The point is that I shouldn't have had to do that.

    Agreed. Had you actually known what you were doing, you would have gotten it right the first time.

    @aogail said:

    What's it to you if I'm the latest victim of this lame behavior?

    Because YOU came HERE to complain about something you clearly know nothing about. If you had a clue of the context of something as vast as OS backwards compatibility it would almost be worth arguing with you.

    But as it stands, you know as well as we do you are just here to troll. 

     

    Wait, I've got it! aogail is Lysis' little sister! Lysis felt picked on, so called on sis to come protect him! Now it all makes sense! 



  • @KenW said:

    Wait, I've got it! aogail is Lysis' little sister! Lysis felt picked on, so called on sis to come protect him! Now it all makes sense! 
     

    Thanks a lot. Soda, meet screen.

    Someone get me a napkin...



  • @MasterPlanSoftware said:

    @aogail said:

    That doesn't change the fact that cmd.exe is still the only CLI that's available on all NT systems
     

    That is blatantly not true. From the .NET frameworks download page:

    In this context, CLI means "Command Line Interface", not "Common Language Infrastructure". Which is actually CLR -Common Language Runtime.

     



  • @alegr said:

    @MasterPlanSoftware said:

    @aogail said:

    That doesn't change the fact that cmd.exe is still the only CLI that's available on all NT systems
     

    That is blatantly not true. From the .NET frameworks download page:

    In this context, CLI means "Command Line Interface", not "Common Language Infrastructure". Which is actually CLR -Common Language Runtime.

     

    I realize, but I am skipping that fact, and going on the basis of 'the right tool for the job'.



  • @morbiuswilters said:

    even single tasking was pretty sophisticated for the consumer market at the time

    As opposed to what, exactly?



  • @Random832 said:

    @morbiuswilters said:
    even single tasking was pretty sophisticated for the consumer market at the time

    As opposed to what, exactly?

    Nothing?  Computers were fairly new to the consumer market when DOS was big and there were plenty of single-use machines that didn't even support the idea of loadable software, like dedicated word processors.  Thank you for trying to pick apart a single sentence fragment rather than contribute anything useful.



  • @morbiuswilters said:

    rather than contribute anything useful.
    It should be "contributing." 



  • @bstorer said:

    It should be "contributing."

    Believe it or not, I did that intentionally to see if random would comment on it.  Thank you for tripping my carefully laid trap.  :-P 



  • @morbiuswilters said:

    Thank you for tripping my carefully laid trap
     

    Next time, please use a landmine. Just barely violent enough for habitual thread resurrectors.



  • @morbiuswilters said:

    @bstorer said:

    It should be "contributing."

    Believe it or not, I did that intentionally to see if random would comment on it.  Thank you for tripping my carefully laid trap.  :-P

    "Tripping?"  I stomped all over it!



  • @Pidgeot said:

    Of course, since you wouldn't be writing a script that relies on !-expansion without knowing this requires /V, it would be trivial to make your script launch a new cmd.exe process with the appropriate parameters, if /V is not activated:

    set SOMESPECIALVARNAME=test
    if not "!SOMESPECIALVARNAME!" == "test" (
      cmd /v /c %0 %*
      goto EOF
    )
    rem Insert your script here

    You might want to add a check to ensure that it doesn't go into an infinite loop if run on a system that doesn't support it, and possibly define your own label, but this should run unmodified on XP and later (don't know about Win2000).

    Well, seeing as how the thread has been resurrected anyway:

    D:\Profile>help setlocal
    

    ... snip ...

    SETLOCAL batch command now accepts optional arguments:
    ENABLEEXTENSIONS / DISABLEEXTENSIONS
    enable or disable command processor extensions. See
    CMD /? for details.
    ENABLEDELAYEDEXPANSION / DISABLEDELAYEDEXPANSION
    enable or disable delayed environment variable
    expansion. See SET /? for details.
    These modifications last until the matching ENDLOCAL command,
    regardless of their setting prior to the SETLOCAL command.

    ... snip ...

    D:\Profile>

    So you can just do:

    setlocal enabledelayedexpansion 2>nul

    and delayed expansion will work.

    ... I've worked with cmd for way too long.



  • @aogail said:


    Hooray for MS apologists.

    Do you really love MS so much that you overlook even this ridiculous behavior? No one in their right mind would expect a shell to not properly support variable changes inside if blocks. By the way, adding new and incompatible syntax to work around a bug doesn't count as fixing the bug.

     

    [  ] You know about Powershell 

    [X] Anti-MS-Troll 



  • @jo-82 said:

    [  ] You know about Powershell 

    [X] Anti-MS-Troll 

    Hmm.. those look like checkboxes.  Should probably be radio buttons so only one can be selected at a time. 



  • @morbiuswilters said:

    Hmm.. those look like checkboxes. Should probably be radio buttons so only one can be selected at a time.
    Well one could simultaneously know of Monad and be an Anti-MS-Troll.



  • @Lingerance said:

    @morbiuswilters said:
    Hmm.. those look like checkboxes. Should probably be radio buttons so only one can be selected at a time.
    Well one could simultaneously know of Monad and be an Anti-MS-Troll.

    Yes, but that destroys the humor inherent in the juxtaposition. 



  • Point one: 

    @morbiuswilters said:

    Well, for starters, you have to call to external programs to do most everything for you.

    Point two: 

    @morbiuswilters said:

    There's no telling what version of that program will be available or what arguments it will like or choke on.  Also, there are differing formats for arguments to these programs like grep, sed and awk.

    I'm pretty certain you can't blame bash for both of those at the same time! 

    @morbiuswilters said:

      Additionally, handling variable is pain.  Want to compare strings?  Oh, that's a different syntax than comparing numbers, unless you're using double-brackets!  Better make sure you encase all strings in double-quotes and prepend with a dummy character just in case one of the strings is null!  Oh, you want to atomically lock a resource?  You're gonna need to write an external program in C to handle that so you can use O_EXCL.

    Well, "yeh but meh", to coin a phrase.  Every language has its axiomatic assumptions.  What you really want to attack it for is when it does something inconsistent, like the way quoting is different when you run an executable script as compared to when you source it, surely?




  • @aogail said:

    @Lingerance said:
    @DaveK said:
    @Same example in bash shell said:

    /tmp $ export FOO=baz
    /tmp $ if [ "true" == "true" ] ; then export FOO=bar ; echo "${FOO}" should be bar; fi
    bar should be bar
    /tmp $
    bash --version GNU bash, version 3.2.33(1)-release (i686-pc-linux-gnu) Copyright (C) 2007 Free Software Foundation, Inc.

    ROFLMAO

    DaveK, if you are going to call me a "moron" and say I need to "RTFM", you might want to do the same yourself. You obviously don't know that cmd.exe's "set" is quite different from POSIX "set".

    You are showing poor reasoning skills there - assuming something that I did not imply. 

    @aogail said:

    But wait, you used "export" on the first line and "set" in the if block. Did you actually think that would deceive anyone?

    Truly?  Honestly?

    No, I didn't.

    Y'know, I pondered for a while about claiming this as a deliberate troll.  After all, it has all the characteristics of a well-crafted troll: a false claim disguised in tricky language, phrased in provocative words to distract from the point by provoking an emotional response, yet containing a blatant explanation of the catch behind the disguise.  But no, that's not what it was.  I just screwed up.  I tried too hard to make an example exactly analagous to the testcase, when a far simpler demonstration would have proved my point.

    My point: 

    You are perfectly used to and familiar with a situation where the shell does textual substitution before it executes the command.

    My (simplified and corrected) example: 

    ~ $ foo=bar
    ~ $ echo $foo
    bar
    ~ $ foo=baa echo $foo
    bar

     

    There.  No 'set's or 'export's to confuse the issue now.

    @aogail said:

    You're even worse of an MS fanboy than MasterPlanSoftware and KenW.
     

    This is what philosophy calls "a fallacy of false dichotomy".  You see, hidden in that sentence is the implicit assumption that both you AND microsoft can't both be assholes at the same time - which is obviously false!  No, just because I criticise you, does not mean I support microsoft.  But thanks for backing up my point about your poor reasoning skills.



  • @DaveK said:

    Point one: 

    @morbiuswilters said:

    Well, for starters, you have to call to external programs to do most everything for you.

    Point two: 

    @morbiuswilters said:

    There's no telling what version of that program will be available or what arguments it will like or choke on.  Also, there are differing formats for arguments to these programs like grep, sed and awk.

    I'm pretty certain you can't blame bash for both of those at the same time!

    It seems like different aspects of the same problem: bash lacks much built-in functionality that languages like perl have.  If we're going to attack cmd.exe for lacking features, surely the same applies to bash as well, right?

     

    @DaveK said:

    @morbiuswilters said:

      Additionally, handling variable is pain.  Want to compare strings?  Oh, that's a different syntax than comparing numbers, unless you're using double-brackets!  Better make sure you encase all strings in double-quotes and prepend with a dummy character just in case one of the strings is null!  Oh, you want to atomically lock a resource?  You're gonna need to write an external program in C to handle that so you can use O_EXCL.

    Well, "yeh but meh", to coin a phrase.  Every language has its axiomatic assumptions.  What you really want to attack it for is when it does something inconsistent, like the way quoting is different when you run an executable script as compared to when you source it, surely?


    Yes, but the assumptions in bash are quirky and complicated.  There are decades of cruft built into it.  I'm not sure what you mean about the quoting being different, though.  Executables aren't quoted themselves, but their arguments are.  Anyway, there are plenty of quirks and limitiations in bash that generally make another language like perl preferrable, just like VBS is preferrable to batch scripts.


  • @DaveK said:

    My (simplified and corrected) example: 

    ~ $ foo=bar
    ~ $ echo $foo
    bar
    ~ $ foo=baa echo $foo
    bar

     

    There.  No 'set's or 'export's to confuse the issue now.

    To be honest I have no idea what the line "foo=baa echo $foo" is supposed to do, it does output what you say it does but fails to set foo, however if one is writing multiple commands on one line like you are one puts a semi-colon in, which does output the updated variable and correctly sets said variable as well.



  • @Lingerance said:

    @DaveK said:

    My (simplified and corrected) example: 

    ~ $ foo=bar
    ~ $ echo $foo
    bar
    ~ $ foo=baa echo $foo
    bar

     

    There.  No 'set's or 'export's to confuse the issue now.

    To be honest I have no idea what the line "foo=baa echo $foo" is supposed to do, it does output what you say it does but fails to set foo, however if one is writing multiple commands on one line like you are one puts a semi-colon in, which does output the updated variable and correctly sets said variable as well.

    This bash script is made of fail and AIDS. 



  • @Lingerance said:

    To be honest I have no idea what the line "foo=baa echo $foo" is supposed to do, it does output what you say it does but fails to set foo, however if one is writing multiple commands on one line like you are one puts a semi-colon in, which does output the updated variable and correctly sets said variable as well.
    var=value program syntax exports var to the program, but doesn't affect the environment of the calling process. You'd see the difference if the program itself used var in some way, but it certainly won't affect $var used in the same statement.



  • @Lingerance said:

    @DaveK said:

    My (simplified and corrected) example: 

    ~ $ foo=bar
    ~ $ echo $foo
    bar
    ~ $ foo=baa echo $foo
    bar

     

    There.  No 'set's or 'export's to confuse the issue now.

    To be honest I have no idea

    If only you had stopped there, while you were behind. 

    @morbiuswilters said:

    This bash script is made of fail and AIDS. 

    Clealy, your shell-scripting knowledge is made of fail and not RTFMing.



  • @DaveK said:

    If only you had stopped there, while you were behind.
    Both pieces of bash script you posted you claimed did one thing but you misused a feature to cause the output to cause FUD.
    @DaveK said:
    Clealy, your shell-scripting knowledge is made of fail and not RTFMing.
    Who specifically are you flaming?



  • @morbiuswilters said:

    @Spacecoyote said:

    Exactly what, pray tell, makes bash a pain in the ass?

    Well, for starters, you have to call to external programs to do most everything for you.

    To be fair, that is pretty much a central design point of UNIX.

    There's no telling what version of that program will be available or what arguments it will like or choke on.  Also, there are differing formats for arguments to these programs like grep, sed and awk.  Additionally, handling variable is pain.  Want to compare strings?  Oh, that's a different syntax than comparing numbers, unless you're using double-brackets!  Better make sure you encase all strings in double-quotes and prepend with a dummy character just in case one of the strings is null!  Oh, you want to atomically lock a resource?  You're gonna need to write an external program in C to handle that so you can use O_EXCL.

    Yes, different argument forms for certain commands such as find & dd are a pain, especially if you're new to *nix, but syntax issues are just One Of Those Things are arn't exactly a huge issue. There are plenty of modern languages that have multiple comparision operators, for example. Shell scripts arn't supposed to be a replacement for full blown applications, after all. Bourne (Again) shell scripting is usually more than adequate for small jobs.


  • @Lingerance said:

    @DaveK said:
    Clealy, your shell-scripting knowledge is made of fail and not RTFMing.
    Who specifically are you flaming?

    He's just a jackass troll, ignore him.  He doesn't know how to write a bash script and after fucking it up twice he goes on the offensive.



  • @Vanders said:

    @morbiuswilters said:

    Well, for starters, you have to call to external programs to do most everything for you.

    To be fair, that is pretty much a central design point of UNIX.

    Yes, I am aware of that.  It's also a disadvantage in many situations.  The original point is that bash is more expressive than batch scripting but honestly both are pretty weak for any serious task and you will be better off using a higher-level language like perl or vbs which are pretty much standard for the respective platforms.  I'm not saying bash is useless, just that comparing it to cmd.exe is kind of pointless because both are limited in their own ways and it's not like there aren't more powerful languages available for free.

     

    @Vanders said:

    There's no telling what version of that program will be available or what arguments it will like or choke on.  Also, there are differing formats for arguments to these programs like grep, sed and awk.  Additionally, handling variable is pain.  Want to compare strings?  Oh, that's a different syntax than comparing numbers, unless you're using double-brackets!  Better make sure you encase all strings in double-quotes and prepend with a dummy character just in case one of the strings is null!  Oh, you want to atomically lock a resource?  You're gonna need to write an external program in C to handle that so you can use O_EXCL.

    Yes, different argument forms for certain commands such as find & dd are a pain, especially if you're new to *nix, but syntax issues are just One Of Those Things are arn't exactly a huge issue. There are plenty of modern languages that have multiple comparision operators, for example.

    I'm not new to UNIX and I'm not talking about the difference in argument syntax between different programs, I'm talking about the different versions of the same program.  If you haven't run into this problem, I would posit that it is you who are new to UNIX.  The fact is, most applications have different arguments depending if you are using the BSD versions, GNU versions or some commercial variant.  My point is that scriping on UNIX isn't nearly as easy as everyone seems to indicate.  UNIX has a more expressive shell environment than Windows, Windows usually has far superior GUI tools, Windows has a more standardized scripting language than you will find in UNIX.  I'm trying to bring balance to the discussion because I get tired of people comparing cmd.exe and bash and concluding that "Windows sucks".

     



  • @morbiuswilters said:

    The original point is that bash is more expressive than batch scripting but honestly both are pretty weak for any serious task and you will be better off using a higher-level language like perl or vbs which are pretty much standard for the respective platforms.

    I'm not disagreeing with you on the last point, but having written large scripts in both I know which I prefer. Trying to do anything that requires much more than a list of commands to be executed in sequence using cmd scripts is tortuous, where at least Bourne has a proper series of flow control statements and comparison operators.  It also doesn't rely on heavily abusing the 'for' statement for basic tasks, which I think we should both be able to agree has to be a bonus.

    @morbiuswilters said:

    @Vanders said:
    Yes, different argument forms for certain commands such as find & dd are a pain, especially if you're new to *nix, but syntax issues are just One Of Those Things are arn't exactly a huge issue.

    I'm not new to UNIX and I'm not talking about the difference in argument syntax between different programs, I'm talking about the different versions of the same program.  If you haven't run into this problem, I would posit that it is you who are new to UNIX.  The fact is, most applications have different arguments depending if you are using the BSD versions, GNU versions or some commercial variant.

    "You" in the second person, not "You" personally.

    You (1st) can posit if you must, but the last time I actually ran into this problem in the real world was when I had to log into a Solaris 7 machine, and I can't remember how long ago that was. When was the last time you saw any moderately complex Bourne script that had to be portable across that many UNIX systems anyway? Most scripts tend never to leave the system they were written on, making it largely a moot point.

    I'm trying to bring balance to the discussion because I get tired of people comparing cmd.exe and bash and concluding that "Windows sucks".
     

    It is a valid comparison if the comparison you are making is "Which has the better default command line language?". If the comparison is "Who has the best scripting language?" then you're asking the wrong question in a world of PERL, Python, Ruby and the like, most of which run on both systems anyway.

    We can go back and forth over this all day, but I think we should just agree on who the real scripting villain is here: m4



  • @Vanders said:

    I'm not disagreeing with you on the last point, but having written large scripts in both I know which I prefer. Trying to do anything that requires much more than a list of commands to be executed in sequence using cmd scripts is tortuous, where at least Bourne has a proper series of flow control statements and comparison operators.  It also doesn't rely on heavily abusing the 'for' statement for basic tasks, which I think we should both be able to agree has to be a bonus.

    No offense, but you seriously need to go back and re-read the entire thread instead of chiming in at the end.  I know bash is superior to cmd.exe.  The point is it doesn't really matter that much since there are so many rich alternatives to batch scripting for Windows.  I'm trying to shut up the UNIX freaks who keep pointing out cmd.exe's weakness as some kind of win for UNIX.  I brought up the limitations of bash to point out that it's not perfect either and that most non-trivial scripts should be written in a higher language, although bash will go further than cmd.exe will.  We had this same flaming argument in a thread 2 weeks ago and I'm through with it for now.

     

    @Vanders said:

    You (1st) can posit if you must, but the last time I actually ran into this problem in the real world was when I had to log into a Solaris 7 machine, and I can't remember how long ago that was. When was the last time you saw any moderately complex Bourne script that had to be portable across that many UNIX systems anyway? Most scripts tend never to leave the system they were written on, making it largely a moot point.

    Plenty of times.  However, I generally have more problems with different makes, ccs and libcs.  Not everyone is writing scripts that only run on one server or one platform.  This is one place where Microsoft clearly wins because their higher-level scripting languages are ubiquitous across their OSes (or are incredibly easy to install) whereas the same isn't necessarily true in the UNIX world.



  • @morbiuswilters said:

    This is one place where Microsoft clearly wins
     

    Oh no. You said it. Quick! Hide! Here come the trolls!



  • @morbiuswilters said:

    No offense, but you seriously need to go back and re-read the entire thread instead of chiming in at the end.

    No offence taken, because I did and I'm not. You're replying to things I never said, which is probably why you seem to think I'm disagreeing with you when in fact, we agree on most points.

    I know bash is superior to cmd.exe.  The point is it doesn't really matter that much since there are so many rich alternatives to batch scripting for Windows. I'm trying to shut up the UNIX freaks who keep pointing out cmd.exe's weakness as some kind of win for UNIX.  I brought up the limitations of bash to point out that it's not perfect either and that most non-trivial scripts should be written in a higher language, although bash will go further than cmd.exe will.

    Sure, but my point was that your limitions you highlighted were weak, and the argument that Windows has alternatives is just as valid on UNIX anyway, so you're right back at the question of "Who has the better shell scripting?" and not "Who has the better available scripting?" I personally don't really care that Windows cmd scripting is crap: my days of writing thousand-line scripts spread across multiple files are long behind me, thankfully.

    @morbiuswilters said:

    @Vanders said:
    When was the last time you saw any moderately complex Bourne script that had to be portable across that many UNIX systems anyway? Most scripts tend never to leave the system they were written on, making it largely a moot point.

    Plenty of times.  However, I generally have more problems with different makes, ccs and libcs.  Not everyone is writing scripts that only run on one server or one platform.

    I'm not denying they exist, just that anyone who is trying to write any serious scripts that have to be multi-platform are doing it wrong and should be using a higher-level language. Which was sort of my point last time, but I didn't make it clear enough.

    This is one place where Microsoft clearly wins because their higher-level scripting languages are ubiquitous across their OSes (or are incredibly easy to install) whereas the same isn't necessarily true in the UNIX world.
     

    I'd say Perl fits the bill as a ubiquitous language on UNIX, even if I am allergic to it. I'm sure some spirited individuals might make a case for Python, too. Heck if push to comes to shove, what UNIX doesn't have a Postscript interpreter?

    O.K, maybe not that last one.
     


Log in to reply