Microsoft still hasn't figured it out.


  • Discourse touched me in a no-no place

    @Salamander said:

    Doesn't print anything; it's executing the command like:

    $temp = sprintf("%s -eq %s", $bar, $baz)
    if($foo == $temp) { echo 1 }
    So… it does tokenizing, == does string equality (in this case), and variables are substituted with the string representation of their values when $blah occurs in a double-quoted string? But without any reinterpretation?

    Sounds utterly standard for a whole class of programming languages really.

    What it won't do is be very nice when invoking external programs that don't provide any other interface mechanism other than their command line arguments. The problem there isn't PS itself but rather that Windows (like MS-DOS before it) delegates parsing of command line arguments to the called program, and those do it inconsistently. When things are that bad, there's just no way to be consistent about it. It's grown gradually better, but that's almost entirely because the MSVC runtime has won out as the supplier of the actual initial entry point, at least giving sort-of consistency. (I believe that PS prefers working via COM APIs because those are at least consistently typed even if rather more complex, but this might be mis-recollecting.)



  • @dkf said:

    @Salamander said:
    Doesn't print anything; it's executing the command like:

    $temp = sprintf("%s -eq %s", $bar, $baz)
    if($foo == $temp) { echo 1 }
    So… it does tokenizing, == does string equality (in this case), and variables are substituted with the string representation of their values when $blah occurs in a double-quoted string? But without any reinterpretation?

    Sounds utterly standard for a whole class of programming languages really.

    What it won't do is be very nice when invoking external programs that don't provide any other interface mechanism other than their command line arguments. The problem there isn't PS itself but rather that Windows (like MS-DOS before it) delegates parsing of command line arguments to the called program, and those do it inconsistently. When things are that bad, there's just no way to be consistent about it. It's grown gradually better, but that's almost entirely because the MSVC runtime has won out as the supplier of the actual initial entry point, at least giving sort-of consistency. (I believe that PS prefers working via COM APIs because those are at least consistently typed even if rather more complex, but this might be mis-recollecting.)

    The concept of PS was to be a fully object-oriented shell / scripting language. The reality is that <font face="comic sans ms">most</font> programs don't actually provide a good way to accept objects on startup, hence the preference for the COM APIs &c



  • @dkf said:

    (I believe that PS prefers working via COM APIs because those are at least consistently typed even if rather more complex, but this might be mis-recollecting.)

    PS uses .Net. The commands you run in PS that can handle objects are written either in PS itself or in any language that can be compiled into a .Net DLL. Yes this does mean that native programs can't accept objects; this includes programs using the MSVCRT so that has nothing to do with it.



  • The first time I ever saw PS described, it struck me as Bourne shell with a bad case of second-system effect. That initial impression was confirmed the first time it rewarded a syntax error by scrolling away half of the window in front of me in favour of an over-detailed and yet completely unhelpful error message in red. I currently have it filed under "less annoying than cmd" but so far I haven't actually managed to enjoy using it; like most Windows programming facilities it's more of a why-the-fuck-is-this-so-overcomplicated-never-mind-just-hold-your-nose-and-get-on-with-it sort of thing.

    I've observed before that the problems I have with Unix usually involve trying to work out how to make it do things I want it to do, while the problems I have with Windows usually involve trying to work out how to stop it from doing things I don't want it to do. I've found that very much the case with PS.

    Bash and sed and awk and dc and all the other little languages from Planet Unix are all flawed. Not one of them does everything well. And yet their collective effect manages to create a design space I find intensely pleasing to work with. Go figure. Microsoft still hasn't.


  • Discourse touched me in a no-no place

    @flabdablet said:

    like most Windows programming facilities it's more of a why-the-fuck-is-this-so-overcomplicated-never-mind-just-hold-your-nose-and-get-on-with-it sort of thing
    Huh. It's not just me then.

    My personal least favourite on that theme are the profusion of complexity in the Windows system library APIs, most of are calls that seem to take masses of arguments of which the majority are NULL or 0. Figuring out what the silly things are doing exactly always seems a trial…



  • To kinda beat a dead horse a little more...

    @flabdablet said:

    The first time I ever saw PS described, it struck me as Bourne shell with a bad case of second-system effect.

    This is a classic impression everyone gets when they use PS for the first time. The PS team says they went with it instead of improving cmd because cmd is too legacy and making changes to it are difficult. I personally have stopped using cmd completely, so as far as I'm concerned only one exists, no matter whether it's called "cmd.exe" or "powershell.exe"

    @flabdablet said:

    That initial impression was confirmed the first time it rewarded a syntax error by scrolling away half of the window in front of me in favour of an over-detailed and yet completely unhelpful error message in red.

    This doesn't have anything to do with the second system effect? Although yes PS's error messages are annoyingly unhelpful. There is a separate application called the PowerShell ISE (Interactive Script Editor) that's like an IDE with debugging, breakpoints, etc. if you care for that.

    @flabdablet said:

    like most Windows programming facilities it's more of a why-the-fuck-is-this-so-overcomplicated-never-mind-just-hold-your-nose-and-get-on-with-it sort of thing.

    This is another classic impression that I see in Linux programmers, for example, when they see the CreateProcess Windows API function with its 10 parameters. I much prefer the way Get-Process gives me an object that contains all the details of the process I want in an easily query-able manner, rather than relying on 2 or 3 commands - pidof, ps + some command switches, some grep or awk to extract the relevant columns.

    @flabdablet said:

    I've observed before that the problems I have with Unix usually involve trying to work out how to make it do things I want it to do, while the problems I have with Windows usually involve trying to work out how to stop it from doing things I don't want it to do. I've found that very much the case with PS.

    Oh, PS has its flaws, don't get me wrong. It has the utmost annoying behavior of autounboxing a singleton array into the element itself. For example, Get-Job returns you an array of Job objects, but if there's only one job it'll return you an array with a single object, which PS will then "helpfully" unbox for you. The result is that $jobs = Get-Job is no longer an array, and if you were going to loop over it you'll be in for a surprise. This behavior has to be disabled by declaring $jobs to be of type Job[]] : [Job[]]] $jobs = Get-Job

    But the fact that I get objects instead of a bunch of text which can be formatted in any number of different ways depending on the command I ran is still a plus-point I can't live without. The same way that you hold your nose on PS's (to you) complexity, I feel disgusted by the Linux shell's (to me) rag-tag collection of inconsistent, one-off utilities.



  • @flabdablet said:

    @Ragnax said:
    The problems is; what do you do when code has dependencies spread over multiple DLLs that have version inter-dependencies? It's not possible to cover the arbitrary case where one DLL relies on an exactly paired version of another DLL by simply retaining open references, because dependencies might be late bound and not have been opened yet.

    Anybody who doesn't believe that this is also an issue for Linux hasn't tried updating a running Linux box by rsyncing from another pre-updated box's root filesystem. Abuse any system badly enough and it will break.

    Hence my comment that it's the rest of the world that "still hasn't figured it out" and not Microsoft.



  • @Arnavion said:

    The PS team says they went with it instead of improving cmd because cmd is too legacy and making changes to it are difficult.

    That was absolutely the right call; cmd is a total fucking nightmare and should have been replaced at least a decade ago.

    The Windows APIs have evolved organically over time and now exhibit the same kind of mad inconsistency at the API level that you so dislike when it occurs one level up in Unix's utility collection; building their clean-sheet shell in a way that lets it pass typed stuff back and forth to the most useful of those APIs (which most naturally involves building a type system into the shell itself) was also the right call.

    There is no crystalline beauty to be found in systems programming, only occasional ground-breaking elegance. PowerShell is a workmanlike adaptation of Bourne shell to the conditions encountered in Windows system programming. Compared to cmd, it's wonderful. But that's not saying much.



  • So I opened Visual Studio today. Apparently there's an update, which they helpfully told me existed by putting a tiny one next to the tiny flag on the title bar.

    So I click the link to download the updater. It brings me to a page listing the various versions of Visual Studio. I finally find the one I'm using and download the 1MB updater.

    It starts up and immediately tells me to reboot. Probably because I had Visual Studio open. How stupid of me. I close VS and try again.

    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.

    I reboot and get the updater started. It shows the options I had when I installed the original copy of VS from dreamspark, filled out exactly as I had them. At the bottom, it says "this will require 7GB of disk space."

    Seven. Gigabytes. For an update. Well, that explains why they couldn't put it on Microsoft Update. Anyway, I'm installing now, and I hope it means "seven gigabytes to unpack the update and swap it out for the old stuff" as opposed to "this update is seven gigabytes larger than the previous version".

    Either way, this is going to be downloading for quite a while.



  • @Ben L. said:

    So I opened Visual Studio today. Apparently there's an update, which they helpfully told me existed by putting a tiny one next to the tiny flag on the title bar.

    So I click the link to download the updater. It brings me to a page listing the various versions of Visual Studio. I finally find the one I'm using and download the 1MB updater.

    It starts up and immediately tells me to reboot. Probably because I had Visual Studio open. How stupid of me. I close VS and try again.

    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.

    I reboot and get the updater started. It shows the options I had when I installed the original copy of VS from dreamspark, filled out exactly as I had them. At the bottom, it says "this will require 7GB of disk space."

    Seven. Gigabytes. For an update. Well, that explains why they couldn't put it on Microsoft Update. Anyway, I'm installing now, and I hope it means "seven gigabytes to unpack the update and swap it out for the old stuff" as opposed to "this update is seven gigabytes larger than the previous version".

    Either way, this is going to be downloading for quite a while.

     

    That's... odd.  Last I checked, Visual Studio can be updated through Microsoft Update on Windows XP, or Window Update on Vista or newer if you have "Get updates for other Microsoft products" checked in the update options.

     



  • @powerlord said:

    @Ben L. said:

    So I opened Visual Studio today. Apparently there's an update, which they helpfully told me existed by putting a tiny one next to the tiny flag on the title bar.

    So I click the link to download the updater. It brings me to a page listing the various versions of Visual Studio. I finally find the one I'm using and download the 1MB updater.

    It starts up and immediately tells me to reboot. Probably because I had Visual Studio open. How stupid of me. I close VS and try again.

    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.

    I reboot and get the updater started. It shows the options I had when I installed the original copy of VS from dreamspark, filled out exactly as I had them. At the bottom, it says "this will require 7GB of disk space."

    Seven. Gigabytes. For an update. Well, that explains why they couldn't put it on Microsoft Update. Anyway, I'm installing now, and I hope it means "seven gigabytes to unpack the update and swap it out for the old stuff" as opposed to "this update is seven gigabytes larger than the previous version".

    Either way, this is going to be downloading for quite a while.

     

    That's... odd.  Last I checked, Visual Studio can be updated through Microsoft Update on Windows XP, or Window Update on Vista or newer if you have "Get updates for other Microsoft products" checked in the update options.

     

    Okay, the update finished. Here's what it says:

    That's right, TWO reboots required for a software update. In 2013.


  • Considered Harmful

    @Ben L. said:

    That's right, TWO reboots required for a software update. In 2013.

    What I love is when program X asks for a reboot, and as soon as Windows starts, program Y immediately demands a reboot! And sometimes program Z after that. Usually Adobe products.



  • @Ben L. said:

    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.
    You're told to reboot before install if there are pending rename operations waiting (since those could replace the files that the installer will place with older versions - safer to reboot first, get those files installed/deleted, and then do the new install properly).



  • @ender said:

    @Ben L. said:
    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.
    You're told to reboot before install if there are pending rename operations waiting (since those could replace the files that the installer will place with older versions - safer to reboot first, get those files installed/deleted, and then do the new install properly).

    Right, because having anything resembling a reasonable API that an installer could use to model the filesystem state after all pending renames have been applied is clearly way too hard, and the Blakeyrat Doctrine forbids querying the Registry directly.



  • @flabdablet said:

    @ender said:
    @Ben L. said:
    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.
    You're told to reboot before install if there are pending rename operations waiting (since those could replace the files that the installer will place with older versions - safer to reboot first, get those files installed/deleted, and then do the new install properly).

    Right, because having anything resembling a reasonable API that an installer could use to model the filesystem state after all pending renames have been applied is clearly way too hard, and the Blakeyrat Doctrine forbids querying the Registry directly.

    I'm probably missing something big here, but - what the fuck is a pending rename? If a program wants to rename a file, it renames the fucking file.



  • Not if another program has a lock on it first.



  • @Salamander said:

    Not if another program has a lock on it first.


    Then the rename fails. There's no queue.



  • Now consider a situation where there is a process running that does NOT have a file open (therefore no lock), but will attempt to load it at some random point in time. However the file gets renamed....Oops!

    Hence a "pending" rename. Actually a "script"/"program" set to run during either the shutdown or startup phase "Please wait while windows configures your system" so that there are no other programs running, and the process can be serialized.

    Yes it is possible to design an operating system that does not require this, but it is NOT easy by any means. Once OS used for communication/military system had this requirement, and the contraints that leaked into (prohibited) normal operations were still quite annoying.



  • @Ben L. said:

    Then the rename fails. There's no queue.

    So then, how do you go about the case where you want a file to be renamed and/or replaced but don't particularly care that it gets done right now, just that it gets done at some point in the future?
    Perhaps with, I dunno, a queue of pending operations?



  • @Ben L. said:

    @flabdablet said:
    @ender said:
    @Ben L. said:
    Nope, still need to reboot. Apparently telling me which process I need to kill is too much effort.
    You're told to reboot before install if there are pending rename operations waiting (since those could replace the files that the installer will place with older versions - safer to reboot first, get those files installed/deleted, and then do the new install properly).

    Right, because having anything resembling a reasonable API that an installer could use to model the filesystem state after all pending renames have been applied is clearly way too hard, and the Blakeyrat Doctrine forbids querying the Registry directly.

    I'm probably missing something big here, but - what the fuck is a pending rename? If a program wants to rename a file, it renames the fucking file.

    Instead of complaining or making assumptions based on your limited exposure to multiple operating systems, why don't you RTFM. There are a few interesting sections in that PDF and at least one big WTF early on. Educate yourself.



  • @TheCPUWizard said:

    Now consider a situation where there is a process running that does NOT have a file open (therefore no lock), but will attempt to load it at some random point in time. However the file gets renamed....Oops!

    Hence a "pending" rename. Actually a "script"/"program" set to run during either the shutdown or startup phase "Please wait while windows configures your system" so that there are no other programs running, and the process can be serialized.

    Yes it is possible to design an operating system that does not require this, but it is NOT easy by any means. Once OS used for communication/military system had this requirement, and the contraints that leaked into (prohibited) normal operations were still quite annoying.


    Here's a series of events. Tell me where a file could be renamed by an automatic process (between the two bold lines) and break something:

    1. I open Visual Studio
    2. Visual Studio informs me of an update
    3. I click on the update notification, opening Google Chrome
    4. With Google Chrome, I download the updater
    5. I run the updater
    6. The updater demands that I reboot before updating
    7. I close Visual Studio and retry the updater
    8. The updater still demands that I reboot
    9. I reboot my computer
    10. I open the updater
    11. The updater downloads and installs Visual Studio 2013 RC
    12. The updater demands I reboot again
    13. I reboot my computer again
    14. I open Visual Studio


  • @Ben L. said:

    I'm probably missing something big here, but - what the fuck is a pending rename? If a program wants to rename a file, it renames the fucking file.

    Yeah, you're missing something big there (hint: MOVEFILE_DELAY_UNTIL_REBOOT)



  • @Ben L. said:

    Here's a series of events. Tell me where a file could be renamed by an automatic process (between the two bold lines) and break something:

    1. I open Visual Studio
    2. Visual Studio informs me of an update
    3. I click on the update notification, opening Google Chrome
    4. With Google Chrome, I download the updater
    5. I run the updater
    6. The updater demands that I reboot before updating
    7. I close Visual Studio and retry the updater
    8. The updater still demands that I reboot
    9. I reboot my computer
    10. I open the updater
    11. The updater downloads and installs Visual Studio 2013 RC
    12. The updater demands I reboot again
    13. I reboot my computer again
    14. I open Visual Studio

    There could be files that the updater wants to archive that won't actually be where it expects to find them until right after step 9. It should be perfectly feasible to check for that but I've never seen anybody bother.



    • I reboot my computer

    I open a program that is using ANY PART OF .NET 4.x

    • I open the updater
    • The updater downloads and installs Visual Studio 2013 RC

    The Installation is going to overwrite a part of .NET 4.x - It can not because another program is (or may be) using it.

    • The updater demands I reboot again  - Because the previous step could not be reliably done.
    • I reboot my computer again


  • @TheCPUWizard said:

    • I reboot my computer

    I open a program that is using ANY PART OF .NET 4.x

    • I open the updater
    • The updater downloads and installs Visual Studio 2013 RC

    The Installation is going to overwrite a part of .NET 4.x - It can not because another program is (or may be) using it.

    • The updater demands I reboot again  - Because the previous step could not be reliably done.
    • I reboot my computer again

    So... The updater uses the dlls it modifies? Because I haven't installed anything other than Visual Studio remotely related to .NET.



  • So... The updater uses the dlls it modifies? Because I haven't installed anything other than Visual Studio remotely related to .NET.

    No, the update (installing VS2013RC) is CHANGING .NET - it has no idea if you have loaded any application which may possibly use .NET.

    If it changed the "live" versions, then all hell could break lose. So it Pends all of the changes to the .NET Framework until the reboot. Now since VS2013RC *requires* the NEW versions, it can not run until after the reboot....

    What is so hard to understand?



  • @Ben L. said:

    So... The updater uses the dlls it modifies?

    Quite plausibly, if the updater is itself a .NET application which, in 2013, it probably is.



  • @TheCPUWizard said:

    If it changed the "live" versions, then all hell could break lose.

    And this (for the benefit of Ben L and other young'uns, don't think I'm lecturing at you) is the exact case that Unix used to deal with quite smoothly in the earlier, gentler age before everybody got a hard-on for late binding.

    Unix filesystems have always separated files and their metadata from the directory entries that link to them. File metadata lives in a structure called an inode. An inode contains a reference count; when that reference count is zero, the inode and all associated file blocks get returned to the free pool for re-use. Opening a file creates a file handle and bumps the reference count. So does creating a directory entry. So even if all directory entries for a given file are removed, the file will stay accessible to any processes with open handles on it.

    You can see this behaviour easily by starting playback of a movie, then using the file browser to delete the movie while it's still playing. As far as you can tell by looking at the filesystem, it's completely gone. You can create a new file with the same pathname and there's no conflict with the movie player, which will happily play the deleted movie all the way to the end. Monitor the filesystem's block allocation with df, and you'll see the free blocks jump up as soon as the movie player closes the movie.

    Back in the dawn of time, when executables did all the linking and mmapping of their own components and reading of their own config files that they were ever going to do right when they first started up, this behaviour was enough to let a Unix sysadmin replace anything on the system pretty much at will and not break any running process. Now that everything is sprawling and enormous and late binding has become the Right Thing, that doesn't work any more: replace an underlying library with a new version, and an application can easily end up trying to work with an incompatible mixture of new and old (because opened before replacement) library files.

    The FAT filesystems that Windows inherited from DOS have never had anything like inodes. They don't support multiple directory entries for a single file, so files never had reference counts; removing a file's directory entry deleted the file and returned its blocks to the free pool right then and there. When a process has a file open, that file's directory entry is locked; in-use files just can't be replaced, period. But sometimes they absolutely have to be, which is why MOVEFILE_DELAY_UNTIL_REBOOT is a thing. NTFS does have inodes in the form of MFT entries, but since Windows stuff is now at least as sprawling as Unix stuff, relying on reference counts is no longer sufficient to make updates bulletproof.

    As in so many things, there's a Linux alternative for MOVEFILE_DELAY_UNTIL_REBOOT that's rather more general-purpose: if you want to make wholesale changes to a running Linux box, and it's acceptable for those changes not to take effect until the next reboot, you can do it with full transactional semantics. Take an LVM snapshot of the root volume, mount the snapshot read/write, make all the changes you want to the mounted snapshot (optionally in a chroot jail, if you want all the pathnames to look the same for a package manager's benefit), then unmount it. At this point, you can undo all the changes by simply deleting the snapshot, or commit them all using lvconvert --merge (which will defer the actual commit until after the base volume is next activated; for the root volume, this will happen on reboot).

    If you're going to do that kind of thing, you do need to make sure the root volume is on LVM, and you need to make sure its volume group has enough free space to create a snapshot big enough to deal with all your changes. On a big modern disk that's usually very easy to arrange.

    If you're going to run successive interdependent sets of wholesale changes, and you want all of them to be deferred until reboot, you can do that too - as long as you have some kind of convention for naming the root snapshot. Once you've done lvconvert --merge on a snapshot made from an in-use volume, it doesn't actually go away until the merge has taken effect. It does get hidden, so you do need to know its name, but once you have that you can remount it and make as many additional changes as you like.

    In particular, you can make a running Linux box into a clone of another using this method and rsync.



  • @flabdablet said:

    @TheCPUWizard said:
    If it changed the "live" versions, then all hell could break lose.

    And this (for the benefit of Ben L and other young'uns, don't think I'm lecturing at you) is the exact case that Unix used to deal with quite smoothly in the earlier, gentler age before everybody got a hard-on for late binding.

    A good explanation (and I remember fondly dealing with the very first versions of Unix over 30 years ago), and it does cover a large part of the issue. But it is still not quite the whole story.

    Another condition is that you have a set of files, the first file (a simple text file) is a list of other file names. At a programs startup, it reads the first file into memory. Then over time, it will process (in some fashion) all of the files in has identified by this list.

    Obviously (and regardless of filesystem) if any of the files that were in the list are deleted (remember they are not open yet), then it will be impossible to open them.




  • Discourse touched me in a no-no place

    @TheCPUWizard said:

    Another condition is that you have a set of files, the first file (a simple text file) is a list of other file names. At a programs startup, it reads the first file into memory. Then over time, it will process (in some fashion) all of the files in has identified by this list.

    Obviously (and regardless of filesystem) if any of the files that were in the list are deleted (remember they are not open yet), then it will be impossible to open them.

    Cool story bro.

    Seriously though, what's that got to do with the subtleties of file identity semantics? (Well, apart from showing that some people write crappy programs and use the wrong data structures, but that's not news.)



  • @dkf said:

    .

    Seriously though, what's that got to do with the subtleties of file identity semantics? (Well, apart from showing that some people write crappy programs and use the wrong data structures, but that's not news.)

    NOTHING... but it has everything to do with why it is damn hard (impossible, from a practical point of view) to have an operating system where it is possible to update anything you choose, while other programs are running, without there being some risk. The alternative is to require a cycle where you know you have 100% control of the machine.

    So it become a series of trade-offs....If something is updated on the fly, there is the possibility that some piece of software, written by someone else, and maybe even a piece of crap, will have a problem - and regardless of cause, the company providing the update will be the one to get the blame.


Log in to reply