Messy installation



  • Trying to install a commercial OS on a drive that is also used by linux, i get this strange letters assignation logic:

     

    assigning windows letters 

     

    I understand why C and D are FAT partitions. I also understand why E and F are reserved for my DVD drives, considering other partitions are 'unknown type'. What i don't understand is why C: is on second drive, D: on first,G H I J on first and K L M N on second drive... Anyway, looks liek a funny mess to handle, my OS is now installed on drive D: (first fat partition, first drive)



  • You should see what it does when there's already another installation of windows around. It gets really stupid then.

    The underlying problem is that the drive letter metaphor is stupid and inappropriate. They've combined the overcomplicated design of VMS (volume labels) with the limitations of DOS (no control over the volume labels, limited to C-Z) and come up with a truly insane system.
     



  • I once had a problem where, if I installed XP RTM on a hard drive attached to an add-on IDE controller and did the partitioning from the XP setup disc, it would assign the first hard disk partition to the letter A.  And since these machines had a floppy drive, XP summarily assigned them to the letter C.  Way to go, Windows.

    For the longest time, I used GDISK from Norton Ghost to partition new drives before installing XP, just to avoid this bug.  They seem to have fixed it since in some Service Pack.



  • I just like how it cannot cope with partition sizes > 99 GB :P It's a pet hate of mine, people who design their own user interface when superior existing code exists. Closely related is when the existing code isn't even superior to anything and has no provision to cope with edge cases. For example, Apple Mail in Tiger ships with a mail rule that's too big for the screen:

    The enclosing container should be automatically resizing, the window should be automatically resizing, and the window manager should have an intelligent constraints system that automatically sizes windows to fit the given space unless a specific size is required. One of my huge beefs with Mac OS X -- failure to innovate the basics. Such wasted opportunity. Heck, I run a PII 333 MHz PC and XUL -- which is hardly an efficient system -- performs pretty well. Even GTK+ isn't too bad lately although for larger apps it's still a painful mess (and that's the one where UI code is written in C/C++ and not JavaScript...).

    Windows Setup looks like a completely hand-coded UI. I remember those days. They were a pain.

    But C and A being swapped? That would be priceless to see.



  • I had 8 MB of unpartitioned space for a long time, too.  Anyone know what causes it?  I think I had to use Partition Magic to add it to my main partition.



  • @Cap'n Steve said:

    I had 8 MB of unpartitioned space for a long time, too. Anyone know what causes it? I think I had to use Partition Magic to add it to my main partition.

    I'm not quite sure, but I think it's reserved for logical disks. 



  • This partition manager often gets really stupid. I often had this problem with WinXP RTM (no service pack preinstalled):

    When I completely reformat my harddrive and reinstall WIndows, creating a new partition (using the installer's manager), the installer assigns a C: letter to it, but after Windows is installed the system HDD (primary master) becomes D: and the CD-ROM (secondary master) becomes C:. I repeated the whole reformatting/reinstallation process and everything returned back to normal.



  • @Cap'n Steve said:

    I had 8 MB of unpartitioned space for a long time, too. Anyone know what causes it? I think I had to use Partition Magic to add it to my main partition.

    Rounding error. I forget the details. It should be precisely one cylinder of space. 



  • Is it just me, or is that a photograph of the monitor? And, if so, why wasn't it placed against a wooden table backdrop?



  • It's hardly a WTF to take a photograph of the monitor when A) in an environment that does not allow saving of screenshots and B) in a circumstance that cannot be duplicated by doing the installation in an emulator.

    About as WTFy (i.e. not at all) as taking a picture when there's a windows error on an ATM machine or airport flight listings. 



  • @asuffield said:

    ...(no control over the volume labels, limited to C-Z)...

    Using the disk manager you can assign letters to drives as you desire, or not assign a letter at all and simply mount the volume under another drive.

    (To the op - FAT32? Why?)



  • Presumably for linux r/w permissions to work reliably. Although you can apparently get a module that allows witting to NTFS, but I haven't investigated into it, last I heard it was experimental.



  • Writing to NTFS in Linux is as stable as it is in Windows.



  • @benryves said:

    @asuffield said:

    ...(no control over the volume labels, limited to C-Z)...

    Using the disk manager you can assign letters to drives as you desire, or not assign a letter at all and simply mount the volume under another drive.

    You have become so used to the Windows idiocy that you have failed to even comprehend what they have done. In VMS and the other systems from which the concept of volume labels was inherited, you could name a volume just like you would any file. You were not limited to picking one of 23 capital letters. In Windows, you cannot choose the name, you're stuck with it because DOS only had enough memory for a static array.



  • @asuffield said:

    In Windows, you cannot choose the name, you're stuck with it because DOS only had enough memory for a static array.

    Aaaaaand cue your friendly neighborhood Microsoft apologist.

    You're not stuck with it because of the limitations of DOS.

    You're stuck with it because such a massive proportion of the independent developer community thought the limitations of DOS were forever.

    If you want to have names just like files, you simply share the root of the drive. You name it whatever you want. Programs that understand UNC paths will happily work with it; you just say "\\my longass volume name\blah\blah\blah\filename" and everything works great.

    Trouble is, we have these crapplications from who knows when that simply insist on knowing the drive letter and local path for these files. So no matter how smart the rest of your apps are, all your volumes must also have a single drive letter so they don't break crapplications.

    But that requires you to think about someone other than yourself. Instead of saying "I don't have any crap that needs that, why do I need to do this?" you would need to say "gee, Microsoft has billions of customers, maybe they're not all just like me".

     



  • Also windows does support linux style mounting (that is my secondary drive has an access path of C:\whatever); it's available when you format the drive iirc.



  • @aythun said:

    Writing to NTFS in Linux is as stable as it is in Windows.
    I see what you did there...



  • @CDarklock said:

    If you want to have names just like files, you simply share the root of the drive. You name it whatever you want. Programs that understand UNC paths will happily work with it; you just say "\my longass volume name\blah\blah\blah\filename" and everything works great.

    At about a third of the speed, because you're now shoving every local disk access through the CIFS server and client, and you've also created a whole new flavour of security issues. That is truly a Microsoft solution.

    Trouble is, we have these crapplications from who knows when that simply insist on knowing the drive letter and local path for these files. So no matter how smart the rest of your apps are, all your volumes must also have a single drive letter so they don't break crapplications.

    But that requires you to think about someone other than yourself. Instead of saying "I don't have any crap that needs that, why do I need to do this?" you would need to say "gee, Microsoft has billions of customers, maybe they're not all just like me".

    At least four times in the past ten years, Microsoft has created an entirely incompatible new interface for applications. The latest one is .NET. Any application written against it does not depend on any of the old APIs and none of the old applications can ever use the new one. You cannot excuse the perpetuation of this idiocy by saying "existing applications need it" when there are no existing applications using the incompatible new interfaces.



  • @Random832 said:

    It's hardly a WTF to take a photograph of the monitor when A) in an environment that does not allow saving of screenshots and B) in a circumstance that cannot be duplicated by doing the installation in an emulator.

    About as WTFy (i.e. not at all) as taking a picture when there's a windows error on an ATM machine or airport flight listings. 

    I was, of course, kidding.



  • @benryves said:



    (To the op - FAT32? Why?)

     

    Mainly because i partitionned on linux first, creating 2 fat partition (no choice for a mkntfs if i remember well). Then windows installer created at my demand a ntfs partition on drive D: (where i installed windows). But then it also installed on fat32 C: (without giving me a change to format it) the nt boot loader. Am stuck with a C: which is a fat32 originally meaned to store my few games and now the nt bootloader. Note that because bootloader is there, it refused to format it to ntfs just after install :/

     And to question, why install on D? because i wanted to install on "first windows partition on first hard driver, which got assigned "D:" letter.
     Now, because all i do in windows is gaming, no need for anything else...

     



  • The real WTF is the "word" "assignation."

    Drive letters are assigned to active partitions first, then primary partitions, then logical drives within extended partitions.

    The top of your screen shot is cut off, so I can't be sure what is going on.  I'm guessing that the disk at the top is an IDE disk, but the bottom one is on the boot controller.
     



  • @operagost said:

    Drive letters are assigned to active partitions first, then primary partitions, then logical drives within extended partitions.

    While true as far as it goes, other factors involved in how Windows decides this include: the order in which device drivers were loaded when the NT kernel booted, SCSI ID numbers (if applicable), the existence and configuration of any other installations of Windows on any visible partition (with truly strange behaviour if there is more than one), the order in which partitions appear in the partition table, the order in which partitions appear on the disk (and no, those two are not the same thing), the partition type flag, the current contents of the partition (notably what type of filesystem, if any, is in it), and the information about the disk configuration supplied by the BIOS (which is frequently buggy).

    That is not an exhaustive list, those are just the ones that I happen to know about. I do not believe that this 'feature' of Windows has been particularly well tested; I think that Microsoft only does significant testing on simple hardware configurations, and the behaviour when presented with other non-trivial setups is buggy at best. 



  • @asuffield said:

    At least four times in the past ten years, Microsoft has created an entirely incompatible new interface for applications. The latest one is .NET. Any application written against it does not depend on any of the old APIs and none of the old applications can ever use the new one. You cannot excuse the perpetuation of this idiocy by saying "existing applications need it" when there are no existing applications using the incompatible new interfaces.

    No.  The only situation in which this was partially true is the .NET 1.1 -> .NET 2.0 transition; however, you can even make a .NET 1.1 assembly run in .NET 2.0.  .NET 3.0 is nothing more than a slight superset of .NET 2.0.  .NET 1.1 was completely compatible with .NET 1.0.  Don't confuse versions of Visual Studio with compatibility.



  • @ShadowWolf said:

    @asuffield said:

    At least four times in the past ten years, Microsoft has created an entirely incompatible new interface for applications. The latest one is .NET. Any application written against it does not depend on any of the old APIs and none of the old applications can ever use the new one. You cannot excuse the perpetuation of this idiocy by saying "existing applications need it" when there are no existing applications using the incompatible new interfaces.

    No.  The only situation in which this was partially true is the .NET 1.1 -> .NET 2.0 transition; however, you can even make a .NET 1.1 assembly run in .NET 2.0.  .NET 3.0 is nothing more than a slight superset of .NET 2.0.  .NET 1.1 was completely compatible with .NET 1.0.  Don't confuse versions of Visual Studio with compatibility.

    You have missed my point quite impressively. Windows NT -> .NET was a completely incompatible transition. C applications written against the NT API cannot run against .NET. There were no pre-existing .NET applications, so there is no excuse for perpetuating insanity in the name of compatibility. It was a completely new system for which software had to be largely rewritten. Another well-known transition with this behaviour was win16 -> win32, which even accompanied an OS rewrite.



  • @aythun said:

    Writing to NTFS in Linux is as stable as it is in Windows.


    Does anyone know how the NTFS for Linux development went?  It kind of worries me that NTFS was out for like 10 years and Linux had experimental read support and no write support, and then in the last couple of years they suddenly got to "just as good as Windows".



  • @Cap'n Steve said:

    @aythun said:
    Writing to NTFS in Linux is as stable as it is in Windows.


    Does anyone know how the NTFS for Linux development went? It kind of worries me that NTFS was out for like 10 years and Linux had experimental read support and no write support, and then in the last couple of years they suddenly got to "just as good as Windows".

    Short periods of intense activity interspersed with long periods of being completely ignored. It's not hugely difficult to implement NTFS when somebody actually bothers to do it. 



  • Speaking of drive letters, my Windows installs in the last few years ended up on U:, R:, L: and currently T:. I have no idea where the drive letters came from, because I didn't have that many partitions (and just 3 optical drives), and I only got a cardreader before the last reinstall. Anyway, Windows NT lets you use just about every Unicode character for drive letter, but most programs won't be able to see these drives then. Most ASCII characters will work though, and even though the drive "letters" won't appear in file dialogs, you can access them by typing them in manually.
    Also, if you know how, you can change all drive letters, including boot and system drives with some Registry editing (I actually don't know why Windows won't let you change the boot drive letter through the Disk Management UI - it's not like there's any files that are needed after the boot there).

    As for the OP's C: drive, doing a Start -> Run -> convert C: /FS:NTFS should do the trick.



  • @asuffield said:

    You have missed my point quite impressively. Windows NT -> .NET was a completely incompatible transition. C applications written against the NT API cannot run against .NET. There were no pre-existing .NET applications, so there is no excuse for perpetuating insanity in the name of compatibility. It was a completely new system for which software had to be largely rewritten. Another well-known transition with this behaviour was win16 -> win32, which even accompanied an OS rewrite.

    I'm afraid your point makes no sense, so missing it is fairly easy. "C applications written against the NT API" (there's no such thing as the "NT API", BTW - it's the Win32 API) should NOT run against .NET, as they can run perfectly well against the already existing Win32 API. That's like saying "Windows applications don't run on the Mac, therefore Microsoft didn't maintain compatibility." Comparing apples with oranges never makes sense; neither does comparing Lear Jets and bicycles.

    .NET applications should be compared for compatibility with other versions of .NET, not with totally different platforms.

    And Win16->Win32 wasn't that bad a transition, was it? If it was for you, you either need a better language choice or better skills. I made the transition with both C and Delphi; the Delphi transition was almost non-existent; the change in strings from ShortString [length 255 chars or less] to AnsiString [limited by available memory up to 2GB] was about the only thing that was significant - the change in pointer size was almost negligible unless you were intentionally doing things low-level.
     



  • @operagost said:

    The real WTF is the "word" "assignation."

    I guess you missed the fact that the OP is from Belgium, and therefore probably not a native English language speaker. 

    And for the OP: "Assignation" typically is used to mean a romantic liaison. The word you're looking for is "assignment".
     



  • @asuffield said:

    @ShadowWolf said:
    @asuffield said:

    At least four times in the past ten years, Microsoft has created an entirely incompatible new interface for applications. The latest one is .NET. Any application written against it does not depend on any of the old APIs and none of the old applications can ever use the new one. You cannot excuse the perpetuation of this idiocy by saying "existing applications need it" when there are no existing applications using the incompatible new interfaces.

    No.  The only situation in which this was partially true is the .NET 1.1 -> .NET 2.0 transition; however, you can even make a .NET 1.1 assembly run in .NET 2.0.  .NET 3.0 is nothing more than a slight superset of .NET 2.0.  .NET 1.1 was completely compatible with .NET 1.0.  Don't confuse versions of Visual Studio with compatibility.

    You have missed my point quite impressively. Windows NT -> .NET was a completely incompatible transition. C applications written against the NT API cannot run against .NET. There were no pre-existing .NET applications, so there is no excuse for perpetuating insanity in the name of compatibility. It was a completely new system for which software had to be largely rewritten. Another well-known transition with this behaviour was win16 -> win32, which even accompanied an OS rewrite.

    I'm not sure where you're going with this.  .NET Framework is nothing more than an architecture - at some point in time it has to call the same kernel level APIs.  You can't create a file without, at some point in time, the kernel/fs drivers becoming involved.  I'm not sure what you believe the .NET framework did to Windows, but it still runs on the Windows Kernel.  Windows NT -> .NET is not a transition.  It's a decision. And there were a lot of NT related fixes.  There are other things that could not be fixed - such as oddities involved with .NET COM Interop & p/invoke.

    From what I've read and seen, a majority of apps still aren't even written in .NET or use the .NET framework for any reason. 

    Win16 -> Win32 did not do that either.  Win16 legacy stuff can be found all over in Win32 APIs and behaviors because people relied on the bugs.  If you don't believe me, do a bit of research on your own.  A starting place is the blog "Old New Thing" by Raymond Chen.  He kinda chats about some of these "crapplications" as someone else said.  OS Rewrite != removal of legacy code.

    Or In short:  Yes I missed your point because you haven't made a lucid point yet.

    You want a case in point of how wrong you are?  Vista broke a lot of backwards crapplications (I like that term now).  When was the last time you heard people complaining about driver authors being lazy and not getting on the ball with their stuff?  Or people complaining about application authors for not supporting their backware?  Nope.  They complain about Vista breaking old applications.  They complain about UAC prompts because people can't figure out the rocket science of application development.  Rather than fix their poorly designed application that's been wrong for 7 years, they decide to require administrative rights.  [Not to say Vista doesn't have bugs!  Anyone with so much as a laptop running Vista can concur that the SP is badly needed]  Most of the people doing the so-called "Upgrade" to XP are the people who blame Microsoft because their idiotically designed application violates Microsoft's rules on App design.

    Why?  Because they took initiative to break some of the security vulnerabilities that people clamored for.  You pine for the security fixes and then you curse them for actually doing them!



  • @ShadowWolf said:

    Win16 -> Win32 did not do that either.  Win16 legacy stuff can be found all over in Win32 APIs and behaviors because people relied on the bugs.

    At a risk of sounding stupid, I think the point that A was trying to make was that Microsoft should have taken the initiative, when devising the Win32 API (for NT 3), to overhaul all the considerable flaws in the design of the OS. To be fair, they did make some improvements, such as increasing the maximum path length from a measly 64 to a slightly more healthy 255.

    @ShadowWolf said:

    They complain about Vista breaking old applications.  They complain about UAC prompts because people can't figure out the rocket science of application development. 

    Well, it's a bit late for Microsoft now. Once you've bent over backwards like a tetanus victim for years and years until your spine breaks to pretend that people's shitty software works when it doesn't, people will come to expect each new Windows version to support old software perfectly.

    I don't know why Microsoft didn't grow some balls a long time ago and make it perfectly clear why certain software no longer works. Its not their problem; why should they care about it? It's not like offending all the shoddy software vendors would cost them sales or monopoly since they'd find all new ways to lock people in. Maybe they'd even gain some respect for a devotion to purity and integrity of the OS, although that would also imply that the rest of the system be something vaguely close to being well-designed.



  • @Daniel Beardsmore said:

    @ShadowWolf said:

    Win16 -> Win32 did not do that either.  Win16 legacy stuff can be found all over in Win32 APIs and behaviors because people relied on the bugs.

    At a risk of sounding stupid, I think the point that A was trying to make was that Microsoft should have taken the initiative, when devising the Win32 API (for NT 3), to overhaul all the considerable flaws in the design of the OS. To be fair, they did make some improvements, such as increasing the maximum path length from a measly 64 to a slightly more healthy 255.@ShadowWolf said:

    They complain about Vista breaking old applications.  They complain about UAC prompts because people can't figure out the rocket science of application development. 

    Well, it's a bit late for Microsoft now. Once you've bent over backwards like a tetanus victim for years and years until your spine breaks to pretend that people's shitty software works when it doesn't, people will come to expect each new Windows version to support old software perfectly.

    I don't know why Microsoft didn't grow some balls a long time ago and make it perfectly clear why certain software no longer works. Its not their problem; why should they care about it? It's not like offending all the shoddy software vendors would cost them sales or monopoly since they'd find all new ways to lock people in. Maybe they'd even gain some respect for a devotion to purity and integrity of the OS, although that would also imply that the rest of the system be something vaguely close to being well-designed.

    What Microsoft should've done is certainly something I'm not going to argue with.  Even today, I gripe and complain (probably weekly, maybe less often, but it feels like at least once a week) about some stupid measure they've taken that helps bad development practices.

    The thing about some of those was that they had no choice.  Having been in a situation where you're trying to migrate people from Tech A to Tech B, it's hard to convince them to do so without some form of monstrosity.  You end up with some kind of Tech A.5 that has the disadvantages of both and the advantages of niether basically.  But the point is that developers don't make the decisions - management do.  And often times, managers like to think "OH!  We can still use 80% of the code we used to have".

    This is outside and beyond the argument and constant issue of misuse of Windows APIs though.

    They started to break things in XP SP2 and no one really complained.  Most people were touting the "It's security" moniker, so they figured it would still work.  It didn't for one reason or another (IE7 is a PITA like Vista, but comparatively speaking no one seems to care).



  • @KenW said:

    @asuffield said:

    You have missed my point quite impressively. Windows NT -> .NET was a completely incompatible transition. C applications written against the NT API cannot run against .NET. There were no pre-existing .NET applications, so there is no excuse for perpetuating insanity in the name of compatibility. It was a completely new system for which software had to be largely rewritten. Another well-known transition with this behaviour was win16 -> win32, which even accompanied an OS rewrite.

    I'm afraid your point makes no sense, so missing it is fairly easy. "C applications written against the NT API" should NOT run against .NET, as they can run perfectly well against the already existing Win32 API. That's like saying "Windows applications don't run on the Mac, therefore Microsoft didn't maintain compatibility."

    You have managed to understand all the facts I was stating while completely missing the point. My working hypothesis is that you are stupid. It is indeed exactly like saying that Windows applications don't run on the Mac, therefore Microsoft (actually, Apple) didn't maintain compatibility.

    The point is that you cannot then turn around and say that Apple had to do something in order to maintain compatibility with Windows, because they weren't even trying to maintain compatibility. The claim I was responding to was that Microsoft had to preserve a misfeature for compatibility reasons. This claim doesn't hold because on numerous occasions, Microsoft have changed APIs so fundamentally that no effort at compatibility was possible or intended, and any one of those instances would have been an adequate opportunity to eliminate the misfeature.

    (there's no such thing as the "NT API", BTW - it's the Win32 API)

    Programs written against the NT API do not necessarily run on every win32 platform. All versions of Windows 4.x (95/98/ME/etc) implemented the win32 API, but are not binary-compatible with applications that run on NT, while all applications written against win32 will run on the NT platforms (modulo 'security' misfeatures of newer versions). The NT API includes a lot more stuff. You have perhaps been mislead by the Microsoft marketing position that they're the same thing - they aren't. NT applications don't run on win95.


    And Win16->Win32 wasn't that bad a transition, was it?

    That depends entirely on which features you were using. Some parts changed more than others. There wasn't a great deal of difference in how you would display a window and render some text into it, but there were major changes in networking, interfaces to system components (explorer, control panel), etc. 



  • @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.



  • @asuffield said:

    @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.

    No - that's not my point at all.  My point is the specific instances you've mentioned so far have not been opportunities.  You said:

    At least four times in the past ten years, Microsoft has created an entirely incompatible new interface for applications.

    This is false.  I have shown why, and even took the opportunity to prove the opposite (that you must maintain backward compatibility) with a real-world example to link the two scenarios together.



  • @ShadowWolf said:

    @asuffield said:
    @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.

    No - that's not my point at all.  My point is the specific instances you've mentioned so far have not been opportunities.  You said:

    At least four times in the past ten years, Microsoft has created an entirely incompatible new interface for applications.

    This is false.  I have shown why, and even took the opportunity to prove the opposite (that you must maintain backward compatibility) with a real-world example to link the two scenarios together.

    Read for comprehension: _HIS_ point was that there are changes that they should have made, that they did not make, and that the only excuse for not having made them is backwards compatibility. Therefore in cases where backwards compatibility was not maintained, they have no excuse for not having changed these things.



  • @Daniel Beardsmore said:

    Well, it's a bit late for Microsoft now. Once you've bent over backwards like a tetanus victim for years and years until your spine breaks to pretend that people's shitty software works when it doesn't, people will come to expect each new Windows version to support old software perfectly.

    I don't know why Microsoft didn't grow some balls a long time ago and make it perfectly clear why certain software no longer works. Its not their problem; why should they care about it? It's not like offending all the shoddy software vendors would cost them sales or monopoly since they'd find all new ways to lock people in. Maybe they'd even gain some respect for a devotion to purity and integrity of the OS, although that would also imply that the rest of the system be something vaguely close to being well-designed.

    Raymond Chen ([url]http://blogs.msdn.com/oldnewthing[/url]) has several dozen posts on this very subject.

    In a nutshell, suppose that a Fortune 500 corporation has several internal applications that they depend on to keep their business running. Also suppose that they outsourced the development of these applications to a consulting company who wrote software that depended on things being done by the OS in a specific (perhaps undocumented) way, and the corporation didn't receive the source.  Now MS releases a new version of Windows that fixes or changes the undocumented feature that the consultants had relied on, and the applications no longer work. Who gets the blame (and bad publicity)? Even though MS just did what you think they should have and fixed the problem, but it's not gonna be "Those stupid consultants broke our software.". Why not? Because the software didn't change. So it has to be Microsoft's fault, doesn't it? So the backward compatibility isn't optional.

    You keep thinking of your own software, maybe. You know, the single application that runs on the machines at the company you work for (maybe 50 users or so). If you release a new version, you have control over the environment those users operate in, and can handle updating things to support your software. MS doesn't have that luxury, though. They write an operating system, not a desktop application. That OS has to run tons of software written by thousands (if not millions) of programmers, and MS has to make sure that new releases of Windows continue to run the same software as the previous version of Windows did (even if that software did things wrong), or MS gets the blame.

    It's not a question of "Microsoft not growing some balls". It's a question of you not understanding the difference in scale between your (and my) little application and user base as opposed to the millions of different users, computer systems, and software running on Windows. It also has nothing to do with "offending shoddy vendors"; it's about the end user running the software, and about who gets the blame regardless for compatibility issues even when it isn't their fault.

    And no, I'm not a major MS fanboy (although I do run Windows). I use CodeGear's Delphi as my primary development environment for Windows programming. I also have a Linux box at home. I just think that people who write things without any comprehension of the issues that they're writing about should learn from their mistake.

    You should spend some time at Raymond's blog. It really is educational. 



  • @asuffield said:

    You have managed to understand all the facts I was stating while completely missing the point. My working hypothesis is that you are stupid. It is indeed exactly like saying that Windows applications don't run on the Mac, therefore Microsoft (actually, Apple) didn't maintain compatibility.

    No, we've proven in the other thread about literacy that you're an absolute moron and jackass, and you've further proven it here by degrading yourself by launching a personal attack because I showed your idiocy yet again here.

    @asuffield said:

    The point is that you cannot then turn around and say that Apple had to do something in order to maintain compatibility with Windows, because they weren't even trying to maintain compatibility. The claim I was responding to was that Microsoft had to preserve a misfeature for compatibility reasons. This claim doesn't hold because on numerous occasions, Microsoft have changed APIs so fundamentally that no effort at compatibility was possible or intended, and any one of those instances would have been an adequate opportunity to eliminate the misfeature.

    And again, you've proven that you have no reading comprehension (and apparently no development knowledge, either). The Win32 API didn't change as a result of .NET, as you keep trying to say it did. The .NET Framework is a layer on top of the Win32 API (and indeed contains calls to the API throughout), and not a replacement for the API. Perhaps if you can grasp this fundamental fact, you'll quit being such an ass (at least publicly here in these fora).

    @asuffield said:

    Programs written against the NT API do not necessarily run on every win32 platform. All versions of Windows 4.x (95/98/ME/etc) implemented the win32 API, but are not binary-compatible with applications that run on NT, while all applications written against win32 will run on the NT platforms (modulo 'security' misfeatures of newer versions). The NT API includes a lot more stuff. You have perhaps been mislead by the Microsoft marketing position that they're the same thing - they aren't. NT applications don't run on win95.
     

    WTF are you talking about? I write applications all the time that run perfectly fine on Win95/98/ME/NT4/XP/2003/Vista. If you keep your head out of your ass and write the code to handle differences in the API properly, it's easy to keep supporting different versions of Windows. NT applications run fine on Win95, as long as you don't try and blindly use NT features on Win95; you can still use NT features on NT from the same app that doesn't use them on Win95.

    Let me guess... Software development isn't your real profession, is it? You probably "code" in HTML, don't you? You certainly don't do any kind of real professional development for other than toy stuff. 

    @asuffield said:

    That depends entirely on which features you were using. Some parts changed more than others. There wasn't a great deal of difference in how you would display a window and render some text into it, but there were major changes in networking, interfaces to system components (explorer, control panel), etc. 

    And, yet again, these differences were negligible if you had a clue about what you were doing. But it appears that you couldn't buy a clue if someone gave you a million dollars to get one. 

    Come back and respond to a post that you at least have some limited knowledge about; maybe somebody will ask about what sort of pail they should buy to scoop sand at the beach. Google could probably give you an answer to that question that you could comprehend. Hell, Google Maps would even give you directions - you could seem semi-intelligent! That would be a switch!



  • @asuffield said:

    @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.

    And you're wrong yet again. Read my response to the other post about why this wasn't possible, and perhaps spend some time at Raymond's site (URL in the other post). Hey! That would be  a great idea - you could probably learn a little about real development there as well! 

    ShadowWolf, give up. asuffield lacks the basic intelligence, logic, and comprehension to understand how wrong his information is; he appears to just enjoy spreading BS without any real information or facts. 



  • @ShadowWolf said:

    @asuffield said:
    @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.

    No - that's not my point at all.  My point is the specific instances you've mentioned so far have not been opportunities.

    The other guy's a troll, but you appear to be paying some degree of attention, so I'll try one last time:

    When Microsoft created .NET, they did have the opportunity to create new APIs that did not include the limitations of the old ones, so that new applications could have supported more flexible filename structure; there were no pre-existing .NET applications, so there could be no compatibility issues. For another instance, when they created win32, they also had that opportunity. That's two specific cases when they could have eliminated this issue (not to mention a whole bunch of others). The design of the software they used would even have easily permitted it, in both cases - they just didn't do it. Frankly, I doubt they even thought of it.

    When they created Vista, there was no such API change, so they did indeed not have an opportunity at that particular point in time - that isn't relevant.



  • @asuffield said:

    @ShadowWolf said:
    @asuffield said:
    @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.

    No - that's not my point at all.  My point is the specific instances you've mentioned so far have not been opportunities.

    The other guy's a troll, but you appear to be paying some degree of attention, so I'll try one last time:

    When Microsoft created .NET, they did have the opportunity to create new APIs that did not include the limitations of the old ones, so that new applications could have supported more flexible filename structure; there were no pre-existing .NET applications, so there could be no compatibility issues. For another instance, when they created win32, they also had that opportunity. That's two specific cases when they could have eliminated this issue (not to mention a whole bunch of others). The design of the software they used would even have easily permitted it, in both cases - they just didn't do it. Frankly, I doubt they even thought of it.

    When they created Vista, there was no such API change, so they did indeed not have an opportunity at that particular point in time - that isn't relevant.

    Well... to be completely fair, it's the filesystem. What happens when you save a file with a new more flexible shinier filename structure from a .NET application, and then try to open it in a non-.NET application? What happens when you use drag-and-drop; the file drop list format is just a list of filenames. While backwards compatibility _itself_ might not be as much of a problem, there's still interoperability.



  • @asuffield said:

    @ShadowWolf said:
    @asuffield said:
    @ShadowWolf said:

    Vista broke a lot of backwards crapplications (I like that term now).

    While true, that is irrelevant. My point was of the form "some opportunities for doing this existed at various points in time"; your response is of the form "at least one point in time existed when they did not have an opportunity". That doesn't even disagree with what I wrote.

    No - that's not my point at all.  My point is the specific instances you've mentioned so far have not been opportunities.

    The other guy's a troll, but you appear to be paying some degree of attention, so I'll try one last time:

    When Microsoft created .NET, they did have the opportunity to create new APIs that did not include the limitations of the old ones, so that new applications could have supported more flexible filename structure; there were no pre-existing .NET applications, so there could be no compatibility issues. For another instance, when they created win32, they also had that opportunity. That's two specific cases when they could have eliminated this issue (not to mention a whole bunch of others). The design of the software they used would even have easily permitted it, in both cases - they just didn't do it. Frankly, I doubt they even thought of it.

    When they created Vista, there was no such API change, so they did indeed not have an opportunity at that particular point in time - that isn't relevant.

    Microsoft's implementation of .NET relies on Win32, so while they can remove the visiblity of the oddities of the API, it wasn't an opportunity to update or remove those nuances from the Win32 side of things without breaking existing applications (note:  I understand that not all .NET functionality relies on Win32, but most of it probably does from my understanding).  Win32 required a bridge to get 16-bit developers to move to 32-bit.  Because of that, there were certain things that had to be done to appease the majority of developers.  See my above post for my reasons for that.

    Vista contained substantial security and functionality changes.  There is not illustrative difference between the move from 16-bits -> 32-bits and Insecure code -> Security focused code.  The only real difference is a matter of degree and functionality; however, the same opportunities to code-break exist.  In Win32 they didn't do that.  In Vista they did.  See my example for the net result.



  • @asuffield said:

    The other guy's a troll, but you appear to be paying some degree of attention, so I'll try one last time:

    God! Don't you ever quit trying to beat the dead horse that you've ridden all the way to Totally Wrong Town???

    @asuffield said:

     

    When Microsoft created .NET, they did have the opportunity to create new APIs that did not include the limitations of the old ones, so that new applications could have supported more flexible filename structure; there were no pre-existing .NET applications, so there could be no compatibility issues. For another instance, when they created win32, they also had that opportunity. That's two specific cases when they could have eliminated this issue (not to mention a whole bunch of others). The design of the software they used would even have easily permitted it, in both cases - they just didn't do it. Frankly, I doubt they even thought of it.

    When they created Vista, there was no such API change, so they did indeed not have an opportunity at that particular point in time - that isn't relevant.

    And yet again, you're wrong. As ShadowWolf pointed out very eloquently to you, .NET runs ON TOP OF the existing Win32 API. .NET IS NOT A NEW PLATFORM!!! 

    I'm hoping that, by typing in all caps and bold letters, it might sink in. Sorta like yelling to help a deaf person understand what you're saying, I know; you're obviously not intelligent enough to understand what has been explained to you multiple times by multiple people. If it was just me that you weren't understanding, I'd think I was not explaining it well enough for your level of comprehension (like you do when speaking to a small child, for example, I've tried to bring things down to a level you could grasp). But apparently even the child level is too much for you.

    You can't make such drastic languages to a layer that relies on the underlying platform; if you do, you make the new layer unusable. See ShadowWolf's explanation of new file names, for instance. Surely even you can grasp why that would be a problem... It's the same reason that, when MS changed the API with Win32 to support long file names, they implemented a backward-compatibility feature so that applications that didn't support long filenames still worked; you could use a 16-bit Windows app or a DOS app to open a file created with a long filename on Win95. Since .NET isn't a new platform, but simply a framework on top of the existing platform, the comparable strategy wouldn't be available.

    MS does have an opportunity to get rid of some of the cruft from the old compatibility stuff with Win64, if they choose to take it. Whether they will fully is doubtful, as the Win32 subsystem on Win64 still has to retain much of that backwards-compatibility. They could, however, drop it in the Win64 layer only, as apps to take advantage of the Win64 features need to be rewritten anyway.

    I really hope that ShadowWolf and I have finally dumbed things down enough for you to follow; after reading the ongoing idiocy you're posting to the literacy thread, however, I have my doubts. 



  • @KenW said:

    In a nutshell, suppose that a Fortune 500 corporation has several internal applications that they depend on to keep their business running. Also suppose that they outsourced the development of these applications to a consulting company who wrote software that depended on things being done by the OS in a specific (perhaps undocumented) way, and the corporation didn't receive the source.  Now MS releases a new version of Windows that fixes or changes the undocumented feature that the consultants had relied on, and the applications no longer work. Who gets the blame (and bad publicity)? Even though MS just did what you think they should have and fixed the problem, but it's not gonna be "Those stupid consultants broke our software.". Why not? Because the software didn't change. So it has to be Microsoft's fault, doesn't it?...

    Because obviously it's Microsoft's job to coddle and protect companies too stupid to get a proper contract for their software work or a proper contractor, and companies too stupid to understand that perhaps all the money that went into executive salaries and not on their lowest-bidder contract work might have resulted in bad code. That's not what an OS vendor is there for. Horrifying as Raymond's examples are of some of the things Microsoft did find were, it's up to developers to develop software.

    I will give companies credit for one thing and one thing only -- Microsoft can't program a working OS to save their lives. Their documentation stinks, the API is incomprehensible and full of functions that don't work, and it's too hard to tell whose fault anything is. If Windows was a top-notch OS then it would be easier to contemplate that maybe a program was doing something bad. But the reputation of Windows does, sadly, overshadow this greatly with the fear that it wasn't your code that broke, but the OS. But then again, why on earth would you ever need to keep changing the API so dramatically so often other than that you made such a mess of it to begin with? I watched a video recently of the introduction to the Amiga 1000 in 1985 and was in complete awe -- a pre-emptively multitasked, full-colour consumer OS one year after the Macintosh 128.

    Apple took a very long time to fix their OS, which they wrecked from the start by deciding that since 128 k was too little to support GUI multitasking (fair enough) they would prevent the OS from ever having multitasking of any kind, let alone pre-emptive. Their change from Mac OS 9 to X, though, while far more drastic than anything Microsoft were eve willing to go through, helped far more than it hindered. Painful, but brought the Mac into the modern age and got people loving it again. The kinds of people who switched to Mac after X came out have really surprised me.

    I've certainly been on the receiving end of programs that broke due to external changes where it was not possible to get the software rewritten. One was sending effectively invalid FTP commands (two spaces between STOR and the filename), that were being dealt with with one version of Windows server, but with an upgrade at our Web host to Windows 2003, the server no longer coped with the mistake. I think that one was a real mess. I think Windows 2003 (presumably IIS) was storing spaces in the filenames on disc -- not legal in Windows -- but was stripping them back out when reading from the disc, causing the files to no longer be able to be opened (the old visible but inaccessible flaw seen on various systems). A bunch of inconsistencies in Windows compounded by a design flaw in our program.

    Yes, .NET is a new framework, but Apple have faced and dealt with similar problems. Starting with Mac OS 8.1, the Mac shipped with a new file system called HFS+ that supported 255-character names. Mac OS 9 could not read these long names, and this created another potential visible but inaccessible flaw -- a file with a name longer than 31 chars that would be viewed with a truncated name. Ask to affect that file, and the Finder would refuse because the file does not exist!

    However, provision was made for the OS to cope with both name lengths and, starting with Carbon perhaps, maybe earlier, Mac OS 9 apps can read and write full 255-character names. The OS then generates a special filename for normal applications to see -- sadly, a much more hideous mess than Microsoft's "*~1.*" approach. But it did exist, and it did work.

    There already are nasty inconsistencies in Windows right now. I can go Start > Run > firefox and Firefox loads. I can run cmd, type firefox, and simply get an error. The path-free binary lookup doesn't exist in cmd. Nor does cmd support UNC paths. Some other applications also fail to support UNC paths. It's not like we've not, thus far, had features that aren't universally supported.

    That said, it's deeper that that. The Windows file space is quite a complex beast. Windows does not restrict itself to simply drive letters. You have this complex space with things like Network Neighbourhood, Desktop, My Documents all as custom objects outside of the drive letter space, with drives themselves being at Desktop > My Computer > C: etc. So there's no reason why you can't add a new shell library to support all the custom drive designations in such a way that all applications can see them. It's a bit tricky, as they don't have regular paths (which it the real WTF -- virtual space with no way to refer back to it!) and although a path can be anything (e.g. UNC) programs may want to perform internal path validation. If you stick to UNC naming, it might work with all applications that understand that. Maybe fall back to some (spare) drive letters for apps that really, really don't get it. But altering .NET to have different paths wouldn't be simple.



  • @Cap'n Steve said:

    @aythun said:
    Writing to NTFS in Linux is as stable as it is in Windows.


    Does anyone know how the NTFS for Linux development went?  It kind of worries me that NTFS was out for like 10 years and Linux had experimental read support and no write support, and then in the last couple of years they suddenly got to "just as good as Windows".

    Last I checked, what they did was take the binary NTFS driver from Windows, and write a Linux wrapper around it.



  • @Daniel Beardsmore said:

    Nor does cmd support UNC paths.
    I've read somewhere that the rationale for cmd not supporting UNC paths is that it could cause problems when you tried running an old (DOS) application from an UNC path, since the current directory wouldn't have a drive letter then (you can run a program from UNC path directly from cmd.exe, it just won't let you cd to it). Of course, this doesn't seem to bother file managers, which seem to run such programs just fine.

    @Carnildo said:

    Last I checked, what they did was take the binary NTFS driver from Windows, and write a Linux wrapper around it.
    That's happened quite a long time ago now. Current fad is a NTFS driver in user-space.



  • There already are nasty inconsistencies in Windows right now. I can go Start > Run > firefox and Firefox loads. I can run cmd, type firefox, and simply get an error. The path-free binary lookup doesn't exist in cmd.


    Because, from a user prespective, application management in Windows has never been properly thought out.

    Network Neighbourhood, Desktop, My Documents all as custom objects outside of the drive letter space, with drives themselves being at Desktop > My Computer > C: etc.


    Isn't it the same for X? The Finder has its column on the right with icons that are little more than links to folders on the hard drive. Same thing for Explorer's Desktop, MyDocs, Trash, etc. Just a friendly GUI layer on top of the real situation. "Desktop" really isn't physically the mother of all.

    Hell, you can change where My Documents points to, using TweakUI (not a very wise thing to do though, because applications that use Mydocs (many) will start putting their crap there, instead of the real MyDocs folder in your user folder in Documents & Settings).



  • @KenW said:

    And yet again, you're wrong. As ShadowWolf pointed out very eloquently to you, .NET runs ON TOP OF the existing Win32 API. .NET IS NOT A NEW PLATFORM!!! 

    This is also the reason why Wine only supports Windows applications that use Unix-style pathnames... oh, wait.



  • I dunno - I never had much issue with the MSDN API as a whole.  Speaking personally, though, I've yet to see a well documented, fully explained API set.  Even from Mac.  Aside from that, the following line demonstrates a fundamental misunderstanding of the reason things are the way they are,

    But then again, why on earth would you ever need to keep changing the API so dramatically so often other than that you made such a mess of it to begin with?

    The issue is not that the API changed.  The issue is that people did things that the API was never designed to do, but didn't error out, and relied on it for application behavior.  It's like if you relied on a string overflow using strcpy for your application to work, not because you were malicious or something like that, but because you thought that strcpy was just supposed to work that way.  If they plug the hole in it, at whom are you going to complain?  And it's not a Doc issue because half the time, it's obvious they never referenced the docs anyway.  They just googled around until they found the API and then filled it out using the sample instead of doing their own work.



  • @iwpg said:

    @KenW said:

    And yet again, you're wrong. As ShadowWolf pointed out very eloquently to you, .NET runs ON TOP OF the existing Win32 API. .NET IS NOT A NEW PLATFORM!!! 

    This is also the reason why Wine only supports Windows applications that use Unix-style pathnames... oh, wait.

    Then of course there is mono that manages to do a similar job with many .net apps with no win32 api in sight.
     


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.