Why desktop operating systems have completely failed at their job for the last 20 years



  • @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    What happens if the software's not in a package manager because the developer, for example, wants to sell it for a profit? What's the Linux solution to that? Is it "fuck off, asshole?" I think it is.

    You can install packages directly without having to use the central repository.

    Granted, if they're selling a Debian style package, they're probably going to tell you to toss the .deb into the package manager's cache directory and then install it so it does all the dependency resolution stuff.


  • Considered Harmful

    @lucas1 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Duh. If the prerequisites are not available, of course it won't work. You won't get a .NET 2.0 program to work on 1.1 either. Or anything on the sort on any system.

    Whoosh.

    Except .NET 2.0 is the same on every windows system where it is installed. Every linux distro tends to have slightly different versions of the same lib which makes it a nightmare if you have a proprietary program. That why nobody can be arsed to support anything other than Redhat and Ubuntu.

    If that's what you need (i.e. official support, not what Steam users need), why not use RHEL/CentOS or Ubuntu? Using Windows "because you need $SOFTWARE" is a good reason but using RHEL for the same reason is not?

    Surprise, surprise! Not using the package manager fucks up things. Even Steam have learned this and are distributing proper RPMs now. dnf localinstall and you're good.

    So you have to make a package for about 5 different distros because they are all incompatible. If someone hasn't made a package for your system that works with dnf / apt-get whatever you will have to resort to using alien or something similar or using a tarball which is a headache in itself.

    ... or making a package for your system. So? The point is, using the package manager saves users a huge headache, and for developers writing a SPEC file or a bunch of DEB configs is still far easier than rolling your own functionality for everything that's not included in your baseline system or alternatively including those third-party libraries and making a release (and expecting users to find it, download and install it) whenever one of those components is updated.

    Funnily enough I never have these problems with OSX. Because it does application installation sensibly.

    As in "package everything with the app"?

    ... or third parties develop their packages according to the guidelines. Works fine for us, we have hundreds of custom CentOS packages.

    So you are supporting one distro that does change very often. Which is the whole fucking point.

    Wat? CentOS/RHEL have just about the longest release/support cycle of all of them.

    I can take an 32bit exe on Windows and install it today and it will probably work. If you try doing that on linux you are in dependency hell.

    Yeah, Linux started moving away from 32-bit-only software over 20 years ago. Anything that is not 64-bit clean in 2017 is a rotting piece of shit that hasn't been maintained for more than a decade. I should know, I have the joy of administrating some of these rotting pieces of shit for conservative™ clients. Still no dependency hell there, they just live on VMs with a Linux from the last millennium.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Wat? CentOS/RHEL have just about the longest release/support cycle of all of them.

    That has its own issues.

    For example, I recently ran across a binary server application recently that requires GLIBC 2.18 or newer. For the record, GLIBC 2.18 is from August 2013.

    Redhat 7.3, released November 2016, ships with GLIBC 2.17.



  • @lucas1 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @Gurth said in Why desktop operating systems have completely failed at their job for the last 20 years:

    That might just be the real big difference with Windows, and if you ask me, it’s not as big a deal as the Windows crowd tends to make it out to be. L

    It isn't until you need it.

    Then, if you really need that app and can’t find a decent substitute, you install it in a VM running an older version of the OS.


  • Considered Harmful

    @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    No. Libraries are there to be used, and proper package manager makes that a complete non-issue. So what if I type "aptitude install blender" and it tells me it's going to pull in 438 packages? I say "y" and have a coffee (on Lao internet that is, in Europe I could maybe read half of your post while it's working).

    There's no way any application that requires 438 packages is a good application. It's not possible. You're probably installing Discourse there.

    I pulled that number outta my ass, it doesn't matter if it's 15 or 500. I'd be curious how many NodeJS packaged are required by NodeBB. With Perl stuff you can easily reach a few hundred packages already, and Node seems to have packages for the most trivial stuff.

    If I want to remove it later, the package manager will notice nothing needs those 438 packages any more and offer to remove them all.

    What happens if the software's not in a package manager because the developer, for example, wants to sell it for a profit? What's the Linux solution to that? Is it "fuck off, asshole?" I think it is.

    Software can not be "in a package manager". The package manager is the the software that manages packages it usually (but not necessarily) pulls out of repositories. If a commercial program is not in a repository that's because the vendor has not set up one. Probably due to the mistaken belief that repositories had anything to do with selling software for a profit.

    Of course it would suck to do that on Windows, which is why every crappy little app brings with it half a gigabyte of libraries.

    Again: any app with half a gigabyte of libraries is going to suck regardless of the OS or how those libraries are installed/distributed.

    AMD's latest Windows driver has half a gigabyte, Nvidia's is >400 MB. So there's a library dependency of, say, a password cracker using CUDA.

    What is "the OS" to you?

    The shit that ends up on your computer after Ubuntu or Windows is done installing. Any other definition is moronic and stupid.

    That's a pretty big "absolute bare minimum" then.

    If you add the typical GNU userland and packages you find in a Linux distribution, you actually get far more functionality out of the box than on OSX.

    Really.

    I don't believe GNU includes anything which can, for example, embed a web browser in an app's window. Which isn't to say Linux can't do that, in fact Linux has several ways of doing that, none of which you can guarantee will be on any particular Linux at any particular time.

    Why would you need to? Make your package depend on it to have it automatically installed before yours, and you're done.

    On Unixes, SIGHUP is conventionally used for that, and it's implemented by a whole bunch of server software that needs to be reconfigurable at runtime.

    "Typically used for" != "if the OS created this feature it would work 100% of the time so users wouldn't think it was broken out-of-the-box".

    Who thinks what was broken, and why?

    Also, only server software? That's, what, maybe 10% of software people use day-to-day? At best.

    Why would you want Minesweeper to include a mechanism to reload some kind of configuration that nobody ever changes outside the program anyway, and if anyone did that it wouldn't hurt at all to simply restart the program?

    Or the crutches people invented to get around the lack of preemptive multitasking.

    And yet practically speaking, multitasking in Mac Classic worked better than in Windows, at least until Windows 2000 came out. Sure Windows 95 had "proper" virtual memory and "proper" preemptive multitasking, but that didn't stop it from spending 45 seconds grinding your hard drive to dust every time you switched windows.

    If you ran it with too little memory, yes. Which, granted, everyone did, because they didn't want off-putting, realistic requirements on the box. Try running a renderer on MacOS Classic, or a CD writing program or something else that tries to hog the CPU or has tight timing requirements. Either the program becomes slow and/or/ unstable, or the machine gets completely unresponsive.

    Works fine for us, we have hundreds of custom CentOS packages.

    Who is "us" and "we", exactly?

    My workplace.


  • Considered Harmful

    @powerlord said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Wat? CentOS/RHEL have just about the longest release/support cycle of all of them.

    That has its own issues.

    For example, I recently ran across a binary server application recently that requires GLIBC 2.18 or newer. For the record, GLIBC 2.18 is from August 2013.

    Redhat 7.3, released November 2016, ships with GLIBC 2.17.

    Oh, definitely! Tell me about the joys of still having to write for Perl 5.10.1 that's been EOL'd for like a decade because that's what the productions servers have :angry:
    I just don't know where @lucas1 gets the idea that CentOS changed frequently.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Software can not be "in a package manager".

    Posters to DailyWTF can not not be "pedantic dickweeds".

    Did you know what I meant when you read that? Oh, you did? So shut the fuck up.

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    If a commercial program is not in a repository that's because the vendor has not set up one. Probably due to the mistaken belief that repositories had anything to do with selling software for a profit.

    So the reason there's hardly any commercial software on Linux is because of a mistaken belief. Do you really believe that?

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    So there's a library dependency of, say, a password cracker using CUDA.

    Right, that software so common to using a desktop experience. I mean, I can't even remember the last time I went to my mom's house and saw her, once again!, running her password cracker using CUDA on her laptop. Candy Crush hasn't been touched in months.

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    That's a pretty big "absolute bare minimum" then.

    I think it would save the Linux community a hell of a lot of market confusion if they'd just admit to themselves that using the word "Linux" to refer to Ubuntu is counter-productive.

    Of course they'd never stop doing that for two reasons:

    1. It's easier to win debates like this one if the word "Linux" can magically switch from describing an entire OS to describing just a kernel to describing an entire ecosystem of experimental features and back again.
    2. User hostility is baked-in at literally every level. It's not enough that the software is nightmarish for human beings to actually use, you also need to call it "GIMP" for that extra dose of repulsiveness.

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Why would you want Minesweeper to include a mechanism to reload some kind of configuration that nobody ever changes outside the program anyway, and if anyone did that it wouldn't hurt at all to simply restart the program?

    We're talking about a theoretical OS feature that can sync application settings between two computer, remember? We're also talking about desktop OSes, remember? Scroll up and take a refresher.

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    If you ran it with too little memory, yes. Which, granted, everyone did,

    So practically speaking that fancy-pants Windows 95 performed worse than Mac Classic machines of the same era.

    Man I could have saved all your confusion if I'd just had put "practically speaking" in the original text you're replying to.

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    My workplace.

    This is a great debating technique. Think your argument is weak? Invent a group you (suddenly, without warning) represent, then start dropping those "we"s all over! You're weeing all over your post!

    Now you just have to sidle into work on Monday and say, "hey guys, with your help we really slammed that Blakeyrat guy on a forum over the weekend! Thanks for electing me your DailyWTF representative, apparently!"


  • Considered Harmful

    @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Software can not be "in a package manager".

    Posters to DailyWTF can not not be "pedantic dickweeds".

    Did you know what I meant when you read that? Oh, you did? So shut the fuck up.

    You seem to be confused about how that works because your question makes no sense. Why would a "package not being in a package manager" have anything to do about who is or isn't selling it for a profit?

    If a commercial program is not in a repository that's because the vendor has not set up one. Probably due to the mistaken belief that repositories had anything to do with selling software for a profit.

    So the reason there's hardly any commercial software on Linux is because of a mistaken belief. Do you really believe that?

    :moving_goal_post: You'd have to explain why MSI is an inherently better format to sell your software in than RPM if you want to claim using a package manager was somehow bad for business.

    So there's a library dependency of, say, a password cracker using CUDA.

    Right, that software so common to using a desktop experience. I mean, I can't even remember the last time I went to my mom's house and saw her, once again!, running her password cracker using CUDA on her laptop. Candy Crush hasn't been touched in months.

    Ah, that's the VGA console Candy Crush that doesn't need a graphics card driver, isn't it?

    That's a pretty big "absolute bare minimum" then.

    I think it would save the Linux community a hell of a lot of market confusion if they'd just admit to themselves that using the word "Linux" to refer to Ubuntu is counter-productive.

    Of course they'd never stop doing that for two reasons:

    1. It's easier to win debates like this one if the word "Linux" can magically switch from describing an entire OS to describing just a kernel to describing an entire ecosystem of experimental features and back again.

    Huh? I'm totally fine with your definition. I just don't understand how you can claim that everything that comes on an Ubuntu DVD is the "absolute bare minimum". In fact for most people that's everything they will ever install and work with.

    Why would you want Minesweeper to include a mechanism to reload some kind of configuration that nobody ever changes outside the program anyway, and if anyone did that it wouldn't hurt at all to simply restart the program?

    We're talking about a theoretical OS feature that can sync application settings between two computer, remember? We're also talking about desktop OSes, remember? Scroll up and take a refresher.

    You said as far as you knew the OS had no way to signal such a mechanism. I answered that in fact it does and it has been used for decades. That as a programmer you can always choose to implement it or not has no bearing on this.

    If you ran it with too little memory, yes. Which, granted, everyone did,

    So practically speaking that fancy-pants Windows 95 performed worse than Mac Classic machines of the same era.

    Unless you tried to do exotic stuff like burning a CD, yes.
    TBH, in that era I'd have taken a Mac over a PC, too, if that had been the choice. But that was more due to the superior hardware. "Plug'n'Play" shit and everything that preceded PCI was a nightmare, on top of the piled-up crap from "real mode" to BIOSes and all that, while the Mac had PREP, ADB and I-forget-what-the-bus-was.

    My workplace.

    This is a great debating technique. Think your argument is weak? Invent a group you (suddenly, without warning) represent, then start dropping those "we"s all over! You're weeing all over your post!

    I'm not the one getting all pissy here :) But yes, I'm sorry, I should have clarified that "we", I didn't know it would cause an allergic reaction.

    Now you just have to sidle into work on Monday and say, "hey guys, with your help we really slammed that Blakeyrat guy on a forum over the weekend! Thanks for electing me your DailyWTF representative, apparently!"

    I was replying to @lucas1 who inexplicably likes to blame package managers for his failed attempts at hacking stuff to work with libraries it wasn't linked to. Scroll up and take a refresher.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    You'd have to explain why MSI is an inherently better format to sell your software in than RPM if you want to claim using a package manager was somehow bad for business.

    Demonstrably it is, since the MSI format is used on an OS that has a non-zero number of third-party software sales.

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Huh? I'm totally fine with your definition. I just don't understand how you can claim that everything that comes on an Ubuntu DVD is the "absolute bare minimum".

    You obviously didn't read what my definition was. (Either that, or when you install Ubuntu it literally copies every byte on the DVD to the computer, which I suppose is possible?)

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    You said as far as you knew the OS had no way to signal such a mechanism. I answered that in fact it does and it has been used for decades. That as a programmer you can always choose to implement it or not has no bearing on this.

    Right?

    What does this have to do with the actual argument I was making? (Which is, to remind your ass, "there'd be no point to implementing the settings-sync feature since it would simply appear to be broken to the end-user until 99% of apps supported it". Also keep in mind we're talking about DESKTOP OSes, not people doing password cracking or whatever. You can always scroll up if you get confused.)

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    I'm not the one getting all pissy here

    I'm not pissy. I'm just pointing out your cheap rhetorical technique to remove its power. (Not that it wasn't obvious anyway, but my experience is most people don't think much about what they read. For example, there was a guy around here who read "desktop operating systems" and replied with a bunch of shit about password cracking using CUDA.)

    For the record, if I ever start saying "we" that's because I represent 7 billion people all of whom are super-model PhD's and if you don't obey us without question then you're a moron. If you're going to suddenly be a "we", at least go all-out.


  • Considered Harmful

    @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    You'd have to explain why MSI is an inherently better format to sell your software in than RPM if you want to claim using a package manager was somehow bad for business.

    Demonstrably it is, since the MSI format is used on an OS that has a non-zero number of third-party software sales.

    Yes, that's about the level of technical I thought you'd get :rolleyes:
    I wonder why it is that I can go buy, say, Maya for Linux and install it using yum.

    Huh? I'm totally fine with your definition. I just don't understand how you can claim that everything that comes on an Ubuntu DVD is the "absolute bare minimum".

    You obviously didn't read what my definition was. (Either that, or when you install Ubuntu it literally copies every byte on the DVD to the computer, which I suppose is possible?)

    It's been a while since I installed Ubuntu but yes, I think it has an "everything" option. Not that it made much sense as apt will always pull in dependencies when you need them, but if you don't understand the concept you can have it that way.

    You said as far as you knew the OS had no way to signal such a mechanism. I answered that in fact it does and it has been used for decades. That as a programmer you can always choose to implement it or not has no bearing on this.

    Right?

    What does this have to do with the actual argument I was making? (Which is, to remind your ass, "there'd be no point to implementing the settings-sync feature since it would simply appear to be broken to the end-user until 99% of apps supported it".

    That was your (badly copied and incomprehensible the way you originally put it) response to me pointing to SIGHUP. Why don't you scroll up for a change instead of talking about it all the time?

    Also keep in mind we're talking about DESKTOP OSes, not people doing password cracking or whatever. You can always scroll up if you get confused.)

    You serious about playing Candy Crush without a gfx driver?

    I'm not pissy. I'm just pointing out your cheap rhetorical technique to remove its power.

    Huh? I don't even see how talking about "we" instead of alleging it was me who did all the packaging could be used as a "rhetorical technique". What "power" do you think it had in the first place?


  • Impossible Mission - B

    @anonymous234 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @svieira said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Midori

    That article was interesting as fuck, but ultimately it's all about a new language-compiler-runtime model that provides safety, not security, basically a more efficient .NET. It probably does make implementing security easier, there's a section about how to implement capabilities in an object-oriented way, but I don't think it's the biggest obstacle right now.

    We can already implement permissions checking in both plain old machine code and .NET, we just don't do it enough.

    Historically, a huge percentage of security exploits occur as a direct result of violating safety. Not all, of course, but it's a significant enough proportion that it's simply not reasonable to treat the two as entirely distinct concepts.


  • Notification Spam Recipient

    This post has been obliviated!


    Filed under: FFS I don't like this phone with a DPI of like 600!


  • Notification Spam Recipient

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Ah, that's the VGA console Candy Crush that doesn't need a graphics card driver, isn't it?

    Works fine with Microsoft Basic Display Driver "installed". Though if you don't have a separate sound card (for some reason it seems most don't anymore), no sound.

    Honestly that's not such a bad thing either...


  • Notification Spam Recipient

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    You serious about playing Candy Crush without a gfx driver?

    Yup. It's definitely doable. Wouldn't recommend actually doing it though, because fuck Candy Crush.


  • Considered Harmful

    @Tsaukpaetra said in Why desktop operating systems have completely failed at their job for the last 20 years:

    You serious about playing Candy Crush without a gfx driver?

    Yup. It's definitely doable. Wouldn't recommend actually doing it though, because fuck Candy Crush.

    Also, @blakeyrat's mom.
    Then again, that's only 2.x MB of driver—it must be much better than AMD's.



  • @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    What happens if the software's not in a package manager because the developer, for example, wants to sell it for a profit? What's the Linux solution to that? Is it "fuck off, asshole?" I think it is.

    Well, the solution in the most popular Linux distribution is to have a payment mechanism inside the package manager (Google Play Store / Amazon Appstore).



  • @Grunnen said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    What happens if the software's not in a package manager because the developer, for example, wants to sell it for a profit? What's the Linux solution to that? Is it "fuck off, asshole?" I think it is.

    Well, the solution in the most popular Linux distribution is to have a payment mechanism inside the package manager (Google Play Store / Amazon Appstore).

    Or just fork a commercial distro that has commercial products included in the repository, like RHLE or SUSE.

    Or create a distro that the OS is just something bundled with your product, like Ubuntu bundled with Untangle Firewall .



  • @blakeyrat said in Why desktop operating systems have completely failed at their job for the last 20 years:

    What happens if the software's not in a package manager because the developer, for example, wants to sell it for a profit? What's the Linux solution to that? Is it "fuck off, asshole?" I think it is.

    In Ubuntu, I'm pretty sure you can just distribute a .deb file. The user can double click and select "install", which will download any dependencies from the official repositories, or you can include them in your file if you want.

    No, while repositories are definitely far from perfect, the are not the main problem.

    The REAL problem is, as I've said before, that there is no such thing as the "GNU/Linux OS". Ubuntu is one arbitrary set of open source software configured one way, RHEL is a different one, and OpenSuse is a different one.

    So you simply can't make a "Linux program", unless you only want to use text I/O and don't depend on external services. You make a Ubuntu program, or an OpenSUSE program.

    There are a number of "standards" (OpenDesktop, LSB) that attempt to specify how to do certain things so that programmers can rely on them, but it's an impossible goal, because if all distros actually followed those they'd simply end up being the same distro.


  • Discourse touched me in a no-no place

    @anonymous234 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    So you simply can't make a "Linux program", unless you only want to use text I/O and don't depend on external services. You make a Ubuntu program, or an OpenSUSE program.

    Or you depend on a higher-level platform on top of those variations that hides all that stuff. Which is what most people actually do. (As a bonus, it also usually makes the program easier to port to other platforms too.)


  • Notification Spam Recipient

    @anonymous234 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    The REAL problem is, as I've said before, that there is no such thing as the "GNU/Linux OS". Ubuntu is one arbitrary set of open source software configured one way, RHEL is a different one, and OpenSuse is a different one.

    Yeah, I was actually discussing this exact thing with an associate today.



  • @Unperverted-Vixen ...anyone knows why was it canceled? :(


  • FoxDev

    @sh_code The best I can find is 'it was ahead of its time'. Nevertheless, some of the tech has been released as:

    • Entity Framework
    • Hierarchical data (SQL2008 and later)
    • Sync Framework


  • @RaceProUK said in Why desktop operating systems have completely failed at their job for the last 20 years:

    The best I can find is 'it was ahead of its time'.

    Yes, and Micro-soft wasn't able to create a production-ready implementation after announcing it.



  • @TimeBandit And most people want to search like with Google and not like using an SQL statement or a query builder?



  • @Grunnen said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @TimeBandit And most people want to search like with Google and not like using an SQL statement or a query builder?

    And considering how Bing search gives you accurate results, maybe we're better off without WinFS 🍹



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    he point is, using the package manager saves users a huge headache, and for developers writing a SPEC file or a bunch of DEB configs is still far easier than rolling your own functionality for everything that's not included in your baseline system or alternatively including those third-party libraries and making a release (and expecting users to find it, download and install it) whenever one of those components is updated.

    Having a proper build system FFS. And distribute the binary properly. It really isn't that hard. I've been doing it since Uni.



  • @anonymous234 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    In Ubuntu, I'm pretty sure you can just distribute a .deb file. The user can double click and select "install", which will download any dependencies from the official repositories, or you can include them in your file if you want.

    It is basically still using apt-get.

    Ubuntu has PPAs to get around this. Fedora you have to add an extra repo



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Still no dependency hell there, they just live on VMs with a Linux from the last millennium.

    Whenever anyone says "just run a VM" if it is something that needs to see the outside world. It is a huge security problem.


  • Considered Harmful

    @lucas1 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    he point is, using the package manager saves users a huge headache, and for developers writing a SPEC file or a bunch of DEB configs is still far easier than rolling your own functionality for everything that's not included in your baseline system or alternatively including those third-party libraries and making a release (and expecting users to find it, download and install it) whenever one of those components is updated.

    Having a proper build system FFS.

    One that blindly downloads all your dependencies, rebuilds and re-releases? Cool story, bro.

    And distribute the binary properly. It really isn't that hard. I've been doing it since Uni.

    Have you also found a way of getting Blakey's mom to pass by candycrush.com every other day to check for updates, download them (entering her credit card for good measure—we're talking real, commercial software here after all!) and install them? You should share that proper distribution channel with Adobe & Co., they still think yet another useless memory hog of an "updater service" was the way to go.

    @lucas1 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Still no dependency hell there, they just live on VMs with a Linux from the last millennium.

    Whenever anyone says "just run a VM" if it is something that needs to see the outside world. It is a huge security problem.

    Unmaintained undead turds from the last millennium tend to have security problems. News at 11.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    One that blindly downloads all your dependencies, rebuilds and re-releases? Cool story, bro.

    Jesus christ you are stupid. Even if you are creating an RPM (I've done it) you still need to get hold of the sources from somewhere.

    If you are distributing a binary you should include all the things that are required for it to work, or make sure the necessary pre-requisites are there if you can't for whatever reason re-distribute it.

    This isn't hard and hard drive space and network bandwidth are cheap these days.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Unmaintained undead turds from the last millennium tend to have security problems.

    I am talking about the OS. The actual application might be fine.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    One that blindly downloads all your dependencies, rebuilds and re-releases? Cool story, bro.

    It's possible to download specific commits or run checksums. Not every build system just pulls in the latest unverified code and links it.



  • @LB_ It is almost like he doesn't know what he is on about.


  • Considered Harmful

    @LB_ said in Why desktop operating systems have completely failed at their job for the last 20 years:

    One that blindly downloads all your dependencies, rebuilds and re-releases? Cool story, bro.

    It's possible to download specific commits or run checksums. Not every build system just pulls in the latest unverified code and links it.

    Of course it is. Point is, that is work. It's stuff you have to do, and if you decide to take a long holiday you have to pay someone to do it or your software won't get any security fixes. And that's not even talking about getting your clients to actually install your updates.



  • @TimeBandit said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Yes, and Micro-soft wasn't able to create a production-ready implementation after announcing it.

    It's also possible that it was never actually intended to be a feature in the first place, and they only announced it to give their competitors cold sweats.


  • Considered Harmful

    @lucas1 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    One that blindly downloads all your dependencies, rebuilds and re-releases? Cool story, bro.

    Jesus christ you are stupid. Even if you are creating an RPM (I've done it) you still need to get hold of the sources from somewhere.

    Not necessarily, not even for the program you're packaging. What you most certainly don't need are the sources of your dependencies.

    If you are distributing a binary you should include all the things that are required for it to work, or make sure the necessary pre-requisites are there if you can't for whatever reason re-distribute it.

    Sure, that's what a package manager does for you.



  • @blakeyrat there were apparently* betas of it available, though none I ever saw :( it seemed like a really good idea in a lot of ways, though.

    • as in, this was a thing according to Wikipedia.

  • Considered Harmful

    @lucas1 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Unmaintained undead turds from the last millennium tend to have security problems.

    I am talking about the OS. The actual application might be fine.

    The application might theoretically be fine but the probability is very low. Especially if you think of "the application" like you do, as "the program and all its dependencies".
    In reality, both of them are heaps of smelly, rotting bits.



  • @Grunnen that's an implementation detail, once you've got sql-like fs, the google-like searching/filtering is a free subset of sql-like searching-filtering, at most with some syntax conversion. "use plain old search strings OR simple to most complex SQL queries!"

    done


  • 🚽 Regular

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Have you also found a way of getting Blakey's mom to pass by candycrush.com every other day to check for updates


    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    You should share that proper distribution channel with Adobe & Co., they still think yet another useless memory hog of an "updater service" was the way to go.


    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Of course it is. Point is, that is work. It's stuff you have to do, and if you decide to take a long holiday you have to pay someone to do it or your software won't get any security fixes. And that's not even talking about getting your clients to actually install your updates.


  • Considered Harmful

    @Zecc said in Why desktop operating systems have completely failed at their job for the last 20 years:

    Have you also found a way of getting Blakey's mom to pass by candycrush.com every other day to check for updates

    Sure, that's how it works. With apticron or the like that is.

    If you write your own update procedure for every program you can use this tool that makes a crontab look like the epitome of intuitive interface design (hey, the "Last Result" probably grew organically out of the enum { FALSE, TRUE, FILE_NOT_FOUND }) to run it. There must be a reason why hardly anyone does it though? Maybe that the method requires additional custom services that run with SYSTEM privileges and generally seems to be effing hard to get right?

    You should share that proper distribution channel with Adobe & Co., they still think yet another useless memory hog of an "updater service" was the way to go.

    See above. If you've got Google's know-how it seems to be possible to do in a stable way. How that's supposedly preferable to simply dropping a trivial text file in /etc/yum.repos.d or the like is still beyond me.

    Of course it is. Point is, that is work. It's stuff you have to do, and if you decide to take a long holiday you have to pay someone to do it or your software won't get any security fixes. And that's not even talking about getting your clients to actually install your updates.

    We've been there. There's just two possibilities: either you blindly download all updated dependencies, rebuild, re-release, or you do some manual vetting, testing and QA. The former can be left to cron and is fucking stupid. The latter is work.


  • 🚽 Regular

    @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    There's just two possibilities: either you blindly download all updated dependencies, rebuild, re-release, or you do some manual vetting, testing and QA. The former can be left to cron and is fucking stupid. The latter is work.

    Thank you for reifying this point. It wasn't clear to me this was what you were saying.



  • @LaoC I'm not entirely sure what the argument is here.

    Of course OSs should have a way to automatically update software, if they don't it just means the developers will have to roll their own and it will be a mess (ahem).

    This doesn't mean that it has to come from a centralized repository (commercial developers would never let someone else handle distribution of their stuff). Packages could specify which URL and cryptographic key they want to use for updates. In fact the same protocol could be used to make something like Java Web Start that allowed people to launch programs with a single click (the major advantage of containers, remember?).

    And there should definitely be a bunch of "core" libraries that would be considered a part of the OS, maintained by the OS, and developers should be able to trust them not to break compatibility.

    So the question here is whether the other libraries that those programs depend on should be updated automatically by whatever means (and potentially break some apps), or be frozen on whatever version the app developer has tested with.

    I think both options have trade-offs, and it should be left up to the developers. Commercial developers making $5000 software will want full control over everything, while indie developers are fine with a solution that works 99% of the time even if it breaks occasionally. So they could just specify "require libsomething >=2.5.1" (or an URL, or a cryptographic signature, etc.) and be done with it.

    Just provide the means to implement whatever people want. No need to be software police if you're not even sure what policy to enforce.



  • @anonymous234 These complex solutions are messy. MS-DOS did the only correct thing, that is unzip in a folder.

    Then by default I would want that no application have permission to write anywhere but it's own folder. (Like chrooting to the executable path).

    Base all security in sandboxing this stuff.


  • Considered Harmful

    @anonymous234 said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @LaoC I'm not entirely sure what the argument is here.

    Of course OSs should have a way to automatically update software, if they don't it just means the developers will have to roll their own and it will be a mess (ahem).

    This doesn't mean that it has to come from a centralized repository (commercial developers would never let someone else handle distribution of their stuff). Packages could specify which URL and cryptographic key they want to use for updates.

    I agree completely—that's exactly how yum, dnf, apt etc. work. Not everyone seems to agree here, though.

    In fact the same protocol could be used to make something like Java Web Start that allowed people to launch programs with a single click (the major advantage of containers, remember?).

    Uhm ... I don't quite get the connection between containers, JWS and single clicks.

    And there should definitely be a bunch of "core" libraries that would be considered a part of the OS, maintained by the OS, and developers should be able to trust them not to break compatibility.

    So the question here is whether the other libraries that those programs depend on should be updated automatically by whatever means (and potentially break some apps), or be frozen on whatever version the app developer has tested with.

    I think both options have trade-offs, and it should be left up to the developers. Commercial developers making $5000 software will want full control over everything, while indie developers are fine with a solution that works 99% of the time even if it breaks occasionally. So they could just specify "require libsomething >=2.5.1" (or an URL, or a cryptographic signature, etc.) and be done with it.

    If you sell $5000 software you probably have the manpower to ensure speedy and tested security fixes. You can do that, no problem. However, you can still do what many vendors do and avoid bundling all dependencies by supporting only something like RHEL where the OS vendor is known to go to great lengths to make sure security fixes get backported to old versions so compatibility is maintained. As a user you can still use their packages on Fedora (or even unwrap them and make a DEB or something) and things will usually work, it just means you won't get support.

    Just provide the means to implement whatever people want.

    ACK. I just don't think Windows Task Scheduler is the right tool for that ;)


  • Considered Harmful

    @wharrgarbl said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @anonymous234 These complex solutions are messy. MS-DOS did the only correct thing, that is unzip in a folder, copy that driver to C:\WHRGRBL\WHERE\EVA\MPFGL88.SYS, slap its name somewhere into CONFIG.SYS, check the handbook for an arcane incantation to insert into AUTOEXEC.BAT and upon second try add the folder to your %PATH% so you don't have to remember it every time you want to run the program. And then hope you'll never have to remove that shit ever again. Upon hardware upgrade, live with the crashes from unused drivers or do the usual: reformat, reinstall

    FTFY



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @wharrgarbl said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @anonymous234 These complex solutions are messy. MS-DOS did the only correct thing, that is unzip in a folder, copy that driver to C:\WHRGRBL\WHERE\EVA\MPFGL88.SYS, slap its name somewhere into CONFIG.SYS, check the handbook for an arcane incantation to insert into AUTOEXEC.BAT and upon second try add the folder to your %PATH% so you don't have to remember it every time you want to run the program. And then hope you'll never have to remove that shit ever again. Upon hardware upgrade, live with the crashes from unused drivers or do the usual: reformat, reinstall

    FTFY

    Also, with limited conventional memory, you have to choose between difference set of hardware to use on each boot. If you're sure you don't need CDROM this time, don't load it. So does the sound card driver, mouse driver, the Chinese font TSR and so on. (Talking about this, I remember the ETen system have option to just load the most commonly used set of 200+ characters or the full 3000+ set of character, just to save some memory space)

    I remember the days when I have to reboot in different phase to get some job done in secondary school because of this.



  • @cheong said in Why desktop operating systems have completely failed at their job for the last 20 years:

    I remember the days when I have to reboot in different phase to get some job done in secondary school games to run because of this.

    FTFM



  • @TimeBandit said in Why desktop operating systems have completely failed at their job for the last 20 years:

    @cheong said in Why desktop operating systems have completely failed at their job for the last 20 years:

    I remember the days when I have to reboot in different phase to get some job done in secondary school games to run because of this.

    FTFM

    If I were talking about the case of playing games, I'd need those drivers loaded at the same time, not rebooting to use different hardware at different phase.



  • @LaoC said in Why desktop operating systems have completely failed at their job for the last 20 years:

    I don't quite get the connection between containers, JWS and single clicks

    Well, think about how HTML works.

    You click on a single link, which has the URL to download the program (page) you want. The program is downloaded automatically with a predefined protocol (HTTP), run inside a container/sandbox, and shown.

    You could do the same thing with Linux/Android/Windows programs once the container part is solved. Maybe even show them inside a browser tab like HTML ones. In practice you'd also need to create a protocol to dynamically download only the parts of the program you need, or it would be slow as hell.

    So once you have that made, you can actually use it as the primary distribution method. And at this point you pretty much have an exact copy of Java Web Start.