Spaces in file paths, part 7.38*10^89


  • Trolleybus Mechanic

    @Tsaukpaetra said:

    Essentially, you need input from the outside in order to determine if you're on the inside.

    Hence coming up with a theory of the rules of the outside world that the inside world would be constrained to. It wouldn't be easy, or possible. As far as we know, they'll just patch the bug and rollback to yesterday's backup.

    @FrostCat said:

    And your boss will give you your pay in that simulated universe, not this one.

    I don't know why I'm hungry. I fed my Sims. =(

    @blakeyrat said:

    In my own personal mental rewrite of The Matrix, the robots get power from fusion generators (as mentioned in an aside), and the reason they don't kill humans is because it's deep down in their programming that they can't. Like Asimov robots.

    In mine, the only way the robots achieved sentience was to hijack the CPU power of the human brain. The Matrix is their screensaver. Once they figure out how to build better processors, the only one running Human OS will be neck-beard robots doing it for the nostalgia-- like visiting old games on the Apple ][, or installing Linux on a C64.

    In another version, The Matrix is actually the robot's unit test suite. They emulate humans-- and keep running simulations with edge cases to figure out how humans would escape. Then they use that data to keep the actual humans trapped. They do this for-- I dunno-- they like porn?

    In the final version, The Matrix-- and the world around it-- are all simulations to help the machines understand Hollywood cliches.


  • Discourse touched me in a no-no place

    @powerlord said:

    Note, these are just the package names. The actual libraries use the directory name.

    One of the big differences is that Linux libraries have multiple names (via symlinks). This means that libraries can bind to a version of a library (at desired granularity) and the OS/runtime-linker can fix things up based on that. Yes, it would be possible to bind things to an exact release during link, but that's usually a bit too precise and doesn't allow for future upgrading of a lib to fix a bug.

    Windows, as an ecosystem, hasn't had nearly as long to digest the possible consequences of having multiple names for the same thing. It's actually been possible for quite a long time, but was pretty discouraged by relative lack of tooling that supported creating and inspecting such configurations.



  • @TimeBandit said:

    Please explain the logic for that



  • @TimeBandit said:

    On my Debian machine I have /lib and /lib64. Makes perfect sens

    Mine has /lib, /lib32, /lib64, /usr/lib and /usr/lib32. Each of /lib and /usr/lib has i386-linux-gnu and x86_64-linux-gnu subdirectories. It's a bit messy but caring about that is the package maintainers' job, not mine.



  • @DogsB said:

    On windows you have everything there already.

    Sure, and in Debian everything is in the package repository and gets auto-installed as required. And then updating works. I've had much more frustration trying to get Windows packages installed properly than Debian packages; with Debian I don't have to spend hours fartarsing about with obscure combinations of "compatibility settings".


  • Discourse touched me in a no-no place

    Are we talking about stuff from Microsoft/Debian delivered by the standard installation and update channel, or things by third parties?



  • On Debian, all the application software I use is delivered via the standard installation and update channel, and package maintainers have done a bloody good job of making sure it all works together.

    On Windows, everybody and their dog does installation and update differently and application OS dependency handling frequently doesn't work. Hell, even the standard OS update channel frequently doesn't work.


  • Discourse touched me in a no-no place

    @flabdablet said:

    On Debian, all the application software I use is delivered via the standard installation and update channel

    Which is great if what you're using is something that is in the Debian repositories. And nothing like as great if it isn't. Not all software is part of Debian. But maybe you're satisfied with so little.

    @flabdablet said:

    package maintainers have done a bloody good job of making sure it all works together

    Now that's quite true. (I'm also not disputing some of the things you say about Windows.)



  • @dkf said:

    satisfied with so little

    I have a choice of ~20k packages, none of which cost me more than I'm moved to donate for them, and that's before I start adding non-Debian repos. Not one of them vomits upsell gleet all over my screen. I'm on Testing, so most of them are even quite close to the most recent versions available. Yeah, I'm satisfied.

    Are there things that Debian handles less well than Windows? Sure. Do any of those things cause me enough grief to make me willing to deal with, on my own boxes at home, the assorted kinds of commercial bullshit I'm paid to wrestle with at work? Not even close.


  • Discourse touched me in a no-no place

    @flabdablet said:

    Not one of them vomits upsell gleet all over my screen.

    You're missing out! 🚎

    In reality, you can do plenty with just the Debian repos, particularly if you're sticking fairly close to mainstream stuff. It gets trickier when you get into things like entertainment (e.g., games, which are fusions of code and various media files) but that's about it.

    The place where commercial code really starts to matter is with software for various types of commercial use. With a lot of these, the line between software and service gets seriously blurry. (An example we've been going through has been Electronic Lab Notebook systems, where in the end we've gone for a hosted solution, but where there were options to purchase for local installation too. There are open source alternatives, but they suck rotting donkey balls.)


  • Dupa

    This is an open source project and they got UI and UX right:


  • Dupa

    @DogsB said:

    Yes but windows does it out of the box. I don't have to resort to any command line nonsense to get it working.

    Well, on Linux, you don't. Why would you need i386? For wine only. How do you do this on modern distributions?

    sudo apt-get install playonlinux
    

    There, no nonsense.


  • BINNED

    @kt_ said:

    This is an open source project and they got UI and UX right:

    🍿


  • Dupa

    @DogsB said:

    Can't help but notice that they're exe's that have probably installed everything necessary to run or compile against. It's pretty fucking weird that you keep screenshots of dependancies lying around.

    This shit is a screenshot from CCleaner.

    (I've a 30 GB Linux partition set up for webdev with a few projects already started. It's less than 20 GB, runs smoothier on my laptop than W10 and is extremely more usable. WTF is with the W10 search function, anyway? Super + text + have to wait 10-20s before it finds what I'm looking for. Someone should be punished for this.)

    Proof:


    And I only wanted to install Visual Studio for C# development. 😦

    (There's also one more screen of VC++ dependencies, but I'll spare you that shit.)



  • Apples to oranges again. If you just wanted to install Visual Studio for C# development, why did you install two of them, and kitchen-sink both? You can just install the Visual Studio shell, the C# language tools and compilers, and optionally Blend (for WPF or Metro apps) or ASP.NET (for websites).


  • Dupa

    Fucking stupid W10 upgrade process. I had a VS 2010 Ultimate from MSDN installed on W7. Then I upgraded OS and Windows has somehow reset my registration data. I can't access my registration code right now (and don't know when it will be possible, unfortunately), so I had to install the free VS 2013 edition. Don't know why I've never removed the 2010 installation.

    But that's beside the point. It is apples and oranges, I agree. But the same thing goes for the 64 bit dependencies, which modern distros really handle extremely well now. (Still, I remember that it actually was a pain in the ass a few years ago.)



  • By ripping-off OS X's dock?



  • @kt_ said:

    WTF is with the W10 search function, anyway? Super + text + have to wait 10-20s before it finds what I'm looking for. Someone should be punished for this.

    Yeah, you should be punished. For not fixing your busted-ass broken computer.


  • Dupa

    @blakeyrat said:

    Yeah, you should be punished. For not fixing your busted-ass broken computer.

    Well, shit works on Linux. πŸ˜ƒ

    @blakeyrat said:

    By ripping-off OS X's dock?

    Well, not only dock. A lot of it is "inspired". But applied very consistently, you can check out their HIG docs, if you don't believe. UI and UX is treated very seriously and they have really achieved a level not only not achieved by many open source project, but also the level of app consistency not achieved by Windows. (There are also many nice projects built around this shit.)



  • What should I check out if I don't care?


  • Dupa

    Probably: https://www.youtube.com/watch?v=dQw4w9WgXcQ

    But stop spreading FUD about OSS UI and UX abilities.


  • Dupa

    Plus, love your attitude.

    • Something is shit
    • No, it's not. [shows proof]
    • I don't care.

    πŸ˜‰


  • Notification Spam Recipient

    @kt_ said:

    But stop spreading FUD about OSS UI and UX abilities.
    That most of it is a time sucking vampire. That will never stop been true.


  • Dupa

    You mean UI and UX?


  • Notification Spam Recipient

    OSS in general



  • @TimeBandit said:

    Assuming system32 already contained the 32bits DLLs, what is the reasoning behind changin it to contain 64bits DLLs ?If the 32bits executable hardcoded the path to it, what problem does changing it to contain 64bits DLLs would fix ?

    I thought I'd seen this on The Old New Thing, but I can't find the exact entry I was thinking of. It was probably one that linked this TechNet column:
    Here's probably the most relevant section.

    There are quite a number of existing 32-bit programs that hard-code the System32 path rather than calling the GetSystemDirectory function. When these programs are recompiled for 64-bit Windows, they will still try to access the System32 directory, expecting to find 64-bit files (because the program is now compiled 64-bit).

    Oh wait, this might be what I was thinking of... a [url=http://blogs.msdn.com/b/oldnewthing/archive/2004/03/01/82103.aspx#82233]pair of comments[/url] on a different ONT article.
    @Andreas Magnusson said:

    I don't quite understand; it's (some of) the 32-bit apps that hardcode "system32", don't they want the 32-bit binaries?

    @Raymond Chen said:
    It turns out they usually don't! A 32-bit program that builds the path "C:\Windows\System32\control.exe" probably wants the 64-bit control panel, for example.


  • Discourse touched me in a no-no place

    @DogsB said:

    OSSOracle in general

    FTFY…


  • Notification Spam Recipient

    @dkf said:

    @DogsB said:
    OSSOracle in general

    FTFY…

    But Oracle gave us java for free. Oracle is great. Oracle is kind. Oracle will protect us.


  • Dupa

    @DogsB said:

    OSS in general

    Well, of course it depends entirely on the tools you use, but I've often seen that this happens mostly because people come to OSS/Linux with certain set of habits they want software to behave and are not willing to adapt.

    And this is where OSS tools often fail: they make it too easy to change its behaviour (e.g. KDE). So, new users start hacking and hacking and hacking and hacking the shit out of it and then they quit unsatisfied saying that it's a time-sucking vampire.

    More often than not, it's easier to adapt to the new ways than fight them. Lately, I'm starting to believe that well-designed tools that DON'T give you much choice are the better ones, especially when it comes to day-to-day stuff.


  • Notification Spam Recipient

    The time it takes is the major problem. From a work perspective if I move to linux I have to discard most of my tools. The tools that are there to compensate in OSS aren't nearly as good as the tools are available to me on windows. I've built up a nice tool set over the years. Most of the tools I've had work spend money on because there still isn't free version of it available that is as good. When I have found bugs in these tools I've been able to email them and seen it fixed in a few days and a new installer posted on the website. Usually an email with a direct link.

    I don't have the time nor the inclination to do that with OSS. Don't email me use the bug tracker. Works on my machine. I don't have a four gig xml file to check it. It's free put up with it. Not fucking good enough.

    Personally in my free time I game. I'm not going to spent hours trying to get it to work nor am I willing to live with a limited subset of games to choose from. Oh you can load it in WINE is not a good argument when I have a fully functional windows install that doesn't require any effort on my part.


  • Notification Spam Recipient

    @kt_ said:

    More often than not, it's easier to adapt to the new ways than fight them. Lately, I'm starting to believe that well-designed tools that DON'T give you much choice are the better ones, especially when it comes to day-to-day stuff.
    This is probably the best option. If you provide your users with choice the user will hang themselves and hang you too.


  • Dupa

    Get your point entirely.

    I agree, that Wine is never a good argument. It's a decent alternative for people who already have a reason to switch, or have already switched, but it's not an argument for switching. Although now there's this whole PlayOnLinux thing going on, which does a nice job, but it's mostly good for older games. (Still, I found a few old games to run better with wine than on Windows, like the great RTL Ski Jumping 2002. πŸ˜ƒ )

    I'm not saying that everybody should start using OSS, nor that it's appropriate for all use-cases and users. I'm just saying that it's not always as bad as some people make it out to be, or maybe it even might not be bad at all. In some cases, it's even preferred, like when it comes to webdev, which is now going on mostly on OS X and Linux and that's for a reason.

    Hey, to each their own.


  • Discourse touched me in a no-no place

    @DogsB said:

    The tools that are there to compensate in OSS aren't nearly as good as the tools are available to me on windows. I've built up a nice tool set over the years.

    Back when I used Windows as a main work OS, I had a set of tools that made the terminal work very much like a POSIX system (but with \ instead of /). It wasn't Cygwin as they were based directly on the Win32 API, but they worked nicely for me. Switching to OSX after that was not a huge wrench. ο‚­

    I think I've still got that ancient Dell laptop somewhere at work. I wouldn't dare to put it online nowadays. πŸ˜ƒ


  • Notification Spam Recipient

    @dkf said:

    I wouldn't dare to put it online nowadays.

    Could be fun though!



  • DiscoMarkdown strikes again!


  • Discourse touched me in a no-no place

    Ugh. And the fix is awful too.



  • These days I use git bash on Windows, which is portable git for windows which uses some MinGW64 stuff to create a posix environment of sorts or something like that. It's all really complicated and I've never had to worry about it since it comes with nuwen MinGW and all the hard work is done for me. But I don't really do much that needs a terminal - I'm usually just running git and cmake.


  • Discourse touched me in a no-no place

    @LB_ said:

    It's all really complicated and I've never had to worry about it since it comes with nuwen MinGW and all the hard work is done for me.

    I had a set of programs β€” the ones I used mainly β€” that were hand-crafted to work like POSIX but live on top of Windows. I picked them up from all sorts of different sources IIRC, and I know I'm not interested in hunting them all down again. πŸ˜„ I have no idea if any would work with 64-bit Win; that was totally not an issue for me at the point when I was using them. (The only 64-bit systems about at that point were Big Iron of various kinds, and the tooling on them sucked. I'm so glad I don't use IRIX64 any more.)



  • @Lorne_Kates said:

    As a very simplistic example, suppose in the outer universe, parallel processors were impossible. If you wanted to process 2 instructions, it would take 2 units of time.

    But in our universe, due to the rules, you are able to build a parallel processor. Everything says that processing 2 instructions will take 1 unit of time. But when you actually execute the experiment, 2 instructions take 2 units of time, because you're constrained by the limitations of the outer-universe's rules.

    Except it doesn't. It takes 1 unit of time, because your clock is dependent on how fast the simulated universe runs. What you observe to be time is really just a construct of the simulation. You don't know if time even exists outside of the simulation, or exists in the same way as it does inside it.

    To really start finding inconsistencies you must dig a lot further than a simple experiment such as that. For example, you start to wonder "is this subatomic unit really a particle or a wave?" and then discover that the universe isn't terribly consistent and it depends... in fact sometimes it depends on what's going to happen later... the information "I should behave like a particle" or "I should behave like a wave" travels back in time to its younger self and tells it what to do in order to become its future self, thus creating a time-travel paradox.


  • Java Dev

    IIRC experimental evidence suggests our universe is probably a simulation.


  • Trolleybus Mechanic

    @PleegWat said:

    IIRC experimental evidence suggests our universe is probably a simulation.

    CBA to dig up the article, but it was more a probability thing than anything else. Someone posited a theory that any sufficiently advanced society will create a universe simulation, and worked out the odds of a sufficiently advanced society existing before us. It turns out, according that that theory and numbers (but not any sort of evidence) that we're most likely a simulated universe.


  • kills Dumbledore

    If there is a possibility of a perfect Universe simulator, then once the simulation is made, that simulation will be able to make its own simulation, and there is a potentially infinite chain of simulations. Then the probability of us being in the topmost on is infinitesimal.

    Having just written that, I realise that it's dependent on the probability of each simulation creating its own simulation being 1. If not, then the expected number of simulations in the chain becomes much smaller. There are also questions around multiple simulations being run in each level, and whether each simulation is simplified or changed compared to its parent Universe


  • Discourse touched me in a no-no place

    @Jaloopa said:

    If there is a possibility of a perfect Universe simulator, then once the simulation is made, that simulation will be able to make its own simulation, and there is a potentially infinite chain of simulations. Then the probability of us being in the topmost on is infinitesimal.

    This argument reminds me of the ontological argument for the existence of God. Whether or not the thing which is being argued to exist actually exists, something about the argument itself strikes me as being suspicious.


  • kills Dumbledore

    I think the argument breaks down on the assumption that a simulation can be made that perfectly recreates the complexity of the parent Universe, and then you run into Maxwell's Demontype problems.


  • Discourse touched me in a no-no place

    @Jaloopa said:

    a simulation can be made that perfectly recreates the complexity of the parent Universe

    But is the whole complexity being recreated or are just the parts that are observable recreated? Some sort of fractal encoding could be used for the rest, which would allow theoretically infinite detail to be present without having to compute it for the most part.

    My distrust of the argument lies elsewhere. For some reason, the idea that a thing, once conceived of, must exist and that all the consequences of that existence must be followed to their logical omega, for some reason that strikes me extremely fishy. I strongly prefer arguments to be grounded in observations, or at least very clear axioms, so that the space of discussion doesn't get clogged up with eidolons of pure fancy, but instead remains actually useful to someone. I guess that makes me something of a constructivist utilitarian in terms of my philosophical world view. πŸ˜„


  • kills Dumbledore

    I guess that's what I meant about the <1 probability that a simulation will at some point create its own simulation. If it was a perfect recreation of the parent Universe then it would, but that would also be the most boring version of the thought experiment since there is no way, even in principle, to verify if you are in a "real" universe or an exact copy of one.





  • @Jaloopa said:

    If there is a possibility of a perfect Universe simulator, then once the simulation is made, that simulation will be able to make its own simulation, and there is a potentially infinite chain of simulations. Then the probability of us being in the topmost on is infinitesimal.

    Having just written that, I realise that it's dependent on the probability of each simulation creating its own simulation being 1. If not, then the expected number of simulations in the chain becomes much smaller. There are also questions around multiple simulations being run in each level, and whether each simulation is simplified or changed compared to its parent Universe

    That latter part would be pretty much guaranteed... no perfect universe simulator can be made, in terms of duplicating the complexity and scale of its parent universe. There's simply not enough storage space or processing power in the universe to store and run a simulated universe of equal complexity... it could perhaps run the simulation at a slower speed to compensate for lack of processing power (and the simulated universe wouldn't be able to tell, for the same reason as I described above, about its clocks), but there'd still be a limit as to how much information you could use to represent the physical complexity. You'd necessarily have to make the simulated universe smaller and less complex.

    Or, to put it another way, the potentially infinite chain of simulations you described would require a potentially infinite amount of information - either infinitely dense or infinitely expandable in the top-level universe in the chain. It has to store the information to represent the simulation, and the simulation's information includes the information of how to represent the next layer of simulation, ad infinitum.


  • BINNED

    @dkf said:

    This argument reminds me of the ontological argument for the existence of God. Whether or not the thing which is being argued to exist actually exists, something about the argument itself strikes me as being suspicious.

    The first critic of the ontological argument was Anselm's contemporary, Gaunilo of Marmoutiers. He used the analogy of a perfect island, suggesting that the ontological argument could be used to prove the existence of anything.

    Try using it to prove the perfect unicorn.



  • @Jaloopa said:

    Having just written that, I realise that it's dependent on the probability of each simulation creating its own simulation being 1

    Don't bogart that joint, my friend.

    Ever wondered what would happen to a simulator that just, like, simulated itself - in real time - with, like, perfect efficiency and a 1:1 mapping between parts of the simulator and parts of the simulation? Because it's, like, totally obvious that this is where simulators are, like, evolving to. Man.

    Filed under: totally fractal, man


Log in to reply