So I decided to try to update part of my toolchain...



  • I'm coming to a point in one of my projects where to implement Large Outstanding Feature A, I need to implement Supporting Feature B, and so forth. In following that cascading chain to its last few links, the result I ended up with was that I need to do some work with the scene editor I have, and before I get too deep into creating the scene I need, now would be a good time to update it to the latest version as the version I use was pulled from a repo and built a few years ago, and was buggy but functional.

    Now, this project is using OGRE v1.9, and the scene editor I use is Ogitor, as it is the most feature-rich scene editor that was not already a dead project by 2009, or uses long-deprecated tech and only supports up to, say Ogre v1.7 (2010 vintage!). I checked their repo and found out that they had, among other things, recently updated to support Ogre v1.11! Excited, I downloaded a ZIP and started to gather the dependencies.

    Not long after that, I remembered many painful memories that had ostensibly been purged when I had to break out CMake. See, it is exceedingly rare that I am forced to use CMake and have a pleasant experience with it. Usually, I end up being given obscure error messages involving dependencies that I have to track down. This time was no exception, and I ended up hitting SO to learn that I had to slap LIBRARY/ARCHIVE DESTINATION $PATH on every install command to get it to quit complaining. But I was able to fix that.

    Once all those errors went away, then I was faced with the issue that the project maintainers had upgraded from Qt4 to Qt5. Little did I know what fun that would be! See, the Qt folks kindly deprecated qt5_use_modules and the cmake scripts were full of them! However, I was eventually able to fix that as well, and with great satisfaction, I clicked the Generate button, eagerly awaiting a collection of Visual Studio projects I could then open and build. Given that so much game development is on Windows for Windows, one wonders why they didn't just give me a functional VS solution in the ZIP. I mean, the maintainers of many of my other dependencies are nice enough to do that, and it would have saved hours of my life at this point.

    So I start building the solution and clean up the predictable errors caused by incorrect paths due to a combination of errors on my part and CMake screwups. The solution builds after an hour or so of such cleansing. Then I go to run it, but I get an exception, which was thrown in Ogre, and since I downloaded the precompiled binaries for VS2017 (which the Ogre guys kindly provided) I have no debug symbols so I can't get steppin'. Guess I have to build Ogre from source.

    Now I actually want to update to latest Ogre at some point, and I have never run into major problems doing so (upgrading from 1.7 to 1.8 and then a couple versions of 1.9), so I download that codebase and am presented with CMake again to build Ogre. The good news is that this time, I click Configure and Generate and it works out of the box. I get another VS solution, compile Ogre, and then recompile Ogitor with the Ogre binaries that have debug symbols. Great, let's see what that error is.

    After some digging, I'm able to determine that Ogitor is trying to load a bad plugin (the D3D9 Renderer, which was a hardcoded default for Windows, in spite of D3D9 being... ancient), and that plugin reference is controlled by configuration settings from Qt, and Qt is pulling them from entries in the Windows Registry that Qt itself inserted. Lovely. So I change the value using regedit and now I'm finally able to launch the program. First thing I try is to create a new scene, and Ogitor throws another exception at that thought, which appears to be a UI issue. Not wanting to debug something as fundamental as creating a new scene, I decided to stop and then do some more research, which took me to the Ogre forums:

    0_1527735259357_wellfuck.png

    Well fuck. Looks like I'm staying on my 1.9 build, then!



  • I use CMake quite extensively, and sadly a lot of projects don't use it properly. The fact you had to modify someone else's CMake script says enough to me. For a few libraries I actually ditch their provided CMake script and write my own because it's less hassle.

    Fun fact: Freetype has disabled building shared libraries with MSVC, but commenting out the line that generates the error message lets it work 100% correctly. Also, Freetype unconditionally uses dllexport/dllimport under MSVC even though you're only technically allowed to build static libraries of it with MSVC. So no matter what you do, you have to patch something in Freetype to build it under MSVC.

    My experience with other projects gives me the feeling that open sources devs don't like testing under MSVC. How many projects can you think of that only support building under cygwin or msys?



  • @lb_ said in So I decided to try to update part of my toolchain...:

    Also, Freetype unconditionally uses dllexport/dllimport under MSVC even though you're only technically allowed to build static libraries of it with MSVC. So no matter what you do, you have to patch something in Freetype to build it under MSVC.

    Ah, you just reminded me that I also had to put in an #ifdef for those in one of the projects as they were all dllexport when they should have been a mix (i.e. inside or outside of the DLL in question).

    My experience with other projects gives me the feeling that open sources devs don't like testing under MSVC. How many projects can you think of that only support building under cygwin or msys?

    I would hope that game developers would be a bit more Windows-friendly, such as how Ogre provides VS binaries if you don't need debug symbols. I seem to remember being provided VS solutions a few years ago. I can accept using CMake provided it works first time, out of the box (hopefully not too much to ask!).

    In fairness, given how Unity and Unreal have become so cheap in the past few years, I can see how what's left might retreat deeper into the OSS world, serving a handful of specialized use cases.



  • Looks like there was a release a few months later!

    Time to give it another shot. I downloaded the latest 1.x Ogre (1.11.5), and ran that through cmake, and it builds first try. No surprises.

    I try feeding that Ogre build to Ogitor and it complains about not seeing files and expecting 1.10. Okay, so let's grab 1.10.12. I run cmake against that and it builds first try. I plug that build into cmake and FEWER ERRORS! Progress!

    Next up is pointing it to a Qt install. I still have one from the last time I tried to build this. FEWER ERRORS!

    Okay, now we have some griping about boost. I've tried it with 1.69 and 1.54, and I get the same goddamned error with either:

    cmakehell.png

    Since neither old nor new seems to sate this script's desires, I need to either track down the magical boost release that has the delicate file structure expected by this script, or try to remember what I tried last time to get it to shut up. I think I had to set several more variables and point to include and library paths deeper in the folder structure.

    Then there are the usual errors involving qt5_use_modules and install Library TARGETS given no DESTINATION! which we figured out how to solve last time.

    Since this is C++, and since we're moving from VS2010 to 2017ish, I will need to not only recompile my project against Ogre 1.11 (which is currently on 1.9), but first all my other dependencies, including Ogre plugins like CEGUI, SkyX, HydraX, ParticleUniverse. This presents a good opportunity to upgrade all those at the same time, but worst case, I still have the source to all of them, so I could port them to latest Ogre as a last resort.

    Speaking of porting to latest Ogre, if I can't get this Ogitor release working, I think I'll just take the working source and build for Ogitor against Ogre 1.9 that I made several years ago and carry that forward. For all we know, it might be the last compiling, functional build on the planet!

    @Unperverted-Vixen, I HOPE YOU'RE HAPPY.


  • Notification Spam Recipient

    @Groaner said in So I decided to try to update part of my toolchain...:

    @Unperverted-Vixen, I HOPE YOU'RE HAPPY.

    Unperverted-Vixen I hope you're happy too!


  • Considered Harmful

    @levicki I would call making good and damn sure you're aware of Spectre mitigation 'getting their shit right'.
    Oh yeah, and e no repro. My entire computer engineering class just installed VS17 with no real tutorial; no such errors. If C-average Indians can figure it out you can too.


  • And then the murders began.


  • Grade A Premium Asshole

    @Groaner said in So I decided to try to update part of my toolchain...:

    Given that so much game development is on Windows for Windows, one wonders why they didn't just give me a functional VS solution in the ZIP.

    Probably due to some manner of licensing issue.



  • @levicki said in So I decided to try to update part of my toolchain...:

    When you switch to VS 2017 don't forget to install Universal CRT. I didn't install it and any project needing <stdio.h> could not find it. What is worse, there was <cstdio> which was including regular (but missing) <stdio.h> It seems that starting VS 2015, standard C headers were removed from VC\include path and are now part of the Windows SDK.

    Oh, and if you happen to install Windows Driver kit and wonder why you can't link to C runtime anymore, that is because WDK helpfully enables Spectre mitigation libraries for all C/C++ projects and those are not installed by default either. If you don't want to use them, tough luck, you will have to edit some obscure MSBuild props files.

    TL;DR -- if a most widely used development tool can't get their shit right what do you expect from hobby projects.

    Good to know. In the past, I've needed to install the DirectX SDK to get some things to compile, but I'm sure a few things will need to change in this process.

    The only references to <stdio.h> and the like are in a precompiled header for the solution, so changing that is easy enough.



  • SUCCESS.

    cmakehell2.png

    Now that I have a VS solution, it's time to deal with a pile of compilation errors...



  • @pie_flavor said in So I decided to try to update part of my toolchain...:

    My entire computer engineering class just installed VS17 with no real tutorial; no such errors.

    Wat? When I was a CS student, we were all expected to ssh into a Solaris cluster and do all our work with gcc. As much as people here like to hate on VC6 and the like, in those days, it would have been the state-of-the-art (with VS .NET right around the corner). However, many of the professors had hate-boners for anything Microsoft and liked to bash on its bugs, as if it was unacceptable that there'd be any bugs in a system which needed to support millions of hardware combinations with equally-buggy drivers.

    Notwithstanding that a few days before each of the projects were due, there'd be a ton of people logged into the servers and they'd be slow as balls, if I had access to a debugger, I think my grades would have been far higher and I would have gotten the projects done in half the time.


  • And then the murders began.

    @Groaner said in So I decided to try to update part of my toolchain...:

    Wat? When I was a CS student, we were all expected to ssh into a Solaris cluster and do all our work with gcc. As much as people here like to hate on VC6 and the like, in those days, it would have been the state-of-the-art (with VS .NET right around the corner).

    I don't hate on VC6 as a product of its era. I hate on my company still actively developing projects in VC6 in 2019, rather than upgrading the toolchain to something that's not 8-9 versions behind, and that TFS knows how to handle.

    However, many of the professors had hate-boners for anything Microsoft and liked to bash on its bugs, as if it was unacceptable that there'd be any bugs in a system which needed to support millions of hardware combinations with equally-buggy drivers.

    In my day, it wasn't that the professors were anti-Microsoft; they needed to set up tooling like their autograder, and the Solaris (or maybe it was some other *nix) environment was what they had. You were welcome to build using VC++ and use a debugger, but in the end it had to be able to run in the submission environment too.



  • @Unperverted-Vixen said in So I decided to try to update part of my toolchain...:

    I don't hate on VC6 as a product of its era. I hate on my company still actively developing projects in VC6 in 2019, rather than upgrading the toolchain to something that's not 8-9 versions behind, and that TFS knows how to handle.

    Targeting desktop, I'm guessing? I can't imagine doing anything Web with something of that vintage. We use 2012 or 2015 at work and that's about as early as I'll go.


  • And then the murders began.

    @Groaner I think that product may have started off as a desktop application, but in its current incarnation it's an application server that sits between the websites and the database, housing all the business logic.

    (The web front ends are of varying vintages; while there's still a good number of them in VB6/Classic ASP, those have replacements built in C#/ASP.NET MVC that are slowly being rolled out.)


  • Notification Spam Recipient

    @Groaner said in So I decided to try to update part of my toolchain...:

    We use 2012 or 2015 at work and that's about as early as I'll go.

    *Hides his home VM still running 2010..*



  • @Tsaukpaetra said in So I decided to try to update part of my toolchain...:

    @Groaner said in So I decided to try to update part of my toolchain...:

    We use 2012 or 2015 at work and that's about as early as I'll go.

    *Hides his home VM still running 2010..*

    Hey now, this particular project was heretofore running under 2010!

    I can forgive C++ apps for lagging behind due to ABI compatibility, but I read 2015 and 2017 use the same ABI, so that's becoming less of an excuse.



  • Ogre changed their folder structure between 1.9 and 1.10, it seems. The #if used to be something like OGRE_PLATFORM == APPLE, but it was exactly what I needed. So I repurposed it.

    creativemacros.png



  • @Unperverted-Vixen said in So I decided to try to update part of my toolchain...:

    they needed to set up tooling like their autograder, and the Solaris (or maybe it was some other *nix) environment was what they had. You were welcome to build using VC++ and use a debugger, but in the end it had to be able to run in the submission environment too.

    Seconded on the autograder. Also:

    • auto style-check
    • auto testing
    • licensing issues for public school workstations
    • licensing issues when students may want to work at home
    • environment setup issues for students' own machines
    • students tend to be poor, have junkyard computers, so IDE can't be VS
    • students get some hands-on experience with ssh

    ...Did student work submission environment discussion have a dedicated thread somewhere?



  • Good news: It compiles and runs.

    goodnewsbadnews.png

    Bad news: It seems to hang on this screen when I load a scene, and dragging in the components doesn't add them to the scene.

    Guess this means it's time for Plan B, i.e. porting the 2013 build of Ogitor I have as well as the rest of my project to VS2017, Ogre 1.11, and more recent builds of other dependencies. This is a shame, as I'd really like to use Ogre's volume terrain component and don't really have an editor to manage volumetric terrains, not that I know for certain that Ogitor ever supported that tech.


  • Notification Spam Recipient

    @Groaner said in So I decided to try to update part of my toolchain...:

    It seems to hang

    Can you trace what it's waiting on? Maybe something is no longer flagging as expected?



  • @Tsaukpaetra said in So I decided to try to update part of my toolchain...:

    @Groaner said in So I decided to try to update part of my toolchain...:

    It seems to hang

    Can you trace what it's waiting on? Maybe something is no longer flagging as expected?

    Stepping through, it looks like it goes into a handler to load the file, which it does "successfully" and then returns to a sort of main event handler/loop. Which is a special Magic Number Hellâ„¢:

    magicnumberhell.png

    I've tried creating a new scene as well and I can't drag anything onto it, which is a bad sign. It could be that there's something simple that's preventing this from working, or it could be a number of things.


  • Banned

    @Groaner said in So I decided to try to update part of my toolchain...:

    @pie_flavor said in So I decided to try to update part of my toolchain...:

    My entire computer engineering class just installed VS17 with no real tutorial; no such errors.

    Wat? When I was a CS student, we were all expected to ssh into a Solaris cluster and do all our work with gcc. As much as people here like to hate on VC6 and the like, in those days, it would have been the state-of-the-art (with VS .NET right around the corner). However, many of the professors had hate-boners for anything Microsoft and liked to bash on its bugs, as if it was unacceptable that there'd be any bugs in a system which needed to support millions of hardware combinations with equally-buggy drivers.

    Well, bugs in a compiler are unacceptable. And from what I heard, VC6 was full of those.

    if I had access to a debugger, I think my grades would have been far higher and I would have gotten the projects done in half the time.

    If your programs weren't perfect as they were, it's unlikely you had enough skills to make good use of command-line GDB.


  • Notification Spam Recipient

    @Groaner said in So I decided to try to update part of my toolchain...:

    Magic Number Hellâ„¢:

    Fuck. That code reminds me of our 3d paintbrush tool (not programmed by myself). Really ought to refactor the whole thing but ugh...


  • Discourse touched me in a no-no place

    @Groaner said in So I decided to try to update part of my toolchain...:

    Notwithstanding that a few days before each of the projects were due, there'd be a ton of people logged into the servers and they'd be slow as balls

    The trick was to find which server was least heavily loaded and use that. While simultaneously making sure that nobody else has that idea…


  • Discourse touched me in a no-no place

    @Unperverted-Vixen said in So I decided to try to update part of my toolchain...:

    I hate on my company still actively developing projects in VC6 in 2019, rather than upgrading the toolchain to something that's not 8-9 versions behind, and that TFS knows how to handle.

    The problem is that VC6 can produce output which doesn't need the system runtime to be shipped with it, making installation/deployment much simpler. Later versions, despite actually producing better code, were (and are) only making code that has this extra dependency.


  • Banned

    @dkf said in So I decided to try to update part of my toolchain...:

    @Unperverted-Vixen said in So I decided to try to update part of my toolchain...:

    I hate on my company still actively developing projects in VC6 in 2019, rather than upgrading the toolchain to something that's not 8-9 versions behind, and that TFS knows how to handle.

    The problem is that VC6 can produce output which doesn't need the system runtime to be shipped with it, making installation/deployment much simpler.

    Correction: it does need runtime - it's just included in every copy of Windows XP and up (maybe a couple versions down too, not sure).


  • Banned

    @levicki said in So I decided to try to update part of my toolchain...:

    Newer is not always better.

    I'd rather say it's part of the endless war between security and compatibility. As long as the computers get faster, the cryptographic algorithms must be regularly updated too, or they lose their entire purpose. Allowing legacy security protocols is functionally equivalent to turning off security entirely. But disallowing them means cutting off all the old platforms.

    As for your specific problem - have you tried skipping the process of installing runtime and simply putting all the required DLLs (there's 3 of them) in your program's directory?


  • Banned

    @levicki said in So I decided to try to update part of my toolchain...:

    Remind me again what was wrong with static linking and why we invented DLLs in the first place?

    Because the computers of the time had single digit megabytes of RAM. We're long past that point and there's no reason for anything being a DLL anymore, save for system libraries and other inter-process communication stuff where binary compatibility is essential. And global event hooks, but only because the OS doesn't allow not using them.


  • Banned

    @levicki said in So I decided to try to update part of my toolchain...:

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    there's no reason for anything being a DLL anymore

    Actually there are:

    1. Code reuse (between applications)

    How does it matter if the compiled build artifact of your library gets shared between two programs or if each program gets its own copy of the compiled build artifact? You get the same effect either way.

    1. Easier security patching (updating one DLL patches all programs that use it)

    For this to work, the developer of the library must be absolutely sure that the new version has a perfect binary and behavioral compatibility with the old version. And developers are a lazy bunch, and they rarely care about such details. I've had Linux programs break completely on several occasions because a library in upstream package has changed something that the downstream program relied on. I never had this problem on Windows, exactly because everyone includes their dependencies as DLLs in program directory.

    1. Less RAM usage (same code can be mapped in multiple processes with copy-on-write)

    A single DLL usually weighs less than a single display frame nowadays. It's not 1990 anymore, you don't have to care about those kilobytes.

    1. Less disk I/O (always a good thing)

    See above.

    1. Smaller updates (update just the DLL if other stuff hasn't changed)

    Have you never heard of binary patches?

    1. Faster loading (chances are DLL will be in RAM when your program is started)

    See 2x above.

    1. Plugins would be impossible without DLLs

    Right, right. Four uses of DLLs are there - system libraries, IPC, global hooks, and plugins.



  • In theory, yes, in practice, not so much. I have to put DLLs right in the application directory because of how other "developers" operate.



  • @Groaner said in So I decided to try to update part of my toolchain...:

    Stepping through, it looks like it goes into a handler to load the file, which it does "successfully" and then returns to a sort of main event handler/loop. Which is a special Magic Number Hellâ„¢:

    That code is generated by Qt's "moc" code generator and Qt-aware tooling is supposed to completely hide it from you.

    The reason it's there is because Qt has specified their own extension to C++ to define event handling. In addition to methods you can also declare signals and slots on classes which you wire to each other to exchange asynchronous messages. That giant switch is the guts of that mechanism.

    Not that the Qt way of doing things isn't a WTF in itself, but it's nice to know who to blame.


  • Banned

    @levicki said in So I decided to try to update part of my toolchain...:

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    How does it matter if the compiled build artifact of your library gets shared between two programs or if each program gets its own copy of the compiled build artifact? You get the same effect either way.

    It matters because you are wasting disk space, RAM, and bandwidth for updates?

    You're wasting way, way more by having Skype in the background.

    You want to see wasted memory? Look up HTTP header. Remember your computer makes several of them every second.

    Someone is paying for that and you are pissing all over it just because you can't bother to do better?

    It all comes down to cost vs. benefit. The cost of holding two copies of a DLL is miniscule, while the problems that sharing causes are quite big - especially when you count man-hours required to solve them.

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    For this to work, the developer of the library must be absolutely sure that the new version has a perfect binary and behavioral compatibility with the old version. And developers are a lazy bunch, and they rarely care about such details. I've had Linux programs break completely on several occasions because a library in upstream package has changed something that the downstream program relied on. I never had this problem on Windows, exactly because everyone includes their dependencies as DLLs in program directory.

    Which is the case in 99% of the time. You can install VC 2013 update 5 runtime and not break anything which used VC 2013 RTM runtime.

    This is a 1% case, not a 99% case. First, it's one of the most important DLLs in the entire world, so extra care is taken with it to make sure a botched update doesn't paralyze half of the planet. Second, VC2013 runtime has different DLL names from VC2012 runtime specifically because they want to avoid all the problems related to having an updated DLL load with old programs. They used not to do this. MSVCRT.DLL used to be the same filename for different versions of runtime. And it caused countless problems. So starting with VS2002, they've started numbering their DLLs to force programs built for different versions to use different copies of the DLL. They could've kept everything stable and allow all programs to share the same runtime. But they decided not to.

    The fact that you had this problem in Linux is because Linux community doesn't give shit about backward compatibility and thus breaks things when they deem the tradeoff is worthy. The reason why they don't give a shit is because nobody is paying them for that otherwise they would keep backward compatibility just like Microsoft has to.

    Note that Microsoft is a special case here, and in a grand majority of big bucks software companies, no one pays the developers to keep backward compatibility either.

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    A single DLL usually weighs less than a single display frame nowadays. It's not 1990 anymore, you don't have to care about those kilobytes.

    Wrong:

    6f2caa54-b720-4058-ba4b-bc03db540cfe-image.png

    I said, usually. Of course a graphics driver, with all their millions of application-specific hacks and whatnot, is going to weigh a lot more.

    And don't think I haven't noticed you've sorted by size descending. Here's a screenshot of what I see in the middle of my NVidia DLLs:

    362173de-025c-4a0f-b8fe-23f2ce95ad58-obraz.png

    All smaller than the screenshot itself (when saved as BMP).

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    Have you never heard of binary patches?

    Of course I have, but that doesn't stop Office 365 CTR installer from downloading 2.5 GB full streamed install file on every update and it patches nothing, overwrites everything.

    Fun fact: Office is made mostly of DLLs. Looks like they do nothing to help with your issue, after all.

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    Right, right. Four uses of DLLs are there - system libraries, IPC, global hooks, and plugins.

    What about engines

    Be more specific.

    COM servers, services

    That's IPC.

    language resources, binary blobs?

    They can go into EXE as well.



  • @levicki said in So I decided to try to update part of my toolchain...:

    Remind me again what was wrong with static linking and why we invented DLLs in the first place?

    Because if there's a buffer overflow in dependent DLL XYZ, updating the DLL patches that vulnerability in all applications that use it, rather than waiting for each respective developer (some of whom may be dead) to update to the latest version and push out a build.

    Edit: :hanzo:



  • index.png


  • Considered Harmful

    @brie

    rustup update
    cargo update
    

  • Banned

    @levicki said in So I decided to try to update part of my toolchain...:

    @GÄ…ska said in So I decided to try to update part of my toolchain...:
    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    You want to see wasted memory? Look up HTTP header. Remember your computer makes several of them every second.

    Those are at least sent compressed most of the time.

    What was your excuse before 2015, back when they weren't?

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    It all comes down to cost vs. benefit. The cost of holding two copies of a DLL is miniscule, while the problems that sharing causes are quite big - especially when you count man-hours required to solve them.

    Show that huge problem of sharing them please. I am waiting.

    How long have you spent poking about with that VC runtime to learn everything you've put in that amazing rant of yours? Now, multiply it by your hourly salary. Now, multiply it by the number of developers all around the world who had to go the same path as you. Now, add all the hours spent by IT support people who had to fix stuff after new library versions have broken old stuff (this was very common back when developers haven't resigned into putting all their dependencies into program's directory). Also, add the potential cost of how many more hours of development would it take to do the updates right, so that no library ever breaks

    The cost you are mentioning is miniscule to you because as a developer you are not paying for it. I am paying for it as a computer owner.

    But how much are you paying? And how much more would you be willing to pay for software that does DLL updates correctly?

    I am sick of programmers who have no clue about hardware and platform they are writing code for and who put no effort to do stuff in the optimal way but instead do the most wasteful thing they can because "it is easier" and "hey, it works, right?" and we end up with "Death by a 1,000 papercuts" syndrome where no matter how much money you splash on expensive high performance hardware some Indian monkey somewhere is going to find a way to bring it down to its knees.

    Aren't you exaggerating a bit? We're talking about single megabytes here. I hate infinite resource hogging as much as you, but DLLs are among the last places you should look at to be outraged. Implementing everything as embedded HTML documents is far worse.

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    They could've kept everything stable and allow all programs to share the same runtime. But they decided not to.

    And it was a right decision because new versions were bringing new features from which old programs would not benefit without recompile.

    But they still have all the same old features. There's no fundamental reason why they couldn't make it work. It's just not economical.

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    Of course a graphics driver, with all their millions of application-specific hacks and whatnot, is going to weigh a lot more.

    That is not a graphics driver, it is CUDA libraries, largest is cuDNN (neural net training stuff)

    So it's even more special case than I initially thought.

    but the same goes for drivers and for say Intel Performance Primitives library or any other larger piece of software or SDK. If some DLLs are small then they are most likely .Net libraries, not native code. Native code size has increased considerably by going to 64 bits, then by adding VEX prefix, then by 3 operand syntax and also by compiling libraries optimized for several CPU micro-architectures and adding runtime dispatch. We are not talking here about one or few DLLs, it's hundreds if not thousands of them. Imagine if each program had its own copy.

    The world of native code has progressed quite a bit since fifteen years ago, from what I see. When C# still had less than 5% market share, Almost all DLLs I've seen were in 500KB range.

    @GÄ…ska said in So I decided to try to update part of my toolchain...:

    Fun fact: Office is made mostly of DLLs. Looks like they do nothing to help with your issue, after all.

    Stupid O365 updating method has nothing to do with DLLs and one does not cause the other.

    Well, it was your counterargument to my point that binary patches are a thing. What can I say? Try to think your replies through before posting next time? 🤷


  • Banned

    @pie_flavor said in So I decided to try to update part of my toolchain...:

    @brie

    rustup update
    cargo update
    

    Have fun fixing compilation errors after all upstream libraries have switch to a new promises API for the 5th time this year.


  • Considered Harmful

    @GÄ…ska cargo update should only screw you over if you depend on "*" like a retard. And who uses anything other than futures?



  • @GÄ…ska said in So I decided to try to update part of my toolchain...:

    Have fun fixing compilation errors after all upstream libraries have switch to a new promises API for the 5th time this year.

    Sounds like a...

    BROKEN PROMISE.



  • @pie_flavor said in So I decided to try to update part of my toolchain...:

    And who uses anything other than futures?

    Some of us prefer puts and calls.


  • Banned

    @pie_flavor said in So I decided to try to update part of my toolchain...:

    And who uses anything other than futures?

    futures is the 5th one.


  • Considered Harmful

    @Groaner what? I just mean in terms of promises. Noobs use std::future::Future and everyone else uses futures::Future.


  • Considered Harmful

    @GÄ…ska Okay, and? cargo update should perform zero breaking changes unless you were lazy with your build configuration.


  • Banned

    @pie_flavor and it does that by not updating dependencies. Which runs sort of contrary to the point of, you know, updating dependencies.


  • Considered Harmful



  • @pie_flavor said in So I decided to try to update part of my toolchain...:

    @Groaner what? I just mean in terms of promises. Noobs use std::future::Future and everyone else uses futures::Future.

    :whoosh:


  • Considered Harmful

    @Groaner no, no, an option is an asset that may or may not exist, and a future is like that except in the future, yeesh.


  • Banned

    @Groaner he's a college student. Don't demand him to understand economy.


  • Banned

    @pie_flavor said in So I decided to try to update part of my toolchain...:

    @GÄ…ska

    See that number in the front? This is what you want upgraded. Not the other ones, they don't matter. The one in the front matters, as it determines what cool features are available to you.



  • @brie said in So I decided to try to update part of my toolchain...:

    index.png

    We're getting there. On the bright side, the 2013-2014 builds I have of a few dependencies are pretty much the latest releases, sadly enough. I picked up RakNet just before Facebook acquired it and gave it away for free, and it seems like there's been almost zero activity since.


Log in to reply