@Zerosquare said in Hacking News:
An additional factor in this mess is that apparently, some Linux distros aren't built from Git repos ; they use tarballs provided by developers instead, and it's not unusual for those tarballs' contents to differ from what's in Git repos, because they include such things as already auto-generated files and test stuff. The rationale appears to be "it's easier for distro maintainers not to have to run the whole build process from scratch".
It is the release tarballs that are the official output of the project, not the git repository. And while most projects do use a version control repository, and have it somewhere in public, these days, it is a relatively recent thing. It used to be that most projects only had a server with the release tarballs, and a mailing list for sending patches that were somehow applied by the maintainer.
In xz's case, the code in Git was clean, the backdoor was only present in the tarballs.
My opinion is:
- needing such workarounds is a good sign that your build process is too convoluted (Pikachu is unavailable for comment)
That's because portability is pain. There was a lot of platforms and they all had slightly different C compilers with slightly different options, and the way to link code differed a lot too, so to be able to write programs and compile them on all those platforms, autoconf, automake and libtool were born. They are, and have always been an ungodly mess, but they were the only way to work on all sorts of obscure unices.
Many of the systems are now dead, but the projects continue to use the tools because the are already set up and work, and for the occasional odd use on some odd ancient system.
- even without foul play, distributing software that isn't guaranteed to match what's in source control is a bad practice, and can create hard-to-find bugs
The generated files in autoconf, automake and libtool solve a chicken-and-egg problem: the target system may not yet have the tools needed to generate the configure script and the makefiles, but to install them, you have to configure their build systems too. That's why the autoconf+automake+libtool compile to scripts that get included in the distribution tarballs.
@Carnage said in Hacking News:
Oh yes, any project whose build requires anything beyond the compiler and build toolchain installed and then running a single command is by my definition broken. If I have to spend time to fiddle with knobs and dig through years old forum threads to find out how to make the fucking thing build, then it's broken.
But that's the point of the convoluted build system—that it does not require anything beyond a compiler and then your run, well, three commands (configure
, make
, sudo make install
). That you don't fiddle with any knobs, it adjusts the grand dozens of knobs itself.
@Carnage said in Hacking News:
If it's a project that is also deployed in some cloud or some such shit, the deploy should also be a single command or a single button press.
Not reading miles of readmes to do the standard thing.
Sure, and I am the Chinese God of Fun. You can totally have a single button press when its your project deploying to your specific cloud. Where the button is on the CD server and it took someone two weeks to set up the deployment pipeline (because debugging those things is a huge sink of time). But if you are distributing it, well, any non-trivial software will have a lot of parameters for whoever is installing it to set—which will be properly described in the miles of readmes if you are lucky.
@Carnage said in Hacking News:
Oh and building or deploying should not have side effects. If I build one project, another project should not stop compiling.
Building should not. And can be mostly prevented from having.
Deploying … if you deploy each application to a separate container, yes. Installing libraries onto a system … no way to really prevent the possibility.