Go, Go, Gadget Depends!



  • Sorry, but I have to whine about this some more. I knew go had problems with its dependency management architecture, but I had no idea the situation was this FUCKING ATROCIOUS.

    A good breakdown:

    ###http://jbeckwith.com/2015/05/29/dependency-management-go/

    Here's the gist:

    If you follow the setup guide for golang, you’ll find yourself with a single directory where you’re supposed to keep all of your code. Inside of there, you create a /src directory, and a new directory for each project you’re going to work on.

    When you install a dependency using go get, it will essentially drop the source code from that repository into `$GOPATH/src’. In your source code, you just tell the compiler where it needs to go to grab the latest sources:

    import "github.com/JustinBeckwith/go-yelp/yelp"
    ...

    > So this is really bad. The go-yelp library I’m importing from github is pulled down at compile time (if not already available from a go get command), and built into my project. That is pointing to the master branch of my github repository. Who’s to say I won’t change my API tomorrow, breaking everyone who has imported the library in this way?
    
    So in order to do anything more involved than `hello_world.go`, you need to:
    
    - Change the magical `GOPATH` environmental value, so that it points to your current project instead of some common "workplace", as intended
    - Copy all the dependencies into your own source tree, instead of rely on go's "magical" `get` functionality
    
    Basically, manually work around go's pathetically broken dependency management system.
    
    The OP then researches like 20 different utilities to help you do this. I'm not being hyperbolic, there's really that many of them.
    
    <img src="/uploads/default/original/3X/b/c/bc4b9c6d4a1a66090279195605c8b330f9f91d12.png" width="518" height="500">
    
    And of course, without some kind of package repository, good luck finding them all and separating wheat from the chaff.
    
    The OP eventually picks [godep](https://github.com/tools/godep) as the best, most popular package management polyfill.
    
    Ok fine, I thought, golang is a mess, but at least someone stepped in and made a decent fix. Only to learn godep actually expects me to include the source code of every 3rd party package I intend to use into my own repository.
    
    <img src="/uploads/default/_emoji/wtf.png?v=0">
    
    WTF are these people smoking?! Have they never seen how a package manager is supposed to work?
    
    And apparently, even the long awaited ["vendor" experimental feature](http://engineeredweb.com/blog/2015/go-1.5-vendor-handling/) arriving in the next version of go, will require you to forego the go's official methodology and use hack tools to manage your dependencies. I can see how this might lead to some sane system in the future, but right now, I'm stuck with fucking godep and its ilk.
    
    There you go, folks. State of the art dependency management in 2015 - copy 3rd party sources into your own repository.


  • @cartman82 said:

    Who’s to say I won't change my API tomorrow, breaking everyone who has imported the library in this way?

    Well, look at it the other way - in Go, it's pretty much enforced by the compiler that you don't fuck up your API every other release!


    Filed under: granted, it's other people's compiler, but still



  • GIGO (Gigo Installer for Go)

    Oh god, another "clever" recursive acronym. You just know this one's gonna be a winner.


  • Discourse touched me in a no-no place

    @cartman82 said:

    WTF are these people smoking?!

    They're using Go. This should say it all.



  • This one was promising.

    But...

    $ glide get https://github.com/Sirupsen/logrus.git
    Oops! exit status 128
    

    Ok.... and? What now!?

    Sigh.



  • The bullshit open source-y people go through in order to avoid a "myapplication.csproj" file or equivalent.



  • @cartman82 said:

    Only to learn godep actually expects me to include the source code of every 3rd party package I intend to use into my own repository.

    Isn't Go always statically-linked? How else could this work?



  • @blakeyrat said:

    Isn't Go always statically-linked? How else could this work?

    You're supposed to have sources and packages you want to use available for compilation. That's not the issue.

    The issue is all this crap needs to go into my own git repo, according to their spec.



  • @cartman82 said:

    The issue is all this crap needs to go into my own git repo, according to their spec.

    But isn't that the end-goal? So you can just yank down your project and build it forever? I'm confused as to what you're trying to accomplish I guess.

    In the C# world, putting the dependencies into your own source tree where you can check them in is kind of the preferred normal way of using Nuget. Of course, the C# world has dynamic linking so we can do that with a couple .dll files.



  • @cartman82 said:

    This one was promising.

    But...

    $ glide get https://github.com/Sirupsen/logrus.git
    Oops! exit status 128

    Ok.... and? What now!?

    Sigh.

    The trick is, you need to do this instead:

    $ glide get github.com/Sirupsen/logrus
    

    But why should they bother correcting this (I bet) extremely common mistake? In fact, they should just write out a cryptic generic error code and die. Oh, and dump a bunch of garbage on HDD.

    Crappy OSS coders, right?

    Well, I'm pretty sure go runtime itself is the one doing the garbaging.



  • @blakeyrat said:

    But isn't that the end-goal? So you can just yank down your project and build it forever? I'm confused as to what you're trying to accomplish I guess.

    In the C# world, putting the dependencies into your own source tree where you can check them in is kind of the preferred normal way of using Nuget. Of course, the C# world has dynamic linking so we can do that with a couple .dll files.

    Actually, every nuget project I've seen in the wild keeps just the repositories.config in the source control. Nuget then downloads the actual packages at compile time and caches them locally.

    That's also how every other package management operates too (npm, pip...).

    You're an old dog living in a brave new world, Blakey. :-)



  • If you could specify a certain version in the repository to use, all you'd need to trust would be that the server doesn't go down. And you could keep a copy saved somewhere in case it does.



  • @anonymous234 said:

    If you could specify a certain version in the repository to use, all you'd need to trust would be that the server doesn't go down.

    That's still a FUCKLOAD of trust.

    "All you need to do is trust the business sense of people who decided to build their product around Git."



  • @cartman82 said:

    Actually, every nuget project I've seen in the wild keeps just the repositories.config in the source control. Nuget then downloads the actual packages at compile time and caches them locally.

    Yeah well we need to be HIPAA compliant and aren't utter hacks, so obviously we need 100% reproduceable builds. There's no way to do that without keeping the copy of the library in source control.

    It's not about new vs. old, it's about quality vs. shit.

    I guess if you're in the open source world where you just generate utter bullshit and nobody gives a shit because there's no quality or even forethought anywhere ever, then sure: do whatever. Put your dependencies on a USB drive and attach it to the collar or a rabid dog. Why not.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    There's no way to do that without keeping the copy of the library in source control.

    Binary code in source control? :doing_it_wrong:

    (Which isn't to say that it isn't critical to be able to identify precisely what went into the build of course. But that's not the same thing.)



  • @blakeyrat said:

    Yeah well we need to be HIPAA compliant and aren't utter hacks, so obviously we need 100% reproduceable builds. There's no way to do that without keeping the copy of the library in source control.

    Or you could just host your own internal NuGet server. You know, so that you can distribute packages in a sensible manner. Also, so that your repositories aren't gigabytes in size, bogging you down every time you want to do anything.


  • Garbage Person

    Yes and no. A file based repository included with the source works, but it's fucking hellishly slow.

    For quick and dirty slap dash stuff, just pointing right at nuget.org is okay.

    The more correct way to do local storage of packages to guarantee reproducibility is to run a local nuget server like proget.



  • @Weng said:

    The more correct way to do local storage of packages to guarantee reproducibility is to run a local nuget server like proget.

    ..... that's what I said.

    EDIT: What I said in the other post :-)


  • Garbage Person

    The biggest problem with storing the packages in source control is that the when told to get a package from a folder, nuget opens and inspects every fucking package in the folder to make sure it gets the correct package and version. This means it thrashes the fuck outta the file system and CPU to unzip everything for every single dependency.

    We used to do it like that. It sucked. Hour long build times.

    A package server keeps that info in a database and serves up exactly the right package.



  • @dkf said:

    Binary code in source control?

    If that's literally the best reason you can give for why what we're doing is a bad idea (the name of the category of software we use to track changes), then it must be the World's Greatest Idea.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    it must be the World's Greatest Idea.

    Package servers are a thing. They specialise in looking after builds of packages, and do so using tools that are right for that task (so it's blobs and checksums and origins and so on). This leaves your source repository to focus on looking after source, i.e., things where your organisation is the first party and where tracking things is really important.

    @Weng said:

    A package server keeps that info in a database and serves up exactly the right package.

    What he says.



  • I need to say this: the location of your project should never be considered when downloading or building dependencies. The dependency manager should have a dedicated cache folder, like Maven and Biicode.



  • @cartman82 said:

    If you follow the setup guide for golang, you’ll find yourself with a single directory where you’re supposed to keep all of your code. Inside of there, you create a /src directory, and a new directory for each project you’re going to work on.
    When you install a dependency using go get, it will essentially drop the source code from that repository into `$GOPATH/src’

    Can you imagine if MS Word wanted to put all its files in an arbitrary folder?



  • One of the things Java got right. And if you complain about "dependency hell" in Java it's because you're stupid or :doing_it_wrong:

    BTW, doesn't iOS work like this? I've seen some projects with a bunch of folders not related to the project.


  • Discourse touched me in a no-no place

    @Eldelshell said:

    One of the things JavaMaven got right.

    FTFY. (Before Maven, Java sucked a lot.)



  • Yeah, but also having JAR files that you can distribute helps a lot. Maven wouldn't exist if it wasn't for JAR files... something which lots of languages started doing later.


  • Discourse touched me in a no-no place

    @Eldelshell said:

    Yeah, but also having JAR files that you can distribute helps a lot.

    I can remember Java before they came along. In the very early versions, you just gave the JVM a directory (or list of directories) full of .class files, though ZIPping them up was added shortly after in order to make applets suck not quite so much. JARs evolved out of that.

    INB4 applets suck anyway. They sucked far more in the first versions.



  • Aren't JAR files just zip files with a different extension and a standardized location for files? I wouldn't really call that evolution... more like standardization.


  • :belt_onion:

    pom.xml?


  • :belt_onion:

    ?

  • :belt_onion:

    Yup. They're literally .zips with your classes, resources, and a META-INF folder.

    Also, +!



  • That class="emoji" makes the Maven logo unreadable XD


  • Discourse touched me in a no-no place

    @LB_ said:

    Aren't JAR files just zip files with a different extension and a standardized location for files? I wouldn't really call that evolution... more like standardization.

    The big difference is the metadata standardization; a lot of the more complex uses depend on that.



  • @LB_ said:

    Aren't JAR files just zip files with a different extension and a standardized location for files?

    They're the first instance of that particularly useful pattern that I remember seeing. Open Document turned up not much later.



  • Can you imagine if $PATH (%PATH% for you DOS users) only allowed one element?

    No?

    Well, $GOPATH is parsed the same way that $PATH is, so you're free to make as many roots for your code as you want.


  • Banned

    @ben_lubar said:

    Can you imagine if $PATH (%PATH% for you DOS users) only allowed one element?

    It does. It's just that parsers for $PATH split at : (or ; for DOS's %path%).

    @ben_lubar said:

    Well, $GOPATH is parsed the same way that $PATH is, so you're free to make as many roots for your code as you want.

    Dependency hell ahoy!



  • The PATH thing is a :wtf: in and of itself. Why would you ever want to create the exact same thing again when you know it's bad?


  • Discourse touched me in a no-no place

    @LB_ said:

    The PATH thing is a :wtf: in and of itself.

    You have a particular reason for claiming that? Or is it just that you think that all executables should be either all in the same directory, or always specified by full pathname?



  • Neither. Are you suggesting that a decision made decades ago before we knew how computers would be used should be trusted? There are a multitude of well-known problems with it that you should already know. Maybe you don't think they're that big of a deal, but I do.


  • Discourse touched me in a no-no place

    @LB_ said:

    There are a multitude of well-known problems with it that you should already know.

    I was asking because I was hoping that you'd describe them and not just assert “Ooooooh, they're bad!!! EEEeeeevilll!!!!!” A link is OK.


  • Banned

    @dkf said:

    You have a particular reason for claiming that?

    It's stringly typed.

    @dkf said:

    Or is it just that you think that all executables should be either all in the same directory, or always specified by full pathname?

    List of executable directories should be actual list.


  • Discourse touched me in a no-no place

    @Gaska said:

    It's stringly typed.

    Ah, the “the rest of my computer should use my favourite language's native type system” brain-worm.


  • Banned

    As opposed to "you can store everything as a string just fine" brain worm.



  • @dkf said:

    Before Maven, Java sucked a lot.

    I thought Ivy was pretty good. (And it was around before Maven, wasn't it?)



  • Would you rather have it stored in JSON? Or have environment variables that aren't strings of bytes?

    In the former case, you're just taking a string and making it harder to parse. In the latter case, how do you propose the system call works for retrieving the environment variable?


  • Banned

    The situation is unfixable now - it's thirty years too late. But having non-string env vars would be awesome - similarly, non-string stdin/stdout/arguments.



  • But... But they told me everything should bea text file...

    More seriously, I don't see an issue with PATH being a string. It's not like it's some sort of heavily structured data, it's just a basic list of basic strings anyway, and all you need is a proper escaping/delimiting schema.

    Until you find a way to object-orient the file system itself, I think we're going to be fine...


  • Banned

    @Maciejasjmj said:

    More seriously, I don't see an issue with PATH being a string.

    Every time I mess with $PATH, I have this chill in my back similar to the one when I do rm -rf.

    @Maciejasjmj said:

    all you need is a proper escaping/delimiting schema

    90% of all critical security issues come from incorrectly escaping characters. Okay, very few of them have anything to do with PATH in particular, but still - as a principle, storing environment as arbitrary strings is bad idea. Not to mention the whole idea of storing all the places to find system executables in a single variable is kind of PITA.

    @Maciejasjmj said:

    Until you find a way to object-orient the file system itself, I think we're going to be fine...

    What do you mean by object-orienting a filesystem?



  • @dkf said:

    FTFY. (Before Maven, Java sucked a lot.)

    After Maven, Java still sucked a lot.



  • As anybody from the Mac Classic world can tell you, paths aren't text. They're paths. They're their own data type.

    If you have:

    Macintosh HD:files:foobar.text

    And you rename "Macintosh HD" to, say, "Freddy"...

    Freddy:files:foobar.text doesn't magically become a different file, or end up in a different location. It's the same file in the same location.


Log in to reply