This one is too good to not share...



  • I just spent the last hour with a co-worker going through the build process - trying to figure out how all the moving bits and pieces work. Perhaps first, a bit of back story:



    Back in February, I found myself employed at a small software shop. (By small, I mean there were 5 people, including the boss, secretary, two developers and a support guru). I was replacing one employee, who had given his notice, so there were three days for me to learn everything I could from him. Not too long after I started, they hired on two more developers (one of which since quit), and the other only remaining developer gave his notice. The result was that I went from being the new guy to the main developer in just a few months.



    The main product had a rather elaborate build process. I hadn't personally looked into it too much, as according to the authority 'It works, so leave it alone'. Fine. That is until we discovered how convoluted it was.



    Allow me to outline the process:



    The build process is as follows:

    1. Make sure your code compiles/runs as normal on your local machine.


    2. Run a batch file, which makes a call to Visual Studio to build the solution. It then copies the results of the build (from several projects) into a directory.


    3. Start the Update Manager application. In the Update Manager:



        a. Copy any SQL scripts that need to be run for the update.



        b. Copy and paste into the window the newly generated .exe files for the Any CPU build (incorrectly named something like 'Files for x86').



        c. Copy and paste into the window the newly generated .exe files for the x86 build (incorrectly named something like 'Files for x64')



        d. Click Create New Release.


    4. Clicking Create New Release does the following:



        a. Creates a .zip file containing the files that were in the two lists, as well as the SQL script.



        b. Copies this .zip file to a location on a public share.



        c. Calls a .bat file on the public share that does the following:



          i. Calls Visual Studio to make a new build of a project that includes the updated .zip file (which contains the newly created .exe files, as well as the SQL script).



          ii. Copies the just built .exe from this solution to a public directory.


    5. Double clicking on a new release .exe runs the SQL script that was built into itself (using some scary assembly black magic), unzips the .zip file contained within it (again, using assembly black magic), and copies across the .exe files that were contained within itself to replace the old .exe files it is updating.



      Not only is it incredibly long and not very clear, it's also quite frightening. The amount of hackery in it scares me - especially the self extracting .exe which is generated via the second batch file. The whole reason that we started to look into it (and began to unravel a universe of knotted yarn) was that we were wanting to simplify things and do just a single .x86 build. I couldn't figure out why they were running two separate builds when it made absolutely no sense to do so. After unwinding the whole thing, the boss came in and told us the reason it was really done that way wasn't because it was an x86 vs. Any CPU build, but the real reason was a SQL Server 2005 vs. 2008 thing, which in my mind doesn't make any sense at all. Why we should have to do two separate builds (one x86 and the other x64) should have no real bearing on what database it ends up on. Apparently the previous developer named things incorrectly all over the place, and the boss tore him a new one for it. The end result though, is that we are now stuck with a completely convoluted, incorrectly named, overly complicated build process.



      So much for being able to kick off a build with a single click.





      [mod - un-bricked brick-text - PJH]


  • Whoa, "assembly black magic"? You mean actual ASSEMBLY LANGUAGE code, or just reflection against .NET assemblies?



  • @Joel B said:

    The build process is as follows:
    1. Make sure your code compiles/runs as normal on your local machine.

    2. Run a batch file, which makes a call to Visual Studio to build the solution. It then copies the results of the build (from several projects) into a directory.

    3. Start the Update Manager application. In the Update Manager:

      a. Copy any SQL scripts that need to be run for the update.

      b. Copy and paste into the window the newly generated .exe files for the Any CPU build (incorrectly named something like 'Files for x86').

      c. Copy and paste into the window the newly generated .exe files for the x86 build (incorrectly named something like 'Files for x64')

      d. Click Create New Release.

    4. Clicking Create New Release does the following:

      a. Creates a .zip file containing the files that were in the two lists, as well as the SQL script.

      b. Copies this .zip file to a location on a public share.

      c. Calls a .bat file on the public share that does the following:

        i. Calls Visual Studio to make a new build of a project that includes the updated .zip file (which contains the newly created .exe files, as well as the SQL script).

        ii. Copies the just built .exe from this solution to a public directory.

    5. Double clicking on a new release .exe runs the SQL script that was built into itself (using some scary assembly black magic), unzips the .zip file contained within it (again, using assembly black magic), and copies across the .exe files that were contained within itself to replace the old .exe files it is updating.
    Dude!  Last shop I worked at would hire you in an instant if you could streamline a build to that point.

    We're talking 38-page instruction sheets to the operators (Developers May Not Touch The Prod Console!) and four or five-page rollout schedules to coordinate which site will run the script on which day.



  • I don't know if such a thing exists, but it would be awesome for you to have something that "lets you easily automate your builds and deployments. With an intuitive, web-based UI, you can set up as many deployment plans as you need from Testing to Production."

    Maybe the dude who created this site could help :-)



  • @da Doctah said:

    (Developers May Not Touch The Prod Console!)

     

    Relaxen und watchen das Blinkenlights .

     

     



  • @da Doctah said:

    The Prod Console
     

    Your cattle prod has a console attached?

    Can't you tase someone using a fancy GUI nowadays, or are you using the older command-driven higher-shockability model?



  • @Watson said:

    Relaxen und watchen das Blinkenlights
     

    vat are you sinking about?



  • @da Doctah said:

    (Developers May Not Touch The Prod Console!)
     

    Insane! How are you going to prod the console if you can't touch it?



  • @dhromed said:

    @Watson said:

    Relaxen und watchen das Blinkenlights
     

    vat are you sinking about?

    Vateffer ze lights tell me to sink about



  • Well at least your executables are not stored in source control.  On our program Configuration Management does not build any executables, so developers have to build all the solutions and check them in before CM can push them out (and by push them out, I mean manually copy and paste the files to where they are supposed to go).  At least my manager realizes the huge risk associated with it, unfortunately CM is under a separate group outside of her control.


  • ♿ (Parody)

    @Anketam said:

    Well at least your executables are not stored in source control.  On our program Configuration Management does not build any executables, so developers have to build all the solutions and check them in before CM can push them out (and by push them out, I mean manually copy and paste the files to where they are supposed to go).  At least my manager realizes the huge risk associated with it, unfortunately CM is under a separate group outside of her control.

    What's wrong with this? OK, "manually copy and paste" is sub-optimal. But what's wrong with having a record of exactly what you released? Would you rather these unaccountable boobs had the responsibility of properly configuring and building your software for release?



  • I've considered adding a cattle prod to a console before. Activate in response to the "su" command. You know, to re-train people who think it's a good idea to log in with root privs and leave the window open all day long.



  • @boomzilla said:

    @Anketam said:
    Well at least your executables are not stored in source control.  On our program Configuration Management does not build any executables, so developers have to build all the solutions and check them in before CM can push them out (and by push them out, I mean manually copy and paste the files to where they are supposed to go).  At least my manager realizes the huge risk associated with it, unfortunately CM is under a separate group outside of her control.
    What's wrong with this? OK, "manually copy and paste" is sub-optimal. But what's wrong with having a record of exactly what you released? Would you rather these unaccountable boobs had the responsibility of properly configuring and building your software for release?
    What is wrong is that you can easily have an executable out of sync with what has been checked into source control and have no way of knowing.  Derived files like executable should be generated from source control when they are ready to be pushed out or put in a build/patch file.  The other danger is that a developer can easily put in malicious (or really any unauthorized) code, check in the executable and not check in the malicious code.  If someone reviews the code as part of a peer review or code review it will look ok.


  • ♿ (Parody)

    @Anketam said:

    What is wrong is that you can easily have an executable out of sync with what has been checked into source control and have no way of knowing.  Derived files like executable should be generated from source control when they are ready to be pushed out or put in a build/patch file.  The other danger is that a developer can easily put in malicious (or really any unauthorized) code, check in the executable and not check in the malicious code.  If someone reviews the code as part of a peer review or code review it will look ok.

    I don't see how any of that is actually enabled by checking in the binaries. You'd at least have some record of what happened and who did it. And you could always check it against the respective source code. What's to stop someone from doing some malicious changes after the code is checked out and before the build is created?



  • @boomzilla said:

    What's to stop someone from doing some malicious changes after the code is checked out and before the build is created?

    You aren't paying attention:

    If you have built files from a dev checked in to use for deployment you can do this:

    Check out code, Make malicious change, compile, change the code back, check in, profit.

    Notice all code checked in will never show the malicious change and you will deploy it.

    Instead the build should come from a build machine that gets the latest code and builds from that so you know it is only containing code that is in source control, not something outside.

     



  • @boomzilla said:

    @Anketam said:
    What is wrong is that you can easily have an executable out of sync with what has been checked into source control and have no way of knowing.  Derived files like executable should be generated from source control when they are ready to be pushed out or put in a build/patch file.  The other danger is that a developer can easily put in malicious (or really any unauthorized) code, check in the executable and not check in the malicious code.  If someone reviews the code as part of a peer review or code review it will look ok.
    I don't see how any of that is actually enabled by checking in the binaries. You'd at least have some record of what happened and who did it. And you could always check it against the respective source code. What's to stop someone from doing some malicious changes after the code is checked out and before the build is created?
    Not necessarily, the bad code could be hidden in any dll or exe and you will be hard pressed to figure out which dll, exe was the culprit.  As for your second point I am assuming you meant after the code is checked in and before the build is created, that is why you also automate the build process (and not do something as messed up like what the OP inhereted).

    In general though I am not that concerned about malicious developers since if they are determined enough they will make a way.  My primary concern is built files can easily get out of sync with source code.  In my case I normally have multiple issues I am trying to fix for a single project and if I have checked out files containing unfinished/temporary/misc/etc. code for one change I would have to undo the changes or revert the files if I wanted to check in the built files for the other change, unless of course I want unfished code from the other change creeping into my executables.  <rant>On my program there has been several times where developers put temporary code in to test a change, and get an emergency fix request (on the same project) and in their haste they end up with the temporary code from the unrelated issue getting into the emergency fix and break some unrelated part of the website and the test team did not catch it because they were only focusing on testing the emergency fix and not regression testing.</ rant>


  • ♿ (Parody)

    @KattMan said:

    @boomzilla said:

    What's to stop someone from doing some malicious changes after the code is checked out and before the build is created?

    You aren't paying attention:

    If you have built files from a dev checked in to use for deployment you can do this:

    Check out code, Make malicious change, compile, change the code back, check in, profit.

    Notice all code checked in will never show the malicious change and you will deploy it.

    Instead the build should come from a build machine that gets the latest code and builds from that so you know it is only containing code that is in source control, not something outside.

    No, I was paying attention, but you didn't understand what I wrote, I guess. Enforcing the automated build process is no different than enforcing bad behavior from developers. How do you know that the guy running the build script doesn't break it up and do something malicious or stupid?


  • ♿ (Parody)

    @Anketam said:

    Not necessarily, the bad code could be hidden in any dll or exe and you will be hard pressed to figure out which dll, exe was the culprit.  As for your second point I am assuming you meant after the code is checked in and before the build is created, that is why you also automate the build process (and not do something as messed up like what the OP inhereted).

    How is this difficult? You can make a known-good build and compare it to the checked-in and deployed versions. For the record, my view is that any part of the build process that isn't automated is just asking for failure.

    @Anketam said:

    My primary concern is built files can easily get out of sync with source code.  In my case I normally have multiple issues I am trying to fix for a single project and if I have checked out files containing unfinished/temporary/misc/etc. code for one change I would have to undo the changes or revert the files if I wanted to check in the built files for the other change, unless of course I want unfished code from the other change creeping into my executables.

    Yes, poor configuration management is always a problem, and good SCM/VCS discipline is required. But that's orthogonal to keeping a record of what you deploy.



  • @boomzilla said:

    orthogonal
     

    Don't use big words in front of him.



  • @ekolis said:

    Whoa, "assembly black magic"? You mean actual ASSEMBLY LANGUAGE code, or just reflection against .NET assemblies?

    I'm pretty sure he means GetManifestResourceStream - and in my opinion it is far from black magic. If you have any resource (be it a XML file, a text file, or an image) that should be used (parsed, extracted, displayed, or anything else) from within a VS .NET project, add the file to the project, set its build type to "Embedded Resource" and you can access it that way using the "file name" without having to put the file anywhere into the file system.



  • @mihi said:

    I'm pretty sure he means GetManifestResourceStream

    You may want to instead link to the english version (guessing at the address for an english version) as this is an english language forum.



  • Oops right, I meant this one. Sorry folks.



  • My bad, I meant .Net Assembly reflection black magic.

    But yes... I was quite surprised to learn how the process worked. The previous place I worked at had automated builds - any time someone would check something in, it would do a build. It made it pretty easy to tell who broke the build.

    Doing builds on a dev machine is a bit sketchy, in my opinion. It can quickly turn into a case of "Well, it worked on my machine, so what's the problem?". There's no QA that happens past doing a build and releasing it here... which in itself is a WTF.



  • Showing my .NET ignorance but why build through the IDE?



    Seems like starting over using something like Nant for builds, a CI server for automating the builds and storing deployables and a database versioning tool like Liquibase would be time well spent. However with a single(?!) developer handling everything, it seems unlikely you'd get approval for all that unless a few releases break catastrophically.



  • @Joel B said:

      b. Copy and paste into the window the newly generated .exe files for the Any CPU build (incorrectly named something like 'Files for x86').



      c. Copy and paste into the window the newly generated .exe files for the x86 build (incorrectly named something like 'Files for x64')

    x64 processors can run x86 bytecode, but not the other way around. TRWTF is OP not understanding the difference between x86 and x86_64.



  • @Ben L. said:

    x64 processors can run x86 bytecode, but not the other way around. TRWTF is OP not understanding the difference between x86 and x86_64.
    AnyCPU means any CPU, which includes ARM.


Log in to reply