Navigation

    What the Daily WTF?

    • Register
    • Login
    • Search
    • Categories
    • Recent
    • Tags
    1. Home
    2. communist_goatboy
    C
    • Profile
    • Following
    • Followers
    • Topics
    • Posts
    • Best
    • Groups

    communist_goatboy

    @communist_goatboy

    2
    Reputation
    20
    Posts
    36
    Profile views
    0
    Followers
    0
    Following
    Joined Last Online

    communist_goatboy Follow

    Best posts made by communist_goatboy

    This user hasn't posted anything yet.

    Latest posts made by communist_goatboy

    • RE: Custom flooring

      I am struggling to figure out how having the floor of the quotient of the numerator and denominator of a rational number is in any way useful. It smells like rationals are just being used as containers for two integral types (they are integral types, aren't they?).

      If rationals are really needed, why not just wrap up std::ratio if using C++11 or boost::rational if C++2003?

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Astronomically bad

      @blakeyrat said:

      @dkf said:
      What's more, they have a point: most programmers cannot write numeric-heavy software, even on pain of… well, a lot of pain. Making a complex algorithm be numerically stable so you actually get meaningful answers out is non-trivial.

      Right, I get that 100%. But by using C, they're taking that amount of pain and cranking it up to 11.

      Eeehh, C isn't that bad. If you want a language with surprise typing, no memory management for file streams, a bastardized array interface (with even more surprise typing!), and is sold as a vectorized language but doesn't support vectorization beyond a single dimension and truncates mismatching dimensions, then look to IDL. Jesus fuck, someone kill that language.

      @blakeyrat said:

      @dkf said:
      There's a number of stories about careers being ruined when it was found out that someone got things horribly wrong

      Well they can do what they like, but from my perspective I'd be a hell of a lot more worried than I accidentally wrote code exposed one of the nasty warts in C and destroyed my career that way. At least in modern, memory-managed languages your algorithm is (by-and-large) just your algorithm and not "your algorithm and the 50,000 lines of boilerplate and bullshit required to manage your own memory and make-up for C's lack of any useful data types for scientific work in a big congealed lump of crap".

      Accuracy, speed, and scalability are the three rungs of the heirarchy of scientific computing. Maintainability is a far-away and fleeting throught for most code-writers in science. I'm definitely not arguing this should be the case, but without the three fundamental characteristics, no one is going to care what you wrote. I am firmly in the camp that maintainable code is much easier to make accurate, fast, and scalable. But that camp is a very tiny fraction of people who write scientific code. For the sake of speed, managing your own memory is really not that big of an obstacle. Yes, something like FORTRAN* or C++ can be of great help here, but sometimes doing it yourself is necessary to squeeze the last drop out of the silicon. For example, there is a team at Argonne writing a cosmological simulation designed to model a trillion particles using 100k+ CPU cores. They wrote their own memory pool allocator simply because new/delete were too slow. They were getting such heavy CPU utilization that they managed to discover a defect in the fab process of the CPU sockets when they melted under water cooling. Even in less intensive applications, using a managed language has performance penalities that are too large to cope with. One thing I would like to see is the use of managed languages to sketch out programs before porting them to the heavy-lifting languages so that people will focus on algorithms rather than implementation details.

      *FORTRAN90/95/2003/2008 only

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Astronomically bad

       @thatmushroom said:

      Here's the think about academic-y types coding: The code is a means to an end, rather than the product itself.

      Granted, much of the software written in astronomy is a bit of one-off code that needs to perform some specific task and will likely not be used ever again. In that case, you can use Brainfuck for all I care. The code example I presented comes from a very widely-used simulation package that has 1800+ citations. A means to an end, it is not.

       

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Astronomically bad

      @dkf said:

      @blakeyrat said:
      But *why*?
      It could be worse. It could've involved IDL. (Never used it? It's like Fortran decided to rape a scripting language. And I use that word deliberately. I hate IDL!)
       

      Haha, that is the best description of IDL I have heard. I am glad that I have finally found someone else who absolutely loathes that heinous language.

       

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Astronomically bad

      @blakeyrat said:

      @communist_goatboy said:
      Most numerical software in astronomy is written in C or FORTRAN 77.

      But why? I have to admit I have no knowledge of FORTRAN, but I can't think of a worse language than C for this use.

      Is it just the same as the video games industry where they used C because they had a good reason to 15 years ago, and now that they no longer do they can't switch because they're all dinosaurs?

      A lot of it has to do with legacy code and people who were doing this stuff in the 70s/80s when FORTRAN77 and C were the only real options for scientific computing. Why people today start new projects using these languages boils down to familiarity with the old tools and a complete lack of knowledge of using the right tool for the job. FORTRAN77 should die in a fire. FORTRAN95 or 2003 are actually pretty good languages for scientific computing as they have nice array syntaxes and several parallel computing platforms support them (e.g., OpenMP, MPI, etc.). The newer parallel platforms like TBB, OpenCL, and CUDA are oriented around C++ which I think is the language which has the best balance of abstraction and number-crunching power. The complexity of modern simulations really requires using multiple languages (compiled and interpreted!) layered and connected properly. But we both know that is never going to happen. Although it might keep TDWTF open for a few more years...

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Astronomically bad

      @blakeyrat said:

      They write software like that in C? Dear God.

      Most numerical software in astronomy is written in C or FORTRAN 77. I prefer bad C over good FORTRAN 77, personally. There are a few brave souls who have ventured into using C++. Remember how I mentioned this isn't the worst that I have seen? Yeah, C++ is a little too technical for some folks. My favorite WTF was using member templates to pass boolean parameters to a method in lieu of formal parameters. It's C++, we have to use templates!

      posted in Side Bar WTF
      C
      communist_goatboy
    • Astronomically bad

      Modern theoretical astrophysics relies heavily on the utilization of simulations to model astrophysical processes which take place on enormous timescales. The people who write the simulations are held in high esteem by the community at large. The code contained in them.... well, see for yourself. (anonymized to protect the criminally stupid)

      file_stuff.c

      #include "globals.h"

      void set_labels(void) {
          enum field_names i;

          for (i = 0; i < NUM_FIELD_NAMES; i++)
              switch (i) {
              case FIELD_1:
                  strncpy(labels[FIELD_1], "AAA ", 4);
                  break;
              case FIELD_2:
                  strncpy(labels[FIELD_2], "BBB ", 4);
                  break;
              case FIELD_3:
                  strncpy(labels[FIELD_3], "CC  ", 4);
                  break;
              case FIELD_4:
                  strncpy(labels[FIELD_4], "DDDD", 4);
                  break;
              case FIELD_5:
                  strncpy(labels[FIELD_5], "E   ", 4);
                  break;
              case FIELD_6:
                  strncpy(labels[FIELD_6], "FFF ", 4);
                  break;
              case FIELD_7:
                  strncpy(labels[FIELD_7], "GGGG", 4);
                  break;
              case FIELD_8:
                  strncpy(labels[FIELD_8], "HHH ", 4);
                  break;
              case FIELD_9:
                  strncpy(labels[FIELD_9], "III", 4);
                  break;
              case FIELD_10:
                  strncpy(labels[FIELD_10], "JJJJ", 4);
                  break;
              case FIELD_11:
                  strncpy(labels[FIELD_11], "KKKK", 4);
                  break;
              }
          }
      }

      globals.h

      typedef enum {FIELD_1, /*snip*/, FIELD11} field_names;

      #define NUM_FIELD_NAMES 11

      extern char labels[NUM_FIELD_NAMES][4];

      globals.c

      #include "globals.h"

      char labels[NUM_FIELD_NAMES][4]; /* <! This table holds four-byte character tags used for file output */

       

      Given this setup, you might assume that labels is used throughout the program. It is used exactly once -- where set_labels() is called.

       I can't help but think that the person who wrote this (a most esteemed theoretical astrophysicist) had a thought process that went something like: "I need to populate an array of string literals. I know! I'll use an enum to make it easily accessible! But how do I put them together? I know! I'll use my trusty for-loop!"

      And this isn't the worst coding offense I have seen in astronomy software. At least they are using Doxygen comments.

      posted in Side Bar WTF
      C
      communist_goatboy
    • Education costs are on the rise

      Just received an e-mail from my university's cashier's office. I'm not sure, but I think I can come up with the GDP of Timor-Leste by the beginning of time. Well, depending on what epoch they are using (it's PeopleSoft).

      The Minimum Payment listed on your most recent billing statement was not received by the due date. Please make a payment immediately to bring your account current. Failure to do so may prevent enrollment in future terms and suspension of other University privileges.
      Adjusted Amount Due: $ 0.00
      Minimum Payment Amount: $ 9210000000.00
      Payment Due Date: 00/00/0000

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Metric vs Imperial, 30 years later

      TRWTF is that there is still any misunderstanding between force (weight) and mass.  The kilogram is a measure of mass and the pound is a measure of force.  The SI unit of force is the Newton (N) and the imperial unit of mass is the slug.  Personally, I think we should use the slug more often.  Perhaps the aversion to its usage lies in its homonymic counterpart: the slimy mollusk.

      </soapbox>

      posted in Side Bar WTF
      C
      communist_goatboy
    • RE: Who turned out the light?

      @DaveK said:

      @rohypnol said:

      Now, I'm convinced there is a God and all that she said was true, because I simply can't find any scientific answer to the question "HOW CAN THIS PERSON STILL BE ALIVE?"

      She survives through the divine grace of the Flying Spaghetti Monster, as do we all, of course. RAmen!

      Sauce be upon you.

      posted in Side Bar WTF
      C
      communist_goatboy