Things that Dennis Ritchie got wrong.



  • DMR wrote C some time in the seventies on a PDP-7, or something. And it worked on his machine. But nobody uses PDP-7's any more, and a lot of seventies-era assumptions about how machines operate are invalid now.

    Looking back at C now, what are the things about it that have most greviously damaged subsequent languages? A few examples I'd consider:

    • Casting syntax—I think this is probably my least favourite C misfeature. If you have a C program which crashes, there's a 75% chance that a cast is involved somewhere, so you're guaranteed to want to know where all the casts are in your codebase. Good luck with that. I don't think it's possible to arrive at a syntax which is more search-hostile than (type)value. C++ eventually responded with [i]something[/i]_cast<type>(value) syntax which at least gives you keywords you can search for, but it was too late, and the C cast syntax lives on to this day in languages like Java, C#, Objective-C; even PHP has some horrendous mockery of it. It's still as hard to find casts as it ever was. Urgh.
    • switch statement madness, just going to drop the Duff's Device bomb here and move on. Still alive and well in at least PHP, Javascript and C++. Having a keyword to indicate that you want to fall through from one case block to the next is not inherently a bad idea, having that as the default behaviour is crazypants. Still alive and well in at least PHP, Javascript and C++. break;
    • Optional braces—with if and while statements, the parantheses around the conditional should be optional rather than the braces around the execution blocks, so we'd all be dealing with code that looked like this:
      
          if a > b {
              do_some stuff();
          }
              do_more_stuff(); // how do I indentation?
      
      and a whole class of errors might never have arisen:
      
          if(a > b)
              do_some stuff();
              do_more_stuff(); // haha, maintenance guy, sucks to be you :P
      
    I'm sure there are others. The point is not to rag on C [i]per se[/i], but to point out the things that other languages do badly today because C did them in ways that made sense 40 years ago.

    Collosal nerd bonus: some source code from paleolithic C compilers written by DMR circa 1972-3.



  • And all of your complaints are about syntax. Would you prefer everything looking like LISP instead?

    BTW, C# has all those things you dislike and is our favorite language.
    http://what.thedailywtf.com/t/closed-poll-which-language-is-the-least-bad/3557/1



  • @aliceif said:

    And all of your complaints are about syntax.

    Feel free to complain about semantics if you like. (Lack of) bounds checking, although most languages got wise to that? (Lack of) GC? There's a reason we have a lot of C-like languages, and hardly any Smalltalk-like languages...

    @aliceif said:

    Would you prefer everything looking like LISP instead?

    Lisp is OK if you have a decent editor. It didn't really come up in my analysis due to it being older than C and not really influenced by it.

    @aliceif said:

    BTW, C# has all those things you dislike and is our favorite language.

    C# is OK. (I prefer C++ personally, but hey, i I spent 15 years as a C++ developer.) I thought they 'fixed' the switch/case/break business though?

    I'm interested to see what other people think are their (un)favourite parts of C's legacy. I just outlined a few examples to set the ball rolling...



  • @tar said:

    Casting syntax

    Hmm, I was never bitten too hard by this. But then again, most of my static typed programming was in C#, so casting wasn't an important feature for me.

    @tar said:

    switch statement madness

    My head hurts from the Duff's Device stuff. Yes, switch is crap.

    Also, why do all the conditions have to be compile-time constants?

    Suck.

    @tar said:

    Optional braces

    Personally I have moved to Egyptian style braces, but a lot of people who use newline codeblocks like to leave out braces for one-line blocks. And I can understand them - it sucks having to use up 3 lines for 1 line of code (which is why Egyptian style is superior IMO).

    One caveat here. IMO this should be ok too:

    if (condition) action;
    

    I'm not big into c, but one thing that annoys me is having to split up my code into header and source files. I remember reading it was some kind of optimization for 70-ies era computers. But why the fuck are the C* guys still using it today!?


  • FoxDev

    @tar said:

    C# is OK. (I prefer C++ personally, but hey, i I spent 15 years as a C++ developer.) I thought they 'fixed' the switch/case/break business though?

    They did: every case block has to end with either a break, return, throw, or goto, which means falling through has to be an explicit operation, in this case goto case [condition].

    Of course, that means you have a language with goto... and yes, you can use it to write spaghetti code 😒



  • @RaceProUK said:

    every case block has to end with either a break, return, throw, or goto

    This is valid C# and compiles:

    using System;
    class Program
    {
        public static void Main(string[] args)
        {
            Console.WriteLine("Hello World!");
                
            // TODO: Implement Functionality Here
            int[] testArray = { 2, 3, 4 };
                
            foreach (var i in testArray) {
                switch (i) {
                    case 2:
                    case 3:
                        Console.WriteLine("smaller than 4!");
                        break;
                    case 4:
                        Console.WriteLine("Lucky!");
                        break;
                }
            }
                            
                
            Console.Write("Press any key to continue . . . ");
            Console.ReadKey(true);
        }
    }
    

    Guess what the output is:
    [spoiler]Hello World!
    smaller than 4!
    smaller than 4!
    Lucky!
    Press any key to continue ...
    [/spoiler]

    Falling through does exist.



  • I absolutely hate C, but all your complaints can be solved with a good IDE or a postprocessor.


  • FoxDev

    Two labels, one case block 😜


  • ♿ (Parody)

    @aliceif said:

    Falling through does exist.

    In that case (hah!) it's really putting multiple values on a single block of code.

    @cartman82 said:

    One caveat here. IMO this should be ok too:

    if (condition) action;

    I am inconsistent. Sometimes I do that or I split it across lines like normal or I put braces there on the single line:

    if (condition){ action; }
    

    Loops always get braces, however. That shit's just askin' for trouble.


  • Java Dev

    I'm in the newline braces camp, but I add in the following pattern:

    if( condition )
        { action; }
    

    Something I find can get annoying sometimes is break/continue in nested loops. Was it perl where they fixed that by requiring to label your loop, and specify the label with the loop?
    I know perl is the one that has redo/next/last instead of continue/break, which also seems saner to me at first glance.


  • FoxDev

    @aliceif said:

    Falling through does exist.

    and is evil!



  • @tar said:

    // haha, maintenance guy, sucks to be you :P

    Well, that's not the language's problem, but one of the idiot who writes badly indented code.

    Personally, I find the

    if (a == b) doStuff();
    if (a == c) doOtherStuff();
    

    syntax clean enough.



  • As a programmer of Macs with 16-bit ints, not defining the actual size of his "int" and "long" types was a constant pain in the fucking ass.



  • No it isn't. Sometimes it's downright necessary to help combine code.


  • FoxDev

    then be explicit about it.

    C# got that one right.

    switch (a) {
        case 0:
        case 1:
           dosomething1();
           goto case 2;
        case 2:
           dosomething2();
           break;
        case 3:
           dosomething3();
           break;
        case 4:
           dosomething4();
           goto default;
         default:
            dodefault();
            break;
    }
    

  • FoxDev

    @accalia said:

    and is evil!

    Only if used incorrectly; I find case fall-through quite a useful feature. Exploited it quite nicely in a timesheet quick-entry dashboard tile thing I've been working on recently 😜

    @accalia said:

    C# got that one right.

        case 0:
        case 1:
           dosomething1();
           goto case 2;
        case 2:
           dosomething2();
           break;
        case 3:
           dosomething3();
           break;
        case 4:
           dosomething4();
           goto default;
         default:
            dodefault();
            break;
    }```</blockquote>
    Bit like that, just without the Discoformatting :laughing:

  • FoxDev

    @RaceProUK said:

    Only if used incorrectly;

    case fallthrough is evil because it hides bugs.

    it is difficult to know if you wrote the case fall through on purpose or it's a bug. becomes hard to reason about the program. but make it a syntactic rule that you have to be explicit then i can reason about the program better because you had to explicitly enter a different case i know you did it on purpose, rather than just forgetting a break;

    EDIT: and no, comments aren't enough. we all know comments are not updated often enough to be reliable.



  • Your example still has a form of fall through. ;)


  • FoxDev

    yes, it does. Explicit fallthrough, nto implicit fallthrough.

    or are you nitpicking about having multiple case statements? if the latter.... really? you're going to that level?



  • No, I'm just trying to point out that you're arguing aginst this:

    switch (tango)
    {
        case 1:
            DoSomething1();
        case 2:
            DoSomething2();
            break;
    }
    

    The thing is: NO ONE is arguing in support of that form of fall-through.

    The fall- through I like to use is multiple case labels for a single case block. Anything else that needs to be shared gets a method.


  • Java Dev

    Implicit fallthrough (except with double labels) is what causes bugs.



  • One of the biggest things Ritchie got wrong was to kick the bucket the same month Steve Jobs did, so that the media did not give him the recognition he deserved.



  • @cartman82 said:

    I'm not big into c, but one thing that annoys me is having to split up my code into header and source files. I remember reading it was some kind of optimization for 70-ies era computers. But why the fuck are the C* guys still using it today!?

    The header says “this is what my code provides” and the source says “this is how it does it - no need to look inside - details may change without notice”. Seems good to me,



  • OK, apologies up front for anyone's source code examples which get discohorsed...

    @cartman82 said:

    One caveat here. IMO this should be ok too:

    if (condition) action;

    I guess that works in C# where you can put a breakpoint on parts of a line. Never found a C/C++ debugger that'd let you do that, so I have a strong experiential bias towards each expression/statement on its own line.

    @cartman82 said:

    I'm not big into c, but one thing that annoys me is having to split up my code into header and source files.

    That's a good point, although Java/C# fixed this from the get go. C++ [i]still[/i] hasn't managed to get a module system through the standards comittee yet. Let's say that the 70's era C preprocessor gave macros a bad name and is probably the reason that basically no language designed between 1980 and 2004 offers any macro facilities whatsoever. (Looking at D and Rust, looks like macros are finally starting to make a comeback. Yay!)

    @anonymous234 said:

    I absolutely hate [anything at all], but all your complaints can be solved with a good IDE or a postprocessor.

    True, now, but can you imagine the world of rainbows and unicorns we'd inhabit if these problems had never existed in the first place?

    @Maciejasjmj said:

    Well, that's not the language's problem, but one of the idiot who writes badly indented code.

    Also a valid point, but I've probably spent over a decade of my life fixing and maintaining code written by idiots, and removing one idiocy vector from the language would've made my life marginally easier.

    @blakeyrat said:

    defining the actual size of his "int" and "long" types was a constant pain in the fucking ass.

    I did consider this on the original list as it is incredibly irritating, but I wasn't sure if that many languages (other than C++) had mindlessly imported this behaviour—Java perhaps, but C# at least gives you sane integer types doesn't it?

    @abarker said:

    <pre
    switch (tango)
    {
    case 1:
    DoSomething1();
    case 2:
    DoSomething2();
    break;
    }

    What would have made me happy is if this was expressed like this:

    switch(tango) {
        case 1:
            DoSomething1();
            continue;
        case 2:
            DoSomething2();
        default:
            // case 3-n handled here...
    }
    

    So that you have fallthrough if you explicitly ask for it, but otherwise the default is to jump out of the switch at the end of the case block.

    @abarker said:

    Anything else that needs to be shared gets a method.

    Actually good advice in most cases, not just switch statements.

    @Groaner said:

    the same month Steve Jobs did

    True dat : (
    DMR was smarter, handsomer(?), more influential and beardlier than I could ever hope to be, and even his mistakes were a better class of mistakes than I would have made in the same position.



  • @tar said:

    but I wasn't sure if that many languages (other than C++) had mindlessly imported this behaviour

    Oh maybe I misunderstood the thrust of the topic.

    @tar said:

    but C# at least gives you sane integer types doesn't it?

    Yeah, and C99 kind of fixed it by adding stdint.h. That didn't help my pain as a youngster trying to learn software development by porting code examples in books and newsgroups and all of those pre-Internet sources.



  • @sjw said:

    The header says “this is what my code provides” and the source says “this is how it does it - no need to look inside - details may change without notice”. Seems good to me,

    In other words, half-assed interfaces.



  • @blakeyrat said:

    Yeah, and C99 kind of fixed it by adding stdint.h. That didn't help my pain as a youngster trying to learn software development by porting code examples in books and newsgroups and all of those pre-Internet sources.

    <stdint.h> FTW, I wish that C99 had gained more traction, as it really does address quite a few of C89's flaws. Even to this day, my company prefers to get some intern to type out a long series of typedefs rather than just use stdint. It's kind of annoying. ("Yeah, but maybe we'll need to change the size of company::Int64 sometime in the future...")



  • @tar said:

    Yeah, but maybe we'll need to change the size of company::Int64 sometime in the future...

    When INGSOC introduce a new decimal system?


  • Discourse touched me in a no-no place

    @tar said:

    I did consider this on the original list as it is incredibly irritating, but I wasn't sure if that many languages (other than C++) had mindlessly imported this behaviour—Java perhaps, but C# at least gives you sane integer types doesn't it?

    FWIW, Java precisely defines the width of short, int and long to be 16 bits, 32 bits and 64 bits. It also prescribes that they're always signed.



  • @aliceif said:

    When INGSOC introduce a new decimal system?

    When [i]seveighten[/i] finally becomes a thing:

    0, 1, 2, 3, 4, 5, 6, 7, ⅞, 8, 9
    

    @dkf said:

    Java precisely defines the width of short, int and long to be 16 bits, 32 bits and 64 bits. It also prescribes that they're always signed.

    Can i haz unsigned? No? Wat. Do not want...



  • I think switches should have default fallthrough, given that it's basically thin syntax over normal CPU behavior. Switches with default break already exist: they're chains of else-if.

    Perhaps support for a "goto next label" keyword and a compiler option to enforce it would be good for groups of devs wanting that kind of thing, though.


  • Fake News

    That reads like a long "Get off my lawn!" to me.



  • @JBert said:

    That reads like a long "Get off my lawn!" to me.

    I am and have been a dirty old man at heart for most of my adult life. To be fair, I think a true "Get off my lawn!" would never have conceded the possibility of a new keyword with compiler support.

    In my day, we fell through cases uphill both ways in the snow!



  • The "implementation-dependent" behaviour of "int" and "unsigned int" are useful.

    On sane compilers they are the native bitness of the target CPU, and it's useful to have a type of that nature - drops straight into a register etc.

    I agree with the whole "just use stdint" though. There's been no reason to do anything else for at least a decade.

    The really daft ones are company-specific definitions of the form int1, int2, int4 etc.
    Those do annoy me unnecessarily.



  • switch a {
    case 0, 1:
        dosomething1()
        fallthrough
    case 2:
        dosomething2()
    
    case 3:
        dosomething3()
    
    case 4:
        dosomething4()
        fallthrough
    default:
        dodefault()
    }
    

    Much cleaner IMHO.



  • @tar said:

    <ul><li>Casting syntax—I think this is probably my least favourite C misfeature. If you have a C program which crashes, there's a 75% chance that a cast is involved somewhere, so you're guaranteed to want to know where all the casts are in your codebase.

    Yea, I would love actual credible statistics on that. I can hardly believe it. I can believe NULL checks ( the lack of it) however being the real major chance...

    switch statement madness
    If you are afraid of the default fall-through behavior, perhaps you should be using normal if statements?
    Optional braces—with if and while statements, the parantheses around the conditional should be optional rather than the braces around the execution blocks, so we'd all be dealing with code that looked like this:

    There is nothing wrong with optional braces. There are plenty of instances where I have
    if (some error condition)
    goto error_cleanup;

    If I ever do if (some important condition)
    some_important_function(),
    then that would be stupid.

    This is entirely a code style/policy issue.



  • @delfinom said:

    :angry:

    ... well that sure showed me.

    With respect to the topic, is there anything about C that you would've changed, that might've made the lives of nowadays C++/C#/Java/JavaScript/PHP/&c programmers easier?

    Or is C perfect in every respect and any talk about improving it is just naïve blab?



  • @cartman82 said:

    One caveat here. IMO this should be ok too:

    if (condition) action;

    IMO that should be the only form ever used for unbraced actions; if the condition and action don't fit naturally and easily on a single line, braces please.



  • @lightsoff said:

    The really daft ones are company-specific definitions of the form int1, int2, int4 etc.

    Only valid excuse for those, and it's a very thin one, is that it's legacy code that's existed since before stdint and not had budget for refactoring.



  • @tar said:

    Optional braces—with if and while statements, the parantheses around the conditional should be optional rather than the braces around the execution blocks,

    Perl gets this right. The braces are not optional, and using postfix syntax the parens are optional. They are properly optional in perl 6.



  • @tar said:

    ... well that sure showed me.

    With respect to the topic, is there anything about C that you would've changed, that might've made the lives of nowadays C++/C#/Java/JavaScript/PHP/&c programmers easier?

    Or is C perfect in every respect and any talk about improving it is just naïve blab?

    There's always the method of defining octals which is fucking dangerous. Leading 0 in front of a number means its an octal which is insane.

    [code]
    #define CONSTANT1 0x124
    #define CONSTANT2 0124
    #define CONSTANT3 124
    [/code]



  • I don't mind C, I end up inside our C codebase far more often than our C++ or .NET stuff. Really the only thing I'd change is the requirement that all variable declarations come before all code. Surely it would be simple for a compiler to re-arrange your code to do that for you at compile-time so I don't have to have temp variables clogging up the start of a function.



  • @mott555 said:

    I don't mind C, I end up inside our C codebase far more often than our C++ or .NET stuff. Really the only thing I'd change is the requirement that all variable declarations come before all code. Surely it would be simple for a compiler to re-arrange your code to do that for you at compile-time so I don't have to have temp variables clogging up the start of a function.

    That's a MSVC requirement because they only implemented ANSCI C for the longest time. The rest of us using GCC don't have that problem :P C99/GNU99 ftw.

    Edit: Supposedly VS 2013 and beyond should support C99 variable declarations now if you switch it to C99 mode.



  • @delfinom said:

    Edit: Supposedly VS 2013 and beyond should support C99 variable declarations now if you switch it to C99 mode.

    In my industry it'll be another 15 - 20 years before we're allowed to move on to C99. 20 years from now our customers will probably still be running VS2005 😦.



  • The name C isn't very Google-able.

    #troll



  • @mott555 said:

    Surely it would be simple for a compiler to re-arrange your code to do that for you at compile-time so I don't have to have temp variables clogging up the start of a function.

    You did know that C variables are scoped by block, not function? There's nothing but an arbitrary style guide stopping you from writing code like

    if (a > b) {
        int temp = a;
        a = b;
        b = temp;
    }
    

    and being able to be confident that temp hasn't stomped anything else. Variable scope also extends from point of declaration to the end of the innermost block containing the declaration, so again there is nothing but a style guide stopping you from putting code that doesn't use a given variable before its declaration.


  • ♿ (Parody)

    @flabdablet said:

    There's nothing but an arbitrary style guide stopping you from writing code like

    Or even just:

    {
        int temp = a;
        a = b;
        b = temp;
    }
    


  • @flabdablet said:

    You did know that C variables are scoped by block, not function?

    I'm aware. That reduces, but does not eliminate, my complaint.


  • Java Dev

    There is the fact that GCC isn't very good at generating dwarf debug info for those constructs, meaning you can't see the values of those variables in gdb.

    Same happens when gcc inlines a function, but that's fixable with an unoptimized compile.



  • @blakeyrat said:

    The name C isn't very Google-able.

    You think that's bad? Try Googling "Go". Or the "Source" engine.


Log in to reply