Increment WTF



  • @flabdablet said:

    Shocking as it may seem, people are still building stuff with 8051 and 6502 and PIC cores, though often embedded in some kind of SoC.
    Yep. The project I'm working on now has an ARM, but the one before that was an SoC with an embedded 8051.

    By an interesting coincidence, my first quasi-professional job, an internship while I was in university, was writing a simulation model of a then-new version of the 8051.



  • @HardwareGeek said:

    The project I'm working on now has an ARM, but the one before that was an SoC...

    Too bad it was't a FOOT or you could have used that SoC to keep warm.



  • @morbiuswilters said:

    @HardwareGeek said:
    The project I'm working on now has an ARM, but the one before that was an SoC...

    Too bad it was't a FOOT or you could have used that SoC to keep warm.

    I can't believe you were paid $1.17 for posting such weak content. I think people should submit their resume to replace you as the Poster in Residence at TDWTF.


    Please send cover letter, resume and price per post expectations at jobs@thedailywtf.com.



  • @Ronald said:

    @morbiuswilters said:
    @HardwareGeek said:
    The project I'm working on now has an ARM, but the one before that was an SoC...

    Too bad it was't a FOOT or you could have used that SoC to keep warm.

    I can't believe you were paid $1.17 for posting such weak content. I think people should submit their resume to replace you as the Poster in Residence at TDWTF.


    Please send cover letter, resume and price per post expectations at jobs@thedailywtf.com.

    No no no! Okay, they can cut my pay to $0.99 /post. Honestly, $1.17 was just being greedy..



  • @Ronald said:

    That's a fact, my friend who has TWO PhD from a slightly more prestigious university than the ETH in Zürich told me so.
    Which university would that be? Because there aren't that many.

    @Ronald said:

    Also he's not a whiny name-dropping pussy so he's got more credibility than someone with a proven history of not having his own opinions who's expecting that quoting unknown and uninteresting people that he admires will bring any kind of weight to his posts.
    Well, let me tell you a secret. There are these tools on the internet, called "Google" and "Wikipedia", with which you can read up on the subject, which in fact I did. He helped me on the way, but if you could get your head out of your own arse for a second and look these things up, you'd see that C++ is pretty much the only thing when it comes to HPC.

    Not that I'm happy with that; I too hate C++.

    @blakeyrat said:

    @Severity One said:
    Said friend has a PhD from the ETH in Zürich, so I trust his word quite a bit more than someone with a proven history of making a fool of himself on this site.

    Since when do PhD's know how to write software?

    Well, the PhD is relevant to the subject matter, 3D imaging, if I remember correctly.



  • @flabdablet said:

    C is not really a cock up: it's a portable assembly language designed to drag about as much performance from a CPU as can be achieved without resorting to its architecture-specific assembly language*, and as such it's best used by people with a good understanding of what the underlying machine is actually doing.
     

    What the underlying machine is doing and what the compiler is doing are two different things. And when you get undefined behavior from a()+b(), how is it not a cock-up?


  • Discourse touched me in a no-no place

    @The_Assimilator said:

    And every new C++ "standard" adds another layer of warts that are a different size, colour, and texture to the old warts.
    auto. No, not that auto.



  • @PJH said:

    @The_Assimilator said:
    And every new C++ "standard" adds another layer of warts that are a different size, colour, and texture to the old warts.
    auto. No, not that auto.

    I guess they figured "Well, auto is basically useless in C, right? And it's easier than trying to come up with a reserved keyword that might conflict with some label somebody has used in their code."



  • @Severity One said:

    @Ronald said:

    That's a fact, my friend who has TWO PhD from a
    slightly more prestigious university than the ETH in Zürich told me
    so.
    Which university would that be? Because there aren't that
    many.

    @Ronald said:

    Also he's not a whiny
    name-dropping pussy so he's got more credibility than someone with a
    proven history of not having his own opinions who's expecting that
    quoting unknown and uninteresting people that he admires will bring any
    kind of weight to his posts.
    Well, let me tell you a secret.
    There are these tools on the internet, called "Google" and "Wikipedia",
    with which you can read up on the subject, which in fact I did. He
    helped me on the way, but if you could get your head out of your own
    arse for a second and look these things up, you'd see that C++ is pretty
    much the only thing when it comes to HPC.

    Not that I'm happy with that; I too hate C++.

    @blakeyrat said:

    @Severity One said:
    Said friend has a PhD from the ETH in Zürich, so I trust his word quite a bit more than someone with a proven history of making a fool of himself on this site.

    Since when do PhD's know how to write software?

    Well, the PhD is relevant to the subject matter, 3D imaging, if I remember correctly.

    Do you know what you achieved with this reply? Nothing. You did not bring up any relevant fact and you did not manage to insult people in a significant manner. So I'm recycling your useless post into a fun and exciting game. Look carefully at the text above to find a secret message hidden in the crap.



  • @morbiuswilters said:

    I guess they figured "Well, auto is basically useless in C, right? And it's easier than trying to come up with a reserved keyword that might conflict with some label somebody has used in their code."

    I'm pretty sure that's exactly why they picked the keyword auto for that.



  • @Ronald said:

    So I'm recycling your useless post into a fun and exciting game. Look carefully at the text above to find a secret message hidden in the crap.
     

    I'll need to use my special decoder ring for that.



  • ♿ (Parody)

    @The_Assimilator said:

    Don't use basic language features because said features were designed in the same way that I design my bowel movements after a good curry?

    Yeah, that's the problem with a language that allows you to write very low and high level code.



  • @Faxmachinen said:

    when you get undefined behavior from a()+b(), how is it not a cock-up?

    Including the concept of sequence points in a language is a legitimate design decision, not a cock-up. It makes possible some optimization possibilities not available to languages like Java and C# whose precedence and associativity rules apply to subexpression execution order as well as to overall result evaluation.

    You don't get undefined behavior from a() + b() unless both those function calls involve order-dependent side effects on some shared state. Given that you're supposed to understand about sequence points before writing code in this language, if you get bitten by this it's your own code that's cocked up. To my way of thinking, it's in the same family of cock-ups that parallel programming newbies make before they learn about mutexes.

    If you care about the order in which subexrpressions are evaluated, and you've checked that this isn't just because you're Doing It Wrong, you can make them separate statements. This clues in your readers, as well as the compiler, that order-dependent effects should be taken into account to make sense of the code.

    The C equivalent for the snippet originally posted is

    printf("\n\t%d %d %d %d %d\n\t", lolNum++, lolNum++, lolNum++, lolNum++, lolNum++);
    It's now clear that all the lolNum++ expressions are indeed just function parameters and have no sequence points separating their evaluation. This is definitely Doing It Wrong and needs to be replaced by something like
    printf("\n\t%d %d %d %d %d\n\t", lolNum, lolNum+1, lolNum+2, lolNum+3, lolNum+4);
    lolNum += 5;
    which makes the intent perfectly clear and doesn't rely on subexpression ordering rules for correctness.



  • @flabdablet said:

    You don't get undefined behavior from a() + b() unless both those function calls involve order-dependent side effects on some shared state.
     

    You're saying that in a "predictable" language, a() has to guarantee to finish before b() is called, and this absolutely precludes an optimization such as putting both function calls on their own core/thread?

    Just making sure.



  • @dhromed said:

    this absolutely precludes an optimization such as putting both function calls on their own core/thread?

    I'm not aware of a C compiler that emits opportunistically multithreaded code, and most expression evaluaton that benefits from re-ordering doesn't involve function calls; the parallelism comes from the fact that the CPU is heavily pipelined and probably superscalar, and therefore capable of running multiple instructions at once inside the same core even for a single thread.

    The kind of optimizations I had in mind involve rearranging the order of instructions and organizing register allocation and re-use in ways that minimize pipeline stalls. One way to do that is to evaluate subexpressions in parallel by interleaving the instructions that implement them, so it helps if the language semantics give the compiler that freedom.

    Modern x86 processors do a fair bit of that stuff anyway behind the scenes, but compilers can stiill help by emitting code that minimizes the amount of instruction re-ordering the hardware has to do. It's more important on simpler architectures like ARM or things like DSPs and GPUs.

    I guess you might occasionally find two functions being called in parallel via instruction interleaving, provided they were both simple enough to be inlined and neither had much in the way of control flow.



  •  @morbiuswilters said:

    @OldCrow said:
    You use C++ when you have enough RAM to waste for the implicitly added object pointers but still may need to count instruction cycles.

    I think the only time you'd choose C++ over C is when you have extreme brain damage.

    So I take it you haven't heard of the Arduino Mega, or as I like to call it, The Glorified ATMega2560 Demo Board.256KB flash, 8KB RAM (if memory serves), 20MHz of 8bit calculations, 4 USARTs, 4 16bit timers, 4 8bit PWM outputs, and a load of other goodies.

    Now, typically in production use this chip will be attached to a host of external ICs as well. And all of these require their own handling code, so that the flash is usuallu more than half full. Are you really sure that you'd like to try and tackle this one with plain C? Without the luxuries of, say, classes and their neat encapsulation mechanism? It's not impossible, I know. But do you really want to try and emulate the object-oriented approach with structs and pointers?



  • @OldCrow said:

    Without the luxuries of, say, classes and their neat encapsulation mechanism?

    Ha!



  • @morbiuswilters said:

    @OldCrow said:
    Without the luxuries of, say, classes and their neat encapsulation mechanism?

    Ha!

     

    Show me another language that does that and still compiles to native AVR assembly. And actually fits into the space given.



  • @OldCrow said:

    @morbiuswilters said:

    @OldCrow said:
    Without the luxuries of, say, classes and their neat encapsulation mechanism?

    Ha!

     

    Show me another language that does that and still compiles to native AVR assembly. And actually fits into the space given.

    Ada.



  • @OldCrow said:

    @morbiuswilters said:

    @OldCrow said:
    Without the luxuries of, say, classes and their neat encapsulation mechanism?

    Ha!

     

    Show me another language that does that and still compiles to native AVR assembly. And actually fits into the space given.

    I was laughing at your claim that C++ classes were a luxury and provided "neat encapsulation".



  • Not to defend C++, I work with it and it provides its share of frustrations (and yes, I'm a video game programmer for handheld devices, though I'm not the one choosing the tools or libraries or anything), but this particular complaint doesn't make sense to me. I mean, I've probably been drinking the kool-aid too long, but it seems like many of the undefined border cases would be dumb things to do in any language.

    Take i = i++. i++ is already incrementing the value, so why would you then assign it to itself? The operator is problematic, but you wouldn't do i = i = i + 1 in another language, even if the behavior was what you expected. You wouldn't look at code littered with random assignments and think "This is good code."

    Similarly, the complaints about the order of evaluation being unspecified in c = a() + b(). I might have been working on c++ for too long, but it seems to me that you wouldn't want a() to alter the behavior of b() in any language. That is to say, a() + b() should give the same result as b() + a(). If you expect a() to impact b(), then you should let other people looking at the code know by calling a(), storing the result, calling b(), storing that result, and finally adding both results. That way people looking at the code know you are counting on side-effects separate from just the return value of the function.

    I suppose a counter argument is that a better language would make writing shitty code more difficult, and wouldn't then break your legs when you do write shitty code.

    As for C or C++ making writing compilers easier, the hell? I thought c++ compilers were a pain because of the context dependent syntax, that is nigh unparseable. Quick, what does "A a();" mean? And macros. Fucking macros. Everywhere.



  • @Kian said:

    Similarly, the complaints about the order of evaluation being unspecified in c = a() + b(). I might have been working on c++ for too long, but it seems to me that you wouldn't want a() to alter the behavior of b() in any language.

    In Microsoft SSIS or SSRS, expressions often rely on the Iif(condition, ifTrue, ifFalse) function, and to do anything useful you often have nested calls. Without order of evaluation lots of things would be impossible.



  • @Ronald said:

    @Kian said:
    Similarly, the complaints about the order of evaluation being unspecified in c = a() + b(). I might have been working on c++ for too long, but it seems to me that you wouldn't want a() to alter the behavior of b() in any language.

    In Microsoft SSIS or SSRS, expressions often rely on the Iif(condition, ifTrue, ifFalse) function, and to do anything useful you often have nested calls. Without order of evaluation lots of things would be impossible.

    So basically, someone saw the ?: operator in C and thought they could replicate it by writing a function. So now the "equivalent" function evaluates everything (making expressions like x==0?0:1000000/x cause crashes) and returns generic object instead of some useful type.



  • @Ronald said:

    In Microsoft SSIS or SSRS, expressions often rely on the Iif(condition, ifTrue, ifFalse) function, and to do anything useful you often have nested calls. Without order of evaluation lots of things would be impossible.

    C has the (condition? ifTrue: ifFalse) construct for this exact job, there is a sequence point after condition, and only one of ifTrue or ifFalse will be evaluated accordingly. Personally I don't consider the inability to emulate this already confusing operator with an even uglier user-defined function to be a language wart.

    There is also a sequence point after any expression to the left of an && or || logical operator, allowing e.g. tests like

    if (i < limit && lookup[i])
    to be used safely for array bounds checking.



  • @Ben L. said:

    the ?: operator in C

    Just call it the fucking ternary operator. And it exists in numerous languages, not just C.

    @Ben L. said:

    So now the "equivalent" function evaluates everything...

    Who cares?

    @Ben L. said:

    (making expressions like x==0?0:1000000/x cause crashes)

    Then don't write that. Is this really that hard to understand? What the fuck is wrong with you people?



  • @flabdablet said:

    C has the (condition? ifTrue: ifFalse) construct...

    Ternary operator! Ternary operator! Call it "the ternary operator" because it's the only ternary operator most people are familiar with!


    (Do you people never talk about this out loud? Or do you say, like, "the question-mark-colon conditional operator" or something equally unwieldy?)



  • @OldCrow said:

    Are you really sure that you'd like to try and tackle this one with plain C? Without the luxuries of, say, classes and their neat encapsulation mechanism? It's not impossible, I know.

    8K of RAM and you want me to waste it on useless vtables? Thanks, I'll pass.

    @OldCrow said:

    But do you really want to try and emulate the object-oriented approach

    Let me just stop you there: No.



  • @morbiuswilters said:

    @flabdablet said:
    C has the (condition? ifTrue: ifFalse) construct...

    Ternary operator! Ternary operator! Call it "the ternary operator" because it's the only ternary operator most people are familiar with!


    (Do you people never talk about this out loud? Or do you say, like, "the question-mark-colon conditional operator" or something equally unwieldy?)

    I prefer "the one-liner if". Or "the lambda expression wrecker".



  • @flabdablet said:

    8K of RAM and you want me to waste it on useless vtables?

    My guess is there's some embedded version of C++ that strips out bloaty features and doesn't support run-time polymorphism.



  • @Ronald said:

    I prefer "the one-liner if".

    foo =
      bar ?
        baz :
        qux;
    


    Did I just blow your mind or what?



  • @morbiuswilters said:

    My guess is there's some embedded version of C++ that strips out bloaty features

    That would be C.



  • @morbiuswilters said:

    @Ronald said:
    I prefer "the one-liner if".

    foo =
      bar ?
        baz :
        qux;
    


    Did I just blow your mind or what?

    What happens if I type

    goto baz;



  • @Ronald said:

    What happens if I type
    goto baz;

    People point at you and laugh.


  • Discourse touched me in a no-no place

    @OldCrow said:

     @morbiuswilters said:

    @OldCrow said:
    You use C++ when you have enough RAM to waste for the implicitly added object pointers but still may need to count instruction cycles.

    I think the only time you'd choose C++ over C is when you have extreme brain damage.

    So I take it you haven't heard of the Arduino Mega, or as I like to call it, The Glorified ATMega2560 Demo Board.256KB flash, 8KB RAM (if memory serves), 20MHz of 8bit calculations, 4 USARTs, 4 16bit timers, 4 8bit PWM outputs, and a load of other goodies.

    Now, typically in production use this chip will be attached to a host of external ICs as well. And all of these require their own handling code, so that the flash is usuallu more than half full. Are you really sure that you'd like to try and tackle this one with plain C? Without the luxuries of, say, classes and their neat encapsulation mechanism? It's not impossible, I know. But do you really want to try and emulate the object-oriented approach with structs and pointers?

    (1) Your example isn't a case of choosing C++ over C - that choice has already been made

    (2) There's nothing stopping anyone writing OOP code in C, except maybe ignorance - "classes" in C++ are largely syntactic sugar.



  • @morbiuswilters said:

    @flabdablet said:
    C has the (condition? ifTrue: ifFalse) construct...

    Ternary operator! Ternary operator! Call it "the ternary operator" because it's the only ternary operator most people are familiar with!


    (Do you people never talk about this out loud? Or do you say, like, "the question-mark-colon conditional operator" or something equally unwieldy?)

    It's spelt "(condition? ifTrue: ifFalse)" but it's pronounced "ternary operator".


  • Discourse touched me in a no-no place

    @Kian said:

    Similarly, the complaints about the order of evaluation being unspecified in c = a() + b(). I might have been working on c++ for too long, but it seems to me that you wouldn't want a() to alter the behavior of b() in any language. That is to say, a() + b() should give the same result as b() + a(). If you expect a() to impact b(), then you should let other people looking at the code know by calling a(), storing the result, calling b(), storing that result, and finally adding both results.
    You only cover half the problem there - if a() and b() return different types, even if you did store the results it's entirely possible (i.e. wrongly) for resultA + resultB to return a different value from resultB + resultA. e.g. think of the rather naïve and simplistic:

    A operator+(const B& rhs) const {...}
    B operator+(const A& rhs) const {...}



  • @flabdablet said:

    Including the concept of sequence points in a language is a legitimate design decision, not a cock-up.

    The cock-up is that there is no mechanism for preventing or pointing out mistakes. A "pure" function qualifier (and flagging ++ and -- as unpure) would have made all the difference (and also been annoying as hell, but that is besides the point). Or you could just outright disallow f()+f().

    @flabdablet said:

    You don't get undefined behavior from a() + b() unless both those function calls involve order-dependent side effects on some shared state.

    So the statement is defined unless some deeply nested function could at some point have a side effect? That's an even bigger WTF.

    @flabdablet said:

    To my way of thinking, it's in the same family of cock-ups that parallel programming newbies make before they learn about mutexes.

    If you're doing parallel programming, you're asking for it. And apparantly, the same goes for writing simple statements in C.


  • Discourse touched me in a no-no place

    @Ronald said:

    What happens if I type
    goto baz;
    Velociraptors.



  • @Faxmachinen said:

    If you're doing parallel programming, you're asking for it. And apparantly, the same goes for writing simple statements in C.

    Parallel programming and out-of-order execution are both extremely useful features. Tricky, yes, and not everyone (or every language) should make use of them. But I don't consider it WTFy that a 40 year-old language that's only a step above assembly would re-order execution to speed things up.



  • @Faxmachinen said:

    If you're doing parallel programming, you're asking for it. And apparantly, the same goes for writing simple statements in C.

    Writing simple statements in C causes no problems. Writing expressions containing subexpressions that cause side effects sometimes requires paying attention to sequence points. This is really not anywhere near as big a deal as you seem to be making it. I have written a shitload of C code, I don't recall ever having being bitten by this issue, and I'm sure most C coders would tell you the same thing.

    @Faxmachinen said:

    The cock-up is that there is no mechanism for preventing or pointing out mistakes.

    stephen@jellyshot:~/src/wtf$ cat seq.c
    #include <stdio.h>
    
    int main (int argc, char *argv[]) {
    	printf("%d %d\n", argc++, argc++);
    	return 0;
    }
    stephen@jellyshot:~/src/wtf$ gcc -Wall seq.c 
    seq.c: In function ‘main’:
    seq.c:4:32: warning: operation on ‘argc’ may be undefined [-Wsequence-point]
    stephen@jellyshot:~/src/wtf$ 


  • @flabdablet said:

    I have written a shitload of C code, I don't recall ever having being bitten by this issue, and I'm sure most C coders would tell you the same thing.

    Agreed. It's one of those things I never even think of because I just wouldn't write code like that in the first place. (And as you point out, the compiler will warn you anyway.)



  • @PJH said:

    (1) Your example isn't a case of choosing C++ over C - that choice has already been made

    (2) There's nothing stopping anyone writing OOP code in C, except maybe ignorance - "classes" in C++ are largely syntactic sugar.

     

    (1) It uses gcc. And I happen to have an AVRISPmkII (the official (cheap version) programmer for the MCU) on the bookshelf. So for me it just so happens to be a choice.

    (2) I know. But it is sugar that I like. Because it decreases the amount of boilerplate code that I have to write. It also decreases the chances of a naming clash.


     


Log in to reply