Programming Language Learning Curves



  • @ben_lubar said:

    goroutine

    Fries, gravy and cheese curds, served by a four-armed giant from Outworld?


  • Discourse touched me in a no-no place

    :facepalm:

    QFT for great justice or something.

    I don't know if I could bear to use a language with such twee names.



  • Or log it in a log that gets filled with tens of entries per request, which are basically nothing more than strings, and gets purged way too often.

    Like in a certain Oracle database. It's fun when your read bubbles up as:

    ORA-12345 Shit went wrong in line 1842

    and line 1842 is the final when others then throw of a 500LOC sproc.



  • @Captain said:

    recursion

    ...is not easy at all and lots of people have problems with it. The concept itself might be understandable, but translating your intuitive "do this, then do that, then something else" workflow to a fully functional program is a major paradigm shift.



  • "do this, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again, by doing this again..."



  • How hard can it be? You consume the first element of the list, then iterate. 😄


  • Discourse touched me in a no-no place

    @Mikael_Svahnberg said:

    How hard can it be? You consume the first element of the list, then iterate. 😄

    It's usually easier to teach people about recursion using trees, since there it's a much clearer approach than the non-recursive ones.



  • No, it's really easy. I'm not exaggerating when I say 18 year old college students are expected to understand it and use it. And it's not just math and physics majors. Anybody taking math for lit majors would learn about it.

    And honestly, I submit that what you call intuitive (putting data into boxes and moving the boxes around) is not. I remember having a lot of trouble trying to understand why x = 1; x = 2; doesn't mean 1 = 2, when I was learning C in the third grade.

    I can make the same argument against the intuitiveness of it that you made about recursion. It's really easy conceptually, but when you read source code, which part of it is actually doing work? All I see is boxes being moved around, but no structure. Indeed, I have to simulate the entire computation to understand it.



  • @Captain said:

    And honestly, I submit that what you call intuitive (putting data into boxes and moving the boxes around) is not. I remember having a lot of trouble trying to understand why x = 1; x = 2; doesn't mean 1 = 2, when I was learning C in the third grade.

    As I said, maybe it would be easier if you started with it, rather than try to shift your thinking from "putting data into boxes" to recursive programming.

    Let's just say, on second semester we had both C and Prolog. We used C to learn how to write a simple 2D game with Allegro. We used Prolog to find the middle element of a sequence.

    Seriously. It took us the whole semester to explain the most basic, simplest concepts, and every single program felt more like solving a riddle than actually doing something useful.



  • I did start with imperative programming. I learned recursion and induction when I was 18-19. I learned imperative programming when I was 9.

    Also, note that you used lots and lots of libraries in your C game programming course. Could you write an OpenGL device driver after that course? A compiler? Perhaps the difference in complexity wasn't as high as you remember. I'm about 90% sure you learned enough Prolog to write a simple compiler after that course.



  • @Captain said:

    Could you write an OpenGL device driver after that course?

    How does that even relate to the complexity of different programming paradigms? Also, when writing a device driver, the problem lies with knowing the device specification, not with the language itself.

    @Captain said:

    A compiler?

    A better example, but

    @Captain said:

    I'm about 90% sure you learned enough Prolog to write a simple compiler after that course.

    Not an effin' chance. If I tried really hard, I might remember, um... how to merge two sorted lists into one. I did a course on compiler writing two years later, with tools better suited for it (Lex / LLGen / yacc), and while I got through, it still wasn't straightforward for me (and nobody else for that matter), and we still did mostly simple things (basic anbn languages, parsing heavily simplified C or Pascal).

    Look, you get it, that's fine. Congratulations, seriously. But most people don't, and it's generally not as easy as you think. Nor is it suitable for each and every task - there are things which a more declarative language handles better (like compilers), and tasks which are better modeled in an imperative language (like a workflow).


  • Discourse touched me in a no-no place

    @Maciejasjmj said:

    tasks which are better modeled in an imperative language (like a workflow)

    That depends on the type of workflow. Business workflows tend to be good with imperative languages. Scientific workflows tend to be better with declarative languages. The difference tends to be in how errors are handled; business processes require very detailed handling of that, whereas scientific processing requires detailed tracking of what happened to very large amounts of data, with less focus on the individual parts. (Yes, this is my specialist area. Why do you ask?)



  • @dkf said:

    That depends on the type of workflow. Business workflows tend to be good with imperative languages.

    Yeah, that's what I meant. My science knowledge ends with making a potato battery.



  • @Captain said:

    I remember having a lot of trouble trying to understand why x = 1; x = 2; doesn't mean 1 = 2, when I was learning C in the third grade

    I read a study a long time ago that gave a pre-test to a bunch of programming students and followed their progress. The one question on the test that was most predictive of success was:

    Given the following program: X = 1 Y = 2 X = Y

    What is the value of X after execution?

    A. 1
    B. 2
    C. The program will halt with an error
    D. False


    Those that answered C or D rarely learned to program well. It was the difference between looking at the program as whole and thinking that every statement should be simultaneously satisfied and thinking of a program as a series of steps. Apparently you were one of the lucky ones to be able climb over the hill between the two mental models.


  • I survived the hour long Uno hand

    I had no trouble with algebra because I'd already started learning to program. Everyone else would whine week after week "but there's LETTERS" and I was sitting there fuming about "They're just fucking variables, deal with it."



  • @Jaime said:

    I read a study a long time ago that gave a pre-test to a bunch of programming students and followed their progress. The one question on the test that was most predictive of success was:

    Given the following program: X = 1 Y = 2 X = Y
    What is the value of X after execution?
    A. 1B. 2C. The program will halt with an errorD. False

    One I used to spend a lot of time on with my PL/1 students:

    a = 1;
    b = 2;
    c = b = a;

    What are the values of a, b and c after this code segment is executed?

    Correct answer is:
    a = 1
    b = 2
    c = 0 (actually '0'b, but if c is defined as an integer data type, it's converted to just plain zero)

    Next step was to point out that even though the brackets in c = (b = a) are completely optional for the compiler, specifying them for the next programmer to come along is a mitzvah.



  • @da_Doctah said:

    a = 1b = 2c = 0 (actually '0'b, but if c is defined as an integer data type, it's converted to just plain zero)

    :wtf:
    Don't tell me it uses = for both assignment and comparison.



  • @da_Doctah said:

    Next step was to point out that even though the brackets in c = (b = a) are completely optional for the compiler, specifying them for the next programmer to come along is a mitzvah.

    That's more of a language trivia question than a fundamental programming concept. C doesn't use the same symbol for assignment and equality, but still makes the mistake of making assignment return a value. This leads to the common "if (x = y)" bug that should be invalid syntax.


  • Fake News

    @aliceif said:

    Don't tell me it uses = for both assignment and comparison.

    Ob: The Evil Ideas thread is ↗ 😕 ☀ 🔥


  • FoxDev


  • ♿ (Parody)

    @lolwhat said:

    Ob: The Evil Ideas thread is

    Only if you make the C mistake as @Jaime pointed out.



  • @Jaime said:

    C doesn't use the same symbol for assignment and equality, but still makes the mistake of making assignment return a value. This leads to the common "if (x = y)" bug that should be invalid syntax.
    I have mixed feelings about this, but in general I disagree on a few fronts:

    • Assignment returning a value leads to the ability to do something like while ((c = getchar()) != EOF); while this kind of thing can be easily abused (I tend to really dislike mixing side effects and larger expressions), I do tend to prefer things following that idiom to while (true) { c = getchar(); if (c == EOF) break; ... }. (The one big thing that I like about the latter is you could move the declaration of c into the loop.)
    • A Java-like approach, with a true bool type and requiring that conditions be bool, almost solves that problem but better (according to me :-))
    • Compilers warn about if (a=b) under common settings

  • BINNED

    @EvanED said:

    Compilers warn

    Unfortunately, most programmers seem to ignore warnings.


  • I survived the hour long Uno hand

    @EvanED said:

    you could move the declaration of c into the loop

    In some languages you can do that in the while anyway: http://perlmaven.com/open-and-read-from-files

    while (my $row = <$fh>) {
      chomp $row;
      print "$row\n";
    }


  • @Onyx said:

    Unfortunately, most programmers seem to ignore warnings.

    Which is why -Werror exists.


  • ♿ (Parody)

    @Onyx said:

    Unfortunately, most programmers seem to ignore warnings.

    But...but...there's soooo many of them!

    @tar said:

    Which is why -Werror exists.

    Now nothing will ever work. 😢



  • @boomzilla said:

    @tar said:
    Which is why -Werror exists.

    Now nothing will ever work. 😢

    ...which is why the -Wno-<particular-warning> flags exist!


  • BINNED

    @Yamikuronue said:

    http:// perlmaven.com/open-and-read-from-files

    while (my $row = &lt;$fh&gt;) {
      chomp $row;
      print "$row\n";
    }</blockquote>
    

    🚒



  • @Yamikuronue said:

    while (my $row = <$fh>) {
    chomp $row;
    print "$row\n";
    }
    You can do that in C too (or at least C++).... you just can't then compare it to something other than the implicit 0/non-zero comparison. So while(char c = getchar()) is syntactically correct (and something like it is occasionally good enough), but it doesn't do the right thing in this case. I suspect the same would be true in Perl, though I remain willfully ignorant of most of that language.



  • @EvanED said:

    Compilers warn about if (a=b) under common settings

    C syntax has leaked into scripting languages, where there is no compiler.

    @EvanED said:

    Assignment returning a value leads to the ability to do something like while ((c = getchar()) != EOF)

    That's a bad API smell; no need to design a language around it. Look at how .Net reads from databases:

    using(var dr = someCommand.ExecuteReader())
    {
      while(dr.Read())
      {
        // do stuff
      }
    }
    

    Assignment being an expression has led to a lot more problems than it has solved.


  • I survived the hour long Uno hand

    In your .net example, where does the data that was read go?



  • @Jaime said:

    C syntax has leaked into scripting languages, where there is no compiler.

    If only interpreters could also provide warnings. (Oh wait, they can!)

    Also, that's not really C's mistake, that's the scripting languages' mistake if they took a reasonable idea and applied it in contexts where it's not reasonable.



  • MessageBox.Show (dr["ColumnName"].ToString());
    

    Or, if it's a strongly typed dataset:

    MessageBox.Show(dr.ColumnName);
    

  • I survived the hour long Uno hand

    Ah, so it's a syntax that only works for databases and not arbitrary flat files. PHP has that.


  • ♿ (Parody)

    I'd rather have something like

    goto read_it;
    while ( c != EOF){
        // do stuff
    
        read_it:
        c = getc(file);
    }
    
    


  • You could imagine an API that would provide similar syntax for my case though: int c; while (getchar(&c)) { ... }

    C++'s getline actually works like this.



  • You can fetch file data into a DataSet.

    Reading a file line-by-line:

    using (TextReader tr = File.Open(filename))
    {
      while(tr.Peek() != -1)
      {
        MessageBox.Show(tr.ReadLine());
      }
    }
    

  • I survived the hour long Uno hand

    But now you're doing your read in the body of the loop instead of in the condition. So that's not the same either.



  • I prefer while (true) { c = getchar(); if (c==EOF) break; ... } to the goto solution, though not enough to call the latter actually bad.


  • ♿ (Parody)

    I find that while(true) hides what's going on more than a simple goto like that. I'd rather violate DRY and have an extra read prior to the while loop than while(true).



  • @boomzilla said:

    I'd rather violate DRY and have an extra read prior to the while loop than while(true).

    Sigh. Nobody uses do...while these days.


    Filed under: REPEAT UNTIL KEYPRESSED



  • @Yamikuronue said:

    But now you're doing your read in the body of the loop instead of in the condition. So that's not the same either.

    The file API has a faint smell. It could be fixed by wrapping it in something like this:

    public class BetterTextReader
    {
      private TextReader reader;
      private string line;
      
      public BetterTextReader(TextReader tr)
      {
        reader = tr;
      }
      
      public bool ReadLine()
      {
        if (reader.Peek() == -1) return false;
        line = reader.ReadLine();
        return true;
      }
      
      public string CurrentLine
      {
        get
        {
          return line;
        }
      }
    }
    
    

  • ♿ (Parody)

    @Maciejasjmj said:

    Sigh. Nobody uses do...while these days.

    I...have, but it wouldn't work for the case of reading a file.



  • @Jaime said:

    Assignment being an expression has led to a lot more problems than it has solved.

    Idiots misusing syntax features is what has actually led to more problems than it has solved.
    Filed under: a = b = c = d = 0.f;



  • @tar said:

    Idiots misusing syntax features is what has actually led to more problems than it has solved

    Creating a syntax that is ripe for abuse is the root cause. A good language makes it harder to write bad code. C is a loaded foot-gun.



  • @boomzilla said:

    ```C
    goto read_it;
    while ( c != EOF){
    // do stuff
    //...
    read_it:
    c = getc(file);
    }

    
    I'm sure we can do better than this:
    
    ```C
    switch(false) {
        while(true) {
            default:
                //do stuff
            case (c == EOF):
                c = getc(file);
                break;
        }
    }
    


  • @Jaime said:

    Creating a syntax that is ripe for abuse is the root cause. A good language makes it harder to write bad code. C is a loaded foot-gun.

    Oh great, it's the "Wah I Can't Write C Brigade". Again.
    Filed under: other people can actually write C


  • ♿ (Parody)

    I think we have a problem agreeing on what a particular word means...

    @tar said:

    better

    :angry:


  • ♿ (Parody)

    @tar said:

    Filed under: other people can actually write C

    Especially Canadians, eh? That would be pretty embarrassing if they couldn't.



  • @boomzilla said:

    Especially Canadians, eh? That would be pretty embarrassing if they couldn't.

    C was good enough for the Win32 API, and the Mac Classic. What more do you want?


Log in to reply