Debugging doesn't work like I want it to!


  • Winner of the 2016 Presidential Election

    So, there's a guy who goes to school with me who's brand new to the programming thing...

    He almost always uses "Step into" while debugging instead of "Step over"...

    I heard him complaining today that he wished the debugger would automatically ignore all the system functions and just jump to the next line(C++, VS)...
    I've explained the difference between "Step into" and "Step over" before. He doesn't seem to get it.
    :facepalm:
    To his credit - he's a hard worker and usually understands stuff better than 90% of the class. And we never got taught anything about how to use the debugger (yeah, TRWTF). This is what makes me glad I'm self taught and I'm just here to get a stupid fancy piece of paper.



  • Which VS? I think you can enable Just My Code for C++ in 2013, but I haven't done anything C++ in a long while.

    Also, I generally "Step Into" when stepping through the code, so that I don't accidentally miss the line which i.e. throws an exception. That's in C#, though, and C# doesn't jump into the framework by default.


  • Winner of the 2016 Presidential Election

    @Maciejasjmj said:

    Which VS? I think you can enable Just My Code for C++ in 2013, but I haven't done anything C++ in a long while.

    You learn something new every day!

    The issue here really is that he assumes Step Into does what Step Over does. Yes, step into jumps into the code. Yes, that is by design.



  • @sloosecannon said:

    This is what makes me glad I'm self taught and I'm just here to get a stupid fancy piece of paper.

    Ugh. I dread the idea of having to go and get the fancy paper. I have a suspicion I'll actually get worse as a result.


  • SockDev

    the fancy paper is worth it, for all the hoops you have to jump through to get it.

    i've been ther, done that, hated, it, and got the t-shirt.

    I would love to have been able to skip those years and gone straight from highschool to work, but that was not gonna happen. and for many reasons it's a good thing that didn't happen.



  • @sloosecannon said:

    better than 90% of the class

    Two semesters ago, I received almost exclusively grades above 100% because the tests were curved so generously. Some people in those classes still managed to get 40%.


  • SockDev

    CS-160 (basically programming 101) i ended the class with an average of 121.67, which was the absolute maximum that was possible.

    perfect scores on every homework, test, quiz and program including the extra credits.

    CS161 i got cocky and only managed a 102 (out of a possible 135.334)

    that professor did not curve his tests. he famously flunked an entire class one semester because they farted about and didn't do the work. not one of them passed, and while the dean wasn't happy about the complaints did back the prof.



  • Why is someone learning C++ in 2015?

    This is why IT never moves forward.


  • Winner of the 2016 Presidential Election

    @trithne said:

    Ugh. I dread the idea of having to go and get the fancy paper. I have a suspicion I'll actually get worse as a result.

    And that's why I do work on the side for my knowledge and visit sites for educational benefit that make fun of bad programmers.

    @blakeyrat said:

    Why is someone learning C++ in 2015?

    This is why IT never moves forward.


    Because it's required... For an associates' degree nonetheless... Although I think they're trying to get you to trash memory stupidly on a school project rather than do it in [ominous pause] THE REAL WORLD. With limited success. The prof is really good, at least with his programming/real world knowledge. Although I had to explain last semester that the reason my Java final gave a "Lambdas are not supported at source level below 1.8" compile-time error message was because he didn't have Java 8 on his computer.....



  • @sloosecannon said:

    Although I had to explain last semester that the reason my Java final gave a "Lambdas are not supported at source level below 1.8" compile-time error message was because he didn't have Java 8 on his computer...

    I'm not surprised. That's one of the most poorly-written error message I've ever seen.

    I also assume, using telepathic wizard magic, that Java 1.8 is the same thing as Java 8? Somehow?



  • @blakeyrat said:

    Java 1.8 is the same thing as Java 8? Somehow?

    Yes because Java and reasons,
    That's like WTF^2 or something.



  • List of languages I have been taught in an educational setting, roughly in chronological order:

    List of languages I have used outside of an educational setting, roughly in chronological order:

    • JavaScript
    • PHP
    • C++
    • Bash
    • Squirrel
    • Lua
    • LaTeX
    • Go
    • lojban
    • Markdown
    • Ruby


  • Goddamned, Ben, you gotta get you some C# up in that brain.



  • @ben_lubar said:

    Alice 3D
    COOL
    C89

    :wtf:

    @ben_lubar said:

    Squirrel
    Go
    lojban
    Markdown
    Ruby

    :wtf:^:wtf:





  • @loopback0 - Days Since Last Discourse Bug: -1

    <!-- Posted by SockBot 0.13.0 "Devious Boris" on Thu Jan 08 2015 03:57:16 GMT+0000 (UTC)-->


  • To be fair, with COOL, I was writing a compiler for the language and the professor was actually good.



  • To be fair I can't even make Google easily (read: first page of results) tell me wtf COOL is so maybe a :wtf:^(:wtf:^2)

    <!-- site == RO mode! Fuck off Discotoast, -->

  • Discourse touched me in a no-no place

    @blakeyrat said:

    That's one of the most poorly-written error message I've ever seen.

    Aw, man, you've lived a sheltered life. Your head would explode from the kind of errors the VMS Ada compiler produced.


  • Discourse touched me in a no-no place

    @loopback0 said:

    Yes because Java and reasons,

    Actually, it's something like Java 8 -> JDK 1.8 or something.


  • Discourse touched me in a no-no place

    @loopback0 said:

    To be fair I can't even make Google easily (read: first page of results) tell me wtf COOL is so maybe a

    "Cool language" -> first hit -> http://en.wikipedia.org/wiki/Cool_(programming_language)



  • @FrostCat said:

    "Cool language" -> first hit -> http://en.wikipedia.org/wiki/Cool_(programming_language)

    @loopback0 said:

    it's 03:47 and I've been back from the pub/bar less than 10 minutes.

    <!-- whatever -->




  • @blakeyrat said:

    learning C++ in 2015?

    They were threatening us with a new standard revision last year (C++14). I wonder what happened with that.

    Not that I mind, I barely have a handle on the important new C++11 stuff. At least you knew where you were with C++98/TR1.



  • You have the DiscoCommonMarkUpDownBollocks-fu

    <!-- all points revoked for something DF based -->


  • @blakeyrat said:

    IT never moves forward.

    I dunno though, it's not a great time to be a COBOL or FORTRAN developer any more. Even C is only used in specific niches these days. Stuff is moving forward, just at the pace of an arthritic dead sloth...



  • @tar said:

    They were threatening us with a new standard revision last year (C++14). I wonder what happened with that.

    It's there-ish. It wasn't a major revision, though, just a few fixes and tweaks.

    Maybe we just don't hear about it so much because at this point nobody cares what happens with C++. Certainly not after C++11, which personally I consider a friggin' disaster - it took an odd, but usable C-with-classes language and turned it into something that tries to be like C# or Java, but fails miserably.

    @blakeyrat said:

    IT never moves forward.

    It does, but pushed by hipster Ruby/JS/whatever is the hype now developers. So there's certainly a movement involved, just the direction can be kinda off sometimes...



  • @Maciejasjmj said:

    it took an odd, but usable C-with-classes language and turned it into something that tries to be like C# or Java, but fails miserably

    constexpr, auto, noexcept and variadic templates are all good additions.
    Lambdas and move semantics are probably OK on balance, if you feel you need them.
    The rest can probably be safely ignored? enum class can go die in a hole though.


  • area_deu

    Tell him to put breakpoints on each line and click continue execution to go to the next :trolleybus:



  • @Maciejasjmj said:

    Maybe we just don't hear about it so much because at this point nobody cares what happens with C++. Certainly not after C++11, which personally I consider a friggin' disaster - it took an odd, but usable C-with-classes language and turned it into something that tries to be like C# or Java, but fails miserably.
    I have a more... optimistic viewpoint. I think it's still a pretty big improvement over C++98/03; if you're stuck in C++ land, I think it should still be a welcome change.

    I have a love-hate relationship with C++ (trending hate over the last several years as I become more and more against memory-unsafe languages for nearly everything), so I wouldn't go "hey, C++14 makes me want to use C++ now!" for stuff where I have freedom of choice. But there is a lot of C++ code that is being maintained and enhanced, and it's a lot easier to pass --std=c++11 to your compiler and then use ranged-based-for and auto and the new stdlib features in the code you write and update than it is to port your 100,000 lines of code to a better language.

    @tar said:

    constexpr ... enum class can go die in a hole though.
    I'm not going to say this is bad, but I think constexpr is one of the least exciting changes. And enum class is great. Actually, enum not introducing a new type would be a good addition to the "Things Denis Ritchie did wrong" thread IMO...

    And move semantics are pretty awesome. Well, at least in theory. :-) We don't actually use C++11 where I work, so I have little actual experience with it. (I have used some of the other features in another project, but don't think I did anything with &&.) I'd like to just enable it and see if we get any performance boost from the move semantics in STL types.



  • @EvanED said:

    have a love-hate relationship with C++

    I know that feeling.

    @EvanED said:

    constexpr is one of the least exciting changes.

    Constexpr lets you run arbitrary C++ at compile time to produce super-optimized assembly without needing to resort to template hackery. I don't see why that's not exciting.

    @EvanED said:

    And enum class is great.

    Having an enum with doesn't masquerade as an int is fine, but [i]for the love of god why is it forcing me to qualify all of the constants with the typename[/i]. With no way to turn that off. In a language which already has namespaces I can wrap my enums in if I am that paranoid about symbol collisions. Let's just import bullshit behaviour from Java/C# in an attempt to be "relevant", as @Maciejasjmj alluded to.



  • Is it C++14 or C++17 which is threatening to standardize the ABI, btw? Because that would be [i]literally the best thing possible ever to have in a C++ standard[/i], not even exaggurating.

    This is the problem with the C++ standardization commitee—this constant mixture of amazing and terrible new features they keep coming out with...



  • @Maciejasjmj said:

    It does, but pushed by hipster Ruby/JS/whatever is the hype now developers. So there's certainly a movement involved, just the direction can be kinda off sometimes...

    The thing is that a lot of the guys that were straight outta uni just don't want to deal with legacy bollox. Guys like us have a tonne of legacy wtfs filling our brains and just don't get that it can be easier.



  • Also whenever I hear c++ programmers talk I have no idea wtf they are on about.



  • @sloosecannon said:

    He almost always uses "Step into" while debugging instead of "Step over"...

    He uses a debugger to step through code? That's at least showing signs of anti-WTF - more than a lot of programmers I know do....

    printf("%s:%d....\n", __FILE__, __LINE__,...);

    (for C) randomly dotted all over the code. At least in that form for those that know __LINE__ is a thing. Others just skip that bit and just

    printf("asdf\n");

    Multiple times...



  • @PJH said:

    printf("asdf\n");

    TRWTF. They should use puts("asdf"); instead.


  • kills Dumbledore

    @PJH said:

    printf("asdf\n");

    In my early days of coding I once added some debug message boxes to an MFC project

    MessageBox.Show("Never gonna give you up"); /*some code*/ MessageBox.Show("Never gonna let you down"); /*some code*/ MessageBox.Show("Never gonna run around and desert you"); //etc.

    Luckily, I remembered to take them out before comitting


  • SockDev

    @Jaloopa said:

    MFC

    you know to this day i still read that as Mother "fathering" Code

    makes me laugh every time too.


  • kills Dumbledore

    That makes a lot more sense than the actual name. Not a nice framework



  • @accalia said:

    I would love to have been able to skip those years and gone straight from highschool to work,

    Ah, man, but those years were so much fun.


  • SockDev

    @boomzilla said:

    so much fun.

    ..... maybe for you. not so much for me.



  • Oh, yeah. Plus, I assisted my wife on her MRS degree.



  • When Java 5 came out they switched from semantic versioning (Java 2 Standard Edition, version 1.5) to marketing versioning (Java 5). They still use the semantic version in all the SDK and tooling, but all the branding uses the marketing version.



  • This post is deleted!


  • Hey, you forgot to mention Dwarf Fortress somewhere in your post.



  • @blakeyrat said:

    Why is someone learning C++ in 2015?

    This is why IT never moves forward.

    What do you suggest they learn as a systems language instead? Go gets picked on all over around here, C would probably draw a similar reaction as above, Pascal and Ada's descendants are redheaded stepchildren for good reason, and Forth (yes) is just too plain odd for most people who haven't touched HP RPN calculators.

    So, that leaves us with Rust...sound good to you?

    @FrostCat said:

    Aw, man, you've lived a sheltered life. Your head would explode from the kind of errors the VMS Ada compiler produced.

    Still better than the time the NonStop TNS/R debugger (INSPECT, running under Guardian, ofc), refused to debug my program, instead crashing with a backtrace pointing squarely at the symbol-resolution library. Turns out it was a path-handling buffer overrun bug triggered by drive-relative Windows source paths in symbols generated by the TNS/R cross compiler...

    @tar said:

    Is it C++14 or C++17 which is threatening to standardize the ABI, btw?

    They already had most of the hard work done for them -- by the Linux folks, even.



  • @tarunik said:

    What do you suggest they learn as a systems language instead? Go gets picked on all over around here, C would probably draw a similar reaction as above, Pascal and Ada's descendants are redheaded stepchildren for good reason, and Forth (yes) is just too plain odd for most people who haven't touched HP RPN calculators.

    So, that leaves us with Rust...sound good to you?

    You omitted Whitespace and BrainFuck.



  • Malbolge?


  • Winner of the 2016 Presidential Election

    @aliceif said:

    Tell him to put breakpoints on each line and click continue execution to go to the next

    I TOTALLY NEVER DID THAT WHEN I WAS LEARNING TO DEBUG

    I totally did...



  • @PJH said:

    printf("%s:%d....\n", _FILE_, _LINE_,...);

    If you have [i]absolutely[/i] no other remaining techniques available, this is occasionally still useful, as long as it's writing to a logfile. I've used it to track down the causes of odd crashes which take hours/overnight to manifest.


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.