Too much, too young, too fast



  •  It's been 2 terms aka "a year" since I started being a computer science student.

    I'm not sure how to fully translate the specific title. I guess practical computer science would be a good fit. Instead of having to take three math classes per term and learning Java as a first we spent most of our first term dealing with C and the win32-API and a multitude of web-things from everything XML to Javascript-lessons(which felt like washing your eyes in acid). Now we were having an OOP-class which had us pick up both Java and C++.

    This has actually led most of my fellow students to behave very insane. Well, the ones that stayed anyway, out of 50 at the start of term one, we are now down to about 20-30. And their confusion with too many languages has brought forth many crazy things:

    - Emulating if(ERRORCODE) behaviour within C++-Exceptions - For example we had someone who wrote about 15 case-statements into every try-catch he used, because, hey, errorhandling. Please note that his "handling" consisted of simply printing the type of error and not doing anything about it.

    - Using pointer and references to try to get around encapsulation. Just about every prof. we ever had gave us a rant about encapsulation and get()s and set()s. Just about half of the people tried to undermine the requirement of their data being private by passing references to it just about everywhere. Turns out that does not work so well.

     But the best thing ever happen after we had taken a test on Java + UML. We had spend about 60 Minutes writing Java-Code, imagining everything from a small bank to an algorithm for determining Fibonacci's number. Then, just moments after the pens down somebody raised his hand. "I accidently used printf() instead of the Java-thing. Is that bad?" "Well, that depends. Did you copy values onto it?" "Yes" "Well, yeah that's bad."

    Turns out he even imported both "conio.h" and "stdio.h" within his pseudo-code. He passed nevertheless because his UML was spotless, but it was really close.



  • I believe by the time you graduate your class will have only five survivors, aproximately.



  • [quote user="Renan "C#" Sousa"]I believe by the time you graduate your class will have only five survivors, aproximately.[/quote] 

    This term, two people graduaded. Out of 50 that get to try every two terms.

     We actually had four students migrated from russia, which replied to the question which programming language they speak with "Photoshop". And no, they didn't mean the SDK. They meant that they were really good at, well, using Photoshop. They lasted about 2 1/2 months, and they actually once adviced me to read JavaScript backwards because "that's the trick!"



  • @fire2k said:

    [quote user="Renan "C#" Sousa"]I believe by the time you graduate your class will have only five survivors, aproximately.

     

    This term, two people graduaded. Out of 50 that get to try every two terms.

     We actually had four students migrated from russia, which replied to the question which programming language they speak with "Photoshop". And no, they didn't mean the SDK. They meant that they were really good at, well, using Photoshop. They lasted about 2 1/2 months, and they actually once adviced me to read JavaScript backwards because "that's the trick!"

    [/quote] 

    Good.  People like that shouldn't be writing code, and if a class manages to weed them out early on, that makes it that much better for the rest of us who might otherwise end up having to use something they wrote!



  • @Mason Wheeler said:

    @fire2k said:

    [quote user="Renan "C#" Sousa"]I believe by the time you graduate your class will have only five survivors, aproximately.

     

    This term, two people graduaded. Out of 50 that get to try every two terms.

     We actually had four students migrated from russia, which replied to the question which programming language they speak with "Photoshop". And no, they didn't mean the SDK. They meant that they were really good at, well, using Photoshop. They lasted about 2 1/2 months, and they actually once adviced me to read JavaScript backwards because "that's the trick!"

     

    Good.  People like that shouldn't be writing code, and if a class manages to weed them out early on, that makes it that much better for the rest of us who might otherwise end up having to use something they wrote!

    [/quote]

    About that, I've read this in Jeff Atwood's blog:

    And I found the article mentioned there a pretty amusing read as well.



  •  You act as if learning all these different languages is a bad thing. I think in pursuit of my Software Engineering degree I learned at least 10 languages. Including 3 assembly languages! It seemed like a good thing to get exposed to a variety of them. The core curriculum tended to stick to Java, though.



  • [quote user="Renan "C#" Sousa"]

    About that, I've read this in Jeff Atwood's blog:

    And I found the article mentioned there a pretty amusing read as well.

    [/quote] 

    As much as I love having new posts of Jeff in my feed-reader, I personaly think from experience that he's wrong. I had some people that never programmed at all in school and even in the first term of classes.(Yeah, I know it's a wtf for people to go study computer science when they have never even tried anything related to it)

    Basically there are a lot of factors involved. We had people that at first couldn't even get behind Jeff's example and pointer/references was hard even to the people that had worked at a programming job before. But everybody that was willing to work hard enough eventually got to the point where he could get by and pick up speed. I don't really think there's this joelonsoftware-genetic thing where either you "get pointers" or you don't. It depends on how hard you try and work, and whether you actually care.

    I've seen and met a lot of people, working or students, that don't write code because they care about it, or want to, but because it pays the bill. They don't get pointer because they can get by without it just fine. They won't produce good code because that would need attention and additonal work and care.

    What our headmaster told us on the first day was that about 50% of your grade is determined by what people you choose to sit and learn with. He actually was right. For example we had a group of guys that responded to every new subject that was started with "piece of cake" or "I already did something similar xx years ago". Because they reenforced each others belief that they "got this" they never learned. They never even started. All they did was tell funny stories of their past life.

    Another group was made out of people that didn't live nearby AND had a tendency to be slackers. Now they could have handled one, but not two. Because many of them didn't show up there was an excuse to slack off and because nobody did any work there was an excuse to stay at home.

     

     



  •  It would have been nice if my university's department was more like yours... mine was a typical Java one with classes in OS's, network programming, databases, etc. I spent very little of my time studying because this stuff was just so basic/had learned it ahead of time on my own. + win32 and C can be way more fun than Java in some areas.



  • I took a programming class in high school (1999) that taught Javascript, Visual C++, and then Java. My final project was to build a small web browser with its own html rendering (plain flow layout, no tables, no css, etc). Previous years they taught C and Pascal. College felt like a couple steps back for a while.



  •  @fire2k said:

    Just about half of the people tried to undermine the requirement of their data being private by passing references to it just about everywhere. Turns out that does not work so well.

     

     Our legacy software looks like that. They wanted to get settings passed around, so they put in a global get function that returns a pointer to a structure, part of which will be the one setting that they wanted.



  • Damn, college programming courses...Fortran, Pascal, C, Z80 assembler...then again, I was a math & physics major...Personally, I maintain that universities are going about teaching most of their CS majors the wrong way, beat the shit out of them with low-level languages, compiler design, and fundamentals. Teaching high level languages like java and .net is sort of like trying to teach someone with no math background calculus - they can do it eventually but if you skip the basics they will only be able to regurgitate whatever professy says and continually write bad code for the next 10 to 15 years...



  • @fire2k said:

    computer science

    learning Java
    dealing with C
    the win32-API
    XML
    Javascript
    OOP
    C++.
    Emulating if(ERRORCODE) behaviour within C++-Exceptions
    case-statements
    try-catch
    pointer and references
    encapsulation
    get()s and set()s
    UML
    printf()

     

    I see where the problem is.

    You signed up for a computer science class, but instead you got a set of basic programming courses.

    Maybe you did, but you didn't mentino it, but did they touch upon the subjects of, just to name a few; algorithms, symbols, tokens, operators, floating point arithmetic, binary numbers, recursion, binary trees, program correctness, assembly, CPU structure, sorting, compiler creation, tons and tons of math, halting problem, information theory, encoding, encrypting, data compression, data mining, or will they do so in the future?



  • @fire2k said:

     It's been 2 terms aka "a year" since I started being a computer science student.

    I'm not sure how to fully translate the specific title. I guess practical computer science would be a good fit. Instead of having to take three math classes per term...

     Sort of on track with with drohmed's comment...IS this a CS course or is it some information systems thing?

     Also:

    @fire2k said:

    Turns out he even imported both "conio.h" and "stdio.h" within his pseudo-code. He passed nevertheless because his UML was spotless, but it was really close.

     

    ROFLMFAO LOL HAHAHAHA wow, what a total fag.  Only a complete fucking moron wouldn't know every library to include.  After all, memorization is what computer science is all about.  Not gay shit like algorithms and those shitty faggoty math classes you mentioned above.



  • @dhromed said:

    You signed up for a computer science class, but instead you got a set of basic programming courses.

    Maybe you did, but you didn't mentino it, but did they touch upon the subjects of, just to name a few; algorithms, symbols, tokens, operators, floating point arithmetic, binary numbers, recursion, binary trees, program correctness, assembly, CPU structure, sorting, compiler creation, tons and tons of math, halting problem, information theory, encoding, encrypting, data compression, data mining, or will they do so in the future?

     

    Agreed. I would expect that a Computer Science curriculum would include languages like Scheme, Haskell, Smalltalk, and the like rather than Java and Javascript. I would expect a discussion of SGML rather than XML.

     

    B



  • @fire2k said:

    We actually had four students migrated from russia, which replied to the question which programming language they speak with "Photoshop". And no, they didn't mean the SDK. They meant that they were really good at, well, using Photoshop. They lasted about 2 1/2 months, and they actually once adviced me to read JavaScript backwards because "that's the trick!"

     

    I think they were from Poland then.



  • @dhromed said:

    I see where the problem is.

    You signed up for a computer science class, but instead you got a set of basic programming courses.

    Maybe you did, but you didn't mentino it, but did they touch upon the subjects of, just to name a few; algorithms, symbols, tokens, operators, floating point arithmetic, binary numbers, recursion, binary trees, program correctness, assembly, CPU structure, sorting, compiler creation, tons and tons of math, halting problem, information theory, encoding, encrypting, data compression, data mining, or will they do so in the future?

     

    Actually I didn't mention it but we do have most of our hours in math, just didn't think it was relevant. Also yes, we don't have as much math or language theory as a university or someone studying plain computer science.

    -  algorithms

    We have a class about it and data structures, with things like sorting, searches, games and trees

    -  symbols

    Most of that compiler building stuff is only tought when you decide to do embedded systems in the 6th term, there are 3 other subjects to choose from, advanced networks, AI's or application software development.

    - assembly, CPU structure

    Yeah, we have a class on general hardware structure and will have one that primarily has focus on CPUs and X86-Assembler

    - information technology

    Yeah, we had that in the first term, passed it quite good, but many people treated it as irrelevant and got kicket in the exam

     

    While we do all that stuff it's just that the focus is on programming



  • @havokk said:

    Agreed. I would expect that a Computer Science curriculum would include languages like Scheme, Haskell, Smalltalk, and the like rather than Java and Javascript. I would expect a discussion of SGML rather than XML.

     

    We did a little SGML and we had a small test on DOM

    The prof. teaching webtechnologies just didn't think it was that relevant. On of the phrases he uses most is that his students "have to encounter what I teach in the real world". I also heard next years students will learn typo3/tyopscript instead of javascript with DOM.



  • @sys said:

    @fire2k said:

     It's been 2 terms aka "a year" since I started being a computer science student.

    I'm not sure how to fully translate the specific title. I guess practical computer science would be a good fit. Instead of having to take three math classes per term...

     Sort of on track with with drohmed's comment...IS this a CS course or is it some information systems thing?

     

    All irony aside, yes it is a CS degree, actually there was a court ruling a few years back that enabled anyone graduading from it to continue afterwards with a CS diploma/ nowadays master at any given university(If you have the right grades of course)



  •  "I accidently used printf() instead of the Java-thing. Is that bad?"
    "Well, that depends. Did you copy values onto it?" "Yes" "Well, yeah
    that's bad."

    How do you copy values onto printf()?



  • @fire2k said:

    [quote user="Renan "C#" Sousa"]

    About that, I've read this in Jeff Atwood's blog:

    And I found the article mentioned there a pretty amusing read as well.

     

    As much as I love having new posts of Jeff in my feed-reader, I personaly think from experience that he's wrong. I had some people that never programmed at all in school and even in the first term of classes.(Yeah, I know it's a wtf for people to go study computer science when they have never even tried anything related to it)

    Basically there are a lot of factors involved. We had people that at first couldn't even get behind Jeff's example and pointer/references was hard even to the people that had worked at a programming job before. But everybody that was willing to work hard enough eventually got to the point where he could get by and pick up speed. I don't really think there's this joelonsoftware-genetic thing where either you "get pointers" or you don't. It depends on how hard you try and work, and whether you actually care.

    I've seen and met a lot of people, working or students, that don't write code because they care about it, or want to, but because it pays the bill. They don't get pointer because they can get by without it just fine. They won't produce good code because that would need attention and additonal work and care.

    What our headmaster told us on the first day was that about 50% of your grade is determined by what people you choose to sit and learn with. He actually was right. For example we had a group of guys that responded to every new subject that was started with "piece of cake" or "I already did something similar xx years ago". Because they reenforced each others belief that they "got this" they never learned. They never even started. All they did was tell funny stories of their past life.

    Another group was made out of people that didn't live nearby AND had a tendency to be slackers. Now they could have handled one, but not two. Because many of them didn't show up there was an excuse to slack off and because nobody did any work there was an excuse to stay at home.

    [/quote]

    Can you prove any of these statements?

    Do you know all 50 students personally, enough to know (a) which ones are motivated vs. which ones are lazy, and (b) which ones failed brutally vs. which ones passed or at least came close?

    And did you actually compile statistics on failure rates for the "hard-working" vs. "lazy" groups and find the difference to be statistically significant?

    Sounds to me like your rationale is essentially: They fail because they're lazy, and I know they're lazy because they failed.  If it's not outright circular reasoning then it's a representativeness heuristic (making judgments about their work ethic based on relatively superficial characteristics) combined with a generous helping of confirmation bias.  Either way, it's not very convincing.

    I shudder to think that there was some student who couldn't grok references who you say had worked at a programming job before.  Whatever he was doing there, obviously wasn't programming.  Scripting, maybe, or banging on the keyboard like a monkey until something came out that worked.



  • @Aaron said:

    Can you prove any of these statements?

    Judging from the quotes that Atwood included from the original paper, it doesn't sound like the original research is much better than the speculation you point out above:

    It has taken us some time to dare to believe in our own results. It now seems to us, although we are aware that [b]at this point we do not have sufficient data[/b], and so [b]it must remain a speculation[/b], that what distinguishes the three groups in the first test is their different attitudes to meaninglessness.

    WTF?



  • @Aaron said:

    Can you prove any of these statements?

    Do you know all 50 students personally, enough to know (a) which ones are motivated vs. which ones are lazy, and (b) which ones failed brutally vs. which ones passed or at least came close?

    And did you actually compile statistics on failure rates for the "hard-working" vs. "lazy" groups and find the difference to be statistically significant?

     

    Well one of the differences is attending classes. Of course I don't know how hard some people worked at home or whether they worked at all, but the students that still remain(and you can see who's still on board because we have quite a few classes where not showing up gets you kicked out) are mostly the students that attended classes and the students that stayed afterwards to review and code. Also it has been pretty consistent that people failed or passed as groups.

    @Aaron said:


    Sounds to me like your rationale is essentially: They fail because they're lazy, and I know they're lazy because they failed.  If it's not outright circular reasoning then it's a representativeness heuristic (making judgments about their work ethic based on relatively superficial characteristics) combined with a generous helping of confirmation bias.  Either way, it's not very convincing.

     

    Actually I didn't say that all people that failed were lazy - failing an exam isn't something uncommon and some people just block before exams even though they knew the answers before. Exams are retakeable and some of then get harder or easier depending on the year in which you are writing.

    Last years software ergonomics exam simply featured naming gui-elements and visual phenomena, while the one I took had C#-coding, memorizing and naming oddities and cultural differences with common gui elements(like how you're not supposed to show the picture of a mushroom inside a japanese program). Which actually was positive because I actually learned new things.

    I'm saying the people that bailed out/dropped of did so as whole groups, which is easily proveable, and that group dynamic played a large role. We had a group of people that showed up because "my parents know my schedule and make sure I attend classes" and vanished about 5 minutes before class was over. They went straight to McDonalds and then home. They had a ton of full coupon booklet, because of the money you save with these. They always made sure that one guy had a car, planning that in advance.



  • The most interesting thing about this group dynamic is that even exists...especially during lower level classes.  CS people usually aren't known for being especially outgoing (Info Systems retards usually are though).

     

    What the fuck does McDonald's have to do with anything?



  • @fire2k said:

    -  symbols

    Most of that compiler building stuff is only tought when you decide to do embedded systems in the 6th term, there are 3 other subjects to choose from, advanced networks, AI's or application software development.

     

    Holy meme-fuck! He used "embedded systems" in a NON-ironic manner. NOW what do we do?



  • @SQLDave said:

    @fire2k said:

    -  symbols

    Most of that compiler building stuff is only tought when you decide to do embedded systems in the 6th term, there are 3 other subjects to choose from, advanced networks, AI's or application software development.

     

    Holy meme-fuck! He used "embedded systems" in a NON-ironic manner. NOW what do we do?

    Tell him the most vital aspect of embedded systems is getting the filesystem working properly?


  • @SQLDave said:

    Holy meme-fuck! He used "embedded systems" in a NON-ironic manner. NOW what do we do?
     

    Remove the filesystem! Quick, you fool!



  • @fire2k said:

    Last years software ergonomics exam simply featured naming gui-elements and visual phenomena, while the one I took had C#-coding, memorizing and naming oddities and cultural differences with common gui elements(like how you're not supposed to show the picture of a mushroom inside a japanese program). Which actually was positive because I actually learned new things.
    Oh?  What's the deal with the mushroom?



  • @Xyro said:

    @fire2k said:

    Last years software ergonomics exam simply featured naming gui-elements and visual phenomena, while the one I took had C#-coding, memorizing and naming oddities and cultural differences with common gui elements(like how you're not supposed to show the picture of a mushroom inside a japanese program). Which actually was positive because I actually learned new things.
    Oh?  What's the deal with the mushroom?

    Maybe they see it as a phallic symbol?



  • @Xyro said:

    @fire2k said:

    Last years software ergonomics exam simply featured naming gui-elements and visual phenomena, while the one I took had C#-coding, memorizing and naming oddities and cultural differences with common gui elements(like how you're not supposed to show the picture of a mushroom inside a japanese program). Which actually was positive because I actually learned new things.
    Oh?  What's the deal with the mushroom?

     

     Ever since Hiroshima a mushroom is seen as a symbol of very bad things. And nuclear explosions. As opposed to my home country where it means "very good fortune". And mixing those two up can be quite bad.

      Additional software ergonomics fact: The October revolution was in november, and some people still use the Julian calender.

     @Wikipedia said:

    Sources differ about whether Greece adopted the Gregorian calendar or
    the Revised Julian Calendar. Blackburn
    and Holford-Strevens (2003, p. 687) indicate Greece uses the Gregorian
    calendar for civil purposes, while the Nautical Almanac Offices of the
    UK and the US (1961, p. 416) indicate Greece uses the Revised Julian
    calendar.

      Source



  • @fire2k said:

     Ever since Hiroshima a mushroom is seen as a symbol of very bad things. And nuclear explosions. As opposed to my home country where it means "very good fortune". And mixing those two up can be quite bad.

    In my country, faulty floor mats have come to be a symbol of very bad things.  In Japan however...yeah, still bad things.

    @fire2k said:

     @Wikipedia said:

    Sources differ about whether Greece adopted the Gregorian calendar or the Revised Julian Calendar. Blackburn and Holford-Strevens (2003, p. 687) indicate Greece uses the Gregorian calendar for civil purposes, while the Nautical Almanac Offices of the UK and the US (1961, p. 416) indicate Greece uses the Revised Julian calendar.

     

    What we can all agree on is that both calendars are gay.



  • @fire2k said:

    @Xyro said:

    @fire2k said:

    Last years software ergonomics exam simply featured naming gui-elements and visual phenomena, while the one I took had C#-coding, memorizing and naming oddities and cultural differences with common gui elements(like how you're not supposed to show the picture of a mushroom inside a japanese program). Which actually was positive because I actually learned new things.
    Oh?  What's the deal with the mushroom?

     

     Ever since Hiroshima a mushroom is seen as a symbol of very bad things. And nuclear explosions. As opposed to my home country where it means "very good fortune". And mixing those two up can be quite bad.

     

     

    If that's true, then how do you explain all of the positive mushroom references in Super Mario Bros., and its sequels?

     

    Unless SMB was meant to have taken place in a post-apocolyptic wasteland where all of the enemies were victims of radiation-induced mutations... hmm...



  • @Tyler said:

    @fire2k said:

    @Xyro said:

    @fire2k said:

    Last years software ergonomics exam simply featured naming gui-elements and visual phenomena, while the one I took had C#-coding, memorizing and naming oddities and cultural differences with common gui elements(like how you're not supposed to show the picture of a mushroom inside a japanese program). Which actually was positive because I actually learned new things.
    Oh?  What's the deal with the mushroom?

     

     Ever since Hiroshima a mushroom is seen as a symbol of very bad things. And nuclear explosions. As opposed to my home country where it means "very good fortune". And mixing those two up can be quite bad.

     

     

    If that's true, then how do you explain all of the positive mushroom references in Super Mario Bros., and its sequels?

     

    Unless SMB was meant to have taken place in a post-apocolyptic wasteland where all of the enemies were victims of radiation-induced mutations... hmm...

    SMB is actually a LSD-induced trip. That's why it's so coloful and cheery and surreal.



  • @Mason Wheeler said:

    Good.  People like that shouldn't be writing code, and if a class manages to weed them out early on, that makes it that much better for the rest of us who might otherwise end up having to use something they wrote!

    They will just start writing code earlier than the rest of the class graduates.



  • @Tyler said:

    If that's true, then how do you explain all of the positive mushroom
    references in Super Mario Bros., and its sequels?

    Unless
    SMB was meant to have taken place in a post-apocolyptic wasteland where
    all of the enemies were victims of radiation-induced mutations... hmm...

     Drugs. Lot's of them.

    @Wikipedia said:


    Shigeru Miyamoto stated in an interview that the Super Mushroom was created by chance. The first sketches of Mario turned out to be too big, and they were forced to shrink them. Then the development team thought it would be interesting to have Mario grow and shrink by eating a magic mushroom.

    Source

Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.