"The Real World" vs "School"...what I've learned...


  • I survived the hour long Uno hand

    I don't remember. I wrote down some of the studies we went over in class but not the name of the textbook.



  • Poop.


  • I survived the hour long Uno hand

    I did find this amusing excerpt from my notes for week two:

    HCI wk 2

    don’t be late
    Master’s students should be abl to do a literature review

    • not 3 pages of definitions or 3 pages regurgitated from slides
    • not making use of wikipedia
    • this guy is an asshole
    • :(

  • ♿ (Parody)

    @Yamikuronue said:

    I wrote down some of the studies we went over in class but not the name of the textbook.

    Selling back the books for your main curriculum is TRWTF.


  • I survived the hour long Uno hand

    I think I got that one from the library in the first place. After a number of dry textbooks I figured if something struck me as useful I'd buy a copy once I moved back home rather than try to cart them overseas.



  • @boomzilla said:

    a guy took out an overcooked cake (looked like brownies to me, but he called it "cake") from the pan and used it to hammer a nail into the door jamb.

    I used to know someone who claimed to have broken a window with a pancake once. (Apparently they’d forgotten that it was on the stove, and then found it was hard enough to play frisbee with.)



  • @boomzilla said:

    a guy took out an overcooked cake (looked like brownies to me, but he called it "cake") from the pan and used it to hammer a nail into the door jamb

    The cake is a lie.



  • @dkf said:

    You're a young whippersnapper! In my day, we had to make whatever we were running fit into 640kB because working with high memory was so awful.

    In my day, we had to write overlay managers to manage the bottom two 4KiB banks of RAM available in the Language Card.



  • I like building brainfuck interpreters in XSLT too, but I don't try to use it as a solution to an actual problem.


  • :belt_onion:

    @Yamikuronue said:

    I did all those in school.

    Same here. These must be some crappy schools if they don't teach ANYTHING useful.... This topic is making me seriously respect my university's apparent quality of CS education. We had 2 required classes that were essentially semester long projects that were run like a business, which were supposed to simulate real-world experience. I had no problem fitting in with senior developers within a few months on my first job.

    There are also a lot of problems I've solved at work using knowledge garnered from university about low level design and architecture that a lot of my co-workers don't have due to the lack of CS/IT focused college studies.

    The main thing that wasn't taught in school is difficulty of work bureaucracy... but that also depends on where you work... and can be learned in a few Dilbert strips ;)


  • :belt_onion:

    @darkmatter said:

    The main thing that wasn't taught in school is difficulty of work bureaucracy

    oh, and various things like Sarbanes-Oxley compliance bullshit requirements.

    I think they STILL print every single motherfucking code change on the payroll/accounting side of things. As if it matters what's on a sheet of paper in a folder in a cabinet somewhere when we have a HUGE MOTHERFUCKING CHANGE MANAGEMENT SYSTEM SPECIFICALLY FOR TRACKING THE CHANGES TO CODE.

    deep breath

    God that still blows my mind.

    I had one of the financial system auditors actually ask me to print an RPG-IV program for him, and then requested that I "write in the margins to describe how it works" so he could audit whether it did what it says it does..... I asked him how familiar he was with RPG, and he basically had no fucking clue. Mind... Blown...... talk about creating work just to validate his own existence, as if there's any fucking chance he could tell whether that program works as intended as opposed to dumping all our corporate income into my personal account in the Caymans. I assume he was figuring it'd be all commented like [code]//begin fraudulent stuff here
    save(moneyFromCompany) into (My Bank Account)
    //end stealing[/code]

    or some shit. I don't even.



  • @John2 said:

    Things like refactoring, design patterns, unit testing, and so forth. A lot of those things aren't learned in the "classroom" as far as I can tell.

    I must have been taking classes in an alternate universe because I've taken most of those courses.



  • @accalia said:

    but that's a think that schooling should teach you, no?

    This @accalia seems oddly appropo.



  • @NedFodder said:

    they could care less

    AAAAAAGGGGGGHHHHHHH!!!



  • I could care more about the degree.



  • I didn't, but I did Electronic, Electrical, Mechanical and Software Engineering.
    The emphasis was on the hardware, eg bridges and killbots.
    So take solace in the likelihood that the killbot chasing you will probably have poorly factored firmware.



  • As I said, though, I'm just getting started with this. I haven't seen the entire curriculum. As far as I know, the things I've mentioned could indeed be taught later.



  • @John2 said:

    refactoring

    We did that.

    @John2 said:

    design patterns,

    We did that.

    @John2 said:

    unit testing

    We did that with two different frameworks.

    @accalia said:

    Database design, and why 3NF is sometimes the wrong choice...

    Okay, what the fuck are they teaching you over there in the US? How to boot a computer?


  • FoxDev

    @Maciejasjmj said:

    How to boot a computer?

    oddly, no. you're assumed to already know how to do that.

    it's not so much a case of not teaching, but of teaching the how without teaching the why.

    so the school churns out a bunch of people filled to the brim with design patterns and fancy new languages/techniques with none of the knowledge of when to apply their knowledge and when to adapt to existing conditions.

    For example: They know the Factory pattern, but not how to recognize the problem the factory pattern is designed to solve.


  • ♿ (Parody)

    @accalia said:

    so the school

    Plus, there are a lot of schools. Well, my degree wasn't in CS (or any other IT related thing). I took, like, CS101 and CS102 from that department, but that's about it.



  • @accalia said:

    oddly, no. you're assumed to already know how to do that.

    I did fail to do so on one course of the 6th semester. The power button on the case was deviously hidden.

    Anyway, here they did tell us what problem the Factory or Decorator or whatever deals with. To actually recognize this problem when you encounter one is not a skill that can be taught.



  • @blakeyrat said:

    I actually don't believe in denormalization.

    This a bit of a unsucessful troll?

    I slightly furthered our denormalizing our databases to avoid joining two more tables, since a common query just wanted to see if certain rows existed in the indirect table. Much quicker to just add another column that gets populated as the other table is updated instead of having to join many tables on every request!

    As an aside, we have a position open for a database administrator ("proper DBA" - none of us can seriously claim that title)



  • the 3NF is for lazy people. 5NF is better...

    About denormalizing, it's not only the data that matters, but also the expected queries, something that is even more unlikely to be available in the specifications (if you are lucky to have some). Schools look at data but usually forget to have any query on it.

    The one big point of database: Do not store Age, store timepoint !



  • @accalia said:

    oddly, no. you're assumed to already know how to do that.

    Funny, "CSC101: How to Turn On Your Computer" is the name of the course getting a perfect score on my AP Computer Science exam got me out of! It got me into "CSC102: Creating a New Word Document"!

    actually it got me out of Fundamentals of Computing and put me in an Intro to Programming class, but it still felt that way. Luckily, I was able to cut a deal with the department head -- do the final project in a day -- and made it into a real programming class instead.



  • @Zemm said:

    I slightly furthered our denormalizing our databases to avoid joining two more tables, since a common query just wanted to see if certain rows existed in the indirect table.

    Before or after benchmarking?

    @Zemm said:

    Much quicker to just add another column that gets populated as the other table is updated instead of having to join many tables on every request!

    And you have the data to prove it?

    Look, I'm not saying you're doing it wrong. Necessarily. But virtually always, when I meet someone who's really excited about denormalization, they've done zero benchmarking, zero research. It's the worst kind of premature optimization.

    And 99% of the time, when these people claim it's faster, they have no numbers to prove it.



  • Anything other than Java is rare thing to learn in CS schools these days...



  • @mott555 said:

    About the only thing I learned in school that I wouldn't have learned in the real world is proper 3NF database design. That's literally it.

    To add on to this, I had plenty of decent classes in college. The problem was I was also hired as a programmer by the university I was attending, and I was learning things on the job long before we'd cover them in class. The little 35-line homework assignments to demonstrate switch statements in Programming II are pretty much child's play when you're already dealing with university records via ADO.NET.



  • @Yamikuronue said:

    It was called Human-Centipede Interaction

    How I initially read it.



  • @accalia said:

    ... How to think critically about a problem....

    Oh that's universally discouraged.

    If anyone does think of a new way to do something and it gets the stamp of approval, it just becomes the next thing to suppress critical thinking.

    Seriously, I'd rather a child fail to solve a problem after trying 100 ways, than learn to solve a problem in an exact formula.

    Because last time I checked, WD-40 gets its name from 39 failures to displace water.



  • @boomzilla said:

    I'm not sure how it's taught or if it's even generally possible

    You throw someone into a lake and say "swim", ready to pull them out if needed, without letting them realize this.

    You give them a small example set of problems that show each mechanic of solving the set of problems, then give them a massive version of the problem and get the hell out of their way.

    Give them an impossible test that counts towards their grade, but secretly weight it so low that it doesn't affect them.

    You put up boundaries, and say "this area inside is safe", then let them fall all over themselves running around, and exploring what they can do.

    So, basically....

    you give them room to fail.



  • BCNF for life!



  • Our first year of Compsci in NZ was 'how to use Java and Swing' - yes, it really was that bad.

    Second year, we had random courses which taught us basic things about C, Ruby, Bash, discrete mathematics, networking, databases, and whatever else they thought might fit into a semester.

    Third year, AI, OS, and a few other, worse classes. We got two assignments in Prolog, without knowing it existed, and got a basic introduction to 'How C# and Java differ, and by the way this is Linq it's important because reasons'.

    That's a bachelor's degree over there.



  • @TwelveBaud said:

    CSC102: Creating a New Word Document

    That's actually CS 101 here.



  • i have no data. but when i denormalize it's for convenience, not speed.
    having aggregates which shouldn't exist if you blindly follow the rules, but make your queries simpler.
    if you denormalize for speed, the odds are that you're doing it wrong.



  • normalizing is overrated.

    data should be divided into logical entities that accept actions

    completely normalizing can result in chopstick race conditions.



  • You are doing it wrong.

    "Convenience" isn't a good reason to throw data integrity into the crapper.



  • Hold up here.... looking through my notes.

    Data.... should be... blah blah.... and... blah blah...

    Oh yes.

    Programming is a balance between ensuring the ability of the programmer to maintain the code, and the ability of the computer to efficiently and precisely execute it.

    No mention of "convenience" here either. I mean, it's of no concern to me how convenient it is for the computer to execute the code.

    @Jarry said:

    but make your queries simpler.

    Unless that means, "make it easier for a programmer to understand the query so they can correctly modify it" or "ensure a computer can execute a change on it with less risk for error"...

    I'm afraid blakey is right.



  • @xaade said:

    "make it easier for a programmer to understand the query so they can correctly modify it"

    that, easier for me and my team to develop and maintain

    @blakeyrat said:

    throw data integrity into the crapper.

    i get your point. but the integrity doesn't go to the crapper. the not denormalized parts have the same integrity. in the other parts, you have to keep the integrity in the application code, it's not so hard if you have only one application accessing the db(as is the case with almost all the systems i have to do).

    and, in case of failure the denormalized data can be always recalculated.

    all in all, it's a trade off. i know what i'm giving up. but it seems the rigth choice given the things i'm doing.



  • @Jarry said:

    denormalized data can be always recalculated

    Have you tried RavenDB?

    No.

    Go and try it. Get back to me.



  • @Jarry said:

    that, easier for me and my team to develop and maintain

    has to be opportunity and risk cost balanced against

    @Jarry said:

    in case of failure



  • @ben_lubar said:

    That's actually CS 101 here.
    Okay, technically it's CSC1100 here.



  • ok. looks interesting.



  • facepalm



  • @Jarry said:

    that, easier for me and my team to develop and maintain

    Show me the numbers to back that up. You're asserting it like it's a fact, so you must have data backing it... right?

    @Jarry said:

    i get your point. but the integrity doesn't go to the crapper.

    Yes, it does. Either that or you're reimplementing normalization (poorly) using, for example, triggers or long-winded stored procedures.

    @Jarry said:

    the not denormalized parts have the same integrity.

    "not denormalized".

    Anyway, I'd love to see a demonstration of what you're talking about, because I don't see how that could possibly be correct.

    @Jarry said:

    in the other parts, you have to keep the integrity in the application code, it's not so hard if you have only one application accessing the db(as is the case with almost all the systems i have to do).

    If you're using NoSQL, this is the preferred way to do it. Not because you want to, but because you have to.

    You're using a system designed for consistency and removing its consistency. Regardless of how many applications are using the database, that strikes me as a bad idea.

    @Jarry said:

    all in all, it's a trade off. i know what i'm giving up. but it seems the rigth choice given the things i'm doing.

    Show me the data. I don't live my life by "seems to be".



  • 🚎 ?



  • RavenDB is a document style database.
    It has you distributing changes to contained "references" by trigger.

    This happens to fail spectacularly, which is proof that cascade updating isn't something to be thoughtlessly automated. Locking contentions oh my.

    Which is blakey's point.

    If you need this kind of functionality, you need to use a database that was designed to operate in this manner.

    I'm not saying that normalization is the end-all. I'm saying that in sql it almost always is.

    If you need another way to store and interact with data, you need another database provider.


  • ♿ (Parody)

    @xaade said:

    I'm not saying that normalization is the end-all. I'm saying that in sql it almost always is.

    Yeah, I'm not clear on what sort of thing he's denormalizing. If it's...removing a phone number table and sticking that on a person's record...OK, probably no big deal, and possibly less of a hassle to deal with. But talking about "aggregates" makes me suspicious.



  • @boomzilla said:

    But talking about "aggregates" makes me suspicious.

    These "JOIN" things get confusing. If I LEFT JOIN sometimes the data is null in some of the columns, but if I don't, I won't have access to the data from the left side of the relationship for records that don't have a foreign key record. I need to have a list of every customer regardless of whether they input a phone number.

    Um, just query the original table?

    But then I have more than one view to maintain. Let's just denormalize the many to many relationship. It's ok, I can select distinct on the customer side of the data.


  • Java Dev

    I suspect he's optimizing away the inline subquery in:

    SELECT parent.id,
        (SELECT COUNT(*) FROM child WHERE parent_id=parent.id) num_children
    FROM parent;
    


  • Why would you write that with a subquery? Ugh.

    SELECT parent.id, count( child.id )
    FROM parent
    JOIN child on blah = blah
    GROUP BY parent.id
    

    Maybe the argument for denormalization is "we're really bad at writing SQL".


Log in to reply