Bad Textbook



  • The textbook for both my "Object Orented Programming With C++" class and my data structures and algorithms class were both written by the same author. 

     

    In the object oriented class he explained composition and inheritence by saying that one is an "is a relationship" and the other is a " has a relationship, but then in the first homework asignment after that we were supposed to derive a circle from the point class. 

    In the data structures and algorithms class one of the examples where he created an EmployeeLinkedList class that looked almost exactly the same as a Coding Horror.

     This is the n queens example he made:

     

    //Header file: stackADT.h
    
    #ifndef H_queueADT
    #define H_queueADT
      
    //*************************************************************
    // Author: D.S. Malik
    //
    // This class specifies the basic operations on a queue.
    //*************************************************************
    
    template <class Type>
    class queueADT
    {
    public:
        virtual bool isEmptyQueue() const = 0;
          //Function to determine whether the queue is empty.
          //Postcondition: Returns true if the queue is empty,
          //    otherwise returns false.
    
        virtual bool isFullQueue() const = 0;
          //Function to determine whether the queue is full.
          //Postcondition: Returns true if the queue is full,
          //    otherwise returns false.
    
        virtual void initializeQueue() = 0;
          //Function to initialize the queue to an empty state.
          //Postcondition: The queue is empty
    
        virtual Type front() const = 0;
          //Function to return the first element of the queue.
          //Precondition: The queue exists and is not empty.
          //Postcondition: If the queue is empty, the program 
          //    terminates; otherwise, the first element of the queue
          //    is returned.  
    
        virtual Type back() const = 0;
          //Function to return the last element of the queue.
          //Precondition: The queue exists and is not empty.
          //Postcondition: If the queue is empty, the program 
          //    terminates; otherwise, the last element of the queue
          //    is returned.
    
        virtual void addQueue(const Type& queueElement) = 0;
          //Function to add queueElement to the queue.
          //Precondition: The queue exists and is not full.
          //Postcondition: The queue is changed and queueElement is 
          //    added to the queue.
    
        virtual void deleteQueue() = 0;
          //Function to remove the first element of the queue.
          //Precondition: The queue exists and is not empty.
          //Postcondition: The queue is changed and the first element
          //    is removed from the queue.
    };
    
            
    #endif
    
    #include <iostream>
    #include <cmath>
    #include "nQueenPuzzle.h"
     
    using namespace std;
    
    nQueensPuzzle::nQueensPuzzle()
    {
    	noOfQueens = 8;
    	queensInColumn = new int[8];
    	noOfSolutions = 0;
    }
    
    nQueensPuzzle::nQueensPuzzle(int queens)
    {
    	noOfQueens = queens;
    	queensInColumn = new int[noOfQueens];
    	noOfSolutions = 0;
    }
    
    bool nQueensPuzzle::canPlaceQueen(int k, int i)
    {
    	for(int j = 0; j < k; j++)
    		if((queensInColumn[j] == i)
    			|| (abs(queensInColumn[j] - i) == abs(j-k)))
    			return false;
    	return true;
    }
    
    void nQueensPuzzle::queensConfiguration(int k)//, int queens)
    {
    	for(int i = 0; i < noOfQueens; i++)
    	{
    		if(canPlaceQueen(k, i))
    		{
    			queensInColumn[k] = i;
    			if(k == noOfQueens - 1)
    				printConfiguration();
    			else
    				queensConfiguration(k + 1);
    		}
    	}
    }
    
    void nQueensPuzzle::printConfiguration()
    {
    	noOfSolutions++;
    	cout<<"(";
    	for(int i = 0; i < noOfQueens - 1; i++)
    		cout<<queensInColumn[i]<<", ";
    
    
    	cout<<queensInColumn[noOfQueens - 1]<<")"<<endl;
    }
    
    int nQueensPuzzle::solutionsCount()
    {
    	return noOfSolutions;
    }
     

    One of the examples reminded me of the first time I deceded that that my one main method "programs" wasn't right. I had a vaugue idea that it needed more functions so I rewrote it, then realized that It was basically exacly the same becuase I was just passing all the data from function to function. 

    One example looked like

    initialize some variables

     calculations to assign them values

    initiaze another variable

    print the previous values

    calculations invovling the next variable.

     

    I feel sorry for students who haven't read any other books so that they have anything to compare this to. 

    I kept wondering if the ijk variables and wierd interfaces and logic explained why he needed to make the text so fluffy and repetitive.

     



  • TRWTF is teaching C++ to students in 2013.



  • How are we supposed to prove that we can use pointers and recusion?



  • @Chame1eon said:

    How are we supposed to prove that we can use pointers and recusion?

     

    I think Go has those.

     



  • @Chame1eon said:

    How are we supposed to prove that we can use pointers and recusion?

     

    Because obviously no other language has these.


     



  • @BC_Programmer said:

    @Chame1eon said:

    How are we supposed to prove that we can use pointers and recusion?

     

     

    Because obviously no other language has these.


     

     

    I was half kidding.  I'm not actually sure why they seem to test this ability for jobs that don't require it.  

    More seriously I actually liked the c++ primer becuase they mentioned described everything (lvalues integrals etc) that Java books seem to skim over.  

    I thought c and python might make more sense. c has low level stuff in more detail and python is very expressive and simple, but has functional features and objects.

     

    After I started reading that I started reading similar chapters in the c++ primer instead.  I tried the same with Intro to Algorithms but they were not similar enough.



  • @Chame1eon said:

    I thought c and python might make more sense. c has low level stuff in more detail and python is very expressive and simple, but has functional features and objects.

    If it was up to me, colleges would teach assembly, C, C# and Javascript, to give a full range of experience. They'd also teach stuff like SQL, VCS, automated testing, documentation and working on a software engineering team.



  • @morbiuswilters said:

    @Chame1eon said:
    I thought c and python might make more sense. c has low level stuff in more detail and python is very expressive and simple, but has functional features and objects.

    If it was up to me, colleges would teach assembly, C, C# and Javascript, to give a full range of experience. They'd also teach stuff like SQL, VCS, automated testing, documentation and working on a software engineering team.

     

    I've looked at the assembler produced by c compilers and in debuggers while reading buffer overflow articles and I seem to know about as much about x86 as MIPS which was covered in one of my classes, but I don't know why I would want to know more.  I can't think of any way to manipulate anything at that level.

     

    Do you mind explaining more about this?



  • @morbiuswilters said:

    If it was up to me, colleges would teach assembly, C, C# and Javascript, to give a full range of experience. They'd also teach stuff like SQL, VCS, automated testing, documentation and working on a software engineering team.

    If I had my way, incoming freshmen would play Rocky's Boots for the Commodore 64 (not that shitty Apple ][) until they could beat every challenge, then read "The Way Things Work", the chapter on how to build a computer out of half-adders, then skip directly to memory-managed languages (probably C#) with a gigantic warning to never use a non-managed language unless you had no alternative.

    Plus everything in Morbs' second sentence there. The SQL/data storage philosophy class I took was the only thing of use I got from my university career.



  • @morbiuswilters said:

    If it was up to me, colleges would teach assembly, C, C# and Javascript, to give a full range of experience. They'd also teach stuff like SQL, VCS, automated testing, documentation and working on a software engineering team.

    My university has classes on all of those (except C#, Java instead), all in the first or second year to give you a basic understanding of software engineering. They don't all have those?



  • @Chame1eon said:

    ...and I seem to know about as much about x86 as MIPS which was covered in one of my classes...

    It seems you learned assembly, then, right?

    @Chame1eon said:

    Do you mind explaining more about this?

    I think it's useful to understand how higher-level languages are translated into machine instructions.



  • @dtech said:

    They don't all have those?

    HA HA HA HA HA HA.



  • @blakeyrat said:

    Rocky's Boots

    I had a game for DOS sorta like that. You had to wire up robots using logic gates, and make them complete tasks for you. Also, it was set in some hellish future where robots ruled the world and would kill you on-sight, so you had to use your robot creations to hide you.

    Edit: Holy crap, I was able to find it easily on Google: Robot Odyssey. Apparently it was made by the same company that made Rocky's Boots and it used the same game engine.



  • @morbiuswilters said:

    @Chame1eon said:
    ...and I seem to know about as much about x86 as MIPS which was covered in one of my classes...

    It seems you learned assembly, then, right?

    I can read sections of it with a reference card. So..

    @Chame1eon said:

    Do you mind explaining more about this?

    I think it's useful to understand how higher-level languages are translated into machine instructions.

     

    I meant the list of languages. Is it becuase the languages themselves are useful or is it becuase they would be ideal for learning certain concepts or features in the general sense. 

    Javascript seems kind of annoying to me and I don't know of any unique abilities that I would want, but it's useful.

     



  • Discourse touched me in a no-no place

    @Chame1eon said:

    both written by the same author
    Herbert Schildt? Yashwant Kanetkar?



  • @PJH said:

    @Chame1eon said:
    both written by the same author
    Herbert Schildt? Yashwant Kanetkar?
     

    John McAfee.



  • @Chame1eon: I never really spent any real time learning Assembly or C so I can't comment.

    But C# is the best Statically Typed Language out there. The framework is for the most part is well designed.

    SQL and understanding at least 3rd normal form is pretty important if you are doing any database work (which is pretty much a given if you are doing business applications). I am not amazing with databases, but I can usually do a decent job if I am required to create one for web app.

    JavaScript is one of those languages that is extremely expressive, but unless you know what you are doing you are probably creating a WTF in the process. I had a really hard time getting my head around it until I read JavaScript Patterns.



  • @lucas said:

    @Chame1eon:

     

    Does this HONESTLY look like Twitter to you?  Jesus Marion Christ, social networks are ruining pure-topic discussion forums.

     



  • @Chame1eon said:

    Is it becuase the languages themselves are useful or is it becuase they would be ideal for learning certain concepts or features in the general sense.

    Both. C is the grandfather to so many modern languages, and C is still used so widely. It forms the basis of practically every OS in existence. And understanding fundamental C concepts can help you understand things that the OS provides, like file systems, memory management, networking.

    C# is a good example of a modern, high-level, strictly-OO, class-based language. It's in the same vein as Java, although with fewer rough edges. Most development is done in Java or C#, and if you already know one you have a leg up on learning the other. Learning either is going to be a big benefit for finding employment.

    JS is almost on the opposite end of the spectrum from C#. It has a different OO paradigm, being prototype-based instead of class-based. There are few prototypal languages out there and I think learning the fundamentals not only broadens your horizons, it helps you understand class-based OO better, too. Also, JS is a very, very dynamic language with many unique features, such as closures. It's a great way to get exposure to dynamic programming, and to learn both its benefits and pitfalls. Also, JS is used in a lot of development nowadays, so it's a useful job skill.



  • @PJH said:

    @Chame1eon said:
    both written by the same author
    Herbert Schildt? Yashwant Kanetkar?

    The Deitels. (Actually, their books are really good.)



  •  @drurowin said:

    Does this HONESTLY look like Twitter to you?  Jesus Marion Christ, social networks are ruining pure-topic discussion forums. 

     No, but I did quick reply and then there was two additional posts betime I typed mine. It was a quick way for me to indicate who I was replying to. Anyway using "@" to actually refer to something didn't come from using twatter.

     



  • I wouldn't teach any C++ though. I'd use C as an example of "what goes on under the hood" since it's rather simple versus C++ (I still don't know how classes in C++ work, but C's structures make sense to me).

    I haven't found anyone say why we still need pointers and manual memory management in modern coding and also back up the necessity of things that C abstracts away such as jumps and registers.

    We need a low level language that compiles into machine-level bytecode like C, only that takes care of memory management itself. I mean, Go does it, so why can't a sane language do so?

    Also, JavaScript is on the way to becoming the new One True Language™, so otherwise you'll end up like me limiting your career because you refuse to work with or even learn the current One True Language™.



  • @lucas said:

     @drurowin said:

    Does this HONESTLY look like Twitter to you?  Jesus Marion Christ, social networks are ruining pure-topic discussion forums. 

     No, but I did quick reply and then there was two additional posts betime I typed mine. It was a quick way for me to indicate who I was replying to. Anyway using "@" to actually refer to something didn't come from using twatter.

     

     

    @ as a prefix means they're an operator on the IRC channel you're on.

     



  • @MiffTheFox said:

    I wouldn't teach any C++ though.

    That was the first thing I said.

    @MiffTheFox said:

    We need a low level language that compiles into machine-level bytecode like C, only that takes care of memory management itself. I mean, Go does it, so why can't a sane language do so?

    GC is a complex topic. A good, GC'd systems language could probably work (not Go) but there'd still be a need for C. Some things require manual memory management.



  • I agree 100% with what morbius has said, with Python as a potential alternative to C#.

    I'd also throw a completely different language like Prolog or Haskell to the mix, just to show there is life beyond imperative programming.

    Implementing a Lisp would be a good exercise too, perhaps even in the shape of a dumb JIT to some fictional ISA.



  • @morbiuswilters said:

    TRWTF is teaching C++ to students in 2013.
    <br >

    Eh, C++ is good as an intro to C, with cute little couts and a string class instead of these weird printfs. I've started with C++ and it's a good first language - it has a lot of capabilities that you don't necessarily have to use.

    And C was pretty damn useful in our college, when one of the courses basically started with "Here's a not-even-marginally useful textbook, our first project will be an uC-based line following robot, due next month, good luck". All we knew about electronics was how a resistor looks like, so it was really helpful to at least not have to learn a whole new language.

    That's not to say my WTFU doesn't have its WTFs. "Low-level programming" test consisted of finding bugs in a convoluted C code, "Operating systems" consisted of using way deprecated SysV mechanisms, and "Intro to Programming" was lead by a man who still writes a Delphi book every year, since the time it was still "Object Pascal". But we've covered C, C++, assembly, SQL, Java - sadly not C#, but you can't have everything - and

    adfghajrbgajergiaebrohmygodtheflashbacksfafjaafdconc([X|L1],L2,[X|L3]):-conc(L1,L2,L3).gaaaaah

    ...and yes, Prolog.



  • @drurowin said:

    @ as a prefix means they're an operator on the IRC channel you're on.

     

    The total number of fucks I give is zero. Everyone knew what I meant, stop being a bellend.

     



  • Imperial College's [url=http://www3.imperial.ac.uk/computing/teaching/ug/mengcompse]Computing degree[/url] attempts to just throw stuff at you - in the first year they teach [url=http://www3.imperial.ac.uk/computing/teaching/courses/120_1]Haskell[/url], [url=http://www3.imperial.ac.uk/computing/teaching/courses/120_2]Java[/url] and [url=http://www3.imperial.ac.uk/computing/teaching/courses/120_3]C[/url], but then the second year has [url=http://www3.imperial.ac.uk/computing/teaching/courses/275]C++[/url] and [url=http://www3.imperial.ac.uk/computing/teaching/courses/276]Prolog[/url].

    I'm not sure how well the "throwing everything at the wall and seeing what sticks" approach works, though. Hopefully I'll find out next year :P

    As for the programming robots thing, my school (UK Year 6/5th Grade in the States?) taught basic LOGO/BASIC stuff alongside touchtyping. The standard easy drag 'n' drop stuff with traffic lights and buttons and making traffic lights go different colours.

    EDIT: Wait, CS removes line breaks from posts?...



  • @lucas said:

    The total number of fucks I give is zero.
     

    Is your FuckArray zero-indexed? Do you give no fucks, or do we owe you a given fuck?



  • @blakeyrat said:

    @morbiuswilters said:
    If it was up to me, colleges would teach assembly, C, C# and Javascript, to give a full range of experience. They'd also teach stuff like SQL, VCS, automated testing, documentation and working on a software engineering team.

    If I had my way, incoming freshmen would play Rocky's Boots for the Commodore 64 (not that shitty Apple ][) until they could beat every challenge, then read "The Way Things Work", the chapter on how to build a computer out of half-adders, then skip directly to memory-managed languages (probably C#) with a gigantic warning to never use a non-managed language unless you had no alternative.

    And then they'd all go on to write Shlemiel the Painter's Algorithm over and over, and every embedded controller would have to have at least a gig of memory, a full pre-emptive MT OS, and a multi-GHz CPU.

    You gotta understand what's going on under the hood in much more depth than just knowing a few boolean logic gates. 




  • @morbiuswilters said:

    @Chame1eon said:
    I thought c and python might make more sense. c has low level stuff in more detail and python is very expressive and simple, but has functional features and objects.

    If it was up to me, colleges would teach assembly, C, C# and Javascript, to give a full range of experience. They'd also teach stuff like SQL, VCS, automated testing, documentation and working on a software engineering team.

    ...and team leadership/project management (you know, for the incompetant programmers who will end up managing some day).



  • @morbiuswilters said:

    Edit: Holy crap, I was able to find it easily on Google: Robot Odyssey. Apparently it was made by the same company that made Rocky's Boots and it used the same game engine.

    Robot Odyssey was the sequel, numbnuts.

    Rocky's Boots was the original and BEST.



  •  I do teach C++ to my students, as well as Java (caveat: I teach high school, not college).  We are also going to be using some Python for introductory students.  Speaking from experience, students who take C++ get a much more complete picture of what's happening in memory than students who take only Java, and have a much easier time managing references once they've formed that mental picture.  I'm not necessarily claiming that C++ is the only or ideal vehicle to promote that understanding, but it's not a bad language as long as you stay out of the darker corners (and at the level we teach, we don't have any reason to poke into those corners).



  • @DaveK said:

    And then they'd all go on to write Shlemiel the Painter's Algorithm over and over, and every embedded controller would have to have at least a gig of memory, a full pre-emptive MT OS, and a multi-GHz CPU.

    This is what people DON'T FUCKING GET. Writing software for embedded controllers is domain knowledge. There's no reason, NO REASON, universities should teach to your particular domain (embedded controllers) over my particular domain (web/database app development) over NASA's domain (literal rocket science.)

    Guess how much knowing about embedded programming in C has advanced my career? Hint: FUCKING ZERO. If I thought like you, I'd say, "oh universities should teach web programming because obviously it's the most useful" but guess what? I don't think like you; I understand that programming for the web is domain-specific knowledge and nothing universal to programming itself.

    Something everybody listing bullshit so far has missed: usability. Also the thing programmers are weakest at. And something with UNIVERSAL applicability. (Unlike the embedded bullshit or the web programming bullshit.)



  • @blakeyrat said:

    I understand that programming for the web is domain-specific knowledge and nothing universal to programming itself.
     

    Where do you draw the line between specific knowledge and generic knowledge?



  • @blakeyrat said:

    @DaveK said:
    And then they'd all go on to write Shlemiel the Painter's Algorithm over and over, and every embedded controller would have to have at least a gig of memory, a full pre-emptive MT OS, and a multi-GHz CPU.

    This is what people DON'T FUCKING GET. Writing software for embedded controllers is domain knowledge. There's no reason, NO REASON, universities should teach to your particular domain (embedded controllers) over my particular domain (web/database app development) over NASA's domain (literal rocket science.)

    Guess how much knowing about embedded programming in C has advanced my career? Hint: FUCKING ZERO. If I thought like you, I'd say, "oh universities should teach web programming because obviously it's the most useful" but guess what? I don't think like you; I understand that programming for the web is domain-specific knowledge and nothing universal to programming itself.

    Something everybody listing bullshit so far has missed: usability. Also the thing programmers are weakest at. And something with UNIVERSAL applicability. (Unlike the embedded bullshit or the web programming bullshit.)

    Understanding HTML would always be a plus as long as your company has (or uses) a website.



  • @dhromed said:

    Where do you draw the line between specific knowledge and generic knowledge?

    Generic knowledge: how to write a program, manage the project, create a usable interface, create a sane data store, etc.

    Domain knowledge: the stuff the program *does*.

    For example, if you're writing "I Need a Budget", you need all of the stuff in the top bucket, but you also need domain knowledge of: accounting, interfacing with bank networks, etc. If you're writing Outlook, you need all the stuff in the top bucket, but you also need domain knowledge of: email protocols, scheduling techniques, how people work together, etc.

    The problem I had in school is they taught programming as if the sole use of writing a computer program was doing orbital calculations for NASA. So we learned very little in the way of useful skills about programming, but we had SHITLOADS OF ADVANCED MATH. (In fact, a friend and I worked it out: once you did the CS program you were literally 1 class away from a Math minor.) So it completely alienated me as a student, considering my interests were in making computers easier to use. (Something which they didn't even attempt to teach.)

    And this is the attitude I'm still seeing: the only difference is that DaveK wants to teach programming as if its sole use is programming microcontrollers. Well, it's not. It's a problem solving technique that can be used to solve ANY problem, and schools shouldn't alienate students who aren't interested in the class of problems they "like", or problems that seem "more scholarly", or whatever.

    The real problem is that universities are always teaching 25 years out-of-date. I don't think that problem can be solved.



  • BTW Rocky's Boots was written by Warren Robinett. You could do much worse than emulate that dude.



  • @Lorne Kates said:

    @lucas said:

    The total number of fucks I give is zero.
     

    Is your FuckArray zero-indexed? Do you give no fucks, or do we owe you a given fuck?

    sorry it just an integer value of fucks.

     



  • @blakeyrat said:

    @DaveK said:
    And then they'd all go on to write Shlemiel the Painter's Algorithm over and over, and every embedded controller would have to have at least a gig of memory, a full pre-emptive MT OS, and a multi-GHz CPU.

    This is what people DON'T FUCKING GET. Writing software for embedded controllers is domain knowledge. There's no reason, NO REASON, universities should teach to your particular domain (embedded controllers) over my particular domain (web/database app development) over NASA's domain (literal rocket science.)

    Guess how much knowing about embedded programming in C has advanced my career? Hint: FUCKING ZERO. If I thought like you, I'd say, "oh universities should teach web programming because obviously it's the most useful" but guess what? I don't think like you; I understand that programming for the web is domain-specific knowledge and nothing universal to programming itself.

    Here's what you don't fucking get: the English language.  I didn't say universities should teach embedded programming.  I said they should teach the underlying fundamentals, which apply to every domain, and then listed some of the domain-specific consequences of not doing so.  This was a counterargument to your suggestion that they should skip everything between boolean logic and high-level managed-memory languages.  You were preaching ignorance as a strategy, I pointed out some of the disadvantages of that strategy.

    Isn't it you who was arguing that they should teach domain-specific stuff like SQL?  I think it is! 




  • @DaveK said:

    Isn't it you who was arguing that they should teach domain-specific stuff like SQL?  I think it is! 
     

    Oh look at that; subjective disagreement on whether something should be filed under domain-specific or generic knowledge. Who would have thought! Not me! Oh wait! Me! I did!

    Predicted Blakeyresponse: "SQL isn't domain-specific; it's generic"  OR "I never said that"

     place your bets


  • Winner of the 2016 Presidential Election

    It's funny (though not surprising) that each of us has strong biases as to what the most influential or quintessential computer languages are, and they mostly seem to be based on our own personal histories with them.

    I would like to see the languages mentioned as well as a template language like XSLT and also Lisp.

    Clearly, students should have to learn all languages, ever.


  • Discourse touched me in a no-no place

    @dhromed said:

    Predicted Blakeyresponse: "SQL isn't domain-specific; it's generic"  OR "I never said that"
    I'm going for (b).



  • @PJH said:

    @dhromed said:
    Predicted Blakeyresponse: "SQL isn't domain-specific; it's generic"  OR "I never said that"
    I'm going for (b).
    I'm going for A.  He already specifically said "creating a sane data store."  And I agree.  You need to store data in pretty much all programs, not just things in specific domains.



  • It wasn't clear to me that you were making that argument.

    In any case, I disagree. The Blakeyrat method does teach the basics. Teaching a memory-managed language isn't mutually-exclusive with teaching efficient algorithms... on the contrary, it makes teaching efficient algorithms easier because you don't need to worry about all the boilerplate bullshit.

    There's no reason to assume a student who started with C# would be more or less likely to pick a bad algorithm over one who learned C. If you have evidence otherwise, please share.



  • @Sutherlands said:

    @PJH said:
    @dhromed said:
    Predicted Blakeyresponse: "SQL isn't domain-specific; it's generic"  OR "I never said that"
    I'm going for (b).
    I'm going for A.  He already specifically said "creating a sane data store."  And I agree.  You need to store data in pretty much all programs, not just things in specific domains.

    Exactly. Maybe "SQL" is too specific, and I should have said "data storage philosophy", but right now AFAIK the only class teaching the latter also teaches the former.



  • I certainly would have benefitted from project management classes. Eg, it's ridiculous that I've only learned about Gantt charts in the wild.

    Here's a serious question: how do you teach usability? Other than what can be summarized as "test with your users and get their feedback"? Do you teach metrics and stuff? Give good and bad examples of interfaces, and analyse them? Am I answering my own question here? I am, aren't I?



  • @Zecc said:

    I certainly would have benefitted from project management classes. Eg, it's ridiculous that I've only learned about Gantt charts in the wild.

    Here's a serious question: how do you teach usability? Other than what can be summarized as "test with your users and get their feedback"? Do you teach metrics and stuff? Give good and bad examples of interfaces, and analyse them? Am I answering my own question here? I am, aren't I?

    That's about how my usability course went. Add some very low-level psychology (extent of short-term memory, how to make things appear related etc.) and some general UI concepts. Also 3 hours spent on fonts for some arcane reason.



  • @witchdoctor said:

    That's about how my usability course went. Add some very low-level psychology (extent of short-term memory, how to make things appear related etc.) and some general UI concepts. Also 3 hours spent on fonts for some arcane reason.

    Seems a lot like marketing / graphic design there. Except you spent 3 hours on fonts when the crux of typography can be summed down to a single word: Helvetica.



  • @lucas said:

    @drurowin said:

    @ as a prefix means they're an operator on the IRC channel you're on.

     

    The total number of fucks I give is zero. Everyone knew what I meant, stop being a bellend.

     

     

    Nevair.

     


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.