On Education



  • By consensus:

    @dhromed said:

    @Sunstorm said:
    If anything, unmanaged
    languages should always be taught and used earliest in one's
    programming life, and only when one gets fairly competent with that,
    move on, or not, depending on the needs, to the safer ones. Not only a
    good background closer to the memory helps understand the underlying
    nature and caveats of managed languages, but most mistakes are learned
    before you get the chance of blowing up something important.

    I agree with your case, but not quite with the order of learning things. 

    I'd
    say a new learner needs to get comfortable dishing out a logical,
    comprehensible and (most of all!) mainatainable script, before even
    knowing what the hell malloc() does. If you teach someone about memory
    management first, and good coding later, there's an increased risk of
    blowing out not only the lungs, but also the eyes and eardrums --
    through the little toe.

    Or are you arguing that they learn to
    appreciate a katana's sharpness, without actually being given a chance
    to swing it around, chopping off bits left, right & center?

     

    Hm.

    This might be a new thread.
     

    @Welbog said:

    @dhromed said:
    This might be a new
    thread.
    I support this plan. Arguments of the educated about how
    those who aren't educated should be educated are always entertaining if
    not educational.

     



  • In university we covered everything from Java to Intel assembly at different stages and in different courses. Different languages are better at teaching different things. Java is good for a high-level understanding of algorithms and object-oriented design. C++ is good for understanding optimization and low-level interaction. Lisp is good for demonstrating that not all languages are C variants. Assembly is great when used while learning about CPU architecture. As long as you don't come out of university/college having only studied one language, then that's good.

    As for which language is best to start with, that's probably a different story. I learned language before I learned the technology and implementations behind the languages. I know that I didn't turn out weird as a result. My university started with Java, diving right into object-oriented programming. The pace was really slow; that's pretty much all I remember about the beginning. I didn't really appreciate what compilers do for me until I had to write a simple one. Same thing with operating systems. I think writing both of those are very, very important to anyone writing code on a regular basis, but I definitely wouldn't want to start with it.

    I don't know. What do you think?



  • @Welbog said:

    Lisp is good for demonstrating that not all languages are C variants.

    WTF!

    If that's all your university managed to impress upon you about Lisp, then you went to the wrong school!



  • @djork said:

    @Welbog said:

    Lisp is good for demonstrating that not all languages are C variants.

    WTF!

    If that's all your university managed to impress upon you about Lisp, then you went to the wrong school!

    You mean trying to show us that not all our computers should necessarily behave like Von Neumann machines and that Code is Data is Code? Or the bits about the universality of lambda calculus in CS...
     



  • @dhromed said:

    I'd say a new learner needs to get comfortable dishing out a logical, comprehensible and (most of all!) mainatainable script, before even knowing what the hell malloc() does. If you teach someone about memory management first, and good coding later, there's an increased risk of blowing out not only the lungs, but also the eyes and eardrums -- through the little toe.

    Or are you arguing that they learn to appreciate a katana's sharpness, without actually being given a chance to swing it around, chopping off bits left, right & center?

    The way I see things is that programming environments (generic, I'm not talking about niches) evolve in order to make things easier from what there was before, by abstracting or automating them. Let's say we had assembler. It was fine, but eventually requirements asked for faster and easier development. So you take the most common tasks, generate them through an interpreter, and bam, you have C. And it's fine. But again, data organizational issues bring up the need for even more automation, so C evolves into C++ with the concept of objects and so such. And again, it's fine. But it's still got it's problems, like the trouble of manual memory allocation and cleanup. So managed languages like Java step in to automate that as well. And so on and so forth. But all of these are still solutions based on the same model; they are simply refinements to it.

    Now, you can start at any one point and work from there, and you'll probably be fine. But let's say you start with Java. Java hides away from you all of the problems of memory management, and puts you in a world where you never have to worry about it, ever. So you never end up understanding and appreciating how much work is Java actually doing in the background in order to automate the process, and so you have no concept of consequence when you create and discard 20 objects in a tight loop. Appreciating this is invaluable for good construction. Ultimately, all these automatas are there to make it easier to do something that was originally hard or cumbersome. For someone that has learned the way things used to be done, these new features will be welcomed (or not) as a faster or simpler way to do something that they were already doing before, and so, they'll know much better the implications of taking that "shortcut".

    A personal example: I've started my web development career with ASP. Not very good, but it's still pretty close to simple, stateless, single-page request-response HTTP that it services. There's not much automation going on here (though still quite a bit from, say, pure CGI), except header parsing, cookie handling, and a few other such things. I've worked with it for a year or so, and I got to understand it pretty well. In the process I learned about how HTTP works, how to preserve state between pages, and a lot of other useful things, simply because understanding them was a requirement of working well with ASP.

    Eventually, I picked up ASP.NET. ASP.NET is a completely different world, because instead of a single page, request-response model, it tries to pass it off as an event-based model. But in reality, it's still just an automation wrapper over the same HTTP request-response model, which doesn't change. ASP.NET simply takes the common things that are done in webpages, and provides shortcuts for them, abstracting (or trying to) the underlying, unstructured model. But this was just fine, because I was aware of this. I was aware for example that pages need to restore their state on every load, and calculate what hapened with the new input data, which naturally had to be passed back from the browser somehow, and that all of the controls eventually at some point had to render out HTML, which is basically text and doesn't work like a Winforms control. And because I understood how it made my life easier from ASP, I was glad.

    Now, a friend of mine, which is going through university right now, had the choice between learning web development through ASP.NET, or PHP. I recommended PHP to start with, because, like ASP, it's closer to the HTTP model whch it's servicing. He didn't take my advice though, and went for ASP.NET, because there's more market for it nowadays (at least in this country). And since last I checked, thanks to ASP.NET's magic, he wasn't even aware that the web isn't an event-based model, and the concept of 'stateless' confused him. He would code pages using Visual Studio's automatic layout designer (which works with absolute CSS values), and would then wonder why can't he combine components like in WinForms. You can just imagine the ammount of WTFs that could come from someone being ignorant about the web and HTML like that. (In his case, however, I have trust in that he by now's overcome these delusions and will hopefully never show up on this site).

    Back to your idea, scripts help to learn the concepts of structured language (variables, conditions, looping, functions, etc.), but starting there you'll get used to the idea that things really are that easy, and then when someone tells you you have to allocate and return memory, and check buffers and all that, you'll just wonder why do you have to do it if you never had to worry about it before. So if you're going to start with a script language, at least use one that forces you to understand and use the most valuable concepts of the model on which you're working. In the case of programming for PCs, understanding how the memory works and where programs get it, is probably one of the most important things you can learn, since everything else basically hinges on it.

    All and all, working up from hard to easy always gives the best results. If you learn to weild and inflict damage with a wooden training sword, then you will be even deadlier when you get to use a real katana. But if you start with the katana, you won't ever know how to use it in a way so effective that would make it deadly even if it was as blunt as wood.



  • I don't hink I have anything to add to that, except that in my analogy, assembly is the "hard" katana, the real deal; and the wooden sword is the "easy" Java/.Net, the foot-protector.

    Ah, the dangers of analogies. :)
     



  • @dhromed said:

    I don't hink I have anything to add to that, except that in my analogy, assembly is the "hard" katana, the real deal; and the wooden sword is the "easy" Java/.Net, the foot-protector.

    Ah, the dangers of analogies. :)

    Swords perhaps aren't the best analogies for this, even if I do like to imagine my job as something so romantic. Maybe comparing a sword to some kind of surgical knife. You can probably do the same things with either, but it's sure going to take you a lot longer to conquer the Temple of Enterprise armed with just a knife.

    It would be a lot more impressive in the end though...
     



  • @Sunstorm said:

    @dhromed said:

    I don't hink I have anything to add to that, except that in my analogy, assembly is the "hard" katana, the real deal; and the wooden sword is the "easy" Java/.Net, the foot-protector.

    Ah, the dangers of analogies. :)

    Swords perhaps aren't the best analogies for this, even if I do like to imagine my job as something so romantic. Maybe comparing a sword to some kind of surgical knife. You can probably do the same things with either, but it's sure going to take you a lot longer to conquer the Temple of Enterprise armed with just a knife.

    It would be a lot more impressive in the end though...

    And what would a rocket launcher be in that analogy? 



  • @Anonymouse said:

    And what would a rocket launcher be in that analogy? 

     

    JAVASCRIPT.

    The rockets are XML. 
     



  • @Sunstorm said:

    @Anonymouse said:

    And what would a rocket launcher be in that analogy? 

     

    JAVASCRIPT.

    The rockets are XML. 
     


    I find the concept of a rocket launcher that works differently depending upon who's holding it to be frightening.  But not so frightening as the rockets that more often than not wind up killing the user.



  • I think this all depends on the goal of the education.  Yes, even education has a goal.

    There are probably at least five educational areas for programming (the first four probably overlap a good bit):  theoretical computer science, building hardware to execute programs, developing tools to create programs that run on hardware, true software engineers, and then the "just use the tool" class.

    Most people, probably not on this forum though, fall into the "just use the tool" class. These are the people that don't realize that there are high-temperature reactions happening inside their car to make it go; the people that just know that a hammer hitting a nail through two things makes them stick together, etc. The engineers are those who actually know how to design a structure that will let someone build it without hassle.  The toolmakers are those who run the mills to shape the wood into boards and beams; the hardware folks are the guys that build the milling machinery, and the theoretical guys are those that are studying statics of solid materials and chemistry and such.

    So, yes, if you don't care about why a piece of wood is strong, or why you need three nails instead of two to hold them together, you end up with structures that are not very safe: bad code.  This doesn't matter if you have good blueprints, good wood, wood made on the best mills, or people who fully understand the material properties involved.

    Just as it's not feasible to expect everyone who plays with a hammer to know exactly what is good and what isn't, it's not feasible to expect everyone who programs to know what's good and not.

    However, I think the idea that production code should be somehow certified (not I specifically didn't say the programmer needs to be certified, but the code; I think having certified laborers is really also amiss; if the structure is certifiable, who cares who built it?) so that people know what to expect.  In aviation, for instance, code must be certified. However I'm not even aware that any code in a financial system has to be certified to any testing standards, and we all know that there are basically no standards for desktop applications, web interfaces, and the like.

    So, essentially, the level and order of education really does depend on the end result desired. Also remember that, unlike activities which involve "matter bashing", programming has a very high turn-around time with very little risk involved with making a bad decision; if a program crashes (usually) there aren't dire physical consequences. It's probably mostly because of that latter statement that so many programs are in the state they are in.



  • Computer education is one area that I am really passionate about. I think that most modern education systems absolutely fail to deliver the basic knowledge of computers that they should, or even [i]used to[/i]. Today, computer education starts with Microsoft Office outside of comp sci curriculum. For most CS students it is Java. We all know this. We are, or have seen the products of, the infamous JavaSchools. When I was in the third grade we did programming in Logo, and we were all encouraged to explore on our own (and we did!), but by the time I got to middle school, their computer courses barely got any deeper than changing fonts. Most kids couldn't do anything but the step-by-step instructions that the teacher gave out. Guess what, though? The application-based classes were easy for those who did Logo programming, and they used applications to a fuller capacity. Why is this?

    Educators have given up on the absolutely fundamental concept that [b]computers only do exactly what they are told[/b]. Somewhere people lost this and started believing that computers only do what the application says you can do.

    But what language to teach? I think that all that matters when you start teaching computers is that they are [b]just programmable machines, and nothing more[/b]. They could start with any language and get that point across. But which one is the best?



  • In my three year programming course, first year consisted of boolean algebra and other logical concepts like that, then x86 assembler, then finally Pascal, all the way up to pointers (no objects). Second year we did mostly Delphi, very little actual mention of objects. I don't remember if Databases and SQL was second or third year. On the third year, we did VB.NET (although I insisted to use C# for all the projects) and Java.

    It was a good course and it worked well for me, but just to prove that sometimes it's really because of the person rather than the method, by the end of the second year there were people that still didn't know what an array was or how to declare one.

    But that's a course specifically for programmers. I don't know how would one go about educating correctly someone that only wants to use a computer, beyond the science of it all. Probably start with OS basics, maybe even from console operation, and only when you understand fundamental concepts like how computers work with files and the such, move on to working with Windows and it's applications.
     



  • @Sunstorm said:

    But that's a course specifically for programmers. I don't know how would one go about educating correctly someone that only wants to use a computer, beyond the science of it all. Probably start with OS basics, maybe even from console operation, and only when you understand fundamental concepts like how computers work with files and the such, move on to working with Windows and it's applications.

    Usage of a new interface is hardly different from usage of a new programming language. If they cannot learn (which people, as a paper linked in this forum indicated), then you can teach them to fish, but only with that rod and that net in that lake. If they pick it up quickly, they're likely to be able to learn programming as well.

    Still on the topic of Those Who Cannot Learn Programming, and assuming that in general they can be taught, then maybe they need to be taught something else first. Some basic way of thinking.



  • @djork said:

    But what language to teach? I think that all that matters when you start teaching computers is that they are [b]just programmable machines, and nothing more[/b]. They could start with any language and get that point across. But which one is the best?

    A programmer who knows only one language knows nothing of any value. You don't start to learn anything until about the third or fourth.



  • @dhromed said:

    Still on the topic of Those Who Cannot Learn Programming, and assuming that in general they can be taught, then maybe they need to be taught something else first. Some basic way of thinking.

    I propose that all computer-related education begin with all participants being doused with petrol and handed a box of matches. All those who cannot figure out what to do will eliminate themselves.
     



  • @asuffield said:

    @djork said:

    But what language to teach? I think that all that matters when you start teaching computers is that they are [b]just programmable machines, and nothing more[/b]. They could start with any language and get that point across. But which one is the best?

    A programmer who knows only one language knows nothing of any value. You don't start to learn anything until about the third or fourth.

    You can at least grasp the concept that the computer is under your direct control. You've seen failure to understand this manifested as people who "aren't good at computers," and are afraid to touch anything. You can learn that there are facilities in nearly all languages for logic (control structures), abstraction (variables, functions), and I/O.



  • @djork said:

    @asuffield said:
    @djork said:

    But what language to teach? I think that all that matters when you start teaching computers is that they are [b]just programmable machines, and nothing more[/b]. They could start with any language and get that point across. But which one is the best?

    A programmer who knows only one language knows nothing of any value. You don't start to learn anything until about the third or fourth.

    You can at least grasp the concept that the computer is under your direct control. You've seen failure to understand this manifested as people who "aren't good at computers," and are afraid to touch anything. You can learn that there are facilities in nearly all languages for logic (control structures), abstraction (variables, functions), and I/O.

    Well, you don't actually learn anything about "nearly all languages" from your first language - after all, everything _could_ be an idiosyncracy of that language

    And, on the other hand, until you step outside a (sometimes very large) family of languages, you won't learn that some things _aren't_ universal - think of how many languages have the C-style for loop) - That's why something like Lisp is so good - because it's so different from C, Basic, etc (which are what most people these days start out learning) that it makes you reexamine those assumptions.

    My first "really different" language was the RPL language [a kind of pseudo-forth built in to HP-28, -48, and -49 calculators] - anyone else?



  • Still on the topic of Those Who Cannot Learn Programming, and assuming that in general they can be taught, then maybe they need to be taught something else first. Some basic way of thinking.

    I believe that there's actually something biological involved with people that can or cannot use computers, sorta like how some people can draw really well without learning. There's just something fundamentally different about the way we think, visualize, or organize data in our heads, when compared to our non-techie peers. While it's something that can and should be trained, I believe that some people are just born with the brain wired in a good way for that end. Almost makes you feel special.



  • @Random832 said:

    My first "really different" language was the RPL language [a kind of pseudo-forth built in to HP-28, -48, and -49 calculators] - anyone else?

    The strangest language I learned was Miranda.  For those that don't know it, it's a purely functional language (link).  I found it strange mostly because it was the very first language we learned in my Computer Science degree, and I think the only reason they chose that one was because the Dean of the department wrote a book on it.  One of the stronger points of the language was it's ease to perform recursion.  You'd think they could start with something a little less abstract than a semester of recursive programming for people just learning about programming.  The only programming I ever did before that was BASIC and my TI-81 calculator.  Luckily I didn't have too much trouble with the course, but a good chunk of the people in my class sure did. 

     
    ... Now that I think about it more, maybe Prolog was stranger... but after taking enough courses on predicate logic, it started to make sense.

     


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.