Inflicting Haskell on N00bs for Science



  • @boomzilla said:

    I think assuming people must have programming knowledge to start a college curriculum of programming sort of stuff is pretty crazy. It's not the same realm as being proficient in math before beginning an engineering or physics program to me.

    I think it's not even programming knowledge, it's having a reasonable mental model of how computers work -- you'll struggle mightily at CS if you haven't figured out that computers are idiot logic savants only capable of following step-by-step directions instead of black magic boxes...


  • FoxDev

    The way I think of it is computers only do two things:

    1. Add two numbers
    2. Copy a number

    Everything else just comes from the fact computers do those two things billions of times a second


  • ♿ (Parody)

    That's probably true. But then, you have to pick that up from somewhere. If it comes from a 101 style course or fiddling around, I'm not too concerned either way.

    And if people can't hack it, they can go study underwater basket weaving or whatever the SJWs are pushing today.



  • @Salamander said:

    That's not a mutually exclusive category. They're styled after underwear, people sometimes wear them as underwear, and they're generally not socially acceptable to wear as your only clothing anywhere other than at a beach or pool.

    https://www.youtube.com/watch?v=xW4qptCre8M



  • @tarunik said:

    I think it's not even programming knowledge, it's having a reasonable mental model of how computers work -- you'll struggle mightily at CS if you haven't figured out that computers are idiot logic savants only capable of following step-by-step directions instead of black magic boxes...

    I thought they were magic boxes... until my university lecturer told me they weren't.



  • The closest I got to programming before university was making WC3 maps, where usually whenever I asked a question, I was told 'learn JASS, it's better than using the trigger editor' which I dutifully ignored, because learning a variant of actionscript no one uses anywhere else seemed like a bad idea at the time. I don't want to know what would have happened if I had given in.

    Seriously, triggers in Staredit and then Worldedit aren't the worst possible introduction. Partly because they essentially say what they do in English.


  • kills Dumbledore

    @Magus said:

    Partly because they essentially say what they do in English

    My first experience with anything resembling programming was Applescript. I had a script that was something like

    tell application Internet Connect
    Open
    Connect
    Close
    
    wait for 30 seconds
    tell application Firefox
    open
    

    That syntax is probably horribly wrong, I haven't touched Applescript in years, but it was good for getting the idea in my head that a computer is a stupid thing that does exactly what you tell it, without having to learn all that curly brace stuff



  • The trigger editor is more like:

    {Target} uses {Skill} on {Target}

    Where clicking on a link brings you to a screen with dropdowns, where you select something like:

    Target unit

    Which then changes the previous sentence to:

    Target unit uses {Skill} on {Target}

    And so on. I rather like the way they set it up. SC2's system is infinitely more powerful, but much more opaque. I can use it as a programmer, but it's really bad for people new to the stuff.


  • kills Dumbledore

    So that's more like setting Outlook rules?



  • Perhaps. I haven't done that. But its a good system for noobs. It also has support for loops, conditionals, logical operators, and even arrays, which were difficult to understand back then, but once you tried them a couple times, you got the idea. Honestly, the editor made WC3.


  • I survived the hour long Uno hand

    oh man, I'd almost forgotten how much time I spent tweaking Starcraft maps



  • Honestly, one of my cousins introducing me to staredit when I was like 10 may be the reason I ended up studying programming.


  • I survived the hour long Uno hand

    I was doing batch scripts before I had starcraft. Simple things, like changing the prompt, but then I had this idea where you'd type your name and it'd customize your prompt for you, and maybe some other settings -- a naive user management system. My parents gave me Dummies books and sent me to computer camp where I learned enough BASIC and C++ to get a feel for programming, all before I ever learned algebra. I had no trouble with letters standing for numbers; they were just variables, to me.



  • @Magus said:

    SC2's system is infinitely more powerful, but much more opaque.

    Huh, I didn't even know Star Control 2 had an editing system (not an official one anyway; I have seen hacked ship files). I'll have to give it a go.

    I have in fact bought Starcraft 2. I haven't actually installed it yet because I need to upgrade my system a bit first. But SC2 still expands to Star Control 2 for me.


  • Java Dev

    @Magus said:

    SC2's system is infinitely more powerful, but much more opaque.

    I haven't used it, but I seem to recall SC2 has two systems, a trigger editor and a 'traditional' code editor, with limited conversion abilities?

    I know WC3 triggers could be written more clearly than SC1 triggers in many cases because you could attach triggers to events, rather than only having the conditions checked once every 2 seconds.



  • @tarunik said:

    @boomzilla said:
    I think assuming people must have programming knowledge to start a college curriculum of programming sort of stuff is pretty crazy. It's not the same realm as being proficient in math before beginning an engineering or physics program to me.

    I think it's not even programming knowledge, it's having a reasonable mental model of how computers work -- you'll struggle mightily at CS if you haven't figured out that computers are idiot logic savants only capable of following step-by-step directions instead of black magic boxes...

    My former uni had CS students first attend a functional programming course (Haskell in fact, I think), and then a low-level programming course, where they'd get to play with a microchip first in assembly and then in C.

    For all it's faults, I actually think having to mess with assembly on an 8-bit chip is a good thing. There's no way you'll ever end up thinking that computers are somehow magical and/or "smart" after that.

    (The Haskell thing makes things a bit fun, though. After being told that everything(tm) can be solved by recursion, they get to learn (in the C part) that recursion, <2kB of stack and a shitty compiler without tail recursion optimizations don't really mix. Typically the hard way -- namely by overwriting the executable code that is placed in RAM, not too far from the stack. ;-))


  • FoxDev

    @cvi said:

    For all it's faults, I actually think having to mess with assembly on an 8-bit chip is a good thing. There's no way you'll ever end up thinking that computers are somehow magical and/or "smart" after that.

    Doesn't have to be an 8-bit chip; I did a first-year uni course using 32-bit ARM. Still had the desired effect though; even writing something as simple as a bubblesort teaches you just how little a processor actually does.

    Why ARM? Because it's the best for learning, or because several lecturers at the University of Manchester were involved in creating it?


  • Discourse touched me in a no-no place

    @RaceProUK said:

    Doesn't have to be an 8-bit chip; I did a first-year uni course using 32-bit ARM.

    Mine was with MIPS, but I'd already written Z80 and 6502 machine code by hand at that point, so I didn't learn that much. :)



  • @RaceProUK said:

    Why ARM? Because it's the best for learning, or because several lecturers at the University of Manchester were involved in creating it?

    I personally would go with ARM (more precisely, Thumb-2 unified) as well, but that's for practicality's sake -- not only is ARM ubiquitous in mobile and other sorts of "rich device" applications, the ARM Cortex-M family has been taking the true embedded world by storm with its performance/price ratio, availability (including low-power versions that are competitive with the best 8-bit parts), and relative ease of programming.


  • Discourse touched me in a no-no place

    Some of the ARMs are extremely good in terms of their electrical noise. That's another distinct plus point for embedded use cases. (Some of the high-performance CPUs out there also function as reasonably powerful — for the size — radio transmitters.)



  • @dkf said:

    Some of the ARMs are extremely good in terms of their electrical noise. That's another distinct plus point for embedded use cases. (Some of the high-performance CPUs out there also function as reasonably powerful — for the size — radio transmitters.)

    That's a very interesting point -- I wonder if it has to do with how the on-chip clock generation works? (Clocks and other highly periodic signals are the main noise generators in a digital system.)


  • Discourse touched me in a no-no place

    @tarunik said:

    I wonder if it has to do with how the on-chip clock generation works?

    I'm nearly 15 years out of date on that front. Sorry…



  • @RaceProUK said:

    ARM

    ours had us do Alpha, and lc3 or something, the second of which only had emulators, because no one wants to build any. They chose them because x86 is"messy".

    wc3 had a code editor as well. Sc2 is just far more complicated in every respect.


  • FoxDev

    To be honest, we didn't use hardware ARMs, just an emulator. But still, the lessons are the same 😄



  • Yes, but at least ARM has hardware. It's a real thing people use.



  • @cvi said:

    For all it's faults, I actually think having to mess with assembly on an 8-bit chip is a good thing.

    Abso fucking lutely.


  • FoxDev

    @Magus said:

    lc3

    Googling for that got me immunobiology and sofas :wtf:



  • @RaceProUK said:

    Why ARM?

    Because it's a fairly clean architecture, meaning that using it gives you the opportunity to learn more about what a CPU does in general terms than you would if distracted by the Burgess Shale of fossilized backward compatibility that is x86.



  • I mean, it's been years, I could have the name wrong, but I probably don't. The things really never had more than experimental hardware. And then you have Alpha, which my lecturer wished had won, because it was so much better than x86...


  • FoxDev

    The tenth result was an LC3 emulator, so I think you got it right ;)


    @flabdablet said:

    Because it's a fairly clean architecture, meaning that using it gives you the opportunity to learn more about what a CPU does in general terms than you would if distracted by the Burgess Shale of fossilized backward compatibility that is x86.

    That too 😄



  • Was it perhaps on the University of Auckland's website? Because I have a feeling no one else would touch it.


  • FoxDev

    McGraw-Hill Higher Education, apparently



  • Maybe my lecturer got it there...



  • @lightsoff said:

    Unless you were born in the 70s or earlier in which case that would be fair.

    Born in the '50s (very end), high school had a course when I graduated in '77. Now they have it in primary (real stuff in 3rd grade)



  • @RaceProUK said:

    Doesn't have to be an 8-bit chip; I did a first-year uni course using 32-bit ARM. Still had the desired effect though; even writing something as simple as a bubblesort teaches you just how little a processor actually does.

    True.

    I think that having the chip connected to a few LEDs, a simple display and some other peripherals was pretty educational. You don't get use any magic stuff to display text/images, but rather you turn on and off a few electric signals (in the right sequence) to control the LEDs/display segments, and that's it. Nothing magic there either.

    (Now that I think of it, the chip wasn't purely 8-bit either; it had a 16-bit address bus for instance. Not that it really matters.)



  • @cvi said:

    I think that having the chip connected to a few LEDs, a simple display and some other peripherals was pretty educational. You don't get use any magic stuff to display text/images, but rather you turn on and off a few electric signals (in the right sequence) to control the LEDs/display segments, and that's it. Nothing magic there either.

    Yeah -- a Cortex-M3 based microcontroller could be used to build a very nice trainer-type platform...


  • Discourse touched me in a no-no place

    @cvi said:

    (Now that I think of it, the chip wasn't purely 8-bit either; it had a 16-bit address bus for instance. Not that it really matters.)

    Having an 8-bit address bus would be rather constraining. A total of 256 bytes of code and data really isn't very much. (It would have driven me absolutely round the bend back when I wrote Z80 code.)



  • @Magus said:

    then you have Alpha, which my lecturer wished had won, because it was so much better than x86

    That's a very low bar.



  • @dkf said:

    A total of 256 bytes of code and data

    128 (12-bit) words was the page size on a PDP-8. You can only access page 0 and your current page. Yet these machines accomplished much [and were in use for Machinery Control CNC until the early 1990s]


  • Java Dev

    @Magus said:

    wc3 had a code editor as well. Sc2 is just far more complicated in every respect.

    Yeah, just took a quick peek. I recall in SC1 and WC3 you could just slap a trigger into a map to create some dynamic functionality. The screen SC2 calls trigger editor contains whole loads of stuff, and I don't seem to be able to add things at all on the empty default map... I'll believe it's more powerful though, and it still offers the select-and-customize-actions model to construct stuff. Here's a screenshot from what appears to be a default compound action.



  • That's not too dissimilar from WC3's, and better than I remember. I'll probably mess with it again when they release part 3. The biggest difference between the three games is the unit editing. In sc1, you could create differently named units with custom armor, life, and damage values. In wc3, you could customize most things.

    In SC2, you really can customize anything, including putting crazy shaders on your units, but instead of just 'new unit based on tauren', you spend an hour or so getting all the parts ready to make it show up ingame.

    I like what they've done; I intended to make a Dune map, before I formatted and lost the files, and you could do it pretty well. It just takes so much longer to do things that were easy in WC3.


  • Java Dev

    I believe it's also modular - you can create a custom race once, then import it into each of a series of maps, or possibly even use it in a standard map.



  • that is definitely one of the best features. If I end up making that dune map, perhaps I can make more maps with It... It keeps you from catching dota syndrome: "the map is perfect, why would you ever want a different, obviously inferior one?"


Log in to reply