Shameless Promotion of LISP - FOR KIDS!



  • @antiquarian said in Shameless Promotion of LISP - FOR KIDS!:

    You do know there are more Lisp programs than Emacs, right?

    In 2016? No.


  • Impossible Mission - B

    @Captain said in Shameless Promotion of LISP - FOR KIDS!:

    This sounds more like a critique of C than LISP.

    How? I said nothing about C.

    Yes, this is more a critique of C, isn't it? Instead of first class functions, C has function pointers, and the programmer has to explicitly de-reference a function before calling it.

    ...

    bwuh?

    I'm not even a C coder and I know that's not true.

    Why even have pointers at all? I know what they're for, but why expose raw pointers to the user?

    Because Ritchie was an idiot who ignored decades of well-established research and best practices in language design.

    Why use that ridiculous syntax where the function name goes on the left of a tuple of arguments?

    What's ridiculous about saying "this is the X function and it takes Y and Z as arguments"? That seems like an extremely straightforward way to do it. (And doesn't Lisp do essentially the same thing? Can you think of any high-level language that doesn't, for that matter?)

    Why use all those crazy and useless braces? (I know, it's because it makes writing the parser easier!)

    Because it makes writing the parser possible. You need a way to denote the end of a block, and AFAIK there are only three ways we've found to do it:

    1. enclose the block in an opening and closing token pair, as Lisp, Pascal and C do
    2. assume all things that can possibly be a block are a block, and then have a block-end token (Modula and Ruby)
    3. use indentation, Python-style

    Why use all those crazy loops? (I know, it's because it's easier to normalize a for-loop into assembly than to normalize a recursive loop! Except for the programmer, who now has to deal with boundary conditions and null pointers and all sorts of garbage K&R let into their language)

    1. Because iteration is far easier to correctly reason about than recursion in most cases
    2. Because on real-world hardware, stack limitations are a thing
    3. Because transforming an intuitive function into a tail-recursive equivalent in order to avoid problem #2, just so the compiler can then further transform it into proper iteration, can be a non-trivial task in and of itself and it tends to clutter up the code, with the details of the mechanism obscuring the details of the algorithm. (See also: CPS)




  • @dkf said in Shameless Promotion of LISP - FOR KIDS!:

    The 6 year olds of my acquaintance aren't really very interested in programming, not when there's trees to climb.

    Kids: We like trees better.
    DKF (hopeful): Balanced binary trees?
    Kids: Sigh. We are gonna climb some trees now, later gramps.
    DKF: You mean like OUTSIDE!?


  • BINNED

    @Jaloopa said in Shameless Promotion of LISP - FOR KIDS!:

    I think we should teach all children assembly for a long extinct processor family

    We were taught assembly for a processor that doesn't exist, and probably never will, at uni. Does that count?


  • BINNED

    @cartman82 said in Shameless Promotion of LISP - FOR KIDS!:

    @dkf said in Shameless Promotion of LISP - FOR KIDS!:

    The 6 year olds of my acquaintance aren't really very interested in programming, not when there's trees to climb.

    Kids: We like trees better.
    DKF (hopeful): Balanced binary trees?
    Kids: Sigh. We are gonna climb some trees now, later gramps.
    DKF: You mean like OUTSIDE!? ON MY LAWN?

    FTF:belt_onion:


  • kills Dumbledore

    @Onyx said in Shameless Promotion of LISP - FOR KIDS!:

    Does that count?

    Only if you were at uni at 6 years old


  • Notification Spam Recipient

    @Onyx said in Shameless Promotion of LISP - FOR KIDS!:

    @Jaloopa said in Shameless Promotion of LISP - FOR KIDS!:

    I think we should teach all children assembly for a long extinct processor family

    We were taught assembly for a processor that doesn't exist, and probably never will, at uni. Does that count?

    Yeah that's some grade b BS right there. Everything is theoretical, so much so that I strongly believe my degree is theoretical and useless in practice...



  • @Yamikuronue said in Shameless Promotion of LISP - FOR KIDS!:

    4 * 5 but instead * 4 5 to teach kids.

    I wonder if it works better than PEMDAS though...

    * 4 + - 2 - 4 1 * 1 1
    
    * 4 + - 2 3 1
    
    * 4 + (-1) 1
    * 4 0
    0

  • Discourse touched me in a no-no place

    @Onyx No climbable trees in the vicinity of my lawn unless you're a cat. We've tried telling the cat to get off the lawn, but she doesn't listen.



  • @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    Because Ritchie was an idiot who ignored decades of well-established research and best practices in language design.

    What "decades of well-established research and best practices in language design" would that be, in 1972? By then, there had only been just barely two decades of digital electronic computer programming, much less "well-established research and best practices in language design".


  • Discourse touched me in a no-no place

    @Steve_The_Cynic said in Shameless Promotion of LISP - FOR KIDS!:

    What "decades of well-established research and best practices in language design" would that be, in 1972?

    Well, they already had FORTRAN, COBOL, PL/1 and LISP at that point! 🐠

    C is good at what it does — being a fairly thin layer over assembly language to get rid of the really annoying bits — and it doesn't try to be anything else. That's what's good about it; it knows its place, and it does very well at filling out that genuinely useful niche. Other languages attempt to do more, and while that's all very nice, sometimes what you really need is something that will let you get low level work done without having lots of complicated extra abstractions in the way.


  • Impossible Mission - B

    @Steve_The_Cynic said in Shameless Promotion of LISP - FOR KIDS!:

    What "decades of well-established research and best practices in language design" would that be, in 1972? By then, there had only been just barely two decades of digital electronic computer programming, much less "well-established research and best practices in language design".

    Tony Hoare, talking about implementing ALGOL 60 in 1961:

    In that design I adopted certain basic principles which I believe to be as valid today as they were then.

    1. The first principle was security: The principle that every syntactically incorrect program should be rejected by the compiler and that every syntactically correct program should give a result or an error message that was predictable and comprehensible in terms of the source language program itself. Thus no core dumps should ever be necessary. It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time. A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to - they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law

    (emphasis added)


  • kills Dumbledore

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    every syntactically correct program should give a result or an error message that was predictable and comprehensible in terms of the source language program itself

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time

    They solved the halting problem?


  • area_pol

    @Jaloopa said in Shameless Promotion of LISP - FOR KIDS!:

    They solved the halting problem?

    I think they mean "no undefined behaviour".



  • @masonwheeler I have this "law":

    In software, it takes between 10 and 40 years for the proper solution to a problem to actually get accepted by most.


  • Discourse touched me in a no-no place

    @Jaloopa said in Shameless Promotion of LISP - FOR KIDS!:

    They solved the halting problem?

    It's possible to define programming languages that can't run for infinitely long. They're just not going to be Turing Complete. Basically, if the language you define is such that its programs are always equivalent to a primitive-recursive function, then the language is always going to terminate (and won't be TC). This is done mainly by only allowing iteration over finite fixed sets/sequences; the number of iterations can be very large, but is still going to be provably finite as there will always be a metric on the size of the remaining work to do to evaluate a particular program that will be monotonically decreasing.

    Some noted functions are primitive-recursive yet compute huge numbers; the classic example is the Ackermann function.


  • kills Dumbledore

    @dkf That wiki says it's not primitive recursive

    one of the simplest and earliest-discovered examples of a total computable function that is not primitive recursive. All primitive recursive functions are total and computable, but the Ackermann function illustrates that not all total computable functions are primitive recursive.


  • Discourse touched me in a no-no place

    @Jaloopa Hmm. I thought it was. Oh well…



  • @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    @Steve_The_Cynic said in Shameless Promotion of LISP - FOR KIDS!:

    What "decades of well-established research and best practices in language design" would that be, in 1972? By then, there had only been just barely two decades of digital electronic computer programming, much less "well-established research and best practices in language design".

    Tony Hoare, talking about implementing ALGOL 60 in 1961:

    In that design I adopted certain basic principles which I believe to be as valid today as they were then.

    1. The first principle was security: The principle that every syntactically incorrect program should be rejected by the compiler and that every syntactically correct program should give a result or an error message that was predictable and comprehensible in terms of the source language program itself. Thus no core dumps should ever be necessary. It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time. A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to - they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law

    (emphasis added)

    Oh, I wasn't saying that nobody did useful stuff like that, even before 1972. (But I also note that the emphasised sentence talks about 1980, which is a little bit after 1972.)

    What I meant was that calling it "decades of well-established ..." makes those things sound a whole lot more mature and widespread than they were. One might also argue that the checks that Tony Hoare mentions are an early example of shifting the need to be careful away from the programmer, and I'd observe that this is not necessarily a good thing.

    C is, as @dkf says, good at what it does, and is best left to be what it is.



  • @Adynathos said in Shameless Promotion of LISP - FOR KIDS!:

    @Jaloopa said in Shameless Promotion of LISP - FOR KIDS!:

    They solved the halting problem?

    I think they mean "no undefined behaviour".

    Indeed, that is exactly what is meant. (And the point in the original text about "at compile time" is important. Certain constructs in C++, while valid, can cause issues in compilation, especially when templates are involved. Most of them are given prominent display in the classic "Modern C++ Design".)



  • @Maciejasjmj said in Shameless Promotion of LISP - FOR KIDS!:

    want to insert more than 10 lines of code in the middle of your program? Too bad, you should've thought of that the first time around!

    Discipline schmiscipline: CALL -10531


  • Impossible Mission - B

    @Steve_The_Cynic said in Shameless Promotion of LISP - FOR KIDS!:

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    @Steve_The_Cynic said in Shameless Promotion of LISP - FOR KIDS!:

    What "decades of well-established research and best practices in language design" would that be, in 1972? By then, there had only been just barely two decades of digital electronic computer programming, much less "well-established research and best practices in language design".

    Tony Hoare, talking about implementing ALGOL 60 in 1961:

    In that design I adopted certain basic principles which I believe to be as valid today as they were then.

    1. The first principle was security: The principle that every syntactically incorrect program should be rejected by the compiler and that every syntactically correct program should give a result or an error message that was predictable and comprehensible in terms of the source language program itself. Thus no core dumps should ever be necessary. It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time. A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to - they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980, language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law

    (emphasis added)

    Oh, I wasn't saying that nobody did useful stuff like that, even before 1972. (But I also note that the emphasised sentence talks about 1980, which is a little bit after 1972.)

    This was from a talk he gave in 1980; this part was about something he did back in 1961, well before the creation of C.

    What I meant was that calling it "decades of well-established ..." makes those things sound a whole lot more mature and widespread than they were. One might also argue that the checks that Tony Hoare mentions are an early example of shifting the need to be careful away from the programmer, and I'd observe that this is not necessarily a good thing.

    I wouldn't agree. You still need to be careful and get the code right if you want to get correct results, which is the whole point of writing and running a program in the first place. It just reduces the damage for when you mess up--and we must always remember errare humanum est--from "potentially catastrophic" to "annoying".

    It's the same principle as saying "cars should come with seat belts, airbags and crumple zones." Only a moron would use the safety features as an excuse to drive recklessly; they're there because accidents still happen and people shouldn't have to die from them.

    C is, as @dkf says, good at what it does, and is best left to be what it is.

    No, it's not. The Morris Worm proved that. C's original purpose was explicitly for building operating systems, and Morris showed that it's a horrible language for that purpose because the language's design promotes insecurity, and OSes--particularly network-facing ones--need strong security more than almost anything else in computing!



  • @masonwheeler Jesus what the hell are you people talking about and what does it have to do with a site about learning LISP for 8 year olds?

    You know what, nevermind, I'm just going to hit ignore.


  • Impossible Mission - B

    @blakeyrat Goodbye. 😄


  • Discourse touched me in a no-no place

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    the language's design promotes insecurity

    Technically, it's the language's standard library that is a pile of junk; lots of things in there that either can't ever be used safely or that can't be used without terrible performance. Or both. Always room for things to be wrong in several ways at once.

    But when someone goes round saying that that means that we need something that is eternally provably safe, they're missing the point. C is one of the tools that you build things out of, and if you make all your tools so that they can't ever injure anyone, you have many things that you can't build. Something has to make the safety systems themselves; that layer is typically written in an unsafe language (usually C or something even lower level than that). You can build safe systems in C, but you have to be very disciplined in how you go about it.



  • @dkf said in Shameless Promotion of LISP - FOR KIDS!:

    Technically, it's the language's standard library that is a pile of junk

    It's at least arguable that C's fundamental safety problems are rooted not so much in bad library routines but in the language's exposure of raw pointers and its complete lack of bounds checking on indexed dereference operations.

    Mind you, this argument is most often put by people who have missed the point that C, like Bliss and BCPL before it, is essentially a portable assembly language. It was designed to do the kind of work that people were already using assembler to do, while allowing that work to hop across architectures with less pain. It was originally a very small language capable of generating reasonably performant code even with a fairly simple-minded compiler, simply because the abstractions it provided were not very far from those of the underlying machine.

    The more control you get over exactly what the underlying machine is going to do, the more careful you need to be. In assembler you have to be exceedingly careful. In C you have to be very careful.

    It really is unfortunate that so much C has been written by people who seem to expect it to hold their hands as much as, say, Pascal would. But I don't think that makes the language bad; I think that makes the language choice process for those projects bad.


  • Discourse touched me in a no-no place

    @flabdablet said in Shameless Promotion of LISP - FOR KIDS!:

    It's at least arguable that C's fundamental safety problems are rooted not so much in bad library routines but in the language's exposure of raw pointers and its complete lack of bounds checking on indexed dereference operations.

    Also the unchecked casts. So useful. So evil. So dangerous.



  • @dkf Yes. The original C type system is more a collection of implementation hints than an actual abstraction.

    ANSI C tightened that up some, which I consider to have been a mistake. False sense of security and all that. If you're going to put guard rails on your meat slicer, it's best not to make them out of tinfoil.


  • Impossible Mission - B

    @dkf said in Shameless Promotion of LISP - FOR KIDS!:

    You can build safe systems in C, but you have to be very disciplined in how you go about it.

    At the risk of sounding like Bill Clinton, "can" can mean so many different things.

    Is it theoretically possible? Sure!

    Is it realistically achievable in practice? Show me a piece of network-facing C code of non-trivial complexity, at least 10 years old and widely adopted by a large number of users, that does not have a long history of security issues (and/or patches to fix security issues), and I'll concede that it is.


  • Dupa

    @Captain said in Shameless Promotion of LISP - FOR KIDS!:

    @blakeyrat Didn't you LOGO as a kid? Wasn't it awesome?

    I had it worse: LAMP. PHP!!! ;(


  • Winner of the 2016 Presidential Election

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    Is it realistically achievable in practice? Show me a piece of network-facing C code written in any language of non-trivial complexity, at least 10 years old and widely adopted by a large number of users, that does not have a long history of security patches, and I'll concede that it is.

    FTFY



  • @pydsigner C# code holds up well if it wasn't written by morons. The fixes are in the framework, not in the individual code.



  • they say that children need their parenthesis


  • Discourse touched me in a no-no place

    @blakeyrat said in Shameless Promotion of LISP - FOR KIDS!:

    C# code holds up well if it wasn't written by morons.

    True that. Also Java code and a few other languages too. Heck, it's not that hard in C as long as you switch to async I/O mode and don't use stdio. Stdio is shit, almost as bad as C's string handling (which is genuinely awful). Nobody sane uses strcat() or gets()


  • Impossible Mission - B

    @dkf said in Shameless Promotion of LISP - FOR KIDS!:

    @blakeyrat said in Shameless Promotion of LISP - FOR KIDS!:

    C# code holds up well if it wasn't written by morons.

    True that. Also Java code and a few other languages too. Heck, it's not that hard in C as long as you switch to async I/O mode and don't use stdio. Stdio is shit, almost as bad as C's string handling (which is genuinely awful). Nobody sane uses strcat() or gets()

    This is because C's strings are awful.

    Is this the only way to store strings? No, in fact, it's one of the worst ways to store strings. For non-trivial programs, APIs, operating systems, class libraries, you should avoid [C-style] strings like the plague.

    -- Joel Spolsky, Back to Basics


  • Discourse touched me in a no-no place

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    This is because C's strings are awful.

    Not just that. There's no real integration between the string handling, the IO layer, and the memory manager. Those need to work together well for IO to not suck for normal work.



  • @masonwheeler @dkf "C strings" may be the worst thing ever on the story of computing


  • Discourse touched me in a no-no place

    @groo said in Shameless Promotion of LISP - FOR KIDS!:

    "C strings" may be the worst thing ever on the story of computing

    The original Pascal strings were contenders for that too. Having the length and the string together is great, but only using one byte for it? :headdesk:



  • @dkf It made tons of sense in 1968, when the typical mainframe user process was allotted less than 16K memory and some minis had about that much total memory. Also, the language was meant for students to practice with, so no one expected anyone to use anything close to 255 characters. Hell, many input libraries of the time we're based on the assumption of a 80 character line limit because that was slightly larger than the page width of an ASR-33.

    EDIT: Yeah, I muffed that, the ASR-33 actually had a 72-character line limit in the most common configurations.


  • BINNED

    @dkf said in Shameless Promotion of LISP - FOR KIDS!:

    The original Pascal strings were contenders for that too. Having the length and the string together is great, but only using one byte for it?

    No one will ever need more than 256 characters. 🐠



  • @ScholRLEA *mumble mumble* tldRLEA



  • @ScholRLEA said in Shameless Promotion of LISP - FOR KIDS!:

    the assumption of a 80 character line limit because that was slightly larger than the page width of an ASR-33

    I think you'll find that the 80 character convention originates from IBM punch cards, not those newfangled electric typewriters.


  • Discourse touched me in a no-no place

    @ScholRLEA said in Shameless Promotion of LISP - FOR KIDS!:

    Yeah, I muffed that, the ASR-33 actually had a 72-character line limit in the most common configurations.

    I think that it was because the first 8 positions of the punched card were for various sorts of control indicators, such as we know from the bad old days of FORTRAN. (There are some things that are not missed at all.)



  • @flabdablet said in Shameless Promotion of LISP - FOR KIDS!:

    @ScholRLEA said in Shameless Promotion of LISP - FOR KIDS!:

    the assumption of a 80 character line limit because that was slightly larger than the page width of an ASR-33

    I think you'll find that the 80 character convention originates from IBM punch cards, not those newfangled electric typewriters.

    Oh, that's something I hadn't thought of, yeah that's probably true. I wonder, does that go back all the way to Hollerith, or maybe even Jacquard? checks Wicked-Pedo Nope, Hollerith's original cards had 24 columns (later increased to 45), the 80-column format was from a 1928 redesign. Gotcha.


  • Impossible Mission - B

    @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    It was logically impossible for any source language program to cause the computer to run wild, either at compile time or at run time

    They solved the halting problem?

    1. They solved the "undefined behavior due to invalid memory access" problem.

    2. The Halting Problem is a bit of logical trickery that says that you can't write a Halts() function that will always return the correct value for every piece of code fed to it, because it's always possible for some joker to troll the algorithm by asking it to perform an operation equivalent to computing the truth value of this sentence is false. It has very little applicability to real-world coding.


  • Discourse touched me in a no-no place

    @masonwheeler While that's true, it's also possible to show that you can write programs that only terminate if some unsolved mathematical hypothesis is proven. At that point, proving whether the program terminates is equivalent to solving something that the best minds in mathematics are still working on, and the termination problem is truly awful without needing to rely on encoding the program within itself.



  • @masonwheeler said in Shameless Promotion of LISP - FOR KIDS!:

    troll the algorithm by asking it to perform an operation equivalent to computing the truth value of this sentence is false

    ...as expressed in some arbitrary coding.

    That truth value is, of course, file_not_found.

    Perhaps one could attempt to work around this difficulty by writing something that analyzes whether or not any given program does in fact make its own ability to halt conditional on its never doing so.


  • Discourse touched me in a no-no place

    @flabdablet said in Shameless Promotion of LISP - FOR KIDS!:

    Perhaps one could attempt to work around this difficulty by writing something that analyzes whether or not any given program does in fact make its own ability to halt conditional on its never doing so.

    If you allow a three-valued result (yes, no, no-ideal-lol 🐄) then it's pretty easy to write a termination checker; you could do something today in the next 20 seconds or so that never gives a truly incorrect result. :p The problem lies in the sheer diversity of space covered by the third option; there's some horrible things in there.



  • @dkf That kind of depends on the logical strength of no-idea-lol. If it means not-proven then fair enough. If it means not-provable then it's harder.


Log in to reply