Evolution of a Programmer



  • Found via digg.com, mildly amusing.

    Evolution of a Programmer



  • Hehehe.

    Especially the Guru hacker.



  • <FONT face=Tahoma>Nice!

    Forgot an Enterprise Programmer though... :)



    </FONT>



  • Pseuds and academics may also like this version:

    http://www.willamette.edu/~fruehr/haskell/evolution.html




  • I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.



  • @dhromed said:

    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.


    All of the ones I bothered to look at did factorials. Basically the Functional version of "Hello, World!"



  • @dhromed said:

    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.

    "Norrisly" is an awesome word.  Is it based on the Chuck Norris jokes?  (And Haskell is a really cool language - you ought to give it a try some time.  http://www.haskell.org/ has links to tutorials and stuff.)



  • @iwpg said:

    @dhromed said:
    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.

    "Norrisly" is an awesome word.  Is it based on the Chuck Norris jokes?  (And Haskell is a really cool language - you ought to give it a try some time.  http://www.haskell.org/ has links to tutorials and stuff.)


    1. Yes.

    2. Thanks.


  • Discourse touched me in a no-no place

    @dhromed said in Evolution of a Programmer:

    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.

    It's even more scary if you know what those bits of code do.



  • @antiquarian

    Yeah, defining a programming language with catamorphism combinators, just to calculate a factorial, is crazy. Lots of those approaches are crazy unless you're defining a DSL (and even then they're kind of nutty at this point).

    I still like the "interpretive" approach. Very literally algebraic, even if verbose for this problem.



  • @dhromed said in Evolution of a Programmer:

    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.

    One common thing all programming languages must do, even if implicitly, is make a "parse tree" for a program and then traverse and interpret it. Most programming languages are "imperative", which effectively means that the parse tree is almost fully explicit (but "transparent" to you -- your code is evaluated from top to bottom, left to right; except at "jumps" but even then you can think about inlining yadda-yadda...).

    Many of the Haskell strategies are ways of encoding parse trees, and then defining their own traversals (and interpreters) for that flavor of tree. They can be hard to understand, especially if you don't know what the heck is going on to start with (even if only at a high level).

    That said, Haskell has two (or three-ish) idiomatic ways to encode "parse trees," and they're not as crazy as most of those. There's:

    1. data constructors as nodes
    2. monads to encode imperative-style parse trees
    3. the interpretive pattern (which is really the same as 1, except you explicitly write an interpreter as opposed to using data structures that come with interpreters, like lists)
    4. folds


  • So much love for the 'void main()' in the case of 'new professional'. Sadly, that mistake is still abundant in the wild...



  • Oh, and to keep all those interested in the evolution of my language's design (which is probably no one, though @ben_lubar and @antiquarian might have some passing curiosity I guess), here's my latest insult to Alonzo Church's memory:

    novice Thelema programmer

    (def fact 
      (fn |n|
        (if (or (= 0 n) (= 1 n)) 
             1
             (* n fact (- n 1)))))
    

    junior Thelema programmer

    (def fact
      (fn |n #[Integer (in #<0 1>)]|
        :Integer
        1))
    
    (def fact
       (fn (n #[Integer])
         #[Integer]
         (* n (fact (dec n))))
    

    "Experienced" Thelema Programmer - uses the basic Math library from the standard Akashic repo (assuming it is ever written)

    (lib myFactorial #<1 0 0 0 :production>
      (import ((Math%product)
               (functions%modifiers%memo))  
      (export (fact))
    
      (def fact
        "(fact n) - computes the factorial of n"
        (fn |n #[Integer]| #[Integer]
          :memo factorial-table
          (Math%product (range 1 n))))))
    

    Nothing funny here, really, unless you count the delusion that I am ever going to finish this turd.


  • Discourse touched me in a no-no place

    @Captain Typically with programs the interesting things don't happen on the parse trees, so much as the abstract objects you construct from them. Haskell is one of the languages that is relatively interesting for compiler writers to work in, though most actually work with Datalog and C++, the former because it's really good for figuring out the analysis parts of compilation, and the latter because there's some really good libraries out there that mean you don't have to reinvent lots of stuff (the LLVM optimiser is really good, provided you can accurately describe the semantics of your program).



  • @ScholRLEA That looks interesting.

    Looks like you're allowing overloading of user functions, what about primitives?



  • @ScholRLEA said in Evolution of a Programmer:

    "Experienced" Thelema Programmer - uses the basic Math library from the standard Akashic repo (assuming it is ever written)

    "Hipster" Thelema Programmer - uses twenty nine micromodules

    ...etc...
    

  • :belt_onion:

    @boomzilla Those factorials aren't even properly left-padded!



  • @tufty said in Evolution of a Programmer:

    @ScholRLEA That looks interesting.

    Looks like you're allowing overloading of user functions, what about primitives?

    All forms are generics, so yes. However, there's a bit of slight of hand involved in my current design plans - the only actual primitive forms in the language are for extending the compiler, and the only data type is $Raw. There isn't even a primitive lambda form.

    My current ideas for Thelema are very meta, sort of like how XML is both a data definition language and a language for defining DDLs. All the syntax aside from basic sexps is from read macros and lexical specializers (similar to read macros, except that they run through hooks in the lexical analyzer, actually changing the FSA when they are in use and getting unloaded when they go out of scope) while the compile time transforms are done through macros (though for raisins I prefer the term 'gsexprs', short for 's-expression generator') and parse specializers (which have the same parallel to standard macros as lexer specializers do to read macros), while code specializers determine the inline code generated - these can chain to existing code specializers, but can also have specific transforms to optimize for a specific target architecture and even a specific CPU model.

    I still need to work out how that last part will work, though, since code generation is (primarily) a JIT function. The AOT part of the compiler will have to detect when one is being used, and deactivate any AST optimizations that could mask the use of the specializable code. The relics for libraries using code specializers will have to carry the whole set of specialized generators, too, not just the AST. Gotta think about this one...

    Whatever. Most of the standard library - including all the other data types and forms - would be implemented in whole or in part as specializers. Since both gsexprs and specializers are first class objects with lexical scope (and can be defined with dynamic scope as well, indirectly), they only apply when referenced.

    The average programmer would have even less reason to write their own specializers than they would for gsexprs, and for release statuses higher than :beta, the compiler would actually give a warning when sacra containing one is compiled.

    Yeah, experimental system is experimental. Or maybe just mental.


  • Discourse touched me in a no-no place

    Do we have a zombie reanimation badger yet?



  • @lolwhat said in Evolution of a Programmer:

    Do we have a zombie reanimation badger yet?

    /me looks at the dates for the early posts in thread

    Ye gods and little fishes, that's some epic necromancy. I didn't even notice that earlier.


  • :belt_onion:

    @antiquarian said in Evolution of a Programmer:

    @dhromed said in Evolution of a Programmer:

    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.

    It's even more scary if you know what those bits of code do.

    Aw, your necromancy just makes me remember that @dhromed no longer posts...



  • @ScholRLEA said in Evolution of a Programmer:

    Yeah, experimental system is experimental. Or maybe just mental.

    Experimental, sure. Not mental.

    There's some interesting discussion going on at LtU at the moment, and you might want to look at John Shutt's stuff with fexprs.



  • @tufty Thank you, that does sound like an interesting debate.

    EDIT: I think I was looking at the wrong article, I'll keep looking.

    There's a discussion on optional static typing in dynamic languages I need to check into. Obviously, I have my own ideas on this subject; I see static type declarations less a matter of type checking (since types are a property of values, not variables) and more of enforcing contract constraints, and in my planned design, release levels will affect the degree to which they are checked.



  • On the issue of fexprs, macros, etc., I have some idea of my own, though I would be surprised if no one had ever thought of them before - they seem to be pretty basic ideas, which makes me wonder if I have missed them in previous literature.

    First, I need to explain that (fn) is more than a shorter name for (lambda) - since the standard form is going to actually be (λ) anyway, and the name just an alias to that, there's no need for being any more concise in that regard - but is actually a fairly complex specializer, and has hooks to allow it to get modified even further.

    (Yes, the potential for abuse in the form of write-only code is a serious issue; I can't see a way to prevent that in any language,, though I am trying to find ways to discourage it by making the more straightforward solutions easier and more efficient. )

    Hang on, I'm getting to the point eventually.

    Among the extensions in (fn) is that at compile time, it can generate four different forms for the parameters - the dirty parameters, which are the evaluated values, and maned by the user-provided symbols; the hygienic parameters, which are gensyms with the parameter values named by the symbols enclosed in a < > pair; the lazy parameters, which are gensyms bound to a thunk of the parameter, named by the symbols in | pairs; and the parameter literal, $_which is an unevaluated list of the original parameters. A code walker in the (fn) specializer checks to see which of these the function uses, generates those forms at compile time as needed, with functions using the lazy or literal versions marked for when the get called as to whether they need a thunk or not.

    Getting there, getting there...

    Basically, a gsexpr is exactly what the name implies - a function that takes a group of values and returns a new s-expression. The process of a HOF returning an s-expression is logically separate from the use of that function to generate code at compile time, and binding it to the macro evaluator is a separate action (though there will be shortcuts for it).

    While using the dirty params in a gsexpr is permissible, if you bind a gsexpr to compile time evaluation, a warning will be given.

    I'll get back later, gotta go.



  • In the East there is a shark which is larger than all other fish. It changes into a bird whose wings are like clouds filling the sky. When this bird moves across the land, it brings a message from Corporate Headquarters. This message it drops into the midst of the programmers, like a seagull making its mark upon the beach. Then the bird mounts on the wind and, with the blue sky at its back, returns home.

    The novice programmer stares in wonder at the bird, for he understands it not. The average programmer dreads the coming of the bird, for he fears its message. The Master Programmer continues to work at his terminal, unaware that the bird has come and gone.

    From here:

    Unfortunately the nonsense held within makes more sense the longer I am in the industry.


  • Discourse touched me in a no-no place

    @ScholRLEA said in Evolution of a Programmer:

    Among the extensions in (fn) is that at compile time, it can generate four different forms for the parameters - the dirty parameters, which are the evaluated values, and maned by the user-provided symbols; the hygienic parameters, which are gensyms with the parameter values named by the symbols enclosed in a < > pair; the lazy parameters, which are gensyms bound to a thunk of the parameter, named by the symbols in | pairs; and the parameter literal, $_which is an unevaluated list of the original parameters. A code walker in the (fn) specializer checks to see which of these the function uses, generates those forms at compile time as needed, with functions using the lazy or literal versions marked for when the get called as to whether they need a thunk or not.

    Have you considered contributing to an existing Lisp dialect?



  • @ScholRLEA said in Evolution of a Programmer:

    Among the extensions in (fn) is that …

    Seems like an awful lot of work to combine things that I wouldn't have given the same name



  • @antiquarian said in Evolution of a Programmer:

    Have you considered contributing to an existing Lisp dialect?

    Yes, actually, and in fact others have said the same thing (just as others have asked why I am writing Kether instead of contributing to Linux). but there are a number of reasons why I don't really think that would be very effective.

    First off, this is mostly a personal 'climb the mountain because it is there' project. For better or worse, ego satisfaction is a, if not the, major motivation in this.

    Second, I'm looking to experiment with several factors, some of which straddle the line between implementation concerns and language design concerns, and between coding concerns and community concerns. I would have to discard some or all of any existing language I would choose to work with anyway.

    Third, when I have implemented or proposed implementing some of my ideas in, say, Scheme, or Common Lisp, the usual reactions are either, "Why would you do something like that?", "That's a terrible fit for the rest of the language," "You really don't get the point of the language as it is now," or "go away, you are distracting us from the ideas we want." Admittedly, this response was most often on c.l.s., where any proposed changes set off a war of words, but still.

    Finally, some of my ideas are really, really bizarre. Like the whole system of reliquaries (library repositories) having versioning at each level, so they can be manipulated either as whole libraries or as individual sacra (source code elements - functions, data type definitions, macros, specifiers, etc). Or integrating a hypertextual LP documentation system into the compiler. Or building almost everything that goes into a conventional language - including things like arithmetic and basic data types - as library specifiers, to a far greater degree than even Ada-84 did (yes, the compiler will have weave and tangle built into it, but not plus and minus). Or having release status part of the definition of a library, in the source code (or alternatively, in a resource document), and have it affect the warnings and optimizations the compiler performs. Stuff I am not even sure is possible, some of it.

    EDIT: And before anyone asks, the Akashic%math library, among others, is imported by default by the listener, so running code that is run interactively can skip a lot of the boilerplate that stand-alone programs (especially ones in status :production) would require.

    (Akashic being the base Thelema language reliquary; think CPAN or PyPi, but more integral to the language itself. Client-programmers can create their own reliquaries, but Akashic is meant to be carefully managed, and will only add :production sacra which have been vetted by other developers.)



  • @tufty said in Evolution of a Programmer:

    Seems like an awful lot of work to combine things that I wouldn't have given the same name

    That's a fair point, and one I'm trying to keep in mind as I work through things. I'm really not sure if these are good ideas at all yet.

    However, a lot of the goal is to see if there are other ways of organizing these things, and see if those ways are worth exploring. Whereas Lisp programmers sometime use things like (LOOP) to destructure lists or objects, or (syntax-case) to destructure macro arguments, I'm working on destructuring the whole process of evaluating programs themselves, both at compile time and at run time. Is it worth doing? I don't know yet.

    Also, much of the intent is to make it easier to wrap these things up in a way that abstracts them from the programmer's view whenever they don't need to see it, and to limit (or at least monitor) the amount of abstraction leakage going on. There will be plenty of sugar coated over these things for day to day use, but the option will be there to go back to the underlying mechanisms when you need to, without having to either rewrite the compiler itself or use blocks of assembly code in otherwise high-level programs.



  • @JBert said in Evolution of a Programmer:

    @antiquarian said in Evolution of a Programmer:

    @dhromed said in Evolution of a Programmer:

    I'm now really frightened of Haskell because I have no idea what those bits of code do, even after staring at them norrisly.

    It's even more scary if you know what those bits of code do.

    Aw, your necromancy just makes me remember that @dhromed no longer posts...


  • Discourse touched me in a no-no place

    @ScholRLEA said in Evolution of a Programmer:

    First off, this is mostly a personal 'climb the mountain because it is there' project. For better or worse, ego satisfaction is a, if not the, major motivation in this.

    That's one of the best reasons for doing it! It's not some business bullshit.

    Or building almost everything that goes into a conventional language - including things like arithmetic and basic data types - as library specifiers, to a far greater degree than even Ada-84 did (yes, the compiler will have weave and tangle built into it, but not plus and minus).

    There are languages where there are no keywords at all and virtually everything that most ordinary programmers consider to be language is actually a standard library (which happens to have extra support from the compiler, but that's an implementation aspect only). You really don't need all that much at the baseline level.



  • @dkf said in Evolution of a Programmer:

    You really don't need all that much at the baseline level.

    Damn straight. McCarthy's LISP fits on half a page of paper.


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.