The Future of Computer Programming
-
BTW the present in this video is the 1970s.
-
@lucas1 Looks like fake 1970s, what with all the laughing.
-
That is kinda of the point. I believe he is supposed to be dressed like a 1960/1970s IBM "company man", but I am not 100% sure that is the right term.
-
@CreatedToDislikeThis said in The Future of Computer Programming:
@lucas1 Looks like fake 1970s, what with all the laughing.
And the cell phones being held in audience?
-
@dcon you are supposed to use your imagination
-
I remember PLATO...
-
@dcon well I don't because it was before I was born
-
@lucas1 said in The Future of Computer Programming:
@dcon well I don't because it was before I was born
I was at orchestra camp at UoI back in 1975 (I think - can't remember exactly). Played games on it...
-
@dcon I always find how Americans have these camps for activities quite funny. In England you just used to either irritate your parents or hang around with your mates for weeks trying to not be bored.
-
Anything good in the video aside from the "hurr hurr computers were so big" laughs?
-
@CreatedToDislikeThis grail looks pretty bloody amazing considering the tech at the time.
-
@CreatedToDislikeThis It was actually pretty interesting... It's being presented as though it's in the 70s - kind of a review - but the point is that we get too much into a 'I know what I'm doing' rut and don't look at other things. He summarizes it pretty well starting at about 29m.
-
@dcon the most interesting take away for me that apis and api integration over a network tasks are the dumb way to get stuff to communicate.
It is usually the most tedious part of software dev and some of the stuff ides do these days take the pain away but it always hassle.
-
As I watched, I thought about how familiar the places this guy was going were, so I googled his name. Lo and behold, it is the very same guy.
-
@pydsigner other than I hate how his website works, is that a bad thing?
-
@dcon the other take away for me was that just because things are newer they aren't better necessarily.
-
Thank you for posting this! I found this a year ago but wasn't able to find it again
-
@CreatedToDislikeThis said in The Future of Computer Programming:
Anything good in the video aside from the "hurr hurr computers were so big" laughs?
He talks about a bunch of early experiments in HMI/computer inputs. They're actually pretty cool and impressive considering their age. I feel that the points he raises about how interaction could stand to be better are something to at least think about. Don't get turned off by the intro, his talk is pretty interesting IMO
-
I now require an overhead projector with a camera conversion in my life.
Fun fact: My team's conference room has a light fixture hanging over the table that takes 3 overhead projector bulbs.
-
This talk is really cool. It touches on a lot of "COMING SOON!" stuff from the 70's that didn't come true, and some that did (but not the way anyone thought it would be at the time)
There are three lessons here:
- Anyone talking about how programming is going to work in the future is pretty well full of shit.
- The more we know about where we've been, the better we can properly steer where we're going.
- We are literally doing the exact same fucking thing they did 40 fucking years ago, but it's shinier now. And that's depressing.
-
@Weng said in The Future of Computer Programming:
This talk is really cool. It touches on a lot of "COMING SOON!" stuff from the 70's that didn't come true, and some that did (but not the way anyone thought it would be at the time)
There are three lessons here:
- Anyone talking about how programming is going to work in the future is pretty well full of shit.
- The more we know about where we've been, the better we can properly steer where we're going.
- We are literally doing the exact same fucking thing they did 40 fucking years ago, but it's shinier now. And that's depressing.
As he was talking, it seemed to me like he was over-simplifying and eliding a lot of constraints in order to score points. Much of the snarky "surely we won't be doing this in 40" years is wrong in it's snark (a lot of that stuff can or is done...in certain domains) or sensibility (We do do the graphical/spatial coding he was talking about - it's just a lot more subtle because we have better UI).
As some examples:
-
Spatial coding: look at all the neat spatial info/manipulations a modern IDE and language provide. Code folding is a major, popular example. There's also the modern practice of separate files for separate objects/function-collections, which maps pretty well to the Pascal UI he was talking about.
-
Multiprocessors: The snark about Intel having a stranglehold is misplaced - GPUs have been around a long time....but a lot of day-to-day processes are pretty much constrained to linear operation, which is best served by a linear-optimized processor.
His talk says "These are good ideas. We should be using them everywhere in (pretend) 40 years." Reality says "A: We are. B: A lot of what we're doing simply doesn't need that. Choose the right tools for the tasks."
The end was basically "don't fall into the blub paradox", which is good advice.
It was interesting to see and hear about some of the things that we're going on back during prehistory and early history, but I wasn't fond of his pretending like we never made use of nor expanded on those ideas.
-
@lucas1 said in The Future of Computer Programming:
BTW the present in this video is the 1970s.
Considering the HD camera pointing down at the projector and the HD quality, I doubt that.
Well, unless you mean it's supposed to be representative of what a talk during that time might have sounded like.
-
Haven't had time to watch the video but just before I rush out...
@Dreikin said in The Future of Computer Programming:
Code folding is a major, popular example.
And existed in SPF, circa 1975. It was a major feature of ISPF, which was around 1980
There's also the modern practice of separate files for separate objects/function-collections, which maps pretty well to the Pascal UI he was talking about.
That's not all that modern either. It became mainstream with structured programming, again, see SPF. Also see the CADR machine source, which IIRC is around 1976-1980
-
@Dreikin I don't think it is that important how accurate it is. It is just supposed to make you think a bit differently before firing up a text editor and just hacking away. That was my take away from it anyway.
-
@lucas1 said in The Future of Computer Programming:
It is just supposed to make you think a bit differently before firing up a text editor and just hacking away.
Yeah, like that's a thing that's gonna happen. We've been doing more or less the same thing, making all the same mistakes, for 40 years and over several generations of programmers. We pretty much know what we're doing wrong, but we do it anyway.
Even tw@wood, who thought he was making something new and shiny, reinventing things, was doing the same fucking thing. Admittedly, he added new ways of doing it wrong...
-
@tufty well it is a relatively young profession compares to others that have been around for centuries.
I am sure there were Roman architects bemoaning the lack of innovation in construction.
-
@lucas1 said in The Future of Computer Programming:
I am sure there were Roman architects bemoaning the lack of innovation in construction.
And other Roman architects trying to invent lawns so they could tell the plebs to get off them…
-
@dkf said in The Future of Computer Programming:
invent lawns
why else did they need those aquaducts ... those lawns needed water!
-
@Dreikin said in The Future of Computer Programming:
I wasn't fond of his pretending like we never made use of nor expanded on those ideas.
For example, HTTP. clients first ask each other which version of the protocol the other supports before sending requests. I think TCP and IP do that as well..my point is, I think we've partially embraced some of the ideas he mentioned. Ofc we haven't gotten to the point where you don't need to implement an api client but it's better than nothing ¯\_(ツ)_/¯
-
@bb36e said in The Future of Computer Programming:
For example, HTTP. clients first ask each other which version of the protocol the other supports before sending requests.
No. Nice try.
HTTP is designed specifically so that the client can just splat the request at the server and the server then just splat the response back to the client. There's no need for complicated negotiation at the protocol level itself. (The client can ask for info first, but really doesn't have to.) Instead, the negotiation is done by careful formatting of requests and responses, making for really complicated code (HTTP clients tend to get gnarly, and HTTP servers are much more complicated than that) but very fast communication.
-
@dkf Yeah, and for IP/TCP the client decides the version and if it's unacceptable he'll never get a response. TLS is the only one I know of that negotiates (client sends minimum and maximum protocol version, the server picks one).
-
@dkf :x
-
@bb36e said in The Future of Computer Programming:
@CreatedToDislikeThis said in The Future of Computer Programming:
Anything good in the video aside from the "hurr hurr computers were so big" laughs?
He talks about a bunch of early experiments in HMI/computer inputs. They're actually pretty cool and impressive considering their age. I feel that the points he raises about how interaction could stand to be better are something to at least think about. Don't get turned off by the intro, his talk is pretty interesting IMO
Yes I think these things are interesting, but he seems to suggest that the reason we haven't switched to using these cool new systems that he professionally researches is that programmers are resistant to change. My counter-question is, "if all this technology was created 30 years ago, why didn't we switch?" Maybe the problem is that a lot of that stuff is harder than he suggests?
And that brings me to...
@lucas1 said in The Future of Computer Programming:
@pydsigner other than I hate how his website works, is that a bad thing?
Yes. He's an experimental developer guy who gets paid to faff around with cool stuff, but he's bought into it so much because there's these little niches where this stuff works and convinced himself that everyone else is just a bunch of backwards idiots because we aren't using it in all the areas where it doesn't.
-
@Dreikin said in The Future of Computer Programming:
Multiprocessors: The snark about Intel having a stranglehold is misplaced - GPUs have been around a long time....but a lot of day-to-day processes are pretty much constrained to linear operation, which is best served by a linear-optimized processor.
Actually, the architecture he was trying to convey there isn't the one used by GPUs - there's still a giant pool of shared memory being accessed by more than one processor.
The model he was talking about has a bunch of processing units with independent memory talking to each other over an interconnect fabric.
High performance scientific computing has been there for ages. But those techniques have only barely made their way back down into commercial computing. Part of it is that the problem space combined with Moore's law hasn't required it yet. But as we run out of physics, we're going to have to start thinking about it.
-
@pydsigner said in The Future of Computer Programming:
Maybe the problem is that a lot of that stuff is harder than he suggests?
A lot of it comes down to politics.
-
@Weng said in The Future of Computer Programming:
The model he was talking about has a bunch of processing units with independent memory talking to each other over an interconnect fabric.
-
@flabdablet That's one of my favorites. Tldr, everything we know and love is an inferior solution that beat out the better solution because of buttweaselry, and now we're fucking stuck with it forever.
-
@Weng Like VHS?
-
@PleegWat or TCP/IP.
-
@Weng said in The Future of Computer Programming:
TCP/IP.
Which “better” solution were you thinking of there? (I've seen a few back in the early '90s and they sucked.)
-
@dkf CLNP (DECnet ripped off a bunch of the key concepts from it and can be considered mostly equivalent) beat the pants off of IP on virtually every front. DHCP is a dumb hack, NAT is a hack, ARP is a hack. None would have been necessary if politics hadn't played into the alleged meritocracy of internet standards.
Seriously. Watch that Radia Perlman talk. It's awesome.
TCP itself is fine, you can run it on top of CLNP. People did. It worked.
-
@Weng said in The Future of Computer Programming:
Seriously. Watch that Radia Perlman talk. It's awesome.
Seriously. Watch any Radia Perlman talk. She's awesome.
-
@flabdablet said in The Future of Computer Programming:
Seriously. Watch any Radia Perlman talk.
If it was 5 minutes, yes. About 75 minutes? Meh. I'll read it some other time…
-
@dkf I watched it at the gym.
-
@Weng Did you become a double dick super predator: