I met a Real Programmer recently



  • I met an old man I'll call Dave who was recently laid off from his job (that he's had for decades) programming for IBM mainframes. I know next to nothing about that sort of programming -- I write enterprisey web-based information systems (TRWTF ROFLMAO mirite). I have all sorts of respect for his kind though, so I talked with him about programming for a while. He was just as hardcore as I had hoped when I heard the word "mainframe" -- his primary languages were Assembly, FORTRAN, and COBOL. What surprised me was his description of modern mainframes. Apparently, they're not so "old hat" as us young-un's like to think they are, and new stuff is made for them all the time. It's a bit different, though: he said that the hip new way of using a mainframe was no longer to write a great big mainframe application that runs on the bare metal and handles everything. Rather, the new way was to take a mainframe and make it run ~200 virtual machines of Linux, and within those, run ordinary modern C++ applications that all work together to serve the users. He would have none of this, though, as he regarded both C and C++ as "toy languages". Rather than get into that, he now is starting a rather low-class non-technical job as it's all he can find that he's willing to do. Hardcore, I told you.

    In response to the 200-virtual-machines thing, I said that it kinda sounded like the reverse of cloud computing. He got visibly irritated and explained that everyone has a different definition of "cloud computing" and therefore the entire concept is useless to talk about.

    He tiraded against the evils of PC hardware in a server role -- the administration complexity, high per-capacity cost on larger scales, security weaknesses, needing myriad OS licenses. He's right, of course -- One person can admin a mainframe, whereas a large farm of PC servers, (especially when their roles are diverse and the network is complex) can be difficult to maintain and there are many opportunities for mistakes. Potential hackers are familiar with how a PC operates, but most know nothing about even using a mainframe. However, the way he explained all this was akin to a kid proclaiming the merits of his XBox 360 against a rival PS3 owner. He decried the reluctance of companies to use mainframes, making fun of their sticker shock at the price of one, and citing the comparitively low per-seat cost of connected thin clients. He gave a very vague description of how a mainframe has myriad "layers" of security. Then there's this gem:

    "You know all those big financial companies that went down recently? Well, we built some things for their competitors. That's right: the ones that won used mainframes. And you know what the failures used, the ones like AIG that needed the government bailouts 'cause they were doing so bad? THEY USED PCs. Yep. It just goes to show ya."

    He even bashed the PC in general:

    "PCs today are thousands of times more powerful than the machines I grew up working with, but they're still slow as hell and you have to wait so long to do the simplest things sometimes. The reason Windows is so slow these days is that Microsoft refuses to prune or refactor old code. They're still using the same kernel they had in DOS."

    He asked me what I did, and I briefly mentioned my C++/Java/PHP work. His immediate reaction was "So, you make toys, pretty much." I was a bit taken aback, but I suppose I can't be too offended at this since I have the same opinion about Visual Basic and friends. Anyway, after hearing more about my PHP web systems he said "So, you use a NETWORK for that, right?" I had to pause before confirming -- after all, it's not often that I have to think about the actual network being used when I'm writing a PHP app. This prompted another gem from him:

    "In the system I was working on, we had around 20 databases, each with over 7 million records, and thousands of concurrently active users, and we were able to keep the response time for all requests under FOUR SECONDS. Let's see you do THAT with a NETWORK."

    ... ending with a smug grin, as he actually leaned back in his chair in victory. I had an immediate thought about the Amazon Cloud services, but couldn't bring myself to say anything. I can't remember if it was because I was intimidated, or I was afraid of making his brain explode.
    This reminds me of 2.3 from the Tao of Programming.

     A programmer from a very large computer company went to a software conference and then returned to report to his manager, saying: ``What sort of programmers work for other companies? They behaved badly and were unconcerned with appearances. Their hair was long and unkempt and their clothes were wrinkled and old. They crashed our hospitality suite and they made rude noises during my presentation.''

    The manager said: ``I should have never sent you to the conference. Those programmers live beyond the physical world. They consider life absurd, an accidental coincidence. They come and go without knowing limitations. Without a care, they live only for their programs. Why should they bother with social conventions?

    ``They are alive within the Tao.''

    I wonder if Dave was one of those guys.



  • Well, Dave would obviously feel and express some bitterness ( due to his current situation ). He had to develop some fairly mind-numbingly complex skillsets to perform his job-set, and has been obsoleted by "simpler" technologies and platforms. He isn't necessarily JUSTIFIED in his assertions, but they are to be expected considering his position.

    And we'll all be in the same boat eventually. I shudder at the thought of "cloud" computing really taking hold, and was at one time worried about JAVA and C# making me obsolete. Thankfully anything that can be done in C# can be done in VB, CLR's are making more and more programming tasks language agnostic, and Java is about to be locked tight behind an exorbitant licensing fee wall, delaying my obsolescense.

    We all will eventually look at the current state of our field's with the same disdain and harken back to the "glory days".

    "So, you use



  • He's got a point or two, though. I remember working on my university's 1Mb DEC mainframe with 100 other users, and it responded just fine. I have a quad-core PC with Vista, which sometimes locks up for a few seconds. It can run games with a high frame rate, but it cannot multitask properly.  And it uses 1Gb or so just to start up. That's not good.

    About the database response: it depends on how many records you need for your response, of course. Our clients require reports with aggregates over 100k+ records within 10 seconds. That's a very different kind of sport than serving 100k+ clients who all want to see 1 record. But if I remember the stories about DB2 on those MVS things, it was even more hateable than Oracle. And The Real Toy Language (TRTL) is of course COBOL. That's one cluster-fuck of a language. Good riddance. Now we only have to get rid of Perl...



  • @TGV said:

    He's got a point or two, though. I remember working on my university's 1Mb DEC mainframe with 100 other users, and it responded just fine. I have a quad-core PC with Vista, which sometimes locks up for a few seconds. It can run games with a high frame rate, but it cannot multitask properly.  And it uses 1Gb or so just to start up. That's not good.

     

    But todays computers are actually doing more while doing nothing, which I think is part of the problem. 

    If I boot my computer into single user shell, it will be doing nothing but waiting for my input. But if I boot into a normal session, the computer will be waiting for my input, while syncing my files, updating some search index, checking my mail, scanning for wifi AP points, calculating my battery time, etc..
    Now while I hate those unresponsive moments just as much as the next person, I do have to say that at the end of the day I also like and expect it to do all the other things. Yes there are still stoneage critters running about that prefer to dictate 100% when and if their computer should do things, but I think the majority will prefer their computer doing stuff in the background, and will have to unfortunately live with the spot of unresponsiveness once every while.



  • @stratos said:

    But if I boot into a normal session, the computer will be waiting for my input, while syncing my files, updating some search index, checking my mail, scanning for wifi AP points, calculating my battery time, etc.

    I'm not at all saying I don't want my computer to do that: my whole point was that modern PC OSes are pretty bad in multi-tasking, whereas old mainframes could serve over 100 processes without a hiccup. The BeOS was an example of multi-tasking made important in OS design. Way back (1996), it could run several movies simultaneously and still be responsive, with two 133MHz processors. Apparently, it was a bitch to program, but to the user it behaved better than a 2.somethingGHz quad core.



  •  sorry, wasn't actually targeted at you, just went off on a small rant.



  • @TGV said:

    The BeOS was an example of multi-tasking made important in OS design. Way back (1996), it could run several movies simultaneously and still be responsive, with two 133MHz processors.
     

    Pretty sure that wasn't 720p.



  • @TGV said:

    The Real Toy Language (TRTL) is of course COBOL. That's one cluster-fuck of a language. Good riddance. Now we only have to get rid of Perl...

     

    1.2

    The Tao gave birth to machine language. Machine language gave birth to the assembler.
    The assembler gave birth to the compiler. Now there are ten thousand languages.
    Each language has its purpose, however humble. Each language expresses the Yin and Yang of software. Each language has its place within the Tao.
    But do not program in COBOL if you can avoid it.



  • Fuck you guys I love COBOL.

    Congrats, however, on capitalizing it properly.



  • That's a cool story and while you can have a sense of respect for the guy's experience and seniority, at the same time he sounds like a pompous ass and a bit misguided one at that - I don't know how he could draw the conclusion that the companies which were going bust or needed bailouts like AIG was down to them using PC hardware instead of mainframes (correlation doest not imply causation) but aside from that, those companies failed or struggled because they had fundamentally bad business models, not slow hardware.



  • Perhaps his point about using a network was a poorly-worded way of saying that HTTP takes a lot of processing the way we use it in web frameworks and that sometimes you can do it faster and easier by blasting char streams at 3270s, which allow a surprising level of autonomous operation.



  • @arty said:

    Perhaps his point about using a network was a poorly-worded way of saying that HTTP takes a lot of processing the way we use it in web frameworks and that sometimes you can do it faster and easier by blasting char streams at 3270s, which allow a surprising level of autonomous operation.


    Yeah, web frameworks add major overhead. I really don't like HTTP. Why couldn't we establish a nicer protocol for sending text?

    @TGV said:

    And The Real Toy Language (TRTL) is of course COBOL. That's one cluster-fuck of a language. Good riddance.



    Not quite yet. Last term we had IBM over for a presentation and they're pretty ambitious about teaching the young folks old languages, COBOL being their first choice. One of their newer products includes an Eclipse based IDE for building COBOL with pointy-clicky interface tools.

    And what's wrong with Perl? I'd take that over C any time.



  • @Shortjob said:

    really don't like HTTP. Why couldn't we establish a nicer protocol for sending text?
    There is gopher protocol, which is better if all you need to do is send text to the client based on a selector string. And for other uses there are other protocols, such as SMTP, POP3, Telnet, but sometimes HTTP is good for some things, too, but not everything. They tend to use HTTP for a lot of things these days though, and I think that is not quite the best way to do so! And, yes 3270 might be a better way for some things, even.

    Last term we had IBM over for a presentation and they're pretty ambitious about teaching the young folks old languages, COBOL being their first choice.
    COBOL is for common business oriented language, so perhaps some people like to use it for common business function. (I don't find it necessary because I can write my own instead; but some people can have different opinion, please.)
    And what's wrong with Perl? I'd take that over C any time.
    I use C any time. Actually, I usually use Enhanced CWEB, which allows your program to have a table of contents and index, and allows including TeX codes as well in the file, also writing codes in any order, and writing codes for code generation (it has a C interpreter built-in so that you can write codes to generate the other codes, if you want to), and more....


  • @TGV said:

    I'm not at all saying I don't want my computer to do that: my whole point was that modern PC OSes are pretty bad in multi-tasking, whereas old mainframes could serve over 100 processes without a hiccup. The BeOS was an example of multi-tasking made important in OS design. Way back (1996), it could run several movies simultaneously and still be responsive, with two 133MHz processors. Apparently, it was a bitch to program, but to the user it behaved better than a 2.somethingGHz quad core.

    BeOS was good at multitasking because it foisted all the work on the application programmer. (For example, you couldn't create a BeOS application with less than 3 threads, IIRC. It forced you to put your UI in a different thread from the rest of the application logic, which is great if you're writing, say, Photoshop, but awful if you just want to crap out a quick-and-dirty business app or teach a kid BASIC on the thing.

    It wasn't like BeOS created some miracle OS, they just created a large set of guidelines (that would lead to good multitasking on *any* OS) and forced application developers to use them. If your Windows apps/services/drivers followed the same BeOS guidelines, they'd run significantly smoother also.

    Also, people who wax poetic about BeOS have a tendency to forget its shortcomings. For example, it somehow claimed to be POSIX-compatible while having absolutely zero multi-user support. They didn't have printing support until version 4. Their file format system, while extremely clever, was pretty much designed to put software companies out of business-- why would Adobe ever port Photoshop to the thing? Or Microsoft port Word? Or pretty much any complex app? (Note: at the same time, Apple was working on OpenDoc, which had the same problem. The only application they ever got ported to the thing was ClarisWorks, and that's only because they owned it at the time!)

    I mean, on the one hand, BeOS didn't deserved to be killed-off the way it was, but on the other hand they obviously had no business sense whatsoever. It's an OS design which made applications hard to program, and also spit all over third-party applications.

    And even then, we'd be using BeOS *right now* if they weren't stupid enough to throw out Apple's extremely generous offer. Imagine an OS X based on the beautiful and clean BeOS kernel instead of on that shitty NeXT/BSD/Mach/whatever mess they have now!



  • @blakeyrat said:

    Imagine an OS X based on the beautiful and clean BeOS kernel instead of on that shitty NeXT/BSD/Mach/whatever mess they have now!
     

    Isn't OSX a Debian core?



  • Darwin not Debian. IIRC (no-longer-micro)kernel forked from Mach, with userspace forked from FreeBSD somewhere around the 4.x release.



  • @bannedfromcoding said:

    Darwin not Debian.
     

    Oh, ofc, yes. I half-knew it wasn't Debian, but it was something D...a...ish.


Log in to reply