Saving the World from Code


  • Notification Spam Recipient

    Breaking news: The Atlantic has discovered that bugs are bad and that they don't understand programming. Or performance, or economics, or the existence of GameMaker Studio, or a bunch of other things. A real gem is when they start applauding the existence of 'proven fixes', and another would be when they start comparing flowcharting vs textual programming to hex vs assembly.



  • Some gems in here.

    The attempts now underway to change how we make software all seem to start with the same premise: Code is too hard to think about. Before trying to understand the attempts themselves, then, it’s worth understanding why this might be: what it is about code that makes it so foreign to the mind, and so unlike anything that came before it.

    lolwhat?

    And yet we have hardly noticed, because all of that complexity is packed into tiny silicon chips as millions and millions of lines of code

    Funny story, I once took scissors to a processor and all of a sudden reams and reams of code just came pouring right out.

    Anyone looking over a programmer’s shoulder as they pored over line after line like “100001010011” and “000010011110” would have seen just how alienated the programmer was from the actual problems they were trying to solve

    Pity technology never advanced from the 1950's though. Really thought it would have.

    But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”

    Because it's written by a single person.

    Since the 1980s, the way programmers work and the tools they use have changed remarkably little

    C, maybe.

    Chris Granger, a software developer who worked as a lead at Microsoft on Visual Studio, an IDE that costs $1,199 a year and is used by nearly a third of all professional programmers.

    [Citation needed]

    The findings surprised him. “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code."

    So, it's not even close then? Apparently cars are 100m lines and Windows 95 alone had something like 28m.

    Victor has the mien of David Foster Wallace, with a lightning intelligence that lingers beneath a patina of aw-shucks shyness. He is 40 years old, with traces of gray and a thin, undeliberate beard. His voice is gentle, mournful almost, but he wants to share what’s in his head, and when he gets on a roll he’ll seem to skip syllables, as though outrunning his own vocal machinery.

    Does his inner goddess sing when he's with Christian Grey? What a shit paragraph stuffed with purple prose that serves no purpose.

    That code now takes the form of letters on a screen in a language like C or Java (derivatives of Fortran and ALGOL)

    ALGOL, yes, via B, but not Fortran.

    Could keep going but this article is awful.


  • Discourse touched me in a no-no place

    We can do software that is provably correct if we have a specification that is exactly correct and nailed down. There's quite a bit of tooling out there that enables this sort of thing. The usual problem is that the specifications are rotten and highly variable, and the people buying the software's creation aren't willing to pay what it takes to do it right in the first place.


  • SockDev

    Visual Studio, an IDE that costs $1,199 a year

    0_1506503911206_88a32e71-fa25-4840-b61f-9ec20306814c-image.png


  • Notification Spam Recipient

    @thegoryone said in Saving the World from Code:

    Could keep going but this article is awful.

    I diligently read the whole thing before posting. Either the "experts" they're talking about were wildly misquoted, or I'm not sure I want to use development tools created by them.



  • Got hung up on their example further down the article:

    Suppose you wanted to design a level where Mario, jumping and bouncing off of a turtle, would just make it into a small passageway. Game programmers were used to solving this kind of problem in two stages: First, you stared at your code—the code controlling how high Mario jumped, how fast he ran, how bouncy the turtle’s back was—and made some changes to it in your text editor, using your imagination to predict what effect they’d have. Then, you’d replay the game to see what actually happened.

    That might have been the case when the original Mario was created (probably not), but I'd guess most level design uses graphical editors these days. Also, it's unlikely that you'd tweak the (global) gravity; you'd arrange the level in such a way that the jump is possible. No programming involved if you have a graphical level editor. (You should. Otherwise, your artists/level designers will rightly hate you.)

    It's hopefully just a misguided example, though.


  • Impossible Mission Players - A

    Nearing-retirement fuddy duddy gets assigned to write an article on technology because that's all the cool kids click on these days. News at 11.





  • Operated by a systems provider named Intrado

    Is that like the Mexican subsidiary of Initech?

    Software is different. Just by editing the text in a file somewhere, the same hunk of silicon can become an autopilot or an inventory-control system.

    Yes. That's exactly what caused Malaysia Flight 370. See, they accidentally changed the autopilot system to an inventory system in an update to the Boeing 777 and the algorithm determined it had two engines in surplus, so it decided to detach them. True story.



  • @thegoryone said in Saving the World from Code:

    But these systems have become so complicated that hardly anyone can keep them straight in their head. “There’s 100 million lines of code in cars now,” Leveson says. “You just cannot anticipate all these things.”

    Because it's written by a single person.

    And all at the same time.



  • This article is total blakey bait. Did anything ever happen with Light Table?

    Chris Granger, who had worked at Microsoft on Visual Studio, was likewise inspired. Within days of seeing a video of Victor’s talk, in January of 2012, he built a prototype of a new programming environment. Its key capability was that it would give you instant feedback on your program’s behavior. You’d see what your system was doing right next to the code that controlled it. It was like taking off a blindfold. Granger called the project “Light Table.”

    @cvi said in Saving the World from Code:

    It's hopefully just a misguided example, though.

    I see stuff like that and it's all horribly specific and contrived. I can't see how to generalize that to make sense with anything but very very similar cases. To which I expect the response: Don't use "It's hard" as an excuser! Yeah, you can fuck off, too.

    But seeing the impact that his talk ended up having, Bret Victor was disillusioned. “A lot of those things seemed like misinterpretations of what I was saying,” he said later.

    Yeah, trying to turn science fiction into reality is often underwhelming.

    Spoilering big quotes below for brevity:

    More visual, less code

    In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface. Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.

    This makes me think of someone looking at how complicated letters and words and stuff are and thinking that we don't need that. We should just use pictures. I mean, it's true that a lot of stuff that programmers work with lend themselves to graphical / visual sorts of editors and it's good to build specific tools for those specific problems. They aren't general tools, though.

    Model based design / flowcharts

    Bantégnie’s company is one of the pioneers in the industrial use of model-based design, in which you no longer write code directly. Instead, you create a kind of flowchart that describes the rules your program should follow (the “model”), and the computer generates code for you based on those rules. If you were making the control system for an elevator, for instance, one rule might be that when the door is open, and someone presses the button for the lobby, you should close the door and start moving the car. In a model-based design tool, you’d represent this rule with a small diagram, as though drawing the logic out on a whiteboard, made of boxes that represent different states—like “door open,” “moving,” and “door closed”—and lines that define how you can get from one state to the other. The diagrams make the system’s rules obvious: Just by looking, you can see that the only way to get the elevator moving is to close the door, or that the only way to get the door open is to stop.

    Oh, god, make it stop! Yeah, another thing that can work well for simple systems with restricted domains. It doesn't scale and you're really not being more expressive like you think you are.

    Now, the stuff on TLA+ is pretty cool:

    TLA+, which stands for “Temporal Logic of Actions,” is similar in spirit to model-based design: It’s a language for writing down the requirements—TLA+ calls them “specifications”—of computer programs. These specifications can then be completely verified by a computer.

    I'm guessing that was some of the stuff that @dkf was talking about. Of course, WRT TFA, it's...writing more code! But it also touches on something I've argued about a lot around here:

    Writing a program is doing math!

    Most programmers aren’t very fluent in the kind of math—logic and set theory, mostly—that you need to work with TLA+. “Very few programmers—and including very few teachers of programming—understand the very basic concepts and how they’re applied in practice. And they seem to think that all they need is code,” Lamport says. “The idea that there’s some higher level than the code in which you need to be able to think precisely, and that mathematics actually allows you to think precisely about it, is just completely foreign. Because they never learned it.”


  • Grade A Premium Asshole

    I'm not done reading yet, but... is it just me or does the first example undermine the premise of the article? The whole thing seems to be saying that "everyone should just be defining requirements and somehow magically some tool would automatically translate those into autogenerated code", but the example with the 911 isn't an example of bad code, it's a bad (or nonexistent) requirement. Either the requirements for that system said to use a 32-bit signed integer as a counter to give unique IDs to calls, or - more likely - the requirements just said to have an unique ID for each call, and the people implementing that used a 32-bit signed int because that seemed OK and they didn't think about it overflowing at all. The code was correct.

    The end result, refusing 911 calls because of an internal counter, sounds like another bad or missing requirement, too - you don't want that to happen under any circumstances, someone just didn't think about what happens when there are no more IDs left to assign.

    The whole example is a show of shitty design, not shitty code, and the author uses it as a point to argue for getting rid of code.


  • Grade A Premium Asshole

    Also this is amusing, although I'd very much like to know what the dude considers relevant:

    “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. (...)"



  • Solution: all new programs will be made in Scratch. Imagine that, enterprise-grade XML being read by Scratch! Browsers in Scratch!



  • @boomzilla said in Saving the World from Code:

    More visual, less code > In a pair of later talks, “Stop Drawing Dead Fish” and “Drawing Dynamic Visualizations,” Victor went one further. He demoed two programs he’d built—the first for animators, the second for scientists trying to visualize their data—each of which took a process that used to involve writing lots of custom code and reduced it to playing around in a WYSIWYG interface. Victor suggested that the same trick could be pulled for nearly every problem where code was being written today. “I’m not sure that programming has to exist at all,” he told me. “Or at least software developers.” In his mind, a software developer’s proper role was to create tools that removed the need for software developers. Only then would people with the most urgent computational problems be able to grasp those problems directly, without the intermediate muck of code.

    This makes me think of someone looking at how complicated letters and words and stuff are and thinking that we don't need that. We should just use pictures. I mean, it's true that a lot of stuff that programmers work with lend themselves to graphical / visual sorts of editors and it's good to build specific tools for those specific problems. They aren't general tools, though.

    I was going to comment on that too, but then forgot about it. But, essentially what you say.

    Plus, these tools do exist and have in some cases for quite some time. I doubt (most) animators write code these days, that'swhat you've got big name software like Maya and whatnot for. Related: writing shaders (i.e., for creating material descriptions) has gotten much more visual too in recent years (often by connecting boxes to create something like a flow-chart, from which the actual shader code is generated). If you have a decent tool, you'll also get previews (assuming that's possible). (Being able to quickly show accurate previews is often considered a big deal(tm).)

    Yeah, another thing that can work well for simple systems with restricted domains. It doesn't scale and you're really not being more expressive like you think you are.

    Example: LabView (think: programming by placing boxes and drawing wires). Works well (I suppose) for simple problems. But if you add complexity, it'll start to suffer from the same problem as any old program: things get complex. If you're not careful, you'll not only end up with spaghetti-code in the traditional sense, but also because you have a pile of interconnected virtual wires.


  • :belt_onion:

    @cvi said in Saving the World from Code:

    That might have been the case when the original Mario was created (probably not)

    I'm pretty sure there were game development kits by the point of the NES, but from what I've been told, there were times before that when developers didn't get to actually play and test the games until they'd basically shipped. They just had to be really sure of their code. If there ended up being bugs in the game, they'd essentially get documented as "Easter eggs" in the manual.


  • Grade A Premium Asshole

    Then WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”

    I have never called them "wizzywig". I have never heard anyone call them that. Is that a thing? Do people say that?


  • kills Dumbledore

    @polygeekery said in Saving the World from Code:

    Do people say that?

    I do. How do you say it if you're talking about WYSIWYG?


  • Impossible Mission - B

    @polygeekery said in Saving the World from Code:

    I have never called them "wizzywig". I have never heard anyone call them that. Is that a thing? Do people say that?

    Yes. That's been the standard pronunciation for decades.



  • @polygeekery said in Saving the World from Code:

    Then WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”

    I have never called them "wizzywig". I have never heard anyone call them that. Is that a thing? Do people say that?

    Please tell me you haven't been pronouncing it double-yew-why-ess-eye-double-yew-why-gee. If you said that to me, I'd walk away from you halfway as you stumble through that tongue twister.



  • @cvi said in Saving the World from Code:

    Yeah, another thing that can work well for simple systems with restricted domains. It doesn't scale and you're really not being more expressive like you think you are.

    Example: LabView (think: programming by placing boxes and drawing wires). Works well (I suppose) for simple problems. But if you add complexity, it'll start to suffer from the same problem as any old program: things get complex. If you're not careful, you'll not only end up with spaghetti-code in the traditional sense, but also because you have a pile of interconnected virtual wires.

    My company was looking to develop a simple logic-based language that would be simple enough for non-programmers to write (it wouldn't be intended to create large-scale stuff, just simple if statements and the like). For April Fools I sent people screenshots from a prototype of what we were thinking of using, which were shots of complicated LabView constructs. I nearly got shot in the face for the idea. It was worth it for the giggles, though.


  • Grade A Premium Asshole

    @jaloopa said in Saving the World from Code:

    @polygeekery said in Saving the World from Code:

    Do people say that?

    I do. How do you say it if you're talking about WYSIWYG?

    I estimate I only have to refer to them a couple of times a year in IRL conversation, so I just say the words the initialism stand for.


  • Discourse touched me in a no-no place

    @boomzilla said in Saving the World from Code:

    I'm guessing that was some of the stuff that @dkf was talking about.

    I was actually thinking about VDM and the B-Method, but it's part of the same milieu. The aim is that you convert a specification (which is sufficiently mathematical to be tractable to prove correct) through proven-correct transforms into a program that implements the specification, and it's one of the cheapest ways of producing correct code for safety-critical applications since you greatly reduce the amount of effort needed on testing.

    Even Eiffel's design-by-contract approach is pretty good. Heck, it's probably a good idea for programmers to think that way in their own classes anyway, whether or not they discuss it with management, since it's a good way to identify what you precondition guards and postcondition guarantees are, as well as what constant constraints you want to actually maintain.



  • @boomzilla said in Saving the World from Code:

    This article is total blakey bait. Did anything ever happen with Light Table?

    Chris Granger, who had worked at Microsoft on Visual Studio, was likewise inspired. Within days of seeing a video of Victor’s talk, in January of 2012, he built a prototype of a new programming environment. Its key capability was that it would give you instant feedback on your program’s behavior. You’d see what your system was doing right next to the code that controlled it. It was like taking off a blindfold. Granger called the project “Light Table.”

    Yes. The article doesn't even come close to the truth, however. The amount of misrepresentation, both of the goals and of its success, is annoying as fuck.

    It's basically "Emacs re-written in Clojure with modernized default keybindings, running on top of Electron as a webtop app". It's not bad, for what it is, but not great; it is in permanent beta and not really anything new at all. It sure as shit isn't Visual Studio, or even Eclipse, though it has some potential if they ever, you know, put some effort into it.

    So basically only Lisp weirdos like me care, and more because it is Lisp than because it is good. It could be good, but... probably won't?

    The "seeing the results as you worked on it" part? I'm pretty sure they are talking about being able to launch the program from inside the editor. Seriously. What modern programmer's editor can't do that, with a little coaxing? Notepad, maybe, if we reeeeeeally stretch the definition of "programming editor"? Even Notepad++ can. That's, like, 1960s tech - it was a core function in TECO, FFS. I can't imagine the programmers gushing about something that mundane and basic.

    Maybe they have something bigger in mind than that, but if so, it's not clear - I am guessing that the authors had explained something about it that was new, or at least uncommon (being able to run selected code in a background REPL, maybe, something common in the Emacs world but rare elsewhere because, fuck interpreters) and the 'journalist' completely didn't get it and so he made some shit up that sounded sort of like what he'd been told.



  • @blek said in Saving the World from Code:

    I'm not done reading yet, but... is it just me or does the first example undermine the premise of the article? The whole thing seems to be saying that "everyone should just be defining requirements and somehow magically some tool would automatically translate those into autogenerated code", but the example with the 911 isn't an example of bad code, it's a bad (or nonexistent) requirement.

    Depends on the level of requirements you're talking about. Imagining a system where the requirements got to that level of implementation detail strikes me as a big old ball of undefined.



  • @blek said in Saving the World from Code:

    Also this is amusing, although I'd very much like to know what the dude considers relevant:

    “Visual Studio is one of the single largest pieces of software in the world,” he said. “It’s over 55 million lines of code. And one of the things that I found out in this study is more than 98 percent of it is completely irrelevant. (...)"

    It reminded me of the anecdata about how most users use only a tiny fraction of the features in, say, MS Word, but they all use different fractions. Given the lack of detail in TFA on this point, I have no idea if the two are in any way related.



  • @polygeekery said in Saving the World from Code:

    Then WYSIWYG (pronounced “wizzywig”) came along. It stood for “What You See Is What You Get.”

    I have never called them "wizzywig". I have never heard anyone call them that. Is that a thing? Do people say that?

    I never hear it in conversations anymore - so Get Off My Grass !!!


  • Grade A Premium Asshole

    @boomzilla This is a 911 service provider we're talking about - I'd expect the requirements to be several truckload of binders.



  • @polygeekery said in Saving the World from Code:

    Do people say that?

    I do. From now on.



  • @blek said in Saving the World from Code:

    @boomzilla This is a 911 service provider we're talking about - I'd expect the requirements to be several truckload of binders.

    Right, but as you kind of noted in the rest of the post, that's a design time requirement / element. Not a business level requirement, which was the sort of requirement that TFA was talking about.

    Also: I wonder if the actual number was oddly specific.

    ETA: If my business users tried to get into the business of dictating data types I would start some fights.



  • The electromechanical interlockings that controlled train movements at railroad crossings, for instance, only had so many configurations; a few sheets of paper could describe the whole system, and you could run physical trains against each configuration to see how it would behave. Once you’d built and tested it, you knew exactly what you were dealing with.

    Right. Because failure states in mechanical systems never involve things breaking in an unpredictable fashion, people doing stupid things that never occurred to the designers, incorrect values for load capacity or load requirements ("Wait, did you say that the Tacoma Narrows regularly get 40MPH winds, I thought it was only 25MPH!"), failures to make necessary engineering calculations ("I'm sure that the connection between these hanger rods whose layout we changed for aesthetic reasons, and the box girders we redesigned for aesthetic reasons, can hold the load; no reason to re-calculate that"), manufacturing defects or shoddy production processes, maintenance errors ("Ah, I don't need to look up the correct parts in the documentation, matching these replacement screws for this airliner's windshield by eye has always worked before"), undocumented engineering shortcuts ("Hey, this procedure of using a forklift when re-attaching the engine nacelles in a process requiring millimeter precision works great and saves a ton of time, let's tell everyone else about it, but don't bother writing it down or anything"), business cutbacks leading to incomplete maintenance ("Ah, what harm could there possibly be in going six months between times lubricating the tail section's elevator jack screws rather than the mandated six weeks?"), manager's making asinine decisions without looking at the consequences ("sure, you can just slide those 20 ton air conditioning units across the roof, no need for a crane or even rollers"), or anything else that can't be anticipated. Good to know.


  • Discourse touched me in a no-no place

    @the_quiet_one said in Saving the World from Code:

    complicated LabView constructs

    You get similarly complex things in web service automation (especially once you start trying to build proper tracking and failure recovery pathways in), except there people tend to be a bit happier with not seeing everything at once so you can hide a bit more hierarchically.

    Fundamentally, the real problem is that to make code work right you need to use a clear-enough description to give to the computer. That clear-enough description is exactly what a program is, and anyone who is skilled enough at thinking of the consequences of what they wish ask for to write one that stands a reasonable chance of working is someone who has the right mental skills to be a programmer. If you can't organise your thoughts well enough to write a clear description of what you want, you're not going to be able to make a computer reliably do it. Graphical tools can sometimes help with removing unnecessary complexity, but they can very easily add complexity too (alas!) and the necessary complexity in what you're trying to do will still be there.


  • Grade A Premium Asshole

    @boomzilla I guess you're right.

    Regarding the number, I was actually surprised that the author just wrote millions instead of 2147483647 (which is 101% what the number was) and then remarking about how oddly specific that was. I knew I saw that somewhere before...


  • Discourse touched me in a no-no place

    @scholrlea said in Saving the World from Code:

    Because failure states in mechanical systems never involve things breaking in an unpredictable fashion

    Nobody ever made a building without tying the outer skin of the walls to the inner structure. Nobody ever specified it and for sure no builder ever cut that corner so that nobody ever had to worry about bricks falling out of the facade of a 20 storey building and hitting people below on the head. Never ever happened. No way.

    Also, the problems with the building next to this one are entirely coincidental and have no correlation with what I was just saying. 😒



  • Just to make my point clear: I am not saying that this isn't a problem in software development. It's a massive problem. But it is a problem in other engineering fields as well.

    The difference is that if a civil engineer fucks up in designing a levee, or a maintenance company fails its due diligence leading to a bridge collapse, or a chemical plant releases 40 tons of toxic chemicals into the atmosphere right next to a slum, there's a reasonable expectation of prosecution, fines, jail time, or worse. Justice may not actually come, or may be slow in coming (especially if the owners of said chemical plant are in another country which refuses to extradite them because they're buddies with the politicians in their home country), but it is normal to think that they will face the consequences of their failures.

    Nothing like that exists for software, and it is hard to see how it could because programs are so interconnected that pinning the blame becomes Quixotic in many cases. There's also the view that bad software isn't seriously harmful, which is bullshit but remains the general attitude of most developers, and even most users.

    Everyone knows it needs to be fixed, even if they don't know how.



  • I have posted a different version this in the SoTD thread before, but here it is again, because in the end, the problem isn't the software, it's the people making and using it - and there are no available engineering statistics for the load capacity of the human mind.

    Hymn to Breaking Strain -- EPIC VERSION -- CJ Cherryh tribute – 04:03
    — Per Malm



  • @dkf Good example.

    Thing is, the article has a kernel of truth to it. However, not only is it misrepresenting that kernel, it is presenting as some major revelation - as opposed to a problem that has been known for almost fifty years, and which is almost certainly immune to any silver bullet, especially one which has been tried before - and the ones they mention have been.

    I am pissed at how it misrepresents Bret Victor's work, too, though given his over-the-top rhetoric about eliminating programming, he's sort of set himself up for it.



  • @scholrlea said in Saving the World from Code:

    Maybe they have something bigger in mind than that, but if so, it's not clear -

    They always showed off demos of inline calculations, and automatically updating visualizations as you edit the code. It seemed mildly interesting.



  • @dkf said in Saving the World from Code:

    @scholrlea said in Saving the World from Code:

    Because failure states in mechanical systems never involve things breaking in an unpredictable fashion

    Nobody ever made a building without tying the outer skin of the walls to the inner structure. Nobody ever specified it and for sure no builder ever cut that corner so that nobody ever had to worry about bricks falling out of the facade of a 20 storey building and hitting people below on the head. Never ever happened. No way.

    Also, the problems with the building next to this one are entirely coincidental and have no correlation with what I was just saying. 😒

    I also liked the building where they designed the glass front such that it acted as a giant concave mirror - and the focal points were at street level, which promptly resulted in partially melted / scorched parked cars.

    Good thing it was in London, so it wasn't a regular problem.


  • kills Dumbledore

    @rhywden said in Saving the World from Code:

    @dkf said in Saving the World from Code:

    @scholrlea said in Saving the World from Code:

    Because failure states in mechanical systems never involve things breaking in an unpredictable fashion

    Nobody ever made a building without tying the outer skin of the walls to the inner structure. Nobody ever specified it and for sure no builder ever cut that corner so that nobody ever had to worry about bricks falling out of the facade of a 20 storey building and hitting people below on the head. Never ever happened. No way.

    Also, the problems with the building next to this one are entirely coincidental and have no correlation with what I was just saying. 😒

    I also liked the building where they designed the glass front such that it acted as a giant concave mirror - and the focal points were at street level, which promptly resulted in partially melted / scorched parked cars.

    Good thing it was in London, so it wasn't a regular problem.



  • @sumireko said in Saving the World from Code:

    Solution: all new programs will be made in Scratch. Imagine that, enterprise-grade XML being read by Scratch! Browsers in Scratch!

    NO GOD! PLEASE NO!!! NOOOOOOOOOO – 00:17
    — Retro Pep


  • Impossible Mission - B

    @dkf said in Saving the World from Code:

    Fundamentally, the real problem is that to make code work right you need to use a clear-enough description to give to the computer. That clear-enough description is exactly what a program is, and anyone who is skilled enough at thinking of the consequences of what they wish ask for to write one that stands a reasonable chance of working is someone who has the right mental skills to be a programmer. If you can't organise your thoughts well enough to write a clear description of what you want, you're not going to be able to make a computer reliably do it.

    That's basically the point Joel Spolsky made in this article:

    The problem, here, is very fundamental. In order to mechanically prove that a program corresponds to some spec, the spec itself needs to be extremely detailed. In fact the spec has to define everything about the program, otherwise, nothing can be proven automatically and mechanically. Now, if the spec does define everything about how the program is going to behave, then, lo and behold, it contains all the information necessary to generate the program! And now certain geeks go off to a very dark place where they start thinking about automatically compiling specs into programs, and they start to think that they’ve just invented a way to program computers without programming.

    Now, this is the software engineering equivalent of a perpetual motion machine. It’s one of those things that crackpots keep trying to do, no matter how much you tell them it could never work. If the spec defines precisely what a program will do, with enough detail that it can be used to generate the program itself, this just begs the question: how do you write the spec? Such a complete spec is just as hard to write as the underlying computer program, because just as many details have to be answered by spec writer as the programmer. To use terminology from information theory: the spec needs just as many bits of Shannon entropy as the computer program itself would have. Each bit of entropy is a decision taken by the spec-writer or the programmer.

    So, the bottom line is that if there really were a mechanical way to prove things about the correctness of a program, all you’d be able to prove is whether that program is identical to some other program that must contain the same amount of entropy as the first program, otherwise some of the behaviors are going to be undefined, and thus unproven. So now the spec writing is just as hard as writing a program, and all you’ve done is moved one problem from over here to over there, and accomplished nothing whatsoever.


  • Impossible Mission - B

    @scholrlea said in Saving the World from Code:

    Everyone knows it needs to be fixed, even if they don't know how.

    First things first, fix the 2 largest causes of problems: buffer overflows and SQL injection.

    Buffer overflows are a uniquely C problem. While it's theoretically possible to have problems with them in languages other than C and its closest relatives, in practice they almost never occur outside of the C realm.

    The Morris Worm, way back in 1989, should have put the programming world on notice that C was not suitable for the purpose of designing secure software. Unfortunately, we didn't listen, and buffer exploits have caused billions of dollars in damages ever since. And it's not getting any better. Despite almost 30 years passing since the Morris Worm, we still get patches every month to fix buffer exploits. This is not an easy thing to get right in a language that makes doing it wrong easy and intuitive. And in the modern world, we simply can't afford to continue with a system that doesn't take the simple reality of errare humanum est into account.

    If we want this to stop, we need to abandon C. Better alternatives exist. (Heck, better alternatives existed at the time of the Morris Worm!) Fixing that would fix a massive amount of security issues right away.

    The second large class of errors we see again and again is SQL Injection. Unlike with C, there isn't a better alternative in the space that SQL occupies. But unlike buffer overflows, SQL Injection is a trivial problem to solve: use parameters instead of string-building to put outside values into your query. Use parameters, use parameters, use parameters. It really is that simple, every time. Just use parameters.

    Unfortunately, people continue to not use parameters, most of the time because they're not aware of parametrized queries and how they work. This would be simple enough to fix with a change to the DBMS: create a setting, which is on by default, which will cause a query to fail with an error if it contains any literal value written inline rather than as a parameter. This could be disabled for ad-hoc queries with DB admin tools, but it would need to be on by default, so that people who write their queries wrong will hit the error, look it up on Google, discover what they're doing wrong, and start using parameters.

    Make that one simple change, and SQL injection would become a thing of the past.

    These two fixes wouldn't fix all of our security problems, but they'd fix the vast majority of them, and possibly buy us a bit of respite while we start looking at security the rest of it.



  • @masonwheeler said in Saving the World from Code:

    This would be simple enough to fix with a change to the DBMS: create a setting, which is on by default, which will cause a query to fail with an error if it contains any literal value written inline rather than as a parameter.

    That is the worst idea ever. Not every bit of a where or join clause is coming from user input.


  • Impossible Mission - B

    @boomzilla said in Saving the World from Code:

    That is the worst idea ever. Not every bit of a where or join clause is coming from user input.

    Why does that make it a bad idea?

    Also, join clauses? undefined If you're putting literals into a join clause, you're undefined!



  • @masonwheeler said in Saving the World from Code:

    @boomzilla said in Saving the World from Code:

    That is the worst idea ever. Not every bit of a where or join clause is coming from user input.

    Why does that make it a bad idea?

    Also, join clauses? undefined If you're putting literals into a join clause, you're undefined!

    No, sometimes those kinds of things make sense. Especially when you're dealing with outer joins. But seriously, are all of the things in your where clauses always coming from dynamic sources / user input? It sounds like you have some very simple queries to me, if that's the case.


  • Impossible Mission - B

    @boomzilla said in Saving the World from Code:

    No, sometimes those kinds of things make sense. Especially when you're dealing with outer joins.

    ...such as?

    But seriously, are all of the things in your where clauses always coming from dynamic sources / user input?

    No, but most of the literals do. And if this system makes your software more secure at the expense of having to add a couple more parameters, is that really a problem?

    Perhaps this would make the text of programs longer. Never mind! Wouldn’t you be delighted if your Fairy Godmother offered to wave her wand over your program to remove all its errors and only made the condition that you should write out and key in your whole program three times?
    -- The Emperor's Old Clothes, Tony Hoare, 1981 Turing Award acceptance speech



  • @masonwheeler said in Saving the World from Code:

    @boomzilla said in Saving the World from Code:

    No, sometimes those kinds of things make sense. Especially when you're dealing with outer joins.

    ...such as?

    Conditions for the join. Stuff that matches (or doesn't!) the data in a column in the table.

    But seriously, are all of the things in your where clauses always coming from dynamic sources / user input?

    No, but most of the literals do. And if this system makes your software more secure at the expense of having to add a couple more parameters, is that really a problem?

    Yes.



  • @masonwheeler TL;DR: a spec complete enough to describe the complete program would be the program, in a different language.


  • Impossible Mission - B

    @boomzilla said in Saving the World from Code:

    Conditions for the join. Stuff that matches (or doesn't!) the data in a column in the table.

    If it's supposed to be (not) matching against the data in a column, you're fine. That's how joins normally work. It's only when you're matching against a literal that this becomes an issue.

    No, but most of the literals do. And if this system makes your software more secure at the expense of having to add a couple more parameters, is that really a problem?

    Yes.

    Why?


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.