What if we just didn't write the bugs, then we wouldn't have buggy software
-
I don't think my eyes are physically capable of rolling any harder.
-
@pie_flavor said in What if we just didn't write the bugs, then we wouldn't have buggy software:
I don't think my eyes are physically capable of rolling any harder.
I don't know how it's possible to miss the point so hard. We already have techniques and standards for this (I've done work to DO-178C). But that's fine when your development cost is literally orders-of-magnitude higher than commercial products. In my day job, doing commercial/light-industrial stuff, we have customers asking us to value-engineer pence out of the product, and that's not a joke.
It's also the norm that dev cost is amortised into production cost, so one and t'other are very closely linked.If you want this level of development you have to legislate first, then you will pay for it, one way or another.
The Chinese will also continue to flog bargain basement stuff that doesn't conform, because they don't care and RAPEX/similar can't catch but a tiny fraction. And your potential customers will buy the tat because they don't care either.
That's why we got out of the lighting sector. Don't buy cheap Chinese LED lights...they'll burn your house down.
-
@Cursorkeys said in What if we just didn't write the bugs, then we wouldn't have buggy software:
The Chinese will also continue to flog bargain basement stuff that doesn't conform, because they don't care and RAPEX/similar can't catch but a tiny fraction. And your potential customers will buy the tat because they don't care either.
That's why we got out of the lighting sector. Don't buy cheap Chinese LED lights...they'll burn your house down.That's pretty much on point. In electronics, safety is regulated like that, because you don't want to buy little @Polygeekery gadgets for 3 bucks on Amazon, but the Chinese still produce crap like that and plaster it with fake regulation stickers. On Amazon. And for some reason they're not even liable for it.
-
@pie_flavor said in What if we just didn't write the bugs, then we wouldn't have buggy software:
I don't think my eyes are physically capable of rolling any harder.
We can quite easily produce correct software from mathematically-defined specifications. Producing correct specifications in the first place…
Also, the economic/engineering case matters, and not all aspects of correctness are subject to mathematical proof precisely because human factors also matter a lot, and those are linked to physiology and psychology, not mathematics. Because computing is not (just) mathematics!
-
I think the point of the article is that time is spent in testing when that time should be spent actually engineering the software to be correct. AFAIK, a lot of software isn't even really tested though. So it's wishful thinking.
And I don't get why shops can sell non-compliant stuff. Doesn't the final seller carry responsibility over its own sales?
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Doesn't the final seller carry responsibility over its own sales?
In Europe, yes. In the US, often not.
-
@dkf but we get the Chinese crap over here too. With those CE stickers.
-
@dkf structural safety is mostly mathematics though. I mean, one thing is a badly designed, dunno, accelerator, another is a badly designed differential.
-
@topspin said in What if we just didn't write the bugs, then we wouldn't have buggy software:
fake regulation stickers
When we started doing domestic LED lighting we bought a ton of products so we could see what the other manufacturers were doing. One Chinese LED light (domestic) was marked as BS EN 12100...which is for farm machinery
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
China Export
That's just a myth, they are supposed to be knock-off CE marks. But they didn't even care enough to get the font/spacing correct, so it gave rise to the view they they were some cunning double-meaning-not-quite-CE mark instead of just pure laziness.
-
@Cursorkeys I knew that. It's still funny.
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Cursorkeys I knew that. It's still funny.
Fair enough
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
but we get the Chinese crap over here too
Sure, but you get to force the retailer to replace or reimburse, and they can't fob you off by just saying “you should sue the manufacturer (in China)”.
-
@dkf I know this spells "bureaucracy" and people don't like that, but shouldn't the stuff be certified locally first?
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@dkf I know this spells "bureaucracy" and people don't like that, but shouldn't the stuff be certified locally first?
For the EU at least, that's how it's supposed to be. You can't sell anything unless it conforms with the CE requirements. It applies to a majority of products.
Self-certification is a bit of a curse here. You're allowed as a manufacturer/importer to declare, unilaterally, that your product is compliant. You are supposed to be able to back this up with product design files and internal pre-compliance testing. This is how my company does things, the minority of our products are externally tested by verified test facilities. I do the rest of the testing in our internal lab, our equipment is nearly as good as a verified test lab and we apply a safety-factor to our measurements such that we're 99% confident it's compliant. We spend about 25k a year on just calibration for our internal EMC/safety lab equipment.
Of course, if you don't care about ethics you can just slap a 'compliant' EC DoC out for your thing and the chance of it being pulled up by customs for an external lab to prove your self-certification was full of shit is tiny.
My SME company would be majorly impacted if self-certification went away, but it's a massive loop-hole for the bad-actors.
-
@Cursorkeys said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Don't buy cheap Chinese LED lights...they'll burn your house down.
Even more effectively than those expensive combustible lemons? I see why this is a problem...
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
shouldn't the stuff be certified locally first?
@Cursorkeys said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Self-certification is a bit of a curse here.Even for countries which don't allow this, independent testing by a national company or organization is not always required, as long as the manufacturer gives proof it has been tested by a certification lab.
-
@dkf said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
but we get the Chinese crap over here too
Sure, but you get to force the retailer to replace or reimburse, and they can't fob you off by just saying “you should sue the manufacturer (in China)”.
Unless he made the purchase on aliexpress or similar.
-
@dkf said in What if we just didn't write the bugs, then we wouldn't have buggy software:
We can quite easily produce correct software from mathematically-defined specifications. Producing correct specifications in the first place…
Even if you have perfect specs (which is never the case), how can you be sure that the coders did understand it perfectly, made no implementation mistakes, and that the hardware/software the code relies on is flawless?
If your reply is "manual testing" or "automated verification", you've not solved the problem, only moved it elsewhere.
The logical conclusion is that's it's impossible to produce 100% correct software. Even if you throw buckets of resources at it (and the costs balloon up quickly), you can never be sure.
-
18 posts and not a single mention of Rust?
-
@Cursorkeys said in What if we just didn't write the bugs, then we wouldn't have buggy software:
BS
Different meaning, I'm sure.
-
@Gąska Well now you've done it.
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
I think the point of the article is that time is spent in testing when that time should be spent actually engineering the software to be correct.
: What if we just didn't write the bugs, then we wouldn't have buggy software.
: I don't know how it's possible to miss the point so hard.
: Hold my beer.
-
@Zerosquare said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@dkf said in What if we just didn't write the bugs, then we wouldn't have buggy software:
We can quite easily produce correct software from mathematically-defined specifications. Producing correct specifications in the first place…
Even if you have perfect specs (which is never the case), how can you be sure that the coders did understand it perfectly, made no implementation mistakes, and that the hardware/software the code relies on is flawless?
There exist some automated tools that take raw specification (in appropriate format) as input and tell you whether given implementation is perfect in this regard. They are of course limited in what kinds of requirements can appear in the spec, and thus what part of behavior can be checked, but still.
-
Yes, indeed. But how do you know that those tools themselves are correct? And that the hardware they're running on is correct, too?
Whatever you do, you can't completely eliminate the potential for an error to exist, only lower it up to the point where going further would blow your budget, schedule or both.
-
@Zerosquare said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Yes, indeed. But how do you know that those tools themselves are correct?
By performing formal verification on the formal verification software. You can, for example, use the formal verification software you're formally verifying to verify that. Also, traditional testing. Also, manual verification. Of course you will never have 100% certainty (if you account for the possibility of every single human ever attempting to formally verify software, making an error of some sort that results in false positive), but 99.999% is totally within grasp.
And that the hardware they're running on is correct, too?
Automated formal hardware verification actually is a thing that the industry does, and the results are very good.
-
@Gąska I didn't miss the point. Why should you hold my beer? I appreciate that what he's saying is wishful thinking, because not even testing is carried out as it should. Still, where testing is the primary form of QC/QA, how much does it cost to test and how much does it cost to design software (anything really) better from the start?
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Gąska I didn't miss the point. Why should you hold my beer?
If you think the article was about anything else than formal verification, then you missed the point.
Still, where testing is the primary form of QC/QA, how much does it cost to test and how much does it cost to design software (anything really) better from the start?
Ignoring for a moment that this question is completely unrelated to anything said in the article - there's two things at play: the law of diminishing returns, and short-term vs. long-term benefits. Testing is not a substitute for design, and design is not a substitute for testing. They're completely independent of each other, and their effects are also completely independent of each other - the only reason you have to choose between them is because both require budget, and budget is a limited resource. Design works especially well in long term, testing has immediate results. More design is a lot better than less design, but only up to a certain point - when you reach it, more design is just slightly better. More tests are a lot better than less tests, but there's too a point after which it's just slightly better. What's the optimal allocation of budget between design and testing (and everything else that requires budget) depends on how important is long term, vs. short term, and three dozen other factors.
-
@pie_flavor I hate that first article for it's almost invisible in-text links.
And also, I like this bit:Recent advances have reached a point where formal methods’ capacity to check and verify code can be applied at scale with powerful automated tools.
I don't see any links to corroborate or even be helpful and provide a link to a tool. Whining is all fine and good, but it doesn't really change anything.
And as others have said, there is a substantial cost for mathematically proving the correctness of software. And for it to mean anything every library must have every release verified as well, because it means fuck all if your own code is verified but it depends on libraries that are not. And for that matter, you might be using the library in a way that is not in accordance with what the designer of it though, and thus get bugs that way but it's all mathematically correct by the looks of things because there will be a disconnect between the two proofs.
-
@pie_flavor well, before formally verifying my code I would first need to have ever been given a spec that covered everything the program needed to do. I would then need the spec to not change between the time it was given to me and the time it is ready to be verified.
Hell, give me those two things and I would write bug free code even without formal verification methods.
-
@Kian said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@pie_flavor well, before formally verifying my code I would first need to have ever been given a spec that covered everything the program needed to do. I would then need the spec to not change between the time it was given to me and the time it is ready to be verified.
Hell, give me those two things and I would write bug free code even without formal verification methods.
Amen!
-
-
@Zerosquare said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Even if you have perfect specs (which is never the case), how can you be sure that the coders did understand it perfectly, made no implementation mistakes, and that the hardware/software the code relies on is flawless?
There are techniques that actually do enforce correct transformations between spec and implementation. The only place I've ever heard of them being used was in safety-critical applications such as railway signalling and avionics, and even then not always. (The key is to stick exactly to things that are Obviously Correct. That's a hell of a lot more restrictive than you might think.) The conclusion from decades of research in that field is that the problem is “solved” but the problem of getting the specifications right is not close to solution at all.
I'll note here that these are the sorts of software systems where dynamic allocation and other modern OS features are not used at all. You certainly would have great trouble using these techniques to deliver user-facing software (and mathematics doesn't say much about the correctness of UIs).
Hardware is a whole 'nother ball of wax.
-
@Carnage said in What if we just didn't write the bugs, then we wouldn't have buggy software:
And for that matter, you might be using the library in a way that is not in accordance with what the designer of it though, and thus get bugs that way but it's all mathematically correct by the looks of things because there will be a disconnect between the two proofs.
That's one which shouldn't apply. Part of the correctness proof of a library will be to define an interface specification for that library, and to check that it actually obeys that interface specification. Part of the correctness proof of a client of that library will be to check that it only ever uses that library in ways that are correct according to the interface specification. It's really just more of the same sort of thing that you do to verify a program function and its uses.
-
@Kian said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@pie_flavor well, before formally verifying my code I would first need to have ever been given a spec that covered everything the program needed to do. I would then need the spec to not change between the time it was given to me and the time it is ready to be verified.
Hell, give me those two things and I would write bug free code even without formal verification methods.
And that is the problem with formal methods!
-
@dkf said in What if we just didn't write the bugs, then we wouldn't have buggy software:
There are techniques that actually do enforce correct transformations between spec and implementation.
That only sounds good superficially until you realize that you then have to program "the spec" instead of "the implementation", and you're back at square one. Or, if you'd call your source code "the spec" and the resulting binary "the implementation" the statement wouldn't be any stronger than "the compiler isn't buggy". Okay, that's something at least, but far from what it sounded like.
-
@ben_lubar said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Not even drunk cats?
-
@Kian said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Hell, give me those two things and I would write bug free code even without formal verification methods.
Are you a pope? Because it looks like you believe in your own infallibility.
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
And I don't get why shops can sell non-compliant stuff. Doesn't the final seller carry responsibility over its own sales?
Oh, you're selling software? That's Different™
-
I'm not going to say "just formally verify everything lol", but I do think programming needs to get a lot more formal.
- Start explicitly defining invariants, pre-, and post-conditions in every loop and function.
- Static analyzers should be common tools in IDEs. So you can write at least basic conditions like "this function never gets called with a NULL parameter" or "this variable is never less than 0" and get a warning the instant it detects you've written some code that may lead to breaking that.
-
@Cursorkeys said in What if we just didn't write the bugs, then we wouldn't have buggy software:
BS EN 12100...which is for farm
Bull Shit Engineering Norm - quite appropriate in the context of a farm.
-
@Gąska said in What if we just didn't write the bugs, then we wouldn't have buggy software:
By performing formal verification on the formal verification software.
That sounds ... incestuous.
-
@Steve_The_Cynic standard practice. It's like compiling a compiler, or unit testing a unit test framework.
-
@Gąska said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Kian said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Hell, give me those two things and I would write bug free code even without formal verification methods.
Are you a pope? Because it looks like you believe in your own infallibility.
I don't have a cool hat, but I'm just saying that given an impossible requirement I can produce an impossible outcome, which is the same thing that proponents of formal verification do. After all, the verification simply gives you confidence on your implementation, it doesn't write it for you. If you can write a bug free implementation with formal methods, you can write it without them, it will just take longer to find the bugs you missed.
-
@BernieTheBernie said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Cursorkeys said in What if we just didn't write the bugs, then we wouldn't have buggy software:
BS EN 12100...which is for farm
Bull Shit Engineering Norm - quite appropriate in the context of a farm.
BS 5502-50:1993+A2:2010
Buildings and structures for agriculture. Code of practice for design, construction and use of storage tanks and reception pits for livestock slurryThere's a standard for everything
Edit: There's even a standard for writing standards: BS 0:2011
-
Coming to a "know-it-all-but-really-kinda-dumb-and-feeding-on-youtube-only mama" YouTube video near you :
How programmers deliberately insert bugs into their programs in order to guarantee themselves job security for life
Make the title a little more clickbaity and conspiracy-theoryiy, and bang, 100k views guaranteed.
-
@Kian said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Gąska said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Kian said in What if we just didn't write the bugs, then we wouldn't have buggy software:
Hell, give me those two things and I would write bug free code even without formal verification methods.
Are you a pope? Because it looks like you believe in your own infallibility.
I don't have a cool hat, but I'm just saying that given an impossible requirement I can produce an impossible outcome, which is the same thing that proponents of formal verification do.
You said that if you were given perfect spec, you'd never make any error in code. Which is either extreme arrogance or serious misunderstanding. People will always make errors. That's human nature. No one is perfect. You could have the best requirements in the world and have zero change requests throughout eternity, and you would still make some errors in implementation because that's how humans work.
Compare to automated tools, which - assuming the underlying logic behind CPU architecture is deterministic, at least to the extent used by the tool - will consistently produce the same output every time. It might be wrong output if there are bugs in the tool, but if there's a scenario that has been proven (e.g. by testing) to work with a given build of a tool, then this build of a tool will always work in this scenario and never make any errors in this scenario no matter what.
After all, the verification simply gives you confidence on your implementation, it doesn't write it for you.
Scientists are working on that too.
If you can write a bug free implementation with formal methods, you can write it without them, it will just take longer to find the bugs you missed.
There's even more substantial difference - if you don't use formal methods, then even if by some miracle you get everything right and your implementation is absolutely bug free, it's impossible for you to know that's the case.
-
@Gąska said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
@Gąska I didn't miss the point. Why should you hold my beer?
If you think the article was about anything else than formal verification, then you missed the point.
What's the point of quoting Djikstra or how the hell you spell it and babbling so much about testing being so prevalent and formal verification being so overlooked in practice?
Still, where testing is the primary form of QC/QA, how much does it cost to test and how much does it cost to design software (anything really) better from the start?
Ignoring for a moment that this question is completely unrelated to anything said in the article - there's two things at play: the law of diminishing returns, and short-term vs. long-term benefits. Testing is not a substitute for design, and design is not a substitute for testing. They're completely independent of each other, and their effects are also completely independent of each other - the only reason you have to choose between them is because both require budget, and budget is a limited resource. Design works especially well in long term, testing has immediate results. More design is a lot better than less design, but only up to a certain point - when you reach it, more design is just slightly better. More tests are a lot better than less tests, but there's too a point after which it's just slightly better. What's the optimal allocation of budget between design and testing (and everything else that requires budget) depends on how important is long term, vs. short term, and three dozen other factors.
Testing should be done like it is done in the "real stuff" industry. Testing is inherently destructive when testing physical items, be it a car, a tool, a loudspeaker (the modern standard for rating loudspeakers is "maximum power sustained for minimum two hours before destruction" or something like that). Testing is expensive in this case (more so when it comes to expensive stuff). You only test stuff when you're sure you've done at least a "decent" job or if you actually want to see how your design fails, not as a cure-all. Adopt the same approach in software and bam! Problem solved. (Yeah, don't get all worked up, I'm half joking). We're spoiled by rapid iteration. There is no need for such speed. Of course, I can see it when it comes to "scratching your itch" and for software that is provided with no guarantees at all. Which is fine, stuff that goes fast and breaks often is fine when it's free (in both meanings).
-
@admiral_p said in What if we just didn't write the bugs, then we wouldn't have buggy software:
We're
Aren't you a teacher, though, not a software developer?
-
@Gąska the users, not the developers.
-
@admiral_p so that sentence should read "the users spoiled by rapid iteration"? TDEMSYR. Users don't benefit from rapid iteration. In most cases, they're not even aware of rapid iteration - because why would they? It's an internal matter of the development team. The user has neither any knowledge nor any interest in knowing those processes.
And anyway. If you think testing has substituted designing and planning in software development in any way... you'd be better off not talking about software development at all, because it's clear you know nothing about it.