Is Object Oriented Programming the worst thing since the Spanish flu?



  • I was thinking about this bloated mess of a system that I am working on. It's purpose is to collect a bit of data. An old 386 would probably be up to the task if it was written dumb and simple, but between all the XML serialization and deserialization, generated wrappers for inter-process procedure calls and all the (ok, single core) modern ARMv7 system with 256 M of memory is struggling with rather simple tasks for tens of seconds.

    The libraries are C++, so they have the option to just pass things by value and call plain old functions, because there isn't a lot of actual polymorphism going on, but it is written with the object-oriented brainworm with plenty of interfaces and auxiliary objects and heap allocations. And it's not actually making it any easier to understand. Maybe if you designed it and knew where you put what, but as a maintenance programmer who just needs to know where to add handling of this new message you get bogged down in a quagmire of layers that just forward to each other.

    Ok, so it's crap. But it's not a unusal crap. A lot of software is bloated like this these days. It's always growing to gobble up all the available resources of the ever faster computers. So I am wondering

    • Is it object-oriented programming itself, luring architecture astronauts to add plenty of interfaces and wrappers in the hope they can separate blocks that are intimately related anyway due to the inherent complexity of the task?

      Object oriented programming uses classes as data types and units of encapsulation. This suggests to the developer that each should be encapsulated separately, encouraging adding layers where the problem being solved can't be cleanly separated. And it uses classes also as tool for code reuse, but a lot of logic is not tied to one and only one entity, which leads on one hand to rather arbitrary distribution of the methods and often creation of helper classes that make even less sense.

      Coupled with the fact that there is little good indicators of whether the chosen layout is sane and nothing to slap your wrist when it isn't it lets wannabe architects design code structure that is totally bogus without ever realizing they are doing anything wrong.

    • Is it the popular Design Patterns: Elements of Reusable Object-Oriented Software book and all the follow-ups building on it that caused code monkeys blindly slap interfaces everywhere and build those horrendous TLAs everywhere?

      I find it funny that the authors of that book are nicknamed “Gang of Four”, which has not nice connotations (maybe I should have titled this post “Is OOP the worst thing since The Cultural Revolution”).

    • Is it something else that makes these bloated designs so popular?

    Of course there are other issues at play here, especially the love of XML. But I would still note that XML is most associated with the culture of Java where all this Object Oriented Plague is also spread the most.

    By the way, we had those oop-are- tags already…


  • ♿ (Parody)

    @Bulb how many endocategories does it have?



  • I like OOP if it's done minimally. But it seems to be common practice--especially among C++ programmers--to put fifty billion layers of interfaces and abstractions in, which makes everything unnecessarily complicated. Someone creates a ticket to make one minor change and you think it'll take you a half hour, then six weeks later you're still trying to figure out which layer/interface you missed during your updates because your simple change still doesn't work due to some crazy inheritance thing that isn't obvious unless you're the code's original author, and when you think you finally fixed it, you end up breaking a trillion other things that inherit your change but don't need it. OOP should be used to encapsulate things and keep them separate, not tangle everything up into a big nasty ten-dimensional ball of string.


  • ♿ (Parody)

    @mott555 said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    But it seems to be common practice--especially among C++ programmers--to put fifty billion layers of interfaces and abstractions in, which makes everything unnecessarily complicated.

    Laughs in java.humor.laugh.factory



  • @boomzilla said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Bulb how many endocategories does it have?

    I am not sure what you mean. I did have a category theory lectures back in the day, but while all the stuff was either ‘trivial’ or ‘trivially trivial’ according to it, I never understand most of it.


  • ♿ (Parody)

    @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I am not sure what you mean.

    It's a Haskell joke.



  • @Bulb I think he just puts the prefix "endo-" on random words because it's funny or something.


    Filed Under: I'm endolaughing through my endofunctors



  • @Bulb As a C# dev almost exclusively, I of course cannot fully agree with you.

    I've seen what you're talking about many times, especially at my last job, but I honestly don't think object orientation is the core of the problem.

    I think the biggest problem is books about architecture and design patterns. You yourself touched on why, but at my last job we had a product written by a group of people who read a book on domain driven development, and they managed to create a system that could take hours to process 5mb of json, using 30 microservices.

    I'm still in awe of how incomprehensible and incompetent that mess was.

    I will say that while object orientation is a tool that can be used for horrible things, the core of the problem is focus: I seriously believe that you should focus on making things as simple as you possibly can while minimizing the chances that changing them ends up destroying the entire system. Not on implementing some stupid pattern.

    I like SOLID quite a bit, and recommend that people read the Wikipedia page on it, but at the end of the day, you keep that thing simple, you write tests, and you make sure you understand the scope of your problem, and you don't build in some stupid webscale database when your maximum customer count will be in the hundreds ever.


  • BINNED

    @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @boomzilla said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Bulb how many endocategories does it have?

    I am not sure what you mean. I did have a category theory lectures back in the day, but while all the stuff was either ‘trivial’ or ‘trivially trivial’ according to it, I never understand most of it.

    Not too long ago there was a thread about this medium article ranting that people still use OOP when, obviously, FP is the one true paradigm. It quickly devolved into people talking out of their endofunctors and a long discussion of monads and cricket.



  • @mott555 said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    especially among C++ programmers

    Trust me, C++ is one of the languages with lower prevalence of this plague… let's call it, say, layeritis.



  • @topspin I miss SCP. But you can't very well do that sort of work when every job description demands a hundred tools just to get past ATS so you can interview with somebody that doesn't know what most of them even are but somehow won authority to unilaterally decide everything.

    In theory, OOP isn't terrible but it leads exactly to what @mott555 describes, TLAs where every interface has exactly one object that implements it and you're changing dozens of files to get a column added to some table.



  • @Bulb Any programming paradigm can be abused to write big balls of mud. Just look at the mess that is Rx.
    There is no replacement for hiring good developers, not agile, not FP, not OOP or any other silly thing dreamt up the last 50 or so years.
    There is a reason that many of the latest generation of languages aren't trying to be purely one or the other but picks the best parts of all and tries to be useful instead.


  • BINNED

    @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I like SOLID quite a bit, and recommend that people read the Wikipedia page on it

    I always found the description to be too abstract. Specifically, the wording of the open-close principle is perfectly inscrutable.



  • @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I honestly don't think object orientation is the core of the problem.
    I think the biggest problem is books about architecture and design patterns.

    Yes, but the books about design patterns exist to a large extent to patch up deficiencies in the basic methodology. Nobody seemed to create design patterns for C or FORTRAN or Perl. It boomed with object oriented programming because that seemed like a nice and simple approach that turned out not to be actually simple at all.

    So I agree the architecture and design patterns are an issue, but I am still blaming OOP for their existence.

    @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    simple

    That basically means breaking off from the traditional understanding of OOP, decoupling data types from encapsulation, polymorphism from code reuse and forgetting about inheritance altogether. The presentation is pretty ancient already.

    (It also happens to be what Go and Rust do by design)



  • @topspin yeah, but that way people can't just go implement it like it says on the page.

    There's a reason half the programming articles on the internet are "x is not a silver bullet"

    That principle in particular should be abstract: sometimes, you may have to break it. But you should at least TRY to design your system such that adding stuff is easy, and making random sweeping architectural changes is rare.



  • @Bulb yeah no.

    For the first part, design patterns are meant to be a descriptive thing: some guys solved s lot of problems, and named some styles of solving some of them. This made them easier to talk about. The second design patterns become scripture, you have problems.

    But as for the second part? Designing large, complex systems is hard. OOP is a good tool to have for dealing with some of them. If you don't know what you're doing, you'll make any system unusable.

    There are people who believe they can just stick to some books and not plan their architecture, or that doing the same thing everywhere will work. They reach for abstraction when they don't need to. If you aren't one of them, and could write a simple app in an OO language, you have just proved that the problem isn't OO. The problem is what it always is.

    The source of every :wtf:

    HUMANS



  • @Bulb The sad thing is XML is just S-expressions...



  • Maybe not OOP in itself (as @Magus said, it's good to have (for the types of problems it's supposed to address)), but (a) object-oriented languages that force everything into the OO mould, and (b) programmers who think everything is an object-orientable problem.

    OOP wants everything it deals with to be nouns, but sometimes you want more than just nouns to express ideas. FP verbs everything, and ... um, maybe it's table-oriented programming where everything is adjective.



  • @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @mott555 said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    especially among C++ programmers

    Trust me, C++ is one of the languages with lower prevalence of this plague… let's call it, say, layeritis.

    I've heard "lasagna code" (as opposed to "spaghetti code")



  • @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    For the first part, design patterns are meant to be a descriptive thing: some guys solved s lot of problems, and named some styles of solving some of them.

    And that's good and well. But many of the problems are problems, or are more difficult, because of OOP. Not because the idea exists, but because of how it was sold as the end of all things and the only one you need.

    @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    The source of every :wtf:: HUMANS

    Of course. Only humans sell this or that as a silver bullet, so…

    @Watson said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    (a) object-oriented languages that force everything into the OO mould, and (b) programmers who think everything is an object-orientable problem.

    … yes, this. And (c) the designers of OOP languages selling the approach as a silver bullet and the one true approach that designed the languages that way and put that idea in programmer's heads.

    The tools of OOP are good and useful. Polymorphism and methods are great thing. But OOP as a uniform closed solution, not so much.



  • @hungrier said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I've heard "lasagna code" (as opposed to "spaghetti code")

    Lasagna code is the effect, or symptom. Layeritis is the disease.



  • @Bulb why yes, any time anyone tries to sell The One True Solution™ they are either complete idiots, lying or both.
    And if people believe them, it usually ends up in tears.



  • @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    That basically means breaking off from the traditional understanding of OOP, decoupling data types from encapsulation, polymorphism from code reuse and forgetting about inheritance altogether.

    Interestingly, one of the more well-known books about design patterns (Clean Architecture), which you seem to blame for messy OOP code, says basically exactly that. And pretty much everyone else who has ever written a book on the subject agrees that introducing inheritance relationships just to share common code is fundamentally wrong.

    I wouldn't say that either OOP, books on design patterns or the idea of design patterns are the culprit. What's horribly broken is the way we teach OOP to newcomers. First of all, we have to start accepting that OOP is an advanced tool that novice programmers will not truly understand until they've written a few larger programs. And then we have to nuke all tutorials that introduce inheritance using biological classifications from orbit.


  • 🚽 Regular

    @topspin said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I like SOLID quite a bit

    I always found the description to be too abstract.

    That's because it's meant to be a base for your implementations, not something you can instantiate directly.

    (I'll show myself out)


  • BINNED

    @Zecc said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @topspin said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Magus said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I like SOLID quite a bit

    I always found the description to be too abstract.

    That's because it's meant to be a base for your implementations, not something you can instantiate directly.

    (I'll show myself out)

    I struggled with phrasing that, being aware that "abstract" doesn't just have the usual connotations but also fails to express what I meant: the definition doesn't explain anything at all. Compare with Liskov, which, of the 5, I find the most clearly defined.

    Also, I'm not the only one frustrated by the wording, smarter people than me have written about it:

    Now I’m not naïve enough to expect everything in a principle to be clear just from the title, but I do expect some light to be shed. In this case, unfortunately I’m none the wiser.



  • @dfdub said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    more well-known books about design patterns (Clean Architecture)

    That was published 2017, which is really, really, new in context. It is deviating from the original pitch a lot.

    @dfdub said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I wouldn't say that either OOP, books on design patterns or the idea of design patterns are the culprit.

    You are basing that idea on the new publications that are already undoing the bigger faults of the original approach. See, the Simple Made Easy presentation is from 2012 and the Singletons: Solving Problems You Never Know You Never Had Since 1995 article that I can't currently prey even from the wayback machine is even older. Before that, Object-Oriented Programming was very much inheritance for everything.

    @dfdub said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    What's horribly broken is the way we teach OOP to newcomers. First of all, we have to start accepting that OOP is an advanced tool that novice programmers will not truly understand until they've written a few larger programs.

    The newer architecture materials basically did a heel-face turn on inheritance. It used to be the silver bullet but it's the naughty word now.

    But because nobody had the gut to stop calling it object-oriented programming now that it's completely different than in the hype period of ~1995–2005, the tutorials from the time stick around.

    @dfdub said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    And then we have to nuke all tutorials that introduce inheritance using biological classifications from orbit.

    Or using anything else, really. We should focus on teaching Liskov's Substitution Principle and how almost nothing can really be a subtype of anything else once it's mutable, since you need covariance for getters but contravariance for setters.


  • 🚽 Regular

    @topspin said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I struggled with phrasing that, being aware that "abstract" doesn't just have the usual connotations but also fails to express what I meant: the definition doesn't explain anything at all.

    Jokes aside, I concur with the choice of "abstract". Alternatively... hmm... "vague"?

    @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    We should focus on teaching Liskov's Substitution Principle

    In my opinion, LSP insofar as it is defined as "it's okay to substitute derived classes for their bases", should come as a consequence of the Dependency Inversion principle (insofar as it is defined as "you should depend on interfaces, not implementations") and the Single Responsibility principle (ie, "mind your own business, if you depend on the other guy doing something in a certain way you're :doing_it_wrong:"). Ideally, anyway; reality as a tendency to get in the way.



  • @Zecc said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    LSP insofar as it is defined as "it's okay to substitute derived classes for their bases"

    I understand it as defined as “if it's not ok to substitute for the base class in any and every context, don't make it a derived class”. And the main point here is that many things look like they are specializations, and would therefore be possible to implement by derivation, actually are not when you consider invariants of all the methods. There is the example with a circle and ellipse. Immutable circle is a specialization of ellipse, but a mutable one is not (because you can set it's major and minor axis differently via the base class and then it's not a circle anymore).

    (Of course most interfaces should only have one method and if you follow the single responsibility principle you'll usually end up with just that)



  • @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    The newer architecture materials basically did a heel-face turn on inheritance. It used to be the silver bullet but it's the naughty word now.

    I don't have enough :belt_onion: to know what used to be taught in 1995, but I've definitely heard and read "composition over inheritance" everywhere in the first decade of the new millennium. If inheritance was ever sold like a silver bullet (which I doubt), then it certainly hasn't been for years.

    But because nobody had the gut to stop calling it object-oriented programming now that it's completely different than in the hype period of ~1995–2005, the tutorials from the time stick around.

    No, it's not just old tutorials that are the problem. People are still trying to teach newbies who can barely write FizzBuzz about the benefits of inheritance and they necessarily have to use stupid analogies for that which violate all OOP design principles.

    The actual problem is the idea that every programming concept should be oversimplified in training materials so that absolute novices can understand it. That particular brain worm is way too widespread. For some reason, people refuse to accept that training programmers takes time and a lot of practical experience.

    We expect people to start hacking together programs using advanced techniques and interacting with complex frameworks way too quickly instead of taking the time to teach each programming concept and its proper use incrementally. Just gluing stuff together until your program works is encouraged way too much and it takes new programmers years to un-learn that later and actually write good code.



  • @dfdub said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    No, it's not just old tutorials that are the problem. People are still trying to teach newbies who can barely write FizzBuzz about the benefits of inheritance and they necessarily have to use stupid analogies for that which violate all OOP design principles.

    Yeah, teachers who themselves stopped at those old teaching materials too. And people who keep teaching what they themselves were taught without realizing they are not actually using it anymore because it turned out to be pain in the arse (it's a weird tendency, but it seems rather prevalent).

    And of course trying to teach benefits of inheritance leads to stupid analogies that violate good design principles simply from the fact that inheritance does not actually have much benefits and was long ago declared evil and replaced with implementation of interfaces and composition.

    @dfdub said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    The actual problem is the idea that every programming concept should be oversimplified in training materials so that absolute novices can understand it. That particular brain worm is way too widespread. For some reason, people refuse to accept that training programmers takes time and a lot of practical experience.

    You are on to something here. Teaching programmers always seemed to suck and still seems to suck.



  • @dfdub I'm glad to note that when I'm teaching introductory programming classes (Python mainly), I don't even mention the idea of inheritance except with particular people who are doing more advanced things like interacting with GUI frameworks (where inheritance is inevitable IMX). I don't even use frameworks (except for those aforementioned advanced GUI students)--everything is hand-typed + standard library.

    HTML/CSS isn't programming in the same way, but I do the same thing there. I teach basic principles and they hand-write everything (ok, boilerplate HTML is done with a template, but that doesn't get them very far)



  • @Benjamin-Hall said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    GUI frameworks (where inheritance is inevitable IMX)

    Well, there are some emerging new ones where it ain't, either because they are based on web or use similar kind of structure where you are just binding data and callbacks, or because they go full ECS. But the traditional ones, well, not always even there if they utilize signals&slots properly.


  • Discourse touched me in a no-no place

    @Bulb Very large GUIs can be done without inheritance. The only point when you actually need inheritance in a GUI is when you're doing an actually new control, and that's an extremely advanced topic (assuming you want to do it right). It's not something that most programmers should ever take on.

    A dialog with a bunch of standard controls in it? No need for inheritance there.



  • @dkf said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    The only point when you actually need inheritance in a GUI is when you're doing an actually new control

    Is it inheritance when you are doing a new control in a HTML-based GUI by decorating a <div> with a class= and creating a template with some standard subelements? It's called a ‘class’, but it's not really a type in the first place, and the stylesheet inherits values, but in a way that's very unlike, say, Java.


  • Banned

    @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    Is it object-oriented programming itself, luring architecture astronauts to add plenty of interfaces and wrappers in the hope they can separate blocks that are intimately related anyway due to the inherent complexity of the task?

    With great power comes great responsibility. Conversely, with little responsibility comes little power. The only languages even worth looking at for serious projects are ones where such abominations are possible. No matter which paradigm you choose, there's always going to be a lot of shitty lasagna code written by other devs that you then have to deal with. The object-oriented lasagna is actually one of the easier kinds to dig through.

    Object oriented programming uses classes as data types and units of encapsulation. This suggests to the developer that each should be encapsulated separately, encouraging adding layers where the problem being solved can't be cleanly separated. And it uses classes also as tool for code reuse, but a lot of logic is not tied to one and only one entity, which leads on one hand to rather arbitrary distribution of the methods and often creation of helper classes that make even less sense.

    For this reason, I think the greatest sin of Java was disallowing free functions. It makes teaching other paradigms much harder than it should be. Objects are cool, but only where they make sense.

    Side-μrant: if you have mocks in your unit tests, your design most likely sucks.


  • BINNED

    @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    Side-μrant: if you have mocks in your unit tests, your design most likely sucks.

    Care to elaborate?

    No, I don't feel bitten by that statement, I'm just curious. Having "mocks in your unit test" would probably be a step up around here.


  • Discourse touched me in a no-no place

    @Bulb said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    Is it inheritance when you are doing a new control in a HTML-based GUI

    I don't know. It depends on what operations and behaviours you support on the control. Static controls really don't need much, and controls that are dynamic but not under direct user control quite possibly don't need much else either (they should be just tracking their model object, after all) as the standard components have that sort of thing largely covered. But anything with actual serious interactivity probably does need inheritance and other OO stuff; it's also hard to write anyway (because there's just… so… many… ways… that users can interact with stuff; it's not unusual for a medium-complexity control to need 30 or more bindings for events).



  • @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    the greatest sin of Java was disallowing free functions.

    Oh gods tits yes.


  • Discourse touched me in a no-no place

    @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    I think the greatest sin of Java was disallowing free functions.

    The sin is reduced from Java 8 onwards. The mess is still there, but you don't have to actually manually pay the piper quite so much (except for the initial main call, but that's a tiny part of all but the most trivial program).


  • Banned

    @topspin not a universal rule by any means, but it's a good first approximation.

    There's basically two kinds of code:

    1. Code that calculates something.

      • If you don't have clearly defined inputs and outputs of the calculation, your design sucks.
      • If you have clearly defined inputs and outputs, but they're passed differently than through arguments and return values, your design sucks.
      • If you have clearly defined inputs and outputs and they're passed through arguments and return values, there's nowhere to put a mock in.
    2. Code that coordinates calling other code.

      • Here, mocks are fine.
      • But usually, this code is so trivial that a couple very basic tests give you 100% coverage.
      • If the code isn't so trivial, it's possible you have your calculating code mixed inside of your coordinating code. Which means your design sucks.
      • In the aforementioned trivial cases, usually you don't need full-blown mocks - a simple stub or fake will do just fine, and leads to more manageable test code. But I admit this point isn't strictly about design.

    Also, I should note I'm not a fan of "one test unit is always one class" approach. I often write tests where the lowest level of granularity is several objects working together, usually in different combinations - and I'm not talking about data objects. #unpopularopinion



  • @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    Also, I should note I'm not a fan of "one test unit is always one class" approach.

    Indeed. A test unit usually should coincide with unit of encapsulation. Since it turns out class is often not the right granularity for encapsulation, it's not any more the right unit of test.



  • @dkf said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Bulb Very large GUIs can be done without inheritance. The only point when you actually need inheritance in a GUI is when you're doing an actually new control, and that's an extremely advanced topic (assuming you want to do it right). It's not something that most programmers should ever take on.

    A dialog with a bunch of standard controls in it? No need for inheritance there.

    A lot of (python) GUI frameworks require that your main class has to inherit from the base "window" class. At least if you want to stay sane. And all the widgets inherit from each other, so you have to understand that concept (even if you don't have to write the class hierarchy yourself) to know whether you can pass a FooWidget into a parameter that wants a BaseWidget or not. That's about the level I talk about it.



  • @Gąska
    To add another unpopular opinion: I also don't think everything must be covered by unit tests. It's perfectly fine to write unit tests for the complex computations and a few high-level integration tests for the rest of your application. Tests should serve as documentation and brittle tests that need to mock complex objects are just dead weight that's going to make refactoring and evolving your application harder instead of easier.



  • @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    Code that coordinates calling other code.

    Here, mocks are fine.
    But usually, this code is so trivial that a couple very basic tests give you 100% coverage.

    That is the more or less exact opposite of my experience: The individual computations, where you have a limited, clearly defined set of inputs and outputs, are the parts where it is comparatively easy to identify, and devise test cases for, all possible execution paths. The complex, hard to verify parts are the points where everything comes together, and the required behavior inherently depends on bits and pieces of information from all over the place, and it's impossible to reasonably separate these into independent parts because the inter-dependencies are inherent to the problem domain.


  • BINNED

    @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    If you don't have clearly defined inputs and outputs of the calculation, your design sucks.

    I have that problem quite often when the computation is some numerics stuff and nobody can tell me what the output should be. ☹
    Usually there's some simplistic cases where you can test against analytical solutions, but other than that my tests just consist of "results didn't change".


  • Banned

    @ixvedeusi sounds like you could use a command design pattern or something similar. Converting path selection into value computation and a trivial switch. Easier to maintain, easier to test.


  • Banned

    @topspin said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    If you don't have clearly defined inputs and outputs of the calculation, your design sucks.

    I have that problem quite often when the computation is some numerics stuff and nobody can tell me what the output should be. ☹
    Usually there's some simplistic cases where you can test against analytical solutions, but other than that my tests just consist of "results didn't change".

    I know that feel. Data scientists and business analysts rarely care about the exact functionality of the product, as long as the numbers "feel" correct.



  • @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @topspin said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    If you don't have clearly defined inputs and outputs of the calculation, your design sucks.

    I have that problem quite often when the computation is some numerics stuff and nobody can tell me what the output should be. ☹
    Usually there's some simplistic cases where you can test against analytical solutions, but other than that my tests just consist of "results didn't change".

    I know that feel. Data scientists and business analysts rarely care about the exact functionality of the product, as long as the numbers "feel" correct.

    I even think that most devs that have been around for a few projects have been asked to make things less correct for raisins.
    And even being asked to break the law all too common.



  • @Gąska said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    @ixvedeusi sounds like you could use a command design pattern or something similar. Converting path selection into value computation and a trivial switch.

    Yeah well, that sounds like a really nice idea in theory, and would work really well if you actually had a set of cleanly separable paths, in which case you wouldn't have had the problem in the first place. In the other cases, that leads straight to Codethulhu Spaghetti Lasagna.


  • Discourse touched me in a no-no place

    @Benjamin-Hall said in Is Object Oriented Programming the worst thing since the Spanish flu?:

    At least if you want to stay sane.

    Why would you want to do that???