Standarditis



  • "Is Object-Oriented Programming the best programming paradigm? No. It’s fucking stupid. But it’s the industry standard. It’s a terrible standard. But it’s the standard. The most popular languages are all object-oriented. The vast majority of employed programmers are paid to write object-oriented code.

    It gets worse. If you’re a software company, you can attempt to buck the trend by writing your code in a non-OOP language. That’s often a superior technical decision. But you will find it almost impossible to find talent. It’s very difficult for oopers to unlearn OOP. You’re better off hiring people who have never written a single line of code ever, because their brain hasn’t been poisoned with OOP “design patterns”. So, from the perspective of the company, you can either accept the OOP technical overhead (which all of your competitors also suffer), or accept the human resources overhead (typically much larger).

    Crucially, that HR overhead is convex. If your company becomes sufficiently large, you no longer have the capacity to judge potential employees as individuals. You have to pick an arbitrary standard and judge them against that standard. That becomes extremely difficult when you’re not following the same standard as everyone else.

    The cost of having no standard is far greater than the cost of having a bad standard.

    Usual disclaimers about broad generalizations apply. But, the programmer job market tends to demand interchangeable programmers with a more or less standardized set of skills.

    So…it makes sense for CS departments to train their students in Object-Oriented Programming. The absolute best thing you can say about contemporary universities is that they are expensive trade schools. Universities long ago abandoned the Ciceronean idea of education as a means of sharpening the mind and creating the “ultimate individual.” There’s a lot more money to be made in white-collar vocational training and communist indoctrination. Universities charge students outrageous quantities of money with the promise of delivering an upper-middle-class income at the end of the rainbow. In order to pretend to meet this promise, everyone with a BS in Computer Science needs to be trained against the industry standard.

    The software industry demands standardized inputs, and so the universities respond by standardizing their outputs."



  • @jinpa like everything, OOP is a tool to be used. Do not blame the tool, blame the tools who misuse the tool.

    There are times I go very functional, times I lean into OOP and times :eek: i do both in the same project. Correct tool for job.

    But I’ll give you that the industry is fucked in general and the universities are making it worse.



  • @jinpa found this on your link

    Can you imagine if a country had two incompatible power grids? That would be retarded.

    down here it's either 220v or 127v depending on the city, and it's not uncommon to have both in the same building in case you have some appliance using the wrong type for your city


  • Discourse touched me in a no-no place

    @jinpa The useful bits of OOP are encapsulation and naming. These basic values belong together with this profile of operations that apply to them. I've worked with codebases where everything was basic types (tuples, dictionaries, etc.) and that is not a fun experience at all; so easy to get lost. Which is why giving those things names helps.

    Going all in on inheritance for everything is... a mark of someone likely using the wrong tool.



  • @dkf said in Standarditis:

    Going all in on inheritance for everything is...

    a mark of someone straight out of school.

    Focusing on a single tool at a time is a good way to get a better understanding of it. It's just that nobody bothers telling students theast mix-and-match is a thing


  • BINNED

    @jinpa said in Standarditis:

    "Is Object-Oriented Programming the best programming paradigm? No. It’s fucking stupid. But it’s the industry standard. It’s a terrible standard. But it’s the standard. The most popular languages are all object-oriented. The vast majority of employed programmers are paid to write object-oriented code.

    These are not all the same.
    Extremely overdoing it with OOP like it's the 90s is certainly "fucking stupid" in many cases, but the language supporting OOP doesn't mean you have to go all in with it, including crazy inheritance hieararchies and OOP-pattern madness. Well, maybe unless it's early Java.

    OOP is a tool, "no silver bullet" applies, don't use it for everything. The industry treated it as one until a decade ago, I guess, but I don't think that's still true everywhere.

    I don't want to disagree too much, though, because personally I like to chime in with the "fuck OOP" sentiment.

    Crucially, that HR overhead is convex.

    I have no idea what he means with "convex" here, which is surprising considering he's from a math background.

    So…it makes sense for CS departments to train their students in Object-Oriented Programming. The absolute best thing you can say about contemporary universities is that they are expensive trade schools. Universities long ago abandoned the Ciceronean idea of education as a means of sharpening the mind and creating the “ultimate individual.” There’s a lot more money to be made in white-collar vocational training

    That does seem to be a problem. Trade schools and universities are both needed, but universities have, ostensibly, different goals. They shouldn't be trade schools for programmers.

    and communist indoctrination.

      :rolleyes:  

    Every math major has the following experience:

    • is good at and enjoys ACT-type BS problems
    • majors in math so he can spend 4 years and $200,000 of his parents’ money solving ACT-type BS problems
    • discovers at some point that “math” is an entirely different subject from ACT-type BS problems
    • likes the new subject
    • graduates
    • has spent 4 years and $200,000 learning a largely useless subject that most people don’t even know exists
    • I guess I’ll go get a PHD?

    I want to convince you that there is an entirely different subject that is also called “mathematics”. Let’s call it Mathematics 2. Mathematics 1 is horrible, ugly, confusing, and is intentionally designed to maximize human suffering. Mathematics 2 is simple, beautiful, full of wonder and mystery, and yet still makes an unbelievable amount of sense.

    I'm not sure if "Mathematics 1" here refers to what he calls earlier "ACT-type BS problems" or not. If not, he seems to be actually talking about 3 types. I also have to disagree on the "largely useless subject", especially considering that it's not a trade school.



  • @sockpuppet7 said in Standarditis:

    @jinpa found this on your link

    Can you imagine if a country had two incompatible power grids? That would be retarded.


  • 🚽 Regular

    @topspin said in Standarditis:

    I have no idea what he means with "convex" here

    I don't either, but I'm going to wild ass-guess it refers to the shape of the graph of "more HR" vs overhead. In other words, first derivative does not go brrrt.

    INB4 convex is concave upside-down.



  • @topspin Yeah, I don't agree with everything he said, but I know him from another forum and thought he had some interesting thoughts. He's a mathematician, not a full-time programmer. I would agree with you OOP is mainly a problem when it's dogmatic. I think there's fewer OOP purists than there were at one time.



  • @Gurth said in Standarditis:

    @sockpuppet7 said in Standarditis:

    @jinpa found this on your link

    Can you imagine if a country had two incompatible power grids? That would be retarded.

    the ones in kV are probably distribution lines, you won't have those in your sockets



  • @sockpuppet7 and you probably don't want to, unless your name is Porygeekely.



  • @jinpa I often code in python. While most if the time I can get away with using a bunch of functions, when the code gets sufficiently complex, I find that OOP really helps to organize it.



  • @jinpa Oop can easily get stoopid, but that's true for every programming paradigm. Shit programmers write shit code. I like the new generation of languages that simply picks parts of all paradigms and lets you work however your current problem is easier to solve.
    And even Java is slowly changing in that direction as well.


  • Java Dev

    @sockpuppet7 said in Standarditis:

    @Gurth said in Standarditis:

    @sockpuppet7 said in Standarditis:

    @jinpa found this on your link

    Can you imagine if a country had two incompatible power grids? That would be retarded.

    the ones in kV are probably distribution lines, you won't have those in your sockets

    The voltages don't matter that much. But converting between 50Hz and 60Hz is very nontrivial. If the generator in the power plant runs at 3000 RPM, then everything is 50Hz. If it runs at 3600 RPM, then everything is 60Hz. To do anything else is madness.



  • @jinpa said in Standarditis:

    Is Object-Oriented Programming the best programming paradigm? No. It’s fucking stupid. But it’s the industry standard.

    It's funny that this thread is titled "Standarditis". The exact problem here isn't the value of OOP, it's that programmers are disengaging their brains and "doing what everyone else is doing", aka following a so-called standard.

    Let's assume everyone else that lives on your block works in construction and they buy big trucks. You are a programmer. If you buy a truck, you aren't "following the standard", you are engaging in cargo-cult behavior.

    This applies to the original analysis. The people making disasters are the one simply doing what other people do rather than learning a set of tools and applying them appropriately to their work.

    OOP training isn't helping here. It's really easy to explain inheritance by using real-world analogies like animals, but inheritance doesn't come in all that handy in the real world, and it is actually pretty hard to do correctly in non-trivial cases. Interfaces, on the other hand, are harder to explain and have few easily graspable real-world analogies, so OOP students sometimes come out of training with a lesser grasp of interfaces than of inheritance. Yet, interfaces are the real workhorse of OOP. Writing base classes intended for inheritance is usually best left to framework developers and architects.



  • @Jaime said in Standarditis:

    This applies to your original analysis.

    The author is not me. :pendant:



  • @jinpa My apologies, I corrected "your original analysis" to "the original analysis".


  • Notification Spam Recipient

    @jinpa you forgot the bit where most oop shops could go functional and their codebases wouldn’t change drastically. I suspect that there is a good market out there for C but with classes. Go maybe? Its got google all over it so its going to be taken out and shot soon.



  • @DogsB said in Standarditis:

    @jinpa you forgot the bit where most oop shops could go functional and their codebases wouldn’t change drastically. I suspect that there is a good market out there for C but with classes. Go maybe? Its got google all over it so its going to be taken out and shot soon.

    Google already abandoned it once before, didn't they?
    Kotlin native is another alternative, but it needs a lot of 3rd party libraries.



  • @DogsB said in Standarditis:

    I suspect that there is a good market out there for C but with classes.

    Nearly all of the C++ I've done has been just "C with classes" and using almost none of the other features of C++, not even the I/O operators (because it's running in an environment where there is no real stdout; printf is implemented to write its output into a buffer from which the host OS can read and display it). And even that is mostly just using (maybe occasionally extending) classes provided by the framework.



  • @sockpuppet7 said in Standarditis:

    @Gurth said in Standarditis:

    the ones in kV are probably distribution lines, you won't have those in your sockets

    The important line there is the red one, between light green on one side, and orange and yellow on the other: the net is 60 Hz to the west of it, but 50 Hz to the right. Most modern equipment should be fine if you plug it in on either side, but I don’t think I would trust any random electrical clock I just plugged in if I were in Japan.


  • Discourse touched me in a no-no place

    @Gurth Modern switch mode power supplies will work fine wherever; they just sense what they receive and make do. (This is so useful when travelling with a laptop.) Devices that take timings from the AC cycle will care. High power devices like heaters and air conditioning units will care more about the differences in voltage.

    Korea is another country that has these fun differences. And different sockets too, sometimes three different ones in the same room. I have no good explanation for this.



  • @Jaime other than cargo cult, there are the people that design a single controller rest service with a trillion abstraction layers, hexagonal architecture, clean architecture, DDD, etc

    then you spend more time searching where everything is than coding

    I guess it's ok if it's a code you're working frequently, but I have these small services that rarely change, and when they do I have this



  • @dkf said in Standarditis:

    sometimes three different ones

    Welcome to my home. Not US ones specifically, but three different ones still.

    They say you shouldn't chain extensions/power bars. I laugh at that idea .. (but also stick to reputable brands).



  • @cvi said in Standarditis:

    power bars

    3cb556d5-01ee-4f5e-b64f-4ceb129b6dcb-image.png



  • @HardwareGeek :mlp_eww: (Specifically on the flavour, but also on the whole thing.)

    Wtf do you call the electrical things again? ESL strikes and googling on mobile is a pain.)



  • @cvi I've always called them power strips, but searching for "power bar" finds them, too. It's about a 50:50 mix of food and electrical.


  • 🚽 Regular

    @Jaime said in Standarditis:

    Interfaces, on the other hand, are harder to explain

    Au contraire. You can use the same animal analogy, and in fact it's better because then you can show how ITamable and IWillEatIt implementations intersect but don't overlap.


  • Notification Spam Recipient

    @jinpa said in Standarditis:

    It’s very difficult for oopers to unlearn OOP.

    I had an 'opposite' experience - learning OOP after low level non OO stuff. It was not a pleasant one. I remember constantly asking colleagues 'why is this simple thing so complicated in this language' and sometimes 'why is this so fucking stupid?'.

    their brain hasn’t been poisoned with OOP “design patterns”.

    I find people using design patterns on daily basis as the hardest to work with -it signifies a peculiar intellectual laziness that's a chore to deal with.

    @Jaime said in Standarditis:

    programmers are disengaging their brains and "doing what everyone else is doing", aka following a so-called standard.

    Which was always the case, because programmers are morons just like everyone else. Unfortunately in last ~8 years the phenomenon seems to reach all time heights. I blame javascript.

    Let's assume everyone else that lives on your block works in construction and they buy big trucks. You are a programmer. If you buy a truck, you aren't "following the standard", you are engaging in [cargo-cult]

    Yep, but it's even worse than classic cargo cult. Original cultists imitated actually useful actions of more advanced civilization and their rituals have known and sensible goal: cargo/wealth. In programming people imitate actions of others with no clear idea of any gains. Not only that, you can show them that their rituals have no value or even make things worse - and they will still do it.



  • @jinpa said in Standarditis:

    The most popular languages are all object-oriented.

    That's, fortunately, ceasing to be true. Go and Rust are not object-oriented and as far as I can tell (I never cared to learn much about it) Swift is not either.


    @Arantor said in Standarditis:

    @jinpa like everything, OOP is a tool to be used. Do not blame the tool, blame the tools who misuse the tool.

    OOP is, well, a buzzword with no universal understanding of what it means, but usually when a language is called object-oriented, it means it has a particular mash-up of the tools where most of them are shoehorned onto one basic category, the class. That restricts how the tools can be used.

    In the newer languages, the tools are still there (well, often inheritance is interface-only), but encapsulation is by module or scope, interfaces can be added to data types ex-post etc., and it's no longer called object-oriented.


    @dkf said in Standarditis:

    @jinpa The useful bits of OOP are encapsulation and naming. These basic values belong together with this profile of operations that apply to them. I've worked with codebases where everything was basic types (tuples, dictionaries, etc.) and that is not a fun experience at all; so easy to get lost. Which is why giving those things names helps.

    We had named structure types long before “OOP”. Yes, it's an important ingredient in that tool mash-up, but OOP didn't come up with it. And encapsulation OOP does wrong, because it ties it to the data types, but often you have several tightly coupled types that would be better put in a common capsule.


    @Jaime said in Standarditis:

    OOP training isn't helping here. It's really easy to explain inheritance by using real-world analogies like animals, but inheritance doesn't come in all that handy in the real world, and it is actually pretty hard to do correctly in non-trivial cases.

    The animal and geometric examples, notably, almost always violate the Liskov substitution principle. And the training rarely even mention it. But if you don't follow it, you'll make a mess of it.

    Of course the gist of Liskov substitution principle is that unless your object is immutable, you most likely can't treat is as its base class anyway.

    Which is why the language designers are abandoning class inheritance and only keep it for interfaces. Interfaces are the subsets of methods that can be used for the different types of objects safely.



  • None of this is helped by the amount of online material “explaining” these things but in fact mis-explain them on top, or the amount of “here is how to build a thing” when it doesn’t explain why anything is so and all you have learned is how to copy/paste the next thing of the same shape.



  • @MrL said in Standarditis:

    @jinpa said in Standarditis:
    learning OOP after low level non OO stuff. It was not a pleasant one. I remember constantly asking colleagues 'why is this simple thing so complicated in this language' and sometimes 'why is this so fucking stupid?'.

    This is the level I have been at for about twenty years. I use OOP stuff when necessary to interact with the language, like Number.parseInt(), but don’t think of it as that — to my mind, I’m just calling a function there, that returns an integer when I feed it a string. I soon get very frustrated with languages that want me to make objects for everything so that those objects can then be fed to other objects which are needed by yet other objects to do something that in my mind should be very simple: “WTF can I not just tell myWindow to be h pixels high?!”

    (Yes, you’re right: other than a few months of an hour of Pascal lessons a week over 30 years ago, nobody ever actually taught me programming.)



  • @Arantor said in Standarditis:

    None of this is helped by the amount of online material “explaining” these things but in fact mis-explain them on top, or the amount of “here is how to build a thing” when it doesn’t explain why anything is so and all you have learned is how to copy/paste the next thing of the same shape.

    Can I upvote this more than once?



  • @Arantor said in Standarditis:

    None of this is helped by the amount of online material “explaining” these things but in fact mis-explain them on top, or the amount of “here is how to build a thing” when it doesn’t explain why anything is so and all you have learned is how to copy/paste the next thing of the same shape.

    Before learning programming, the closest thing I had done was play chess. I was the type of player where I didn't like to read books to learn famous opening moves, I liked to just use brute intellect to figure out the possible responses to a move.

    The first language I learned was C. All of the instructions were how to do things - I would have preferred a list of the rules of C, which would then allow me to figure out how to do things on my own.


  • I survived the hour long Uno hand

    @jinpa said in Standarditis:

    the rules of C

    aa625814-4d79-4147-969f-b1041413c060-image.png


  • Notification Spam Recipient

    @Gurth said in Standarditis:

    @MrL said in Standarditis:

    @jinpa said in Standarditis:
    learning OOP after low level non OO stuff. It was not a pleasant one. I remember constantly asking colleagues 'why is this simple thing so complicated in this language' and sometimes 'why is this so fucking stupid?'.

    This is the level I have been at for about twenty years. I use OOP stuff when necessary to interact with the language, like Number.parseInt(), but don’t think of it as that — to my mind, I’m just calling a function there, that returns an integer when I feed it a string. I soon get very frustrated with languages that want me to make objects for everything so that those objects can then be fed to other objects which are needed by yet other objects to do something that in my mind should be very simple: “WTF can I not just tell myWindow to be h pixels high?!”

    (Yes, you’re right: other than a few months of an hour of Pascal lessons a week over 30 years ago, nobody ever actually taught me programming.)

    Someone has had a run in with a dependency injection enthusiast. You have my sympathies.


  • Notification Spam Recipient

    @Arantor said in Standarditis:

    None of this is helped by the amount of online material “explaining” these things but in fact mis-explain them on top, or the amount of “here is how to build a thing” when it doesn’t explain why anything is so and all you have learned is how to copy/paste the next thing of the same shape.

    The cult of Bob would like to give you all the answers. Pity most of their examples wouldn’t survive a code review outside their toy box or deadlines. Good start but most of it doesn’t scale. I will concede the books probably have pulled up the standard of output for some cultists.



  • @jinpa I learned PHP from the manual which is mostly “here is the syntax, here are the functions, here are brief snippets of how these functions could, but probably shouldn’t, be used”.

    I think this saved me from a lot of the brain measles. That and growing up in the era of books of type-in listings where they’d have breakdowns of code, explanations of why and then a glorious “going further” chapter that came with such wonderful suggestions as “user input shouldn’t break your program, check what the user input is what you expect”. That guidance got into my head aged about 7 and has stayed with me.



  • @Arantor I would imagine that there was a big advantage in learning programming when you were 7, rather than waiting until you were in your forties. 😁



  • @jinpa well, yes, when I was 7 the world was as complex as ZX Basic, a 256x192 screen with colour clash and loading off 3 inch disks.



  • @Gurth said in Standarditis:

    @MrL said in Standarditis:

    @jinpa said in Standarditis:
    learning OOP after low level non OO stuff. It was not a pleasant one. I remember constantly asking colleagues 'why is this simple thing so complicated in this language' and sometimes 'why is this so fucking stupid?'.

    This is the level I have been at for about twenty years. I use OOP stuff when necessary to interact with the language, like Number.parseInt(), but don’t think of it as that — to my mind, I’m just calling a function there, that returns an integer when I feed it a string.

    This example nicely demonstrates the limitation of “functions belong to the data”—should a function that parses a string into an integer belong to the string class, because it parses it, or to the integer class, because it constructs it (from string)?

    And yes, to me it's also still “just a function”, because I also learned OOP as add-on to procedural languages, not as the start and end of it all.

    I soon get very frustrated with languages that want me to make objects for everything so that those objects can then be fed to other objects which are needed by yet other objects to do something that in my mind should be very simple: “WTF can I not just tell myWindow to be h pixels high?!”

    This isn't as much languages as (standard) libraries and frameworks. Even independent of OOP, there are two kinds: the ones designed by engineers and the ones designed by architects. Engineers will content with hodgepodge of functions that get the job done. It will have a lot of weird edge cases and unspoken assumptions, but will get the typical job done in a couple of lines. Architects will instead insist on making something elegant and consistent, which will, however, require long set up as you state all the unspoken assumptions of the common use-case before you get to the real work.

    OOP can be credited by really letting the architecture astronauts soar—and due to the complex (entangled; everything's tied to classes) nature of OOP end up with a mess anyway.

    I tend to have this feeling with .нет a lot—the base classes are nice and flexible, but would really need some helper wrappers or default parameters for the one or two use-cases that are vastly more common than all the combinations they support.

    @DogsB said in Standarditis:

    Someone has had a run in with a dependency injection enthusiast. You have my sympathies.

    Some companies are, unfortunately, run over by them. Worse, it's some of those creating major programming languages like C# and (I suppose; I don't use it much) Java.

    @Gurth said in Standarditis:

    (Yes, you’re right: other than a few months of an hour of Pascal lessons a week over 30 years ago, nobody ever actually taught me programming.)

    I had various lessons. I consider most useful the ones about data structures and complexity—which didn't touch any programming at all, just the theory—and the introduction to databases, which did get into SQL, but we started with the relational algebra concept.

    But really, you learn the most by reading a lot of other people's code and trying to fix a bug or add a small feature here or there. Open source is good for this. Just pick something that has a reputation for thorough code review¹, so you know most of the code is actually sensible.


    ¹ e.g. Linux (means the kernel); the 400-something mail thread where they kicked out Hans Raiser with his raiserfs4 because it violated layering and had some odd intractable edge cases was epic².
    ² It's been over 20 years (and I stopped following Linux since), but I see you said 30.


  • I survived the hour long Uno hand

    @Arantor said in Standarditis:

    “user input shouldn’t break your program, check what the user input is what you expect”

    Also known as The Idiotitis Principle



  • @Arantor said in Standarditis:

    PHP ... these functions could, but probably shouldn’t, be used”.

    Yes.



  • @Arantor said in Standarditis:

    @jinpa well, yes, when I was 7 the world was as complex as ZX Basic, a 256x192 screen with colour clash and loading off 3 inch disks.

    :belt_onion: When I was 7, the world was as complex as
    fc951b96-1aee-422a-84a7-1a2a11f432d2-image.png



  • @izzion Guidelines are exactly what I didn't want. Suppose you're reading the rules for a new game, and some %@$+@$# mixes strategies or suggestions in with the rules.

    If I later decided I wanted to see strategies, that would be fine. But I wanted to know the rules separately from guidelines and suggested strategies.



  • Many people get so hung up on "I don't know where to start" that they forget that these are tools, and they give away all of their power of choice to whatever template they find first.

    This has come up repeatedly in the history of the profession. I remember back in the ASP.Net WebForms days, asmx based Web Services came out and were a viable way for a browser or another application to call a web application. It served a purpose and was a workable early example of one possible way to get it done.

    However, ASMX Web Service were created by inheriting from a base class, and that base class did all the interaction with the platform. Sounds great and worked. However, in order to run your service code, you had to create an instance of a class with inherited functionality that expected to be running on a web server. This complicated automated testing. For example, if your code touched the Session property it would work just fine on a real server, but it wasn't worth it to do the extensive customization to make it work in a unit test. Now you had to abstract out session access. Repeat for the User property.

    The test-driven guys wanted something better abstracted from the web server innards. They came up with WebAPI. They succeeded and the integration was abstracted out to the router and the DI container. To this day, 95% of .Net guys will tell you that WebAPI controllers are better than Web Services, but most of them can't tell you specifics about how and why they are better. When I ask, the most common answers I get include faster, "cleaner", or "more modern". Wrong, meaningless, or more meaningless.



  • @Jaime said in Standarditis:

    Wrong, meaningless, or more meaningless.

    Just like the web itself.



  • @DogsB said in Standarditis:

    Someone has had a run in with a dependency injection enthusiast. You have my sympathies.

    I have no clue what you’re talking about here :) My example was a rough approximation of a memory of trying to get a macOS window to change size in reaction to a button click (“show/hide details” type of thing). IIRC, it involved having to set up an NSFrame object with the x and y coordinates as well as width and height of the window, then feeding that object to the NSWindow object. And as I recall, you couldn’t just give the NSFrame some numbers either but they had to be another NSThingamajig. (Couple that to the idiocy of having the screen’s origin at lower left when I would think windows usually get resized from code in such a manner that the title bar stays where it is …)

    @Bulb said in Standarditis:

    I soon get very frustrated with languages that want me to make objects for everything

    This isn't as much languages as (standard) libraries and frameworks.

    True, but the two often go hand in hand in my experience (which, admittedly, is fairly limited — partly because I just give up and think, “I can do without this shit”).

    reading a lot of other people's code

    Yeah, that’s one of my problems here :) When I read explanations with sample code, I read the text and mostly skip over the code to the next part of the text, like I would a photograph in the text. Even when I make myself read the code, I find it very difficult to follow what it does. That also goes for stuff I wrote myself longer ago than fairly recently … I tend to comment my code fairly extensively as a result.



  • @Gurth said in Standarditis:

    @Bulb said in Standarditis:

    I soon get very frustrated with languages that want me to make objects for everything

    This isn't as much languages as (standard) libraries and frameworks.

    True, but the two often go hand in hand in my experience (which, admittedly, is fairly limited — partly because I just give up and think, “I can do without this shit”).

    Well, yes, they do. At least the standard library always comes with the compiler/interpreter, that's what makes it a standard library. I still make the distinction for several reasons, though

    • In some languages, like C, C++ or Rust, you can program without the standard library, and do it when targeting unusual environments like small embedded devices.
    • You are always restricted by what the language does or does not support, but you don't have to follow the conventions of the standard library in your code and can reimplement or wrap anything in it in what you consider more convenient interfaces.
    • Things being in the standard library rather than built into the language indicate its power. Go has associative arrays (dictionaries) built in, but until recently you couldn't implement anything similar in a library, which to me was important reason to ignore that language.

    reading a lot of other people's code

    Yeah, that’s one of my problems here :) When I read explanations with sample code, I read the text and mostly skip over the code to the next part of the text, like I would a photograph in the text. Even when I make myself read the code, I find it very difficult to follow what it does. That also goes for stuff I wrote myself longer ago than fairly recently … I tend to comment my code fairly extensively as a result.

    I don't mean explanations with sample code. Sample code is almost always contrived. I mean actual production code. Maybe look whether some library you are using has some bugs in its bug-tracker that you can try to fix.

    And for commenting, some time ago I've seen code that was very extensively commented. The guy who wrote it apparently followed a suggestion, that I read somewhere back in the day, to describe the task in words first and then insert the code, step by step. So each line was commented. Except the code then got fixed and modified and the comments never were, so they ended up being misleading. Oh, and one thing, the one thing that would make most sense to comment, never was: the variables a lot of which were i, a, b, tmp etc.


  • Notification Spam Recipient

    @Gurth said in Standarditis:

    @DogsB said in Standarditis:

    Someone has had a run in with a dependency injection enthusiast. You have my sympathies.

    I have no clue what you’re talking about here :)

    Run Gurth! Run and never look back you beautiful summer child!


Log in to reply