Object-Oriented Programming is...



  • This kind of bullshit only exists in the OOP world:

    try
    {
        html = node.OuterHtml;
    }
    catch (ArgumentOutOfRangeException ex)
    {
        // "Use HtmlAgilityPack", they said. "It's the best HTML parser around", they said.
        if (ex.ParamName != "length" || ex.TargetSite.Name != "Substring")
        {
            throw;
        }
        html = "<" + node.OriginalName + ">";
    }
    

  • BINNED

    @ben_lubar said in Object-Oriented Programming is...:

    // "Use HtmlAgilityPack", they said. "It's the best HTML parser around", they said. I'd rather use regular expressions.
    

    FTFY.

    🐠


  • Discourse touched me in a no-no place

    @RaceProUK You make it a method of the object that is the class.
    OK, I think I need to explain this better. 😄

    You've written something like this (in some language that might not exist in reality; this isn't really a syntax discussion):

    class FooBar {
        # Definitions go here; syntax not important for this discussion
    }
    

    and you want to instantiate it. That means you call:

    var fred = FooBar.new();
    

    What happens? Well, the FooBar there gets you the class object that you're talking about — it's an instance of Class that represents the FooBar class, so in Java this is the thing you get from FooBar.class because Java isn't awesome enough — and then you invoke an instance method, new, on it. There's some magic going on to handle matching up the arguments with the constructor, but that's really just the language's general varargs mechanism at the fundamental semantic level (a compiler or an IDE can do smarter stuff). It's the documented behaviour of that method that it constructs a new instance of the class.

    It's not a static method though it looks quite a lot like it because it's just using the normal variable mechanism (or perhaps we're talking constant handling; again that's off the side of what I'm talking about) and the normal instance method dispatch system.

    What is special here is not new but rather the class FooBar { … } before it. That really ought to be more like this:

    var FooBar = class.new({
        # Definitions go here; syntax not important for this discussion
    });
    

    But that's only possible to do nicely in some languages (typically the ones that allow switching to using a different semantic-level parser in subcontexts like that); it won't work well in languages that derive from C.

    Note that on that last example, neither class nor new are actually keywords. One is just a special value in the language (you typically have to have such things for other reasons) and the other is just a well-known method name.

    This post seems to be really confusing the syntax highlighter! Silly javascript!



  • @MathNerdCNU said in Object-Oriented Programming is...:

    @Dreikin What if those languages didn't want to use Declare in memory(Dim) syntax because it was associated with VB? 🚎

    You do know that the whole 'declare in memory' backronym is an urban legend, right? DIM actually was an abbreviation for 'DIMENSION', and it came from the FORTRAN array declaration syntax - in older BASIC dialects, scalars weren't declared at all. The original syntax was:

    <line number> 'DIM <var>(' <size> ')'

    Where both <line number> and <size> were integers and <var> was a single letter.



  • @RaceProUK Here's a hint: the reason why that particular name is used predates C++, and it wasn't an operator in the language it came from.



  • @dkf to replicate the behaviour of the new operator, you would also need some way of guaranteeing that the object returned from the factory method is A) not null and B) an instance of precisely the class in question, not a subclass.



  • Here are some more hints:

    Bjarne was basing C++ not on two languages, but three, and wanted the syntax to resemble that of the youngest one, while using semantics closer to the oldest one, because he wanted to bolt them both onto the middle one.

    Java and C# could have avoided having it be an operator if the designers had really wanted to, because both from the beginning had a feature that C++ originally didn't, but it would have been awkward to do since the implementations of said feature was something of a handwave in both of them. Besides, the C++ syntax was familiar to a lot of programmers, and changing it would have been confusing, especially since it would look too much like something else (and actually would be, if the implementation was more literal)..


  • I survived the hour long Uno hand

    @dkf said in Object-Oriented Programming is...:

    OK, I think I need to explain this better.

    That's how Javascript works. and it's disgusting. It leads to confusion, chaos, and wasted memory.


  • kills Dumbledore

    @ScholRLEA said in Object-Oriented Programming is...:

    Here are some more hints:

    How about just explaining it for those of us who don't have the knowledge of programming language history, or the time to go researching it? Case in point, I didn't even know C++ was based on as many as 2 languages. I thought it was all about extending C


  • Discourse touched me in a no-no place

    @Yamikuronue said in Object-Oriented Programming is...:

    That's how Javascript works. and it's disgusting. It leads to confusion, chaos, and wasted memory.

    But Javascript doesn't really use classes as such. Or didn't; don't know what the most recent standards have bolted onto the Frankenstein…


  • Discourse touched me in a no-no place

    @Jaloopa I think one of the languages might have been Simula.


  • I survived the hour long Uno hand

    @dkf To get around that, people did exactly what you're explaining.

    Take Backbone for a widely used example. Backbone.Model is a "class object"; to make a specific model, you do var Book = Backbone.Model.extend({[options here]}). new becomes syntactic sugar for creating an object from that prototype, so you can do var 20kLeagues = new Book(), but that Book is just an object itself, sitting around in memory, waiting to be used.



  • OK, fine, fine. Here's the the deal.

    First off, you need to know that OOP originally came out of two unrelated ideas: the Actor model, which was something developed by Carl Hewitt at MIT around 1964 to study ways to simulate physical phenomena by encapsulating complex data structures into self-contained units running as separate processes (well, threads really, but that distinction hadn't arisen yet), and classes, which were developed by a pair of Norwegian professors named Nygaard and Dahl that same year as a way of writing simulations of complex real-world objects by modeling them as units that could be hierarchically categorized and had fixed interfaces. While the various Actors experiments were written as libraries and DSLs in Lisp (because, well, it was MIT and that's what you worked in when you're with Project MAC; the best known of these were Planner and Conniver), Nygaard's work took the form of a series of languages named Simula, which were implemented as a pre-processor over ALGOL-60. While they didn't have much direct impact, they did lead to two other development branches: Scheme, which was originally an experiment on how best to implement actors efficiently (among other things; the 'other things' were what led to Steele and Sussman noticing later that their implementations of actors and closures were virtually identical, leading to the discovery of the relation between the two seemingly unconnected ideas), and Smalltalk at Xerox PARC.

    And here is where things get a bit ironic, because Alan Kay wasn't setting out to create a something for professional programmers to use, but rather something that non-programmers could understand; a large part of the goal of the Smalltalk project, and the PARC development projects in general, was to make the entire idea of professional application programming obsolete. He had previously worked on both Logo and Planner at MIT, and after reading about Simula, he connected Nygaard's class concept to the Actor model and realized it was something an average user - especially a child - could grasp more easily than something less structured, i.e., Dartmouth BASIC.

    He started off with a proof-of-concept language called Flex, but he didn't really hit his stride until he went to The Mother Of All Demos in the summer of 1968 with several other Stanford grad students who had just been hired by Xerox. To say that they were impressed with Engelbart's developments is an understatement; they immediately decided that this was The Future of Computing, and set to work on their own versions of bitmapped windows (which Kay had seen at MIT, after Corbato borrowed it from Engelbart a few years earlier, but the NLS team had significantly improved on their original idea since then), mice, and all the other shiny new toys Engelbart had amazed everyone with.

    Kay's role was to create a language that would allow casual users to automate the GUI, and he came up with a doozy: Smalltalk, a language with minimal syntax based around the idea that everything consisted of objects passing messages to each other. Since the 'everything is object' extended to classes themselves, creating a new object as simply a case of passing a message to the class object for the kind of object you wanted to create:

    myInteger := Integer new: 5.
    

    This same syntax (with some minor special cases for things like blocks of code and infix mathematical operators) would be used for all method invocations, and since every class inherited the new: method from Object, and could specialize it as needed, that seemed to work fine. Because it was used mainly for automation, it had a lot of visual tools, including a variant on the Logo turtle called a pen, and most people who didn't know programming could work with it pretty quickly.

    The problem was, it was slow. Really slow, even on the custom, hand-wrapped Alto workstations they were building in the same office where Kay was working. And it was considered weird; this was still the era of punch cards and batch processing mainframes, so the whole idea of having individual workstations talking over a network and being used by ordinary people through a graphical interface was something no one outside of teams like NLS, Project MAC, Bell Labs, and Circle Graphics Habitat had even imagined before. So while it got a lot of press, most working programmers just scratched their heads over it and went back to work writing COBOL and FORTRAN programs.

    Now fate takes another twist, as a Dane named Bjarne Stroustrup gets a job at Bell Labs. Bjarne had happened to attend some guest lectures by Nygaard when his was attending the Aarhus University, and used Simula for his doctorate dissertation at University of Cambridge, so when he saw C and UNIX, he immediately realized that C, being an ALGOL derivative like Simula, could be preprocessed the same way to provide support for classes. Deciding that C was nice and all, but sort of primitive, he began writing that CFRONT pre-processor for his own use, and soon others were using 'C with Classes' on their larger projects as well.

    Stroustrup was aware of Smalltalk, and liked the funky SVO syntax, but decided that having message dispatch and first-class class objects would be a bit much, so he compromised: messages would be simulated through function calls with an implicit this parameter, and objects would receive messages by applying the already existing structure member operators. To get around the need for a Class class, he created a special-purpose syntax that called the class's constructor through the name of the class. However, he needed to be able to do this with dynamically allocated memory as well, and since he wanted to avoid using automatic memory management (e.g., garbage collection), he decided to create an operator, new, for that purpose (and combine the processes of allocation and construction, since he didn't want to have to use malloc() as a separate step.

    There's more to it, but that's the historical part as I understand it.



  • @ScholRLEA That's some interesting history.

    The thing that confuses me in these discussions is that I can't tell when people are talking about syntax issues or the bit manipulation that the machine actually does (or why these should be linked so closely). Why should it matter that new is an operator versus a method on an abstract class object? OOP criticism will use an example like

    Foo foo;
    foo.method(arg);
    

    being replaced with

    Foo foo;
    function(foo, arg);
    

    To which I say, "so?" The dot is replaced with a comma. What is gained or lost when a compiler can do whatever it wants with the characters typed into a source file? Why would

    myInteger := Integer new: 5.
    

    be slow and (I'm guessing)

    myInteger := new Integer: 5.
    

    be faster?



  • @MZH

    The syntax isn't what causes the slow down. It's the semantics of the language, and the time it takes to build and query the data structures that do all the work to support the language semantics.



  • @MZH because the new keyword is giving the compiler information that a regular method call doesn't. Think about the semantics of a constructor call: First the compiler allocates enough space for the fields of the object that is being constructed, then it calls each constructor in turn until the object is constructed. If you're pulling that out of the language definition and putting it all in userspace you're making the language a lot harder to optimize in exchange for a syntactical consistency that I don'teven see the benefit of in the first place. And lets be real, in a language this abstract, they're probably not even allocating struct space anyway, their objects are probably implemented as hashmaps or linked-lists or some such.



  • @MZH Actually, the remark about slow performance was about the Smalltalk-[76|80] implementation (and the Alto system software) in general, not that specific feature, and I only brought it up to mention how it affected acceptance of the language and the workstation. It wasn't really a technical issue with Smalltalk, as later Smalltalk implementations were able to iron out a lot of the performance issues.

    Smalltalk-80 ran slow in part because, realistically, all software of the time as slow, due to hardware limitations - remember, this was a time when Pascal was considered too non-performant for regular use, and assembly code in application software was still common even though it had been on the way out for over a decade. Also, it was a case of unfair comparisons, as the performance of a relatively small and unoptimized experimental workstation with experimental software running through a pseudomachine interpreter was being measured up against far larger production-quality minis and mainframes running highly tweaked code (usually written in assembly).

    However, the main reason it was perceived as being slow was because they implemented the message passing system in a fairly direct and literal manner, with the objects receiving the messages at run time and performing dispatch by evaluating the message itself first. While they eventually worked out improved ways of implementing this dispatch, the core mechanism was still basically a Lisp style eval-apply loop. Given that even Lisp is rarely implemented in this manner - both Common Lisp and Scheme are designed for compile-and-go, and most production implementations only use eval for performing code generated at runtime - the lack of a compiler was a major performance hit, even with the dispatch operation itself written in assembly and encoded as a bytecode operation in the pseudomachine.

    And before anyone asks, no, bytecode virtual machines were not invented for Smalltalk, and in fact were already well-trodden ground when Wirth used one for the Pascal p-machine in 1968. The first bytecode interpreters predate compilers and even assemblers, and were in use going back to the late 1940s, though those were mostly specialized floating-point interpreters used to simplify using the rather cumbersome FP libraries of the day (hardware FP wasn't really a thing yet) and reduce the memory footprint (important in a system with less than 1K memory, a cycle speed in the low kiloHertz, and a MTBF measured in hours).


  • Discourse touched me in a no-no place

    @Captain said in Object-Oriented Programming is...:

    It's the semantics of the language, and the time it takes to build and query the data structures that do all the work to support the language semantics.

    It's particularly an issue for the sorts of types that less pure languages use primitives for. It turns out that it's really quite important to do well at handling arithmetic if you want speed, and arrays are fantastic for fast collections. This isn't to say that you can't use an object for everything at the abstract level, but rather that the implementation that you use needs to do different things for certain special types.

    Modern compilers are much better than any preceding generation, with far fewer weird bugs and a greater range of safe optimisations.



  • @Buddy I don't know what you mean. If your language specification says that a method called new, init, [classname] or whatever is a constructor, then the compiler must know it too. It wouldn't be able to do anything with it if it didn't.



  • @Salamander this is what i thought we were talking about:

    @dkf said in Object-Oriented Programming is...:

    var FooBar = class.new({
        # Definitions go here; syntax not important for this discussion
    });
    

    ...

    Note that on that last example, neither class nor new are actually keywords. One is just a special value in the language (you typically have to have such things for other reasons) and the other is just a well-known method name.

    So they'respecifically saying that the new method doesn'tget any special treatment by the compiler, meaning it would presumably need to allocate its result by itself somehow.

    But when we're talking about cpp-style ctors, those do get special treatment by the compiler (and language), andbehave differently than methodcalls. So, given that they have different semantics than method calls, it would be a poor stylistic decision to give them the same syntax as regular method calls.


  • Discourse touched me in a no-no place

    @Buddy said in Object-Oriented Programming is...:

    So they'respecifically saying that the new method doesn'tget any special treatment by the compiler, meaning it would presumably need to allocate its result by itself somehow.

    The method dispatch mechanism isn't required special in this case. The new method might be documented to do something that isn't otherwise available to the language (with the mechanism for this concealed) and it is entirely possible that the class of classes itself is something that can't be constructed within the language itself. But then that's really what C does with malloc() so there's no real change there; that's how you dynamically allocate memory yet it is “just” a function.

    Error handling would be a separate matter, as is enforcement of a non-null result. That sort of thing is actually better off being available as general capabilities; no reason for new to be very special there at all.

    when we're talking about cpp-style ctors

    Chase off into the weeds if you must…



  • @dkf said in Object-Oriented Programming is...:

    The new method might be documented to do something that isn't otherwise available to the language (with the mechanism for this concealed)

    Why?


  • Discourse touched me in a no-no place

    @Buddy said in Object-Oriented Programming is...:

    Why?

    Why “Why?”?



  • Why not “Why?”?



  • I only have one complaint about POOOOP: it's always taught as classes representing, well, classes (types) of objects, and objects representing real or logical things. You know that example with class Dog; Dog Spot = new Dog(); Spot.bark().

    Then in the real world, you find out classes are commonly used to group any bunch of functions or data, even when they don't represent a class of objects.

    For example, I remember a Python module to access the reddit.com API that started with "class Reddit: ..." Really? So I can create my own reddits now? Complete with users and posts? And they fit entirely in my computer's memory? No, what happened is for some reason you used a class as a way to group variables and functions.



  • @anonymous234 the way I picture that is that the object is backed not by your computer's memory, but by the real- world thing it's modelling. Like, in JOP (the Java Oriented Processor), the way you write a device driver is you create a regular class, but pin some of the fields to specificlocations in memory (which will be mapped to ports on the device). So a user can instantiate that class, and read and write to the device by reading from or assigning the fields of that object (or preferably, call methods that would do the bit-banging for you).



  • @anonymous234 There are a number of issues there: One, people generally don't understand OOP, particularly because it's taught wrong. They're taught what a hierarchy is, with the typical example of class Animal being inherited by class Dog and class Cat, but not really why you'd want a hierarchy or how to use them. So people go off and try to model the real thing, instead of focusing on what the needs of the program are. For example, if you are writing software for a veterinary, you might not care that Dogs and Cats are Animals. You certainly don't need to have a bark method in Dog. But people will go and do it, because they were taught that OOP models real things as objects.

    Another thing is that using classes doesn't mean you're writing OOP code. If you just think of a class as a type, ignoring all the hierarchical relationships and polymorphic behavior, you can use classes to make more expressive procedural code, package related bits of data to cut down on number of arguments, etc. That's not really bad in itself.

    And then of course, you have the over-engineered OOP approach to trivial things, where you just want to write "Hello world!" to the console and they create an OutputSelector, MessageSelector, MessageFormatter, OutputWriter, and five hundred more classes where a single call would have sufficed.


  • Discourse touched me in a no-no place

    @Kian said in Object-Oriented Programming is...:

    Another thing is that using classes doesn't mean you're writing OOP code. If you just think of a class as a type, ignoring all the hierarchical relationships and polymorphic behavior, you can use classes to make more expressive procedural code, package related bits of data to cut down on number of arguments, etc. That's not really bad in itself.

    Yes/no. I agree with most of what you're saying, but it helps to think of classes not as types, but rather as stereotypes. A class defines a particular stereotypical variety of object, a pattern of interactions, that sort of thing. It's then often convenient to form a type from that so that references to the instances can have their interaction patterns moulded by what the object should be, though that is definitely not actually necessary for an OOP system, especially in systems where objects are capable of responding to messages that aren't described by their class stereotype.

    Not that that's a universal capability of all object systems.


  • Winner of the 2016 Presidential Election

    @Kian said in Object-Oriented Programming is...:

    One, people generally don't understand OOP, particularly because it's taught wrong.

    QFT



  • @asdf said in Object-Oriented Programming is...:

    @Kian said in Object-Oriented Programming is...:

    One, people generally don't understand OOP, particularly because it's taught wrong.

    QFT

    I concur. So very, very much so.

    (Though I should probably say more than just 'AOL!' when I have the chance.)



  • @Kian said in Object-Oriented Programming is...:

    @anonymous234 There are a number of issues there: One, people generally don't understand OOP, particularly because it's taught wrong. They're taught what a hierarchy is, with the typical example of class Animal being inherited by class Dog and class Cat, but not really why you'd want a hierarchy or how to use them. So people go off and try to model the real thing, instead of focusing on what the needs of the program are. For example, if you are writing software for a veterinary, you might not care that Dogs and Cats are Animals. You certainly don't need to have a bark method in Dog. But people will go and do it, because they were taught that OOP models real things as objects.

    I'm sure I was taught OOP using the typical animal analogy (or something like it), but I grasped that it was for grouping programming pieces (methods / properties) rather than real world things.

    So, I'm not so sure it's the way it's being taught so much as it is the students not being able to grasp that programming doesn't strictly involve real world things.

    @Kian said in Object-Oriented Programming is...:

    And then of course, you have the over-engineered OOP approach to trivial things, where you just want to write "Hello world!" to the console and they create an OutputSelector, MessageSelector, MessageFormatter, OutputWriter, and five hundred more classes where a single call would have sufficed.

    There are languages that do this?

    I mean, even in verbose languages like Java and .NET, it's a single method call for printing formatted messages to the console (System.out.printf or Console.Write for the previously mentioned languages).

    Edit: Or did you mean the people using the language?



  • @powerlord said in Object-Oriented Programming is...:

    Edit: Or did you mean the people using the language?

    I meant the people. The example of writing to the console was just an example of something trivial that can be made more enterprisey.


  • BINNED

    The argument against OOP kind of makes sense but only if your language does not let you mix in some procedural code. I see the problem if I have to stick to pure OOP, but same is true for pure functional or pure procedural programming.
    I like to use OOP+libraries, and find this approach extremely clean in Python. Libraries are separate piece of code that do not get entangled with my object hierarchy to make it unnecessarily convoluted, and can be pure procedural. If I see some code can be cleanly written using a function I write a function in a module/library, and reserve objects/classes for things that need state. This also means I rarely use static methods, unless there is some metaclass magic.



  • @Kian said in Object-Oriented Programming is...:

    @powerlord said in Object-Oriented Programming is...:

    Edit: Or did you mean the people using the language?

    I meant the people. The example of writing to the console was just an example of something trivial that can be made more enterprisey.

    Ooh, so now "enterprisey", now could also refer to something over-engineered. I always taught that "enterprisey" meant something hacked together to fulfill baseless needs demanded by those damn sales people.