Super constructor failure antipattern?



  • I've noticed this pattern in the auto-generated init methods (known as "constructors" in sane programming languages) in our Objective-C codebase.

    -(id)init
    {
        self = [super init];
        if (self)
        {
            // Initialize instance variables.
        }
        return self;
    }

    Or here's the Java/C# equivalent:

    public class MyClass extends Object
    {
        public MyClass()
        {
            if (this != null)
            {
                //Initialize instance variables.
            }
        }
    }

    I really have to wonder what the point is. I've been taking the null checks out and informed the rest of my team to as well because it seems like a stupid waste of a few instructions. Even if we left them in, lots of odd errors would occur if the NSObject superconstructor somehow failed, because either we leave the null check in and don't initialize instance variables and get a messed-up object which causes failures in the app, or we don't null check and the app fails because of an invalid pointer being dereferenced.



  • I would ask one of them under what circumstances they expect the initializer to be called when the object was not null. Figure out the source of the misconception instead of fixing the symptoms.



  • @blakeyrat said:

    I would ask one of them under what circumstances they expect the initializer to be called when the object was not null. Figure out the source of the misconception instead of fixing the symptoms.
     

    Ask Apple? This is boilerplate code autogenerated by XCode.



  • Perhaps Objective-C is so terrible that sometimes the constructor just fails. I wouldn't doubt it.

    Or maybe it's there to catch cases where the machine runs out of memory and can't create the object.



  • Gosh, every time I see a piece of Objective-C code, I feel like gouging my eyes out of my face.

    Anyway... Apparently there IS a reason for that. Although I still don't see the reasoning behind it returning null. Or rather, calling the constructor of null. But hey, I'm not a Obj-C programmer...



  • @LegacyCrono said:

    Anyway... Apparently there IS a reason for that.

    So...instead of throwing an exception if you can't find a file, Objective-C just sets the created object to null (nil, whatever) and returns it? Damn that's terrible.



  • @lettucemode said:

    So...instead of throwing an exception if you can't find a file, Objective-C just sets the created object to null (nil, whatever) and returns it? Damn that's terrible.

    If you're not using GC, it's the only way to keep it from leaking non-managed resources. It's basically similar to the reason why exceptions suck in C++.



  • @lettucemode said:

    @LegacyCrono said:
    Anyway... Apparently there IS a reason for that.

    So...instead of throwing an exception if you can't find a file, Objective-C just sets the created object to null (nil, whatever) and returns it? Damn that's terrible.

    That was my thought initially... So rather than having a exception thrown (remember, exceptions are for exceptional situations), it will silently swallow it and you're still screwed, just two lines further down in your code... Makes perfect sense!



  • @morbiuswilters said:

    If you're not using GC, it's the only way to keep it from leaking non-managed resources. It's basically similar to the reason why exceptions suck in C++.

    Can you elaborate?



  • @morbiuswilters said:

    If you're not using GC, it's the only way to keep it from leaking non-managed resources. It's basically similar to the reason why exceptions suck in C++.

    I vaguely remember something about throwing exceptions from constructors and then not being able to delete the object. I also remember a horrible error in the VC runtime that did throw exceptions during construction when the app ran out of memory and an exception handler that duplicated that message in a string object. That doesn't go very well when there's no memory.

    But, the reason for this code in Cocoa is that these objects can be constructed by the GUI environment during the reading of the nib files, and that doesn't expect an exception, but a null, if I remember well. And I don't think the one or two cycles you lose there should cause you any loss of sleep; the thought of impossible crashes that can take days or weeks to resolve on the other hand should.


  • BINNED

    @morbiuswilters said:

    If you're not using GC, it's the only way to keep it from leaking non-managed resources. It's basically similar to the reason why exceptions suck in C++.
     

    Have you heard of RAII and destructors?

    (Not that I'd argue about that exceptions suck in C++. They always give me a tingly feeling I missed one of the billion things to worry about.)



  • @mott555 said:

    I've been taking the null checks out and informed the rest of my team to as well because it seems like a stupid waste of a few instructions.
    Yet in doing so you are going against the entire body of objective-c programming standards. That makes you TRWTF in this case for not understanding why the checks are there in the first place and causing issues down the line when your company hires programmers that that write objective-c like 99.99% of other programmers do.

    BTW a better explanation of this pattern is The how and why of cocoa initializers



  • @topspin said:

    Have you heard of RAII and destructors?

    Yeah, we read this forum. Since RAII comes up EVERY FUCKING TIME someone so much as mentions C++ in passing, yes we've fucking heard of it. The surprise would be a discussion of C++ without 47 idiots bringing up RAII.

    Like in this example, its always cited with a hint of desperation, kind of like, "ok I know we're morons for programming in a shitty-ass language, but here's a technique that makes C++ almost one-fifth as good as C# maybe if you tilt your head and squint!"


  • BINNED

    @blakeyrat said:

    yes we've fucking heard of it
    That was meant to be more of a rhetorical question.

    But even if you don't like it, it is kind of the best answer for dealing with unmanaged resources when you can have exceptions.



  • @morbiuswilters said:

    It's basically similar to the reason why exceptions suck in C++.
    Exceptions in C++ have always seemed to me as a bit of an afterthought. After all, you can write something like throw 5 and a bit further on catch an integer.



  • @OzPeter said:

    BTW a better explanation of this pattern is The how and why of cocoa initializers

    Jesus fuck WHAT THE FUCK IS THIS SHIT. Constructors that aren't constructors? Nil checks instead of exceptions? "The superclass initializer could return a different instance of your class"? Even PHP, the biggest joke language that's ever existed, has a better, more logical concept of classes than this.



  • @Severity One said:

    @morbiuswilters said:

    It's basically similar to the reason why exceptions suck in C++.
    Exceptions in C++ have always seemed to me as a bit of an afterthought. After all, you can write something like throw 5 and a bit further on catch an integer.

    Actually the contraint to use an arbitrary base class (such as System.Exception in .NET) imposes some limitations. The ability to throw "anything" can easily be abused, but also does not impose constraints.



  • @blakeyrat said:

    Like in this example, its always cited with a hint of desperation, kind of like, "ok I know we're morons for programming in a *****-***language, but here's a technique that makes C++ almost one-fifth as good as C# maybe if you tilt your head and squint!"

    1) This post has been about Objective C, which is not C++.
    2) For every use case where C# has an advantage over C++, there is a use-case where the inverse is true.  At a number of conferences, I have sat on either side of the "debate", and in one case sat on opposite sides during two sessions [that got some "double-takes" from people who attended both sessions...]



  • @TheCPUWizard said:

    @Severity One said:
    Exceptions in C++ have always seemed to me as a bit of an afterthought. After all, you can write something like throw 5 and a bit further on catch an integer.
    Actually the contraint to use an arbitrary base class (such as System.Exception in .NET) imposes some limitations. The ability to throw "anything" can easily be abused, but also does not impose constraints.
    Sure, but what's the point? C doesn't pose any restrictions on you to trash your memory; I wouldn't call that an advantage.

    This 'restriction' is that an exception is a full-blown class with a couple of well-defined methods. I don't know .NET, but I would imagine it's similar to Java's java.lang.Exception, which lets you cascade exceptions and produce a stack trace.

     



  • @The_Assimilator said:

    <stryhf> why do people persist in continuing to write SHIT APPLICATIONS IN JAVA
    <stryhf> writing SHIT in JAVA doesn't make it ANY LESS SHITTY
    <stryhf> IT JUST MAKES IT SLOW ASS SHIT
    Funny you should say that. The other day, I wrote a trie tree in Java that is optimised for indexing by an array of integers as a prefix. So if you give it an international phone number, it'll tell what country it belongs to, but you could also use it for ASN.1 or whatever produces a prefix that can be considered as an array of integers. Obviously, it's completely generic, so you can choose any kind of class for both key and value.

    After running it a bit through a profiler, I ended up with 70 million lookups per second on a Core i7 860 (2.8 GHz); 40 million if it is thread-safe. This was with a two-plus-three level deep search, which is the only factor that determines its speed.

    No doubt it could be made doubleplusfast in C, C++ or some other 733t language, but guess what? Nobody cares! Because the database lookups that accompany this sort of thing are much slower.



  • @Severity One said:

    After running it a bit through a profiler, I ended up with 70 million lookups per second on a Core i7 860 (2.8 GHz); 40 million if it is thread-safe. This was with a two-plus-three level deep search, which is the only factor that determines its speed.

    No doubt it could be made doubleplusfast in C, C++ or some other 733t language, but guess what? Nobody cares! Because the database lookups that accompany this sort of thing are much slower.

    Then you put a GUI on it, and suddenly it's slower than molasses and uglier than mole asses.

    As long as you're writing uber-nerd programs you don't expect actual human beings to use, I'm sure Java performs just fine. The problem shows up when you decide to make a futile attempt to make a Java program usable. Usable UIs are apparently Java's kryptonite.



  • @TheCPUWizard said:

    @blakeyrat said:

    Like in this example, its always cited with a hint of desperation, kind of like, "ok I know we're morons for programming in a hunter-twolanguage, but here's a technique that makes C++ almost one-fifth as good as C# maybe if you tilt your head and squint!"

    (something)
     

    I don't get why blakey's post and your quoting was different?


     



  • @blakeyrat said:

    As long as you're writing uber-nerd programs you don't expect actual human beings to use, I'm sure Java performs just fine.
    Well, sure, but the thing is, I get paid to write über-nerd programs that aren't used by actual humans. It's all web services and back-ends for me. The front-ends are off-the-shelf, or outsourced to people that like to use some Microsoft version of a language that I stopped using after the age of 15, around the time that Miami Vice was in its second season.

     



  • @Severity One said:

    @blakeyrat said:

    As long as you're writing uber-nerd programs you don't expect actual human beings to use, I'm sure Java performs just fine.
    Well, sure, but the thing is, I get paid to write über-nerd programs that aren't used by actual humans. It's all web services and back-ends for me. The front-ends are off-the-shelf, or outsourced to people that like to use some Microsoft version of a language that I stopped using after the age of 15, around the time that Miami Vice was in its second season.

     

    You probably should have paid attention to the other part of his post, the part where he makes the not unreasonable claim that Java GUI toolkits are, to paraphrase, absolute garbage.



  • @Severity One said:

    @blakeyrat said:
    As long as you're writing uber-nerd programs you don't expect actual human beings to use, I'm sure Java performs just fine.
    Well, sure, but the thing is, I get paid to write über-nerd programs that aren't used by actual humans. It's all web services and back-ends for me. The front-ends are off-the-shelf, or outsourced to people that like to use some Microsoft version of a language that I stopped using after the age of 15, around the time that Miami Vice was in its second season.

    Yeah but... who gives a shit? That says nothing about Java. Any language can do uber-nerd back-end stuff equally well. There's no differentiation in that market.

    BTW, even if you're ok with Java GUI toolkits being shit, doesn't it bother you that Java doesn't have a single IDE that isn't shit? Or, let me guess, you're too macho to use an IDE, you just code directly from VIM and debug using-- oh wait you're too macho to debug.



  • @blakeyrat said:

    BTW, even if you're ok with Java GUI toolkits being shit, doesn't it bother you that Java doesn't have a single IDE that isn't shit?

    But there's Ecli--

    Fuck it, I couldn't follow through on it.



  • @Severity One said:

    Sure, but what's the point? C doesn't pose any restrictions on you to trash your memory; I wouldn't call that an advantage

    As with nearly everything, it depends on circumstances. While .NET makes it fairly easy to generate code at runtime, it is virtually impossible to write self-modifying code. "Trashing your memory" in a controlled fashion allows for this. It's all a set of trade-offs rather than a set of absolutes.



  • @Zemm said:

    @TheCPUWizard said:

    @blakeyrat said:

    Like in this example, its always cited with a hint of desperation, kind of like, "ok I know we're morons for programming in a hunter-twolanguage, but here's a technique that makes C++ almost one-fifth as good as C# maybe if you tilt your head and squint!"

    (something)
     

    I don't get why blakey's post and your quoting was different?

    Not sure where "hunter-too" came from, but yes, I did censor the quote as I see no value in the profanity, so common these days on forums such as this.



  • @Severity One said:

    @TheCPUWizard said:
    @Severity One said:
    Exceptions in C++ have always seemed to me as a bit of an afterthought. After all, you can write something like throw 5 and a bit further on catch an integer.

    Actually the contraint to use an arbitrary base class (such as System.Exception in .NET) imposes some limitations. The ability to throw "anything" can easily be abused, but also does not impose constraints.

    Sure, but what's the point? C doesn't pose any restrictions on you to trash your memory; I wouldn't call that an advantage.

    This 'restriction' is that an exception is a full-blown class with a couple of well-defined methods. I don't know .NET, but I would imagine it's similar to Java's java.lang.Exception, which lets you cascade exceptions and produce a stack trace.

    Presumably, the point is that in C++, throw is a syntactical construct, whereas an exception is purely a library construct. Coming as it does from a C philosophy, it wouldn't make sense to have those tightly coupled unless it were absolutely necessary. Managed languages do it differently.


  • @blakeyrat said:

    BTW, even if you're ok with Java GUI toolkits being shit, doesn't it bother you that Java doesn't have a single IDE that isn't shit? Or, let me guess, you're too macho to use an IDE, you just code directly from VIM and debug using-- oh wait you're too macho to debug.
    Funny you should say that, because that's actually how I started out, about 12 years ago, using vim and hand-building ant scripts. For some reason, my fellow developers had some problems adjusting to it.

    These days, Java IDEs are much better than they were  back then, and I'm using Netbeans, which is as fine an IDE as one could wish for. Well, provided you don't mind excessive memory usage, one part saying that your code has syntax errors whereas another part says you don't (solution: stop Netbeans, clear the 50+ MB cache and start it again), proxy settings that used to be local but are now global so that working behind a firewall becomes troublesome, and the custom of re-introducing bugs that had been fixed in a previous version.

     



  • @Severity One said:

    These days, Java IDEs are much better than they were  back then, and I'm using Netbeans, which is as fine an IDE as one could wish for. Well, provided you don't mind excessive memory usage, one part saying that your code has syntax errors whereas another part says you don't (solution: stop Netbeans, clear the 50+ MB cache and start it again), proxy settings that used to be local but are now global so that working behind a firewall becomes troublesome, and the custom of re-introducing bugs that had been fixed in a previous version.

    ... oookay. I can't tell if you're trying to be funny or genuinely of the opinion that Netbeans is good despite the long list of stuff long with it.

    I can tell you I've tried it, and I found something like 6 deal-breaker bugs in the first couple minutes:

    @Me said:

    * Where ever the hell Aptana is getting its wrong-ass paths from, Netbeans is using the same wrong-ass paths. Right here, that's a deal-breaker (but let's keep going)

    * There's a mysterious red line down the middle of the window.... why? Also WTF?

    * There's a bar to the right of the scroll bar which contains mysterious orange blobs. Clicking an orange blob takes me to a line of code with a message on the left, signified by a yellow lightbulb. The message is "code has no side effects." Uh... thanks? Also WTF? But not having the scrollbar on the right destroys my muscle-memory and is another deal-breaker.

    * "Print to HTML"... seriously? The word "Export" or "Save" doesn't exist in their little world?

    * Setting the font to Consolas 14pt doesn't actually set it to 14pt. It looks like the size is hard-coded to 12pt, even though it lets you "change" it. Deal-breaker, I can't read this tiny text comfortably.



  • @blakeyrat said:

    I can tell you I've tried [Netbeans], and I found something like 6 deal-breaker bugs in the first couple minutes...


    Netbeans is truly awful. Among my gripes is that what you get from the WYSIWYG GUI builder is not what you see in the editor, and because of the ridiculous layout it creates, the Java generated by the tool is damn near impossible to tweak.


  • BINNED

    @TheCPUWizard said:

    Not sure where "hunter-too" came from, but yes, I did censor the quote as I see no value in the profanity, so common these days on forums such as this.

    Yeah, that old password joke/troll wasn't funny even when it was new.



  • @PedanticCurmudgeon said:

    @TheCPUWizard said:

    Not sure where "hunter-too" came from, but yes, I did censor the quote as I see no value in the profanity, so common these days on forums such as this.

    Yeah, that old password joke/troll wasn't funny even when it was new.
    I find this very ironic coming from the guy who is always open to chucking out the old "your not too smart" line in the comments section. :)



  • @TheCPUWizard said:

    @Severity One said:

    Sure, but what's the point? C doesn't pose any restrictions on you to trash your memory; I wouldn't call that an advantage

    As with nearly everything, it depends on circumstances. While .NET makes it fairly easy to generate code at runtime, it is virtually impossible to write self-modifying code. "Trashing your memory" in a controlled fashion allows for this. It's all a set of trade-offs rather than a set of absolutes.

    Depending on exactly what you mean by "self-modifying", you could have a piece of code that stores a modified version of itself in a delegate. Heck, if you were masochistic enough, you could use reflection to decompile your own source, make changes, and re-emit it, and then store that in the same delegate.



  • @C-Octothorpe said:

    ... chucking out the old "your not too smart" line in the comments section. :)

    Your not too smart.... brother-in-law???



  • @pkmnfrk said:

    @TheCPUWizard said:

    @Severity One said:

    Sure, but what's the point? C doesn't pose any restrictions on you to trash your memory; I wouldn't call that an advantage

    As with nearly everything, it depends on circumstances. While .NET makes it fairly easy to generate code at runtime, it is virtually impossible to write self-modifying code. "Trashing your memory" in a controlled fashion allows for this. It's all a set of trade-offs rather than a set of absolutes.

    Depending on exactly what you mean by "self-modifying", you could have a piece of code that stores a modified version of itself in a delegate. Heck, if you were masochistic enough, you could use reflection to decompile your own source, make changes, and re-emit it, and then store that in the same delegate.

    If I read you right, then both of those involve executing NEW code (i.e. emitting a new code block that gets executed). This can have a high overhead, whereas doing something like hanging a single op-code in the middle of a block of code to change the functionallity would not. This is the distinction I was making in my previous comment (see quoted text above). I did an audio processing (real time, 128+ channels) tht made eavy use of this ability to change filter algo's on the fly an meet the true realtime demands.


  • BINNED

    @C-Octothorpe said:

    @PedanticCurmudgeon said:
    @TheCPUWizard said:

    Not sure where "hunter-too" came from, but yes, I did censor the quote as I see no value in the profanity, so common these days on forums such as this.

    Yeah, that old password joke/troll wasn't funny even when it was new.
    I find this very ironic coming from the guy who is always open to chucking out the old "your not too smart" line in the comments section. :)
    Sometimes the old memes are best. It seems to really throw off the newbies, though.



  • @blakeyrat said:

    I can't tell if you're trying to be funny or genuinely of the opinion that Netbeans is good despite the long list of stuff long with it.
    I don't need to try to be funny, because I'm rather witty, even if I say so myself.

    @blakeyrat said:

    * Where ever the hell Aptana is getting its wrong-ass paths from, Netbeans is using the same wrong-ass paths. Right here, that's a deal-breaker (but let's keep going)
    Never used Aptana, don't know what you mean by "wrong-ass paths".

    @blakeyrat said:

    * There's a mysterious red line down the middle of the window.... why? Also WTF?
    That's 80 characters. People in Africa have to deal with screens that are no more than 80 characters wide, and anyway, for über-nerds it's good to know that they can wrap lines before the red line, when their code is inevitably taken to a Unix command line.

    @blakeyrat said:

    * There's a bar to the right of the scroll bar which contains mysterious orange blobs. Clicking an orange blob takes me to a line of code with a message on the left, signified by a yellow lightbulb. The message is "code has no side effects." Uh... thanks? Also WTF? But not having the scrollbar on the right destroys my muscle-memory and is another deal-breaker.
    Perhaps you missed the wiggly lines that accompany source code WTFs? The lightbulb is to fix code, not to highlight it. I've never seen "code has no side-effects", but then again I find writing PHP beneath my dignity.

    @blakeyrat said:

    * "Print to HTML"... seriously? The word "Export" or "Save" doesn't exist in their little world?
    Export to what? Excel?

    @blakeyrat said:

    * Setting the font to Consolas 14pt doesn't actually set it to 14pt. It looks like the size is hard-coded to 12pt, even though it lets you "change" it. Deal-breaker, I can't read this tiny text comfortably.
    Works for me.

    There's plenty wrong with Netbeans, but then again there's plenty wrong with any Java IDE out there. From what I hear, Visual Studio is way better, but I've never liked to look of it (like it comes from the early nineties), and it implies working with Microsoft's APIs – no thank you.

     



  • @Severity One said:

    From what I hear, Visual Studio is way better, but I've never liked to look of it (like it comes from the early nineties), and it implies working with Microsoft's APIs – no thank you.

    Visual Studio 2010 is great if you do .Net development (web or Windows); the IDE has been redesigned with an advanced presentation layer technology (WPF) that is very fluid and all the features like code completion are snappy (except Help which is now broken like in all Microsoft products). If you work in C++, it's very bad. Actually it's one of the most asked feature on Microsoft's wishlist (Connect): have two separate products for .Net and C++.

    Of course now that the product was getting somehow stable and mainstream they are moving to something else in Metro (Windows 8) where regular .Net applications look like a retarded cousin.


Log in to reply