Java degree desirable



  • http://www.theitjobboard.co.uk/IT-Job/Java-Developer-MidLevel/8711552/en/

    Desirable skills/qualities include:
    • 2-3 years experience of Java.
    • A 2:1 degree in an IT Related discipline (preferably Java).
    • Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.

    I don't know what's worse: the fact that they'd prefer it if I had a "Java degree", or the fact that there was so much Java in my "Computer Science degree" that it probably fulfills this.


  • ♿ (Parody)

    Aren't they just being honest about the state of typical CS education today?



  •  For the benefit of folks outside the UK, "2:1" is roughly equivalent to "bachelor's degree with minimum 3.0 GPA" as I understand it.

     



  •  you mean bac+3 ?



  • @GNU Pepper said:

    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.

    I like how they list out languages and frameworks in the same 'breath' as a development methodology ... or do they think there's an "Agile" plugin for Eclipse/NetBeans?



  • I havent seen a true Computer Science degree in decates (below the Masters or Doctorate level).

    I do have fun with candidates who focus on their degree though. Have them explain what a Turing Machine is and why it is relevant to some real world situations, or explain the ramifications of multi-level cache in a multi-core or multi-processor situation.

    The most common (>90%) effect is a deer-in-the-headlights type stare.



  • @zelmak said:

    @GNU Pepper said:
    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.
    I like how they list out languages and frameworks in the same 'breath' as a development methodology

    Meaning what? Why is this an issue?

    If I say, "knowledge of cats and cars", do you assume cats are the same thing as cars? Is that the problem?

    Look, maybe I'm the space alien. I don't understand what anybody on this forum is talking about.


  • Trolleybus Mechanic

    @blakeyrat said:

    @zelmak said:
    @GNU Pepper said:
    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.
    I like how they list out languages and frameworks in the same 'breath' as a development methodology

    Meaning what? Why is this an issue?

     

    It implies the employer doesn't know the difference between a concept, a methodology and a specific implementation.

    Knowledge of "carpentry / Frank Gehry / hammers / De Waltt Power Drills"

    Then you expect the person hiring you to say: I need you to build a Frank Gehry house using a De Waltt carpentry"

    Versus: "We're building a three story townhouse using the style and philosophy of Frank Gehry. There's a lot of woodwork involved, so you have to have a lot of hands-on carpentry experience, but we won't expect you to do any roofing. Also, we have an exclusive contract with De Waltt tools. I see you know how to use power tools, but just be aware they have a specific requirement for changing the battery on all cordless drills."

    Or to use a car analogy, imagine someone said their harddrive isn't making noise, so it must be crashing because it was hacked, when in reality, the power cord is unplugged from their computer case.



  • @TheCPUWizard said:

    I havent seen a true Computer Science degree in decates (below the Masters or Doctorate level).

    I do have fun with candidates who focus on their degree though. Have them explain what a Turing Machine is and why it is relevant to some real world situations, or explain the ramifications of multi-level cache in a multi-core or multi-processor situation.

    The most common (>90%) effect is a deer-in-the-headlights type stare.

    I have a lowly bachelor degree in computer science, and I'd be able to answer these questions. Lo and behold, I must be one of the 10%!

    But I learned about Turing Machines or processor caching many years after the uni.



  • @Lorne Kates said:

    @blakeyrat said:

    @zelmak said:
    @GNU Pepper said:
    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.
    I like how they list out languages and frameworks in the same 'breath' as a development methodology

    Meaning what? Why is this an issue?

    It implies the employer doesn't know the difference between a concept, a methodology and a specific implementation.

    But that's coming from YOUR BRAIN, not their text. It's the exact same discussion as in the other thread.



  • @blakeyrat said:

    @Lorne Kates said:

    @blakeyrat said:

    @zelmak said:
    @GNU Pepper said:
    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.
    I like how they list out languages and frameworks in the same 'breath' as a development methodology
    Meaning what? Why is this an issue?
    It implies the employer doesn't know the difference between a concept, a methodology and a specific implementation.
    But that's coming from YOUR BRAIN, not their text. It's the exact same discussion as in the other thread.

    Yes, it's based on assumptions that we are making and not on things explicitly stated which can bite you in the ass, but the ability to make assumptions that are "reasonable" (don't feel like getting in a debate on what is and isn't reasonable) is something most will assume others have.  These assumptions may not always be right, but the amount of time saved by not having to spell everything out every time (like you seem to think people should be doing) is greater than that lost do to incorrect assumptions.



  • @blakeyrat said:

    @zelmak said:
    @GNU Pepper said:
    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.
    I like how they list out languages and frameworks in the same 'breath' as a development methodology

    Meaning what? Why is this an issue?

    If I say, "knowledge of cats and cars", do you assume cats are the same thing as cars? Is that the problem?

    Look, maybe I'm the space alien. I don't understand what anybody on this forum is talking about.

     

    Yes, you are a space alien, blakey. But endearing nonetheless.

    Thomas

     

     


  • BINNED

    @locallunatic said:

    Yes, it's based on assumptions that we are making and not on things explicitly stated which can bite you in the ass, but the ability to make assumptions that are "reasonable" (don't feel like getting in a debate on what is and isn't reasonable) is something most will assume others have.  These assumptions may not always be right, but the amount of time saved by not having to spell everything out every time (like you seem to think people should be doing) is greater than that lost do to incorrect assumptions.

    Blakeyrat has something approaching a point here. It is normal to make reasonable assumptions. However, a lot of people on the internet are not very good at that and therefore make unreasonable assumptions instead. Spelling everything out is time-consuming, and the people who make unreasonable assumptions will misinterpret what you spelled out anyway, but your intent and the misinterpretation will be clear to everyone else.



  •  @locallunatic said:

    @blakeyrat said:

    @Lorne Kates said:

    @blakeyrat said:

    @zelmak said:
    @GNU Pepper said:
    Beneficial - Knowledge of Spring / HTML / CSS / Javascript / Agile.
    I like how they list out languages and frameworks in the same 'breath' as a development methodology
    Meaning what? Why is this an issue?
    It implies the employer doesn't know the difference between a concept, a methodology and a specific implementation.
    But that's coming from YOUR BRAIN, not their text. It's the exact same discussion as in the other thread.

    Yes, it's based on assumptions that we are making and not on things explicitly stated which can bite you in the ass, but the ability to make assumptions that are "reasonable" (don't feel like getting in a debate on what is and isn't reasonable) is something most will assume others have.  These assumptions may not always be right, but the amount of time saved by not having to spell everything out every time (like you seem to think people should be doing) is greater than that lost do to incorrect assumptions.

    And these assumptions are a product of cynicism that's been rightfully developed in the minds of TDWTF readership.  It's why I come to this site.  Nothing wrong with some cynical views that have been founded out of a lifetime of experiencing WTFs.  Your assumptions may be wrong, but they're probably right.  At some point, the end user must prove to the cynic that they're not a fucking dumbass.  What I can't stand is the person that will always point out that you're assuming and equates that to something bad.  No shit I'm assuming, but life experience tells me I'm right.  Right enough to say it.  Prove me wrong, and I'll tuck tail and admit it.  The cynic puts himself in this position because he's truely convinced of his beliefs.

    There is too much pussy-footing, political correctness, lack of common sense, word game bullshit these days.  People obsessed with metrics and scapegoats.  These people are so out of touch with reality, it makes a person wonder how they manage to fuck or shit.  HR people seem to fall into this category, so I think the assumption is correct here.

     


  • BINNED

    @TheRider said:

    @TheCPUWizard said:

    I havent seen a true Computer Science degree in decates (below the Masters or Doctorate level).

    I do have fun with candidates who focus on their degree though. Have them explain what a Turing Machine is and why it is relevant to some real world situations, or explain the ramifications of multi-level cache in a multi-core or multi-processor situation.

    The most common (>90%) effect is a deer-in-the-headlights type stare.

    I have a lowly bachelor degree in computer science, and I'd be able to answer these questions. Lo and behold, I must be one of the 10%!

    But I learned about Turing Machines or processor caching many years after the uni.

    Seriously? We covered that in the first 2 semesters here.
    Turing Machines in the theory lectures, caches both in the technical ones (how they work, physical addressing, associativity etc.) and the theoretical and practical ones (effects on software runtime)

     



  • @TheCPUWizard said:

    I havent seen a true Computer Science degree in decates (below the Masters or Doctorate level).

    I do have fun with candidates who focus on their degree though. Have them explain what a Turing Machine is and why it is relevant to some real world situations, or explain the ramifications of multi-level cache in a multi-core or multi-processor situation.

    The most common (>90%) effect is a deer-in-the-headlights type stare.

     

     Discrete Mathematics, Theory of Computing, and Computer Architecture are still required degree courses at Iowa State.

     



  • @TheCPUWizard said:

    I havent seen a true Computer Science degree in decates (below the Masters or Doctorate level).

    I do have fun with candidates who focus on their degree though. Have them explain what a Turing Machine is and why it is relevant to some real world situations, or explain the ramifications of multi-level cache in a multi-core or multi-processor situation.

    The most common (>90%) effect is a deer-in-the-headlights type stare.


    This is the same mindset that most hipsters have with music. There's no reason to assume things were "Better" 20~ years ago. This site as proof, you'll see as many WTF's about older coders as younger ones.

    My University taught me all those things and more. Some may have simply forgotten the term "Turing Machine", but they'll still know the definition. Others simply don't pay attention, don't care, are too arrogant to learn, and I gaurantee people like that have always existed, and always will do.

    Some knowlledge is shifting into .net and other technologies that are in high demand today that didn't exist some time ago, but that's not a bad thing. I still spent a semester on Assembly (An intel VM, can't remember which), and I loved what I learnt about the infrastructure of the CPU, but I don't think I'll ever use it proffressionally. Am I not a computer scientist because I didn't learn PASCAL?

    Also, why does everyone hate Java? I honestly have no idea... Practically speaking it's among the most powerful languages out there. What's the crime in teaching it? (Edit: This statement isn't meant to justify a degree in Java, but that it should definetly have a large ammount of time devoted to it, similar to C++)

     



  • @topspin said:

    first 2 semesters
    In my first semester of college* "computer science", I "learned" how to "program" in Alice. In the second semester, I "learned" by re-typing code my teacher had explicity (and with many syntax errors) typed out in Microsoft Word and then printed.

    *British meaning of college = American meaning of High School*

    *I am not British but I wish I was, and not an American*

    *Mitt Romney



  • @Adanine said:

    Also, why does everyone hate Java?

    You can't build a non-shitty desktop app with it.



  • @blakeyrat said:

    @Adanine said:
    Also, why does everyone hate Java?

    You can't build a non-shitty desktop app with it.

    Minecraft?



  • @Adanine said:

    @blakeyrat said:

    @Adanine said:
    Also, why does everyone hate Java?

    You can't build a non-shitty desktop app with it.

    Minecraft?


    HTFY



  • @Ben L. said:

    @Adanine said:

    @blakeyrat said:

    @Adanine said:
    Also, why does everyone hate Java?

    You can't build a non-shitty desktop app with it.

    Minecraft?


    HTFY
    There
    was a time I didn't like Minecraft, so I understand if you don't like it.
    Doesn't change the fact that it's fun for an overwhelmingly large
    audience, even if it's a bit annoying dealing with people who won't shut
    up about it.

    Also, why am I naming applications to defend a
    language? Do people hate Java just because they've seen crap
    applications developed in Java? I'm not being cheeky or anything, I've
    just been baffled by how everyone in the IT field that I know thinks
    java is crap.

    Is it because of it's pointers/garbage handling? I
    was under the impression that any garbage handler-related issues were
    worked out early-mid 2000, and I'm well aware of how pointers work and
    how they can be used, but java doesn't really take any of that away,
    just hides it.

    I think it's because of the tools people use for
    Java? I've used Netbeans, and ever since I found it I actually like
    coding in Java. Not a real fan of Eclispe,



  • @Adanine said:

    There was a time I didn't like Minecraft, so I understand if you don't like it.

    Just because it's fun does not make it less shitty.

    That thing manages to use more CPU and RAM than World of Warcraft does. And World of Warcraft has [insert long list of graphical buzzwords].



  • @Adanine said:

    There
    was a time I didn't like Minecraft, so I understand if you don't like it.
    Doesn't change the fact that it's fun for an overwhelmingly large
    audience, even if it's a bit annoying dealing with people who won't shut
    up about it.

    "liking it" has nothing to do with "shitty". Lots of people like shitty things. And Minecraft is shitty in numerous ways. The UI is awful, even by video game standards. It takes more resources to run than Skyrim, despite looking like crap and having not-especially-impressive AI.

    The whole concept of Java, the "write once, run everywhere" thing, that was a great idea when Sun started the whole project. But what happened? The Java community never produced any decent GUI tools, or even decent IDEs. When Microsoft had the "gall" to add features to the JVM so that Java apps could better integrate with Windows*, Sun threw a hissy-fit until Microsoft basically said, "fuck you guys" and got out. (Which is exactly 180 degrees away from how they should have reacted.) The JVM somehow got more bloated and insecure year-over-year. Web developers, after seeing that Microsoft stopped shipping it, after seeing that Java was giving their clients computer viruses, and after seeing that Java web applets were still sluggish and useless even after 5 years of development, ran away in droves. Then it got taken over by Oracle, where once-promising software goes to die a lingering, painful death.

    Just a couple months ago we were talking about how the JVM still doesn't, after 15+ years of development, understand how named folders work in Windows. If they haven't fixed that extremely basic and simple bug in 15 years, what are the odds it'll be fixed tomorrow? I hate to break this to you, but it's dead. Doornail dead.

    The only place Java was ever used to make non-WTF software was when it was used for web server development on Linux. But those days are long past-- Ruby, Python, PHP, even Mono are offering the same features with less overhead and significantly easier coding. Because in that environment, none of Java's advantages actually matter. There's no difference between Java bytecode and just-in-time compiled "anything else" code. There's no point to Java's portability because the software never gets distributed off its own servers. There's no advantage to using Java in the one place where using Java isn't full of WTFs. So why use it?

    * BTW Microsoft didn't pick fights with Java because it saw Java as a threat, or because it wanted to clear the marketplace for .net. Microsoft was just trying to make Java suck less on Windows; that's all. If they saw Java as a threat, they wouldn't have sunk millions of dollars writing their own JVM in the first place.

    @Adanine said:

    Also, why am I naming applications to defend a
    language? Do people hate Java just because they've seen crap
    applications developed in Java?

    It goes further than that: I've never seen a GOOD application developed in Java.

    @Adanine said:

    Is it because of it's pointers/garbage handling? I
    was under the impression that any garbage handler-related issues were
    worked out early-mid 2000, and I'm well aware of how pointers work and
    how they can be used, but java doesn't really take any of that away,
    just hides it.

    No, that stuff's fine. I think part of what you're missing is that the WTFs in Java aren't really in the language, they're in the runtime. If C# had to run in the goddamned JVM and Java ecosystem, it'd have most of the exact same problems Java has now.

    @Adanine said:

    I think it's because of the tools people use for
    Java?

    Now you're getting warmer.

    @Adanine said:

    I've used Netbeans, and ever since I found it I actually like
    coding in Java. Not a real fan of Eclispe,

    Netbeans is better than Eclipse, but both are shit.



  • Oh blakeyrat... When will you ever learn?



  • @TheCPUWizard said:

    I do have fun with candidates who focus on their degree though. Have them explain what a Turing Machine is and why it is relevant to some real world situations, or explain the ramifications of multi-level cache in a multi-core or multi-processor situation.
     

    A Touring machine is a big blue bus; it is relevant when some idiot asks you about it in a job interview.

    The second question contains the word "multi-" three times; a sure sign that the software is going to blow up assynchronously.

     



  • @TheCPUWizard said:

    I do have fun with candidates who focus on their degree




    I interpreted that to mean, and I hope I'm right in thinking, that you are more interested in someone's ability to the job. If that is case then +1 to you!


    I don't even have a degree and apart from the social aspect of that (it becomes a class distinction) it hasn't made any practical difference. Sure it means I don't have as much knowledge but the knowledge I have has be built up from a genuine enthusiasm for what I do combined with a the occasional practical reason to learn new concepts as they crop up.


    I find that in the real world project managers ask questions like "Who can get the USB feature working by Friday?" not "We need to get the USB feature working by Friday, who has a degree?"



  • At my interview for this job, I was asked why I didn't have a degree. I said that in IT, I felt that the parts of a degree relevant to an entry level job would be outdated by the time I finished the degree, and for non-entry level work, I had the experience to back up my abilities. The hiring manager then told me they had one guy in the IT department who actually had an IT-related degree.

    His job was to answer the phones.



  • @EncoreSpod said:

    @TheCPUWizard said:

    I do have fun with candidates who focus on their degree




    I interpreted that to mean, and I hope I'm right in thinking, that you are more interested in someone's ability to the job. If that is case then +1 to you!


    I don't even have a degree and apart from the social aspect of that (it becomes a class distinction) it hasn't made any practical difference. Sure it means I don't have as much knowledge but the knowledge I have has be built up from a genuine enthusiasm for what I do combined with a the occasional practical reason to learn new concepts as they crop up.


    I find that in the real world project managers ask questions like "Who can get the USB feature working by Friday?" not "We need to get the USB feature working by Friday, who has a degree?"

    You are correct. My intent was specifically those who focus on the existance of their degree, rather and actual experience/knowledge. I too am non-degreed (I did finish all of my major's requirements for a Physics Degree, but never got around to completing the other courses), and occasionally this still comes up. Hard to imagine how a degree from over 30 years ago would specifically influence my abilities today...

      The topics I mentioned do have real world impact and actually come up in conversations I have with team members during design sessions. The depth of a persons required knowledge does vary significantly depending on the type of project and the role of the person within the team.



  • Wait... so am I the only person with a Phd here..? lonely


  • Discourse touched me in a no-no place

    @serguey123 said:

    Wait... so am I the only person with a Phd here..? lonely
    Probably. Highest academic achievement I can put on my CV are UK A levels. I started a Comp-Science BSc, but they pushed for all students to do a 'work experience' year for year 3, with the intention that the students completed the course in year 4.



    In the first 3 months I'd learnt more 'on the job' than I'd actually learnt in the preceeding 2 years at Uni. Towards the end of the year placement I indicated a desire to continue working with the company and postpone the final year, the company indicated a desire that I stay on.





    Long story short, my 'placement' lasted 10 years, I didn't finish the BSc, and I got the next job because of experience, not because a piece of paper indicated I'd wasted 3 years of my life in academia.



  • @PJH said:

    @serguey123 said:
    Wait... so am I the only person with a Phd here..? lonely
    Probably. Highest academic achievement I can put on my CV are UK A levels. I started a Comp-Science BSc, but they pushed for all students to do a 'work experience' year for year 3, with the intention that the students completed the course in year 4.



    In the first 3 months I'd learnt more 'on the job' than I'd actually learnt in the preceeding 2 years at Uni. Towards the end of the year placement I indicated a desire to continue working with the company and postpone the final year, the company indicated a desire that I stay on.





    Long story short, my 'placement' lasted 10 years, I didn't finish the BSc, and I got the next job because of experience, not because a piece of paper indicated I'd wasted 3 years of my life in academia.

    I wouldn't say I wasted them, but then I worked mostly in research so it was handy sometimes and I enjoyed academic life, however I do agree that some companies and people put too much faith in a piece of paper instead of others achievements



  • @Ben L. said:

    *British meaning of college = American meaning of High School*
     

    I don't think this is right.



  • @AndyCanfield said:

    A Touring machine is a big blue bus; it is relevant when some idiot asks you about it in a job interview.

    The second question contains the word "multi-" three times; a sure sign that the software is going to blow up assynchronously.

     

    :D

     



  • @justanotheradmin said:

    At my interview for this job, I was asked why I didn't have a degree. I said that in IT, I felt that the parts of a degree relevant to an entry level job would be outdated by the time I finished the degree, and for non-entry level work, I had the experience to back up my abilities. The hiring manager then told me they had one guy in the IT department who actually had an IT-related degree.

    His job was to answer the phones.

     

     Yes, so much this. I quit university before I even got a bachelor because it felt like a giant waste of time. Sure, some things were definitely nice to learn, but staying longer would have killed me. Now I've made it an important issue not to have a degree, or at least not tell any prospective employers. Because the places that won't hire you only because of your lack of academical papers aren't worth working at.

     That said, I started following some Stanford lectures online about a year ago and realized that I wasn't an arrogant ADD freak after all. I just went to a really bad university.



  • @Adanine said:

    Do people hate Java just because they've seen crap
    applications developed in Java? I'm not being cheeky or anything, I've
    just been baffled by how everyone in the IT field that I know thinks
    java is crap.

    It's all the little things put together which make it bad.
    Firstly, there's the boiler plate. This is somewhat mitigated if you have a really good IDE, but not completely.

    Secondly, Checked Exceptions. This is a crime against programs everywhere. There are many cases where an exception can't (or shouldn't) actually exist, but you have to make sure it doesn't occur anyway because the compiler will complain otherwise.
    This leads to try {...} catch(Exception e) {} everywhere, and actual exceptions may get ignored. Thirdly, until the most recent version of Java (Which was only about a year ago), there was no equivalent of a using() or with() statement. Closeable resources had to be done somewhat like:

    Closeable x = null;
    try {
     something with x;
    }
    finally {
      if(x != null) {
        try { x.close(); } catch(Exception e) {}  
      }
    }

    Thirdly, yes that try block in the finally statement is necessary. Closeable.close() throws IOException, a checked exception. The compiler will complain if it's not there.
    What the hell is actually going to have an exception when you close a resource? Closing a resource that's already been closed? A sane library would treat that as a no-op. For anything serious there is always Error.

    Fourthly, it has anonymous classes but no anonymous functions. If you want to use callbacks, you need to define an interface and fill out an entire class to use it. See also: Point 1; Boilerplating.

    Finally, as has been pointed out, writing GUIs in Java is a massive pain in the ass. And probably broken because of bugs in the Java API itself.



  • @Salamander said:

    It's all the little things put together which make it bad.

    Firstly, there's the boiler plate. This is somewhat mitigated if you have a really good IDE, but not completely.

    Java scripting languages do a much better job of this (Groovy, Scala, Clojure). For example, to get contents of a file:

    def file = new File("somefile.txt")
    file.text = "Set file contents"




    And there's no need to define a main method or add most of the class, method and field boilerplate.

    @Salamander said:



    Secondly, Checked Exceptions. This is a crime against programs everywhere. There are many cases where an exception can't (or shouldn't) actually exist, but you have to make sure it doesn't occur anyway because the compiler will complain otherwise.

    This leads to try {...} catch(Exception e) {} everywhere, and actual exceptions may get ignored.
    Thirdly, until the most recent version of Java (Which was only about a year ago), there was no equivalent of a using() or with() statement. Closeable resources had to be done somewhat like:

    Closeable x = null;
    try {
     something with x;
    }
    finally {
      if(x != null) {
        try { x.close(); } catch(Exception e) {}  
      }
    }

    Thirdly, yes that try block in the finally statement is necessary. Closeable.close() throws IOException, a checked exception. The compiler will complain if it's not there.
    What the hell is actually going to have an exception when you close a resource? Closing a resource that's already been closed? A sane library would treat that as a no-op. For anything serious there is always Error.

    Generally I agree. The main advantage is forcing programmers to consider whether they can recover from a likely failure point and it can make programming with files and JDBC connections hairy. Project Coin in Java 7 cleans up a lot of that.

    @Salamander said:



    Fourthly, it has anonymous classes but no anonymous functions. If you want to use callbacks, you need to define an interface and fill out an entire class to use it. See also: Point 1; Boilerplating.

    The scripting languages have had closures for a long time and they were added to Java 7.


    @Salamander said:
    Finally, as has been pointed out, writing GUIs in Java is a massive pain in the ass. And probably broken because of bugs in the Java API itself.


    Swing apps are miserable to write from scratch and Java has barely approached a GUI designer as slick as Visual Studio (Matisse is okay). Again, the scripting languages help quite a bit. Groovy SwingBuilder cuts out most of the boilerplate (an example).



  • @cmccormick said:

    Swing apps are miserable to write from scratch and Java has barely approached a GUI designer as slick as Visual Studio (Matisse is okay).

    The problem isn't the designer, it's the fact that the end result of the designer is still awful.



  • @cmccormick said:

    @Salamander said:


    Secondly, Checked Exceptions. This is a crime against programs everywhere. There are many cases where an exception can't (or shouldn't) actually exist, but you have to make sure it doesn't occur anyway because the compiler will complain otherwise.

    This leads to try {...} catch(Exception e) {} everywhere, and actual exceptions may get ignored.
    Thirdly, until the most recent version of Java (Which was only about a year ago), there was no equivalent of a using() or with() statement. Closeable resources had to be done somewhat like:

    Closeable x = null;
    try {
     something with x;
    }
    finally {
      if(x != null) {
        try { x.close(); } catch(Exception e) {}  
      }
    }

    Thirdly, yes that try block in the finally statement is necessary. Closeable.close() throws IOException, a checked exception. The compiler will complain if it's not there.
    What the hell is actually going to have an exception when you close a resource? Closing a resource that's already been closed? A sane library would treat that as a no-op. For anything serious there is always Error.

    Generally I agree. The main advantage is forcing programmers to consider whether they can recover from a likely failure point and it can make programming with files and JDBC connections hairy. Project Coin in Java 7 cleans up a lot of that.

    Checked exceptions are good. In my opinion, exceptions like NumberFormatException and IllegalArgumentException should be checked as well, since they are places that the developer should be forced to deal with a special case. The specific use of a checked exception on Closeable.close() is bad. Closing an already-closed handle shouldn't do anything, as was stated. Don't confuse a bad use of an idea with a bad idea.



  • @Ben L. said:

    Checked exceptions are good. In my opinion, exceptions like NumberFormatException and IllegalArgumentException should be checked as well, since they are places that the developer should be forced to deal with a special case.

    Unless they already have (or thought they have) dealt with any special cases. If they actually haven't, the program should die horribly and loudly, so they can fix it.
    The moment you add in checked exceptions, you get try {...} catch(Exception e) {} blocks which will "never occur" and silence any problem, making the program die in mysterious ways. If you're lucky.



  • @Adanine said:

    @blakeyrat said:

    @Adanine said:
    Also, why does everyone hate Java?

    You can't build a non-shitty desktop app with it.

    Minecraft?

     

     

    Minecraft isn't a desktop app. Pretty sure the reference was to creating a desktop application, specifically the fact that it's impossible to create a desktop application in Java that looks like it belongs in any OS. They always have this "look" to their UI that makes it clear they are a Java program regardless of the platform you run it on.

     



  • @pauly said:

    What I can't stand is the person that will always point out that you're assuming and equates that to something bad. 
     

    Well, actually... you're assuming it equates to something bad, when it could equate to something good.

    </dickweed>

    Assumptions are only made through lack of information. If more information were forthcoming then all recipients wouldn't need to assume - they'd actually know and not have to cope with vagueness.


  • BINNED

    @Ben L. said:

    Checked exceptions are good.
    Explain, please.



  • @PedanticCurmudgeon said:

    @Ben L. said:
    Checked exceptions are good.
    Explain, please.
     

     

    I have to join in on this. If there is one thing I miss in .net it's checked exceptions. With it, you can define what exceptions are thrown, it's remembered and taken care of and reflected in the documentation. In .net, you end up with hoping that the possible exceptions are documented, but there's no check to make sure.

    The argument that "well if you have checked exceptions you end up with a lot of try {..} catch (Exception e) {} blocks" is just a bad excuse. It's like saying "well, you know, people are stupid so we decided to go for a solution that isn't really that good but makes the losers look better". Hell, it isn't even a compiler switch.

    Sure, it's nice to be able to prototype without thinking about exceptions. But when you then want to do things properly (and you usually do that from the start anyway), you just cross your fingers hoping that you're catching what needs to be catched. IF you remember to do it where you should, because there's no compiler telling you where and when.

    It's like saying strictly typed variables are bad because you don't get all the freedom you want. Again, for prototyping it's nice (PHP), but try building something complex and you start failing pretty quickly.


  • BINNED

    Thanks for that, but maybe someone else can explain who knows of objections other than the empty catch block straw man?



  • @arh said:

     With it, you can define what exceptions are thrown...

    The problem with checked exceptions, is that you can not define what exceptions may be thrown by a piece of code as soon as it calls into any other code. Even if you could determine all of the possible exceptions today, a library routine could be updated to introduce a new exception your code is not aware of.

    I am a strong believer that exceptions should only be caught if there is seomething that can be done to address the situation that caused the exception. All other exceptions should be totally invisible and propogate to the next higher level, and if there is no level which can do something "graceful" in the event of the exception, the application should be torn down immediately and rudely.



  • @TheCPUWizard said:

    @arh said:

     With it, you can define what exceptions are thrown...

    The problem with checked exceptions, is that you can not define what exceptions may be thrown by a piece of code as soon as it calls into any other code. Even if you could determine all of the possible exceptions today, a library routine could be updated to introduce a new exception your code is not aware of.

    I am a strong believer that exceptions should only be caught if there is seomething that can be done to address the situation that caused the exception. All other exceptions should be totally invisible and propogate to the next higher level, and if there is no level which can do something "graceful" in the event of the exception, the application should be torn down immediately and rudely.

    Compile-time errors! How's that for fail-fast?



  • @Ben L. said:

    Compile-time errors! How's that for fail-fast?

     I am a big fan of compile time errors whenever possible. In this area however (and as I previously pointed out) it is impossible to know (at compiple time) what exceptions may be thrown by any code you call, since the called code may very well have changed. Any technique which partially addresses a secnario, without being robust, has a high probability of causing false security and resulting in even more severe problems.



  • @TheCPUWizard said:

    @Ben L. said:

    Compile-time errors! How's that for fail-fast?

     I am a big fan of compile time errors whenever possible. In this area however (and as I previously pointed out) it is impossible to know (at compiple time) what exceptions may be thrown by any code you call, since the called code may very well have changed. Any technique which partially addresses a secnario, without being robust, has a high probability of causing false security and resulting in even more severe problems.


    You're compiling against code you don't have? If an exception isn't caught, it isn't caught. It'll keep popping the stack until someone catches it, no matter if it's a checked or an unchecked exception.



  • @TheCPUWizard said:

    @arh said:

     With it, you can define what exceptions are thrown...

    The problem with checked exceptions, is that you can not define what exceptions may be thrown by a piece of code as soon as it calls into any other code. Even if you could determine all of the possible exceptions today, a library routine could be updated to introduce a new exception your code is not aware of.

    I am a strong believer that exceptions should only be caught if there is seomething that can be done to address the situation that caused the exception. All other exceptions should be totally invisible and propogate to the next higher level, and if there is no level which can do something "graceful" in the event of the exception, the application should be torn down immediately and rudely.

     

    Whoah. If you replace a library you'll get a compile-time error when you rebuild your application. (If you are replacing libraries without doing a rebuild, you are doing it WTF style and deserve whatever befalls you.)

    I don't know your background, but I can tell you that tearing the application down "immediately and rudely" is pretty much 100% not the right thing to do with any GUI-based application or server/service based application. Maybe you are in a different neck of the woods where that's the expected behavior.

    Here's why I like checked exceptions. Let's say I have a method called "void PayrollSystem.printPaycheck(Employee emp)". Now, we can have all kinds of fun discussions about what class this method should be in, and what parameters it should take, but that isn't the point here. The point here is to think about what can go wrong when we call printPaycheck(), and how we might handle those problems. If we're running in a development/test environment, there are some silly things that can go wrong, like passing null, or maybe a divide by zero because the employee was on vacation for the entire payperiod (and is paid hourly). These kinds of failures are the programmer's fault. They're bugs in the logic of the code. 

    There's another set of failures that can occur when the application is actually deployed and running in a customer's environment. Let's say that printPaycheck() has to go to the database to fetch some data, invoke a web service, create a PDF file, and then send that PDF to the printer. Obviously there are all kinds of things that go wrong here. Without checked exceptions, I'm going to end up with something like:

    try { payrollSystem.printPaycheck(emp);
    } catch(Exception e) {
      // show a dialog box indicating things went wrong
    }

    Why? Because I have no idea what exceptions are being thrown by printPaycheck() and no way to find out. But if I have checked exceptions I can do something more user-friendly:

    try { payrollSystem.printPaycheck(emp);
    } catch(DatabaseException e) {
     // let the user know the database appears to be offline and they should try again later
    } catch(NetworkException e) {
     // let the user know there was an issue connecting to the network
    } catch(PrintException) {
     // let the user know there was a problem printing, maybe the printer is offline or needs paper?
    }

    Further, within printPaycheck(), if I get a DatabaseException I might want to try again, or try connecting to the backup database. But to do that I need the database driver to give me some kind of hint -- did I get a timeout while connecting, or could the URL of the database not be resolved?

    Summary: without checked exceptions, I just have to guess at what might have gone wrong. With checked exceptions, I have a decent idea of the type of problem that occurred, and that gives me options in handling it.

    Thomas


Log in to reply