Java reflection WTF



  • just a note: Orbitz was written with LISP.

     However, I must say, I think Java is an extremely useful lanuage and would recommend it's use over ML, LISP, or Scheme in almost all situations, especially to companies that aren't startups.  Startups can get away with using languages like that because they have so few programmers and if they loose their core programmers they don't have to worry about loosing the talent because the company is gone anyways.  In large companies you have to hire lots of programmers and deal with turnover, something that is easy with Java, difficult with languages like Scheme, Python and Ruby, and almost impossible with languages that should ship with an anger management book (for instance: ML).



  • @amischiefr said:

     No, I'm just bashing you like you bashed Java, because its fun.  How exactly did I "fail" at the argument?  You cannot do in ML what I can do in Java, so how am I the one that fails?  If you made the argument that I should use PHP or .NET instead of Java you might actually have presented an intelligent argument, but instead you suggested that we all code in ML.  Therefore you sir fail.

    I bashed Java with reason whereas you had nothing except flaming.

    By the way, you sir fail at reading. Please find the exact quote where I suggested that "we all code in ML". Even if you hate ML that much, .NET does the type system MUCH, MUCH BETTER than Java.

    But no, Java is perfect. We can't make any improvements to the language. It's got the best object system. It's got the best type system. It's got all the mechanisms of abstraction that anyone ever wants. It has no WTFs. It even prevents programmers from committing WTFs. It's really easy to get everything done in Java. Programmers can express the logic directly in Java instead of bothering with the details and pitfalls of the language. Java is the greatest invention in PL history.



  • @HypocriteWorld said:

    Java is perfect. We can't make any improvements to the language. It's got the best object system. It's got the best type system. It's got all the mechanisms of abstraction that anyone ever wants. It has no WTFs. It even prevents programmers from committing WTFs. It's really easy to get everything done in Java. Programmers can express the logic directly in Java instead of bothering with the details and pitfalls of the language. Java is the greatest invention in PL history.
    Finally, something we can all agree to!



  • @HypocriteWorld said:

    .NET does the type system MUCH, MUCH BETTER than Java.
     

     

    Not being that experienced with .NET, I'm genuinely interested... how?



  • @MrWiggles said:

    @HypocriteWorld said:

    .NET does the type system MUCH, MUCH BETTER than Java.
     

     

    Not being that experienced with .NET, I'm genuinely interested... how?

    Me too. As far as I remember, .NET does the same thing; the only thing I've seen "different" is Exception throwing and the get {} and set {} stuff.

    I won't go again on the "C# is MS' Java" argument, but back in that 2001 MS conference, it was pretty obvious, even to the MS presenter.

    I really don't get why there's so much hate with Java. Some hate it because of that "typed languages r evil!"; others because of the VM. But I really don't see anything inherently "evil" about the language itself, and I'd rather have a language where 2 + 2 = 4 all the time, and not "22".



  • @danixdefcon5 said:

    I really don't get why there's so much hate with Java.
     There are many valid criticisms of Java, but one of the key things that engenders genuine hatred is the hype compared to what you're really getting.

    @danixdefcon5 said:

    Some hate it because of that "typed languages r evil!"; others because of the VM.
    These aren't flaws in Java so much as they're preferences.  Although the whole "let's be really strict about types, but then pass around Objects all the time" thing is just moronic.  Yeah!  Casting code is awesome!

    @danixdefcon5 said:

    But I really don't see anything inherently "evil" about the language itself, and I'd rather have a language where 2 + 2 = 4 all the time, and not "22".
    Again, that's a preference.  Of course, you're overstating it, because I'm not aware of any language where 2 + 2 = "22"; maybe "2" + 2 = "22".  Anyway, if you want to consider some real criticisms of Java, here's a quick, incomplete list in no particular order:

    • Lack of closures
    • Generics were late to arrive, and still aren't comparable to other languages
    • Autoboxing was another late-arriving solution, this time to a problem that should have never existed.
    • The type system really is overly strict.  Try doing binary math on bytes sometime; you'll get so many "possible loss of precision" errors (not warnings) that you'll want to stab someone (or just start suppressing the errors).
    • The security model is cumbersome.
    • On that point, a lot of things are cumbersome.  I'm looking at you, JAXP, JDBC, and just about every other 3- or 4-letter acronymn starting with a 'J'.
    •  Reflection is a joke.
    • The control structures blow.  Why did it take so long to get a goddamn foreach construct?  Why can't I use switch on anything but primatives?
    • Strip manipulation is weak.
    • Why does I/O require this sort of monstrosity?  Go to hell and die, PushbackInputStream!
    • You can't even parse an integer out of a friggin' string without a damned try/catch clause.
    Some of these apply to other languages, too, and there's really no such thing as a perfect programming language (Preemptive strike: LISP is not perfect.  STFU, LISP fans.).  I post it only so that you can start to open your eyes and evaluate the language critically for yourself.

     



  • @bstorer said:

  • Strip manipulation is weak.
  • Did you mean "string manipulation" but were think about naked ladies, or am I missing something here?


  • @bstorer said:

    • Generics were late to arrive, and still aren't comparable to other languages

     

    Who cares if they were late.  They're here, not perfect by any means, but they are here.  No sense in bitching about when exactly they came about implementing it.  

    @bstorer said:

    • Autoboxing was another late-arriving solution, this time to a problem that should have never existed.

     Granted.

    @bstorer said:

    • The type system really is overly strict.  Try doing binary math on bytes sometime; you'll get so many "possible loss of precision" errors (not warnings) that you'll want to stab someone (or just start suppressing the errors).

     Alright, now what kind of complex math are you using bytes for instead of shorts, ints or BigDecimal?  If you are performing math that extends beyond the range of -128 to 127 then maybe, just maybe you should use something other than a byte.  I'm not sure I understand your arguement here.

    @bstorer said:

    •  Reflection is a joke.

     Absolutely.

    @bstorer said:

    • The control structures blow.  Why did it take so long to get a goddamn foreach construct?  Why can't I use switch on anything but primatives?

     Um... I don't have a good answer to either of those.  They are supposedly fixing the swith problem in java 7.  Why so late?  Was C++ developed overnight and complete?

    @bstorer said:

     Can't speak for PushbackInputStream, but as for the multitude of different ways to send files, well, I can't see a problem with having a large library that can take care of multiple types of I/O needs as oposed to one do-it-all class.

    @bstorer said:

    • You can't even parse an integer out of a friggin' string without a damned try/catch clause.

     Would you rather have a compile time error or a run time error when it goes into production and you forgot to check?  I personally think it helps to ensure that programmers are making the checks that they "should" be making anyway.  It doesn't make sense to parse "134SLF9" into an integer now does it?  C might let you do that, but it would be wrong.  Java forces you to ensure correctness.  Besides personal pride in thinking "why is the program doing my job, I"M GOOD AT WHAT I DO AND DON"T MAKE MISTAKES!!!", I don't see a problem with it.



  • @Zecc said:

    @bstorer said:

  • Strip manipulation is weak.
  • Did you mean "string manipulation" but were think about naked ladies, or am I missing something here?
    Have you ever tried to manipulate a stripper through Java?  The API's awful.

    Also, String manipulation is weak.



  • I realize that this is really fucking pointless if you continue being blindly defensive of Java, but, hey, I'm a glutton for punishment.@amischiefr said:

    Who cares if they were late.  They're here, not perfect by any means, but they are here.  No sense in bitching about when exactly they came about implementing it.  

    Sure there is.  It's indicative of how Sun approaches Java: Developers bitch for years because it doesn't implement something blatantly obvious -> Sun responds with a half-hearted solution -> People like you somehow warp that into a defense of the language -> The rest of us roll our eyes and wonder how this became the language of choice.@amischiefr said:
     Alright, now what kind of complex math are you using bytes for instead of shorts, ints or BigDecimal?  If you are performing math that extends beyond the range of -128 to 127 then maybe, just maybe you should use something other than a byte.  I'm not sure I understand your arguement here.
      If you have a line like a = a << 2, or a = a & b, where a and b are bytes, javac generates "possible loss of precision errors".  The same will happen if they're shorts, for that matter.  There's strong-typing and then there's needless hand holding. @amischiefr said:

     Um... I don't have a good answer to either of those.  They are supposedly fixing the swith problem in java 7.

      It's about time!  I mean, it's been almost 2 years since the last time they changed the language!

    @amischiefr said:

    Why so late?  Was C++ developed overnight and complete?
    What C++ has or hasn't done has no bearing on Java.  As I said, some of these could apply to many languages; that doesn't make them any less valid as criticism of Java.



  •  Yes, those are valid criticisms of Java, as far as I know (haven't used it since 1.4). But it's rare that they are stated in a sober tone, far more often people just reiterate "Java iz teh suck" without giving any reasons at all. I don't understand the intensity of hatred towards Java that is often displayed either, even taking intensified reactions due to the Java hype (btw. hasn't that pretty much abated by now?) into account. But I suspect it's actually more of a religious thing (we also have many Java zealots on the other side) than an argument about the reality of the language.*

     

    Oh, and the "possible loss of precision" when doing bit operations on bytes and shorts, that's real because the arguments are promoted to int for the operation, so storing the results in a byte or short could indeed lose information, therefore you have to explicitly tell the compiler that you want it here. Yes, it is quite annoying. 

     

    (*) Not the posts I'm replying to 



  •  @Ilya Ehrenburg said:

    Yes, those are valid criticisms of Java, as far as I know (haven't used it since 1.4). But it's rare that they are stated in a sober tone, far more often people just reiterate "Java iz teh suck" without giving any reasons at all.
    Which is what I want to avoid.  If you think you're programming language is the best thing around, then you're delusional.  They all suck in different ways.  I could just as easily give a list of criticisms for C++, Ruby, Python, etc.  I just don't get the zealotry; they're all tools, use the best one for the job.  Do you think there are people somewhere arguing back and forth like this over Phillip's vs. flat head screwdrivers?

    @Ilya Ehrenburg said:

    I don't understand the intensity of hatred towards Java that is often displayed either, even taking intensified reactions due to the Java hype (btw. hasn't that pretty much abated by now?) into account.
      The more I think about it, the more I agree with you.  Why doesn't C++, for example, get this sort of scorn?  It's certainly deserving.  And I rarely, if ever, hear anything against C# that isn't just an anti-Microsoft comment.



  • @bstorer said:

    I realize that this is really fucking pointless if you continue being blindly defensive of Java, but, hey, I'm a glutton for punishment.

     

    I'm not blindly defensive of Java, just questioning some of your complaints, to which you chose to only respond to a few of.  I completely agree with you about it being a tool and that it is right for certain jobs.  I agree more with Ilya Ehrenburg though in the fact that most ranting about "Java is teh suk" is pointless and meaningless.  

    @bstorer said:


    Sure there is.  It's indicative of how Sun approaches Java: Developers bitch for years because it doesn't implement something blatantly obvious -> Sun responds with a half-hearted solution -> People like you somehow warp that into a defense of the language -> The rest of us roll our eyes and wonder how this became the language of choice.

    ...

    What C++ has or hasn't done has no bearing on Java.  As I said, some of these could apply to many languages; that doesn't make them any less valid as criticism of Java.

    It probably became the language of choice because of its ease of use and straight forward approach compared to $(.#fkldsjflj).#"something"@@$foo, or the complexities of C.  Lets face it: Americans are lazy and stupid these days and Java is a LOT easier to pick up and learn than C or PHP or Python are.

    As far as the timelyness of development, my point was that C++ wasn't perfected in a day.  Hell you could even compare C++ to Java 1.X in the sense that C++ came from C which came from B which came from BCPL which...  C++ was a project in the making for what, 15 years ( I know, 2 years of direct development, but being an extension of C, which is an extension of B...)?  So there is relevence.  You have to remember what Java was originally developed for and that it has, over the years, developed into much more.  

    Sure, I agree: since most have known about the pitfalls of programming languages and have known about the types of things that make a good language you could ask "why didn't Sun put them all in at the begining?"  I can't help you with that one.

     

    I agree with you, if there is going to be Java bashing why isn't there similar complaints about other languages that have plenty of WTF's themselves?  I think you are misunderstanding my position on Java.  I do not blindly support it, hell I'ld rather be using .NET for the webapps I write. Drag n drop baby!!!  It is just when people bitch about specific things like "oh java sucks because you can't access the kernel directly..." that I go WTF? Who cares, use C or whatever if you need to do that.  I think we all finally agree: all languages have WTF's, pick your best choice for the job.  End of story.



  • @amischiefr said:

    or the complexities of C

    C is 'complex' now??

    @amischiefr said:

    Americans are lazy and stupid

    Speak for yourself.



  • @amischiefr said:

    I'm not blindly defensive of Java, just questioning some of your complaints, to which you chose to only respond to a few of.
    I tried to answer the ones that were less about opinion.  The others I let go because I really don't care.  I don't have it in for Java; I use it when I need it, stay away when I don't.

    @amischiefr said:

    As far as the timelyness of development, my point was that C++ wasn't perfected in a day.  Hell you could even compare C++ to Java 1.X in the sense that C++ came from C which came from B which came from BCPL which...  C++ was a project in the making for what, 15 years ( I know, 2 years of direct development, but being an extension of C, which is an extension of B...)? 
    I don't think that's a valid line of reasoning.  By that logic, every programming language has been in development since Fortran came out in the 1950's, because they're all based on the ones that came before.



  • @bstorer said:

    I don't think that's a valid line of reasoning.  By that logic, every programming language has been in development since Fortran came out in the 1950's, because they're all based on the ones that came before.

    Exactly. And Java just came from C++ anyway!



  • @bstorer said:

     [quote user="Ilya Ehrenburg"]Yes, those are valid criticisms of Java, as far as I know
    (haven't used it since 1.4). But it's rare that they are stated in a
    sober tone, far more often people just reiterate "Java iz teh suck"
    without giving any reasons at all.

    Which is what I want to avoid. [/quote]
    And I praise you for that.

    If you think your programming language is the best thing around, then you're delusional.  They all suck in different ways. 
    And in different degrees. That's why sober criticism is so important to find the best tool for the job.
    I could just as easily give a list of criticisms for C++, Ruby, Python, etc.  I just don't get the zealotry; they're all tools, use the best one for the job.  Do you think there are people somewhere arguing back and forth like this over Phillip's vs. flat head screwdrivers?

    Remembering VHS vs. betamax, I wouldn't bet against it.


  • @Farmer Brown said:

    @bstorer said:
    I don't think that's a valid line of reasoning.  By that logic, every programming language has been in development since Fortran came out in the 1950's, because they're all based on the ones that came before.
    Exactly. And Java just came from C++ anyway!
    And C# is just Java!



  • @bstorer said:

    And C# is just Java!

    Exactly. M$ stole the ideas for generics from Sun and then patented it to cover their tracks. That explains Java's generics fail.



  • It's just that Java's development was extremely lame. Sure Java 5 fixed a bunch of things that are essential, but the amount of time it took for this to happen is a WTF. C++ evolved over a long time because it added a crapload of things over C (classes, templates, etc), whereas Java is more or less a C++ that has low-level pointer manipulation and memory management encapsulated. Sun managed to fix the lack of generics in 5, but it really shouldn't have taken THAT LONG to do it, when the language that Java came from (C++) had it, and the entire type-polymorphism idea has been around since 1977. Same thing with the autoboxing, which was an even larger problem in the absence of generics.

    Of course, there's also: Amount of Flame = O(Amount of Hype / Actual Quality).



  • Do you think there are people somewhere arguing back and forth like this over Phillip's vs. flat head screwdrivers?

     

    http://idlogger.wordpress.com/2008/03/22/which-is-better-phillips-or-flat-head/





  • @amischiefr said:

    I agree more with Ilya Ehrenburg though in the fact that most ranting about "Java is teh suk" is pointless and meaningless.  

     

    Which is probably why on this forum everytime we argue about languages we usually offer many specific details about the language that make is teh suk.

     So here is my list of why Java sucks, and also some reasons specifically .NET is better.

    1. Since generics didn't come until 1.5, there is a shit-ton of code that is 1.4.  This code takes a lot of time to update to 1.5 so that you can start using the fun new features of 1.5 and 1.6 in your project. 
    2. It's easy to get in a jubble mess of generics when you start using wildcards and shit like <T extends Foo>.  
    3. Getting the fucking Java RMI to work the first time you do it is harder than losing your virginity was back in high school.
    4. You can only pass around fucking objects (and primatives).  Want to pass a function pointer/callback?  Create a new class (or use an anonymous class) which inherits from an interface, instantiate it, and pass it in.  This is one place .NET is head and shoulders above Java.
    5. Visual Studio is far and away better than Java's (big 2) IDEs.  THe only way either compares is if you load eclipse up with 20 plug-ins.  Then you better have at least 1 gig memory just for eclipse.
    6. The default look-and-feel is god aweful.  Seriously, why make us copy and paste the same code for every app we write, just so I don't have to look at that "metal" shit?
    7. While we're on the subject.  Which GUI framework would you like to use? AWT, SWT, or Swing? 
    8. The syntax kind of sucks. for instance, how did they come up with:
      for (Foo foo : foos)?
      C# makes more sense:
      foreach (Foo foo in foos).
      not that it matters, most of the code that you work with won't work in 1.5 so you don't have to worry about the foreach loop since you don't have it.
    9. How do you check is obj isn't an instance of Foo?
      if (!(obj instanceof Foo))
      why doesn't instanceof bind more closely than '!'? (PS, I know C# isn't any better)
    10. No multiple inheritance.
    11. No expicit override until 1.5.
    12. No other annotations until 1.5
    13. no operator overloading
    14. No pass by reference.
    15. missing equivalent of the C# keyword "using" (very useful)
    16. Extremely easy to use Windows API in .NET, more difficult in Java.  This isn't a problem with Java, it's just a place where .NET excels.
    17. Since Java object only have methods and members, you miss out on goodies like .NET properties and events that make code more expressive.

    I just want to point out that there are a lot of things about Java which I like and think are better than .NET.  However, my point is that there are lots of problems with it and most people on this forum that say java is teh suk can at least back that up.

    As an aside, NumberFormatException extends RuntimeException, so if I'm not mistaken, you can do parseInt without a try catch block, but I'm too lazy to fire up eclipse to check it out.


  • And you worked so hard to prove my point "most ranting about 'Java is teh suk' is pointless and meaningless".  Alright, I'll bite... one more time...

    @tster said:

    Since generics didn't come until 1.5, there is a shit-ton of code that is 1.4.  This code takes a lot of time to update to 1.5 so that you can start using the fun new features of 1.5 and 1.6 in your project. 

     

    Which is why they developed it so that you didn't have to change every piece of code over to meet generic standards in order for it to work.  Leaving your 1.4 explicit casting still works in 1.5 and look, all you have to do is suppress the warning for it (I know, too much typing for a drag n drop .NET guy).

    @tster said:

    It's easy to get in a jubble mess of generics when you start using wildcards and shit like <T extends Foo>.  

     

    It is there for people who are smart enough to use it.  For those like you who can't comprehend it: leave it alone, you don't have to use it. 

    @tster said:

    Getting the fucking Java RMI to work the first time you do it is harder than losing your virginity was back in high school. 

     

    Personally I didn't have a problem with that, at least the latter

    @tster said:

    You can only pass around fucking objects (and primatives).  Want to pass a function pointer/callback?  Create a new class (or use an anonymous class) which inherits from an interface, instantiate it, and pass it in.  This is one place .NET is head and shoulders above Java.

     

     +1 for you.

    @tster said:

    Visual Studio is far and away better than Java's (big 2) IDEs.  THe only way either compares is if you load eclipse up with 20 plug-ins.  Then you better have at least 1 gig memory just for eclipse.

     

    How much does it cost?

    @tster said:

    The default look-and-feel is god aweful.  Seriously, why make us copy and paste the same code for every app we write, just so I don't have to look at that "metal" shit?

     

    Do you use the default settings for every app you do in .NET?  If so then you have a valid point, otherwise you're just bitching mindlessly since you change the look and feel anyway.

    @tster said:

    While we're on the subject.  Which GUI framework would you like to use? AWT, SWT, or Swing?  

     

    Personally none of the above.  I develop J2EE Web applications.  So, JSP with Grails, Struts or Spring MVC.

    @tster said:

    The syntax kind of sucks. for instance, how did they come up with:
    for (Foo foo : foos)?
    C# makes more sense:
    foreach (Foo foo in foos).
    not that it matters, most of the code that you work with won't work in 1.5 so you don't have to worry about the foreach loop since you don't have it.

     

    Now this is the type of stupid rant that I don't like.  Six of one, half a dozen of the other...  The difference between the two syntax is just a personal preference not a reason that .NET is better than Java.  Your last sentence doesn't even make any sense.  Are you implying that my old for(int i = 0; i < arr.length; ++i) won't work in 1.5, or that the foreach loop designed for 1.5 doesn't work in 1.5?

    @tster said:

    (PS, I know C# isn't any better) 

     

    Then why mention it?

    @tster said:

    No multiple inheritance. 

     

    I am actaully glad that it doesn't support multiple inheritance.  I have seen some really fucked up C, C++ code where it took way longer than it should to track down methods and member variables.   Hell there have even been some front page stories about just such a thing right here on TDWTF.  Look, personally I don't have the need to place constants and globals everywhere.  If you create good OOD then you don't have to worry about this.  Show me an example where multiple inheritance is a must.

    @tster said:

    No expicit override until 1.5.

     

    No explicit override is required, never has been.  The @Override annotation is just a way to force the compiler to check to see whether you got the method signature right.  Do you really need the compiler to hold your hand and change your diaper too?  .NET,nevermind I know the answer :)

    @tster said:

    No other annotations until 1.5

     

     http://java.sun.com/j2se/1.5.0/docs/guide/language/annotations.html

    3rd paragraph.  Java has always ad an annotation system (@depricated).  The new 'metadata' types allow you to define you own new annotations as oposed to only using the ones defined in the JDK.  So, once again a completely ignorant arguement.

    @tster said:

    no operator overloading

     

    I can't tell you the last time that I needed to add another meaning to the + operator, or hell I could be REALLY cool like the LISP guys and completely redefine what + means!!! Yay cause that's oh so useful...

    @tster said:

    No pass by reference.

     

    Only for primitive types.

    @tster said:

    missing equivalent of the C# keyword "using" (very useful)

     

    Are you comparing Java to every language in the world picking out the best parts of each language and comparing them to Java?  Good lord.

    @tster said:

     

    @tster said:

    we argue about languages we usually offer many specific details about the language that make is teh suk.

     

    I don't think that you should classify youself in the "we" category.  I have seen others here with valid arguements, most of yours are meaningless bitches and rants.

     

    @tster said:

    ...most people on this forum that say java is teh suk can at least back that up.

     

    I agree, you are just not one of them :)


  • :belt_onion:

    @amischiefr said:

    @tster said:

    Visual Studio is far and away better than Java's (big 2) IDEs.  THe only way either compares is if you load eclipse up with 20 plug-ins.  Then you better have at least 1 gig memory just for eclipse.

     

    How much does it cost?

    Visual Studio Express is free and gives you the tools and features you need to write decent code


  • @tster said:

    Visual Studio is far and away better than Java's (big 2) IDEs.  THe only way either compares is if you load eclipse up with 20 plug-ins.  Then you better have at least 1 gig memory just for eclipse.
     

     This is something that's always got me.  Everytime I've used Visual Studio i've found it to be severely lacking in terms of functionality compared to Eclipse.  Admittedly it's been a couple of years since I had a look, so it may have got better, but I could never understand how some people considered that IDE to be one of the strong points of .NET.



  • @HypocriteWorld said:

    Sun managed to fix the lack of generics in 5, but it really shouldn't have taken THAT LONG to do it, when the language that Java came from (C++) had it, and the entire type-polymorphism idea has been around since 1977.

    Actually, [url=http://www.cs.gmu.edu/~sean/stuff/java-objc.html]according to Patrick Naughton[/url] (one of the creators of Java), Java 'came from' Objective-C and SmallTalk, not C++. It just happens to have a more C++ish syntax.



  • @amischiefr said:

    @tster said:

    Since generics didn't come until 1.5, there is a shit-ton of code that is 1.4.  This code takes a lot of time to update to 1.5 so that you can start using the fun new features of 1.5 and 1.6 in your project. 

     

    Which is why they developed it so that you didn't have to change every piece of code over to meet generic standards in order for it to work.  Leaving your 1.4 explicit casting still works in 1.5 and look, all you have to do is suppress the warning for it (I know, too much typing for a drag n drop .NET guy).

    That doesn't make it any less of a WTF. Especially considering that parametric polymorphism is 30 years old.

    @amischiefr said:


    @tster said:

    It's easy to get in a jubble mess of generics when you start using wildcards and shit like <T extends Foo>.  

     

    It is there for people who are smart enough to use it.  For those like you who can't comprehend it: leave it alone, you don't have to use it.

    LOL, seems like you're the only person here that understands parametric polymorphism, out of the entire world.

    <snip syntax/IDE stuff that's irrelevant to the language>
    @amischiefr said:

    @tster said:

    (PS, I know C# isn't any better) 

     

    Then why mention it?

    Because there are more languages than just Java and C#.

    @amischiefr said:


    @tster said:

    No multiple inheritance. 

     

    I am actaully glad that it doesn't support multiple inheritance.  I have seen some really fucked up C, C++ code where it took way longer than it should to track down methods and member variables.   Hell there have even been some front page stories about just such a thing right here on TDWTF.  Look, personally I don't have the need to place constants and globals everywhere.  If you create good OOD then you don't have to worry about this.  Show me an example where multiple inheritance is a must.

    Nothing is a must (by Greenspun's Tenth Rule of Programming). You can do OO in C if you really want, just roll your own dynamic dispatch function. When you have an object that needs behaviour from two classes, sure you can inherit one and use something else for the other (like object composition), but then you're essentially Greenspunning your own multiple inheritance.

    BTW: I don't know how you can see fucked up C programs that have hard-to-track methods and member variables.

    @amischiefr said:


    @tster said:

    No expicit override until 1.5.

     

    No explicit override is required, never has been.  The @Override annotation is just a way to force the compiler to check to see whether you got the method signature right.  Do you really need the compiler to hold your hand and change your diaper too?  .NET,nevermind I know the answer :)

    By your logic, do you really need the compiler to keep track of data types for you? Do you really need the compiler to do subtype polymorphism for you? Do you really need the runtime to do memory deallocation for you?

    @amischiefr said:


    @tster said:

    No other annotations until 1.5


     http://java.sun.com/j2se/1.5.0/docs/guide/language/annotations.html

    3rd paragraph.  Java has always ad an annotation system (@depricated).  The new 'metadata' types allow you to define you own new annotations as oposed to only using the ones defined in the JDK.  So, once again a completely ignorant arguement.

    One annotation doesn't constitute an "annotation system". By the same logic, Smalltalk/Ruby has a static type system, with just one type: Object.

    @amischiefr said:


    @tster said:

    no operator overloading

     

    I can't tell you the last time that I needed to add another meaning to the + operator, or hell I could be REALLY cool like the LISP guys and completely redefine what + means!!! Yay cause that's oh so useful...

    BigInteger.

    @amischiefr said:


    @tster said:

    No pass by reference.

     

    Only for primitive types.

    And objects. You cannot have a function change a variable passed in as an argument, no matter the type.

    @amischiefr said:


    @tster said:

    missing equivalent of the C# keyword "using" (very useful)

     

    Are you comparing Java to every language in the world picking out the best parts of each language and comparing them to Java?  Good lord.

    Because tster is arguing that C# is better than Java, so he picks out C# features. Pretty normal to me.



  • @NSCoder said:

    Actually, according to Patrick Naughton (one of the creators of Java), Java 'came from' Objective-C and SmallTalk, not C++. It just happens to have a more C++ish syntax.

    This point doesn't matter, since almost every OO language comes from Smalltalk. Java was designed to be a better version of C++, which succeeded in some ways and failed in other ways.



  • @HypocriteWorld said:

    @amischiefr said:
    @tster said:
    no operator overloading
    I can't tell you the last time that I needed to add another meaning to the + operator, or hell I could be REALLY cool like the LISP guys and completely redefine what + means!!! Yay cause that's oh so useful...
    BigInteger.
    I can't say indeed hard enough to this one. The fact is, + is already overloaded in Java, for use with doubles, ints, longs, shorts, bytes and floats. I don't think anyone who deals with graphics would say that operator overloading is a bad thing. Graphics involve a lot of matrix operations, most of which are written using regular operators, but since Java doesn't have operator overloading you end up having to write it all out as function calls, which is really verbose for something that just takes three printed letters on paper. Vectors, matrices, BigIntegers, complex numbers, etc. Any time you want to deal with math, you want operator overloading to make dealing with it more intuitive, and this was recognized by Java's designers since they're already overloaded for ints and doubles!



  • @HypocriteWorld said:

    @amischiefr said:


    @tster said:

    No multiple inheritance. 

     

    I am actaully glad that it doesn't support multiple inheritance.  I have seen some really fucked up C, C++ code where it took way longer than it should to track down methods and member variables.   Hell there have even been some front page stories about just such a thing right here on TDWTF.  Look, personally I don't have the need to place constants and globals everywhere.  If you create good OOD then you don't have to worry about this.  Show me an example where multiple inheritance is a must.

    Nothing is a must (by Greenspun's Tenth Rule of Programming). You can do OO in C if you really want, just roll your own dynamic dispatch function. When you have an object that needs behaviour from two classes, sure you can inherit one and use something else for the other (like object composition), but then you're essentially Greenspunning your own multiple inheritance.

    BTW: I don't know how you can see fucked up C programs that have hard-to-track methods and member variables.

    C is incredibly bogged down by hard-to-track functions, though; there's no clear way to determine where a specific function is defined.  I've always found this to be a weakness of C/C++'s include system.


  • @HypocriteWorld said:

    This point doesn't matter, since almost every OO language comes from Smalltalk.
    Glaring exception to this statement: C++.  It is blatantly copied from Simula.



  • @HypocriteWorld said:

    @NSCoder said:

    Actually, according to Patrick Naughton (one of the creators of Java), Java 'came from' Objective-C and SmallTalk, not C++. It just happens to have a more C++ish syntax.

    This point doesn't matter, since almost every OO language comes from Smalltalk. Java was designed to be a better version of C++, which succeeded in some ways and failed in other ways.

    It wasn't designed to be a better version of C++, just a better language than C++ (well, that probably wasn't an explicit design goal, but it's implied, since Naughton hated C++ and would be unlikely to create a language which he considered worse than it.) I just figured that this might explain why it took a while to get certain features that C++ had, since it was in no way based on C++. What would be surprising would be if it took a long time to get some feature that existed in Objective-C.

    (Incidentally, there are those who would argue that C++ is not an OO language, and didn't come from Smalltalk, but there's no way I'm going to light that particular flamewar, so I'll stop at explaining why I mentioned the lack of C++ descent.)



  • @Welbog said:

    The fact is, + is already overloaded in Java, for use with doubles, ints, longs, shorts, bytes and floats.
    Not to mention String!



  • @bstorer said:

    C is incredibly bogged down by hard-to-track functions, though; there's no clear way to determine where a specific function is defined.  I've always found this to be a weakness of C/C++'s include system.

    True, the #include really is a lame form of modules. Although this kind of fail can happen in any language with modules as well: after a few "import blahblahblah" statements, it's quite easy to get a huge mess in your module namespace and get messed up.



  • HypocritWorld did a pretty good job, but I want to follow up on a few points:

    @amischiefr said:

    Which is why they developed it so that you didn't have to change every piece of code over to meet generic standards in order for it to work.  Leaving your 1.4 explicit casting still works in 1.5 and look, all you have to do is suppress the warning for it (I know, too much typing for a drag n drop .NET guy).

      Except for all those Lists that are being passed around without any parameter that you must now work with even though you are in 1.5

     

    @amischiefr said:

    It is there for people who are smart enough to use it.  For those like you who can't comprehend it: leave it alone, you don't have to use it.  

      Same could be said for multiple inheritence.  The problem is that you start with using tame generics.  And then you start refactoring your code and adding factories and handlers and shit and the generics you used early start to clutterize your functions because of the way generics work in Java.

     

    @amischiefr said:

    Java has always ad an annotation system (@depricated).  The new 'metadata' types allow you to define you own new annotations as oposed to only using the ones defined in the JDK.  So, once again a completely ignorant arguement.

      It would be ignorant other than the fact that having @depricated (which is only a hint to your IDE/compiler BTW) doesn't constitute having annotations (note the plural).  That's just a marking of deprecated, annotation system implies something more powerful and useful.

     

    @amischiefr said:

    Look, personally I don't have the need to place constants and globals everywhere.  If you create good OOD then you don't have to worry about this.

      Multiple inheritence would be REALLY useful in pub-subscribe systems.  In fact, Java's included pub-subscribe library sucks because java.util.Observable is a class, so if you wanted one of your objects to descend from something and be observable your fucked.  Time to start rolling your own pub-subscribe system to get around not having multiple-inheritence and having a badly implemented default pub-subscribe system.

    @amischiefr said:

    No explicit override is required, never has been.  The @Override annotation is just a way to force the compiler to check to see whether you got the method signature right.  Do you really need the compiler to hold your hand and change your diaper too?  .NET,nevermind I know the answer :)

      This makes me more productive, so yes I need it.  WIthout it I would make less money for myself and the company I work for.

     

    @amischiefr said:

    I can't tell you the last time that I needed to add another meaning to the + operator, or hell I could be REALLY cool like the LISP guys and completely redefine what + means!!! Yay cause that's oh so useful...

    You do realize there are other operators right?  My favorite override in .NET is the array index operator [<index>] for the IDictionary<Foo, Bar> interface.

    @amischiefr said:

    .Are you comparing Java to every language in the world picking out the best parts of each language and comparing them to Java?  Good lord.

    Any good thing that Java doesn't have is a con.  I add up all the cons and decide that Java has a lot of cons.  I still like Java though because all languages have a lot of cons.  That doesn't mean I can't argue that Java sucks in at least some ways.

    @amischiefr said:

    @tster said:
    ...most people on this forum that say java is teh suk can at least back that up.
    I agree, you are just not one of them :)
     

    Considering that most of your post was just blatant personal attacks on me, and since you have yet to really offer any reasons why my language of choice (C#) isn't as good as Java, I'll assume that you are incapable of making any arguments about .NET other than "drag n drop .NET guy", and ".NET,nevermind I know the answer."




  • @HypocriteWorld said:

    Java was designed to be a better version of C++

    See now, that is just going to get you laughed at.



  • @amischiefr said:

    Lets face it: Americans are lazy and stupid these days and Java is a LOT easier to pick up and learn than C or PHP or Python are.
     

    Your remark made me do a double take, not the part about American's being lazy, but rather the one about which language is easiest to learn.  Trying to say Java is easier to learn than Python sounds pretty rediculous to me.  I've been using Python for about a year now and found it to be vastly easier to understand than my first forrays into Java, C# and other strongly typed languages.  Not to say I don't mind strongly typed languages, but because Python is dynamic, syntactically very clean with no verbose cruft, has a small but powerful command set, etc.  It means learning the language and applying its features typically feel easier and more elegant than, say, needing generics, having to manually box value types, having to use complex reflection classes to do simple dynamic operations, etc.



  • @tster said:

    @amischiefr said:

    Which is why they developed it so that you didn't have to change every piece of code over to meet generic standards in order for it to work.  Leaving your 1.4 explicit casting still works in 1.5 and look, all you have to do is suppress the warning for it (I know, too much typing for a drag n drop .NET guy).

      Except for all those Lists that are being passed around without any parameter that you must now work with even though you are in 1.5

     

    @amischiefr said:

    It is there for people who are smart enough to use it.  For those like you who can't comprehend it: leave it alone, you don't have to use it.  

      Same could be said for multiple inheritence.  The problem is that you start with using tame generics.  And then you start refactoring your code and adding factories and handlers and shit and the generics you used early start to clutterize your functions because of the way generics work in Java.

     

    This isn't intended to be a personal attack, but perhaps this is more to do with the way you write your code than a failing with the language.  I've personally never had a problem with generics cluttering the code, but then again I favour encapsulating my logic into a business object rather than passing around raw data structures.  Think Library rather than List<Book>.



  • @Soviut said:

    It means learning the language and applying its features typically feel easier and more elegant than, say, needing generics, having to manually box value types, having to use complex reflection classes to do simple dynamic operations, etc.
    To be fair, Java no longer requires manually boxing values, and you don't have to do complex reflection because Java's reflection is too weak.  Okay, that didn't turn out as fair as I planned...

     The key reason I oppose statically-typed languages is that there's some misconception by their creators that static typing somehow gives you more protection.  If misassigning types were my biggest problem, this programming stuff would be comically easy.  Dynamic languages push the onus back on the developers: you have to test your code and test it completely in order to be sure you've not made a mistake.  Well, isn't that necessary anyway?  Static typing doesn't remove the need to test, or even lessen the importance of good testing.  So what does it get you?



  • @MrWiggles said:

    @tster said:

    @amischiefr said:

    Which is why they developed it so that you didn't have to change every piece of code over to meet generic standards in order for it to work.  Leaving your 1.4 explicit casting still works in 1.5 and look, all you have to do is suppress the warning for it (I know, too much typing for a drag n drop .NET guy).

      Except for all those Lists that are being passed around without any parameter that you must now work with even though you are in 1.5

     

    @amischiefr said:

    It is there for people who are smart enough to use it.  For those like you who can't comprehend it: leave it alone, you don't have to use it.  

      Same could be said for multiple inheritence.  The problem is that you start with using tame generics.  And then you start refactoring your code and adding factories and handlers and shit and the generics you used early start to clutterize your functions because of the way generics work in Java.

     

    This isn't intended to be a personal attack, but perhaps this is more to do with the way you write your code than a failing with the language.  I've personally never had a problem with generics cluttering the code, but then again I favour encapsulating my logic into a business object rather than passing around raw data structures.  Think Library rather than List<Book>.

     

    But what if one of your methods should only deal with List<HardbackBook>, then you need HardbackLibrary.  The whole point of generics is so that you don't have to create a new container class for all of your objects.   Furthermore, you might have a Library class, and you just want to deal with a list of books (for instance, when someone is checking out books you are dealing with a random collection of books).  I guess you could make ListOfBook or CollectionOfBook, but you'll spend so much time making new classes for everything that you won't get anything done.



  • @bstorer said:

    @Soviut said:

    It means learning the language and applying its features typically feel easier and more elegant than, say, needing generics, having to manually box value types, having to use complex reflection classes to do simple dynamic operations, etc.
    To be fair, Java no longer requires manually boxing values, and you don't have to do complex reflection because Java's reflection is too weak.  Okay, that didn't turn out as fair as I planned...

     The key reason I oppose statically-typed languages is that there's some misconception by their creators that static typing somehow gives you more protection.  If misassigning types were my biggest problem, this programming stuff would be comically easy.  Dynamic languages push the onus back on the developers: you have to test your code and test it completely in order to be sure you've not made a mistake.  Well, isn't that necessary anyway?  Static typing doesn't remove the need to test, or even lessen the importance of good testing.  So what does it get you?

     

    It catches typos at compile time rather than run time so that you don't even have to run your tests to find them.  In some cases you might have made a typo that causes your program to stop at a bad time and leave your environment in a bad state. Not that you can't do the same with static typed things, but you at least take away little typos like that.



  • @tster said:

    It catches typos at compile time rather than run time so that you don't even have to run your tests to find them.  In some cases you might have made a typo that causes your program to stop at a bad time and leave your environment in a bad state. Not that you can't do the same with static typed things, but you at least take away little typos like that.
    A fair point, but then you have to decide whether the tradeoff is worth it.  I say no.



  • @bstorer said:

    A fair point, but then you have to decide whether the tradeoff is worth it.  I say no.
     

    In my experience with static / dynamically typed languages (use Java at work, use PHP / Python at home) I'm inclined to agree. Type errors are the least of my worries - even in PHP (which is pretty loosely typed) it rarely causes an issue. 

    Although having said that, the projects I work on at home are usually smaller. If I had to re-do our work apps in Python or PHP my opinion would probably change!



  • @PhillS said:

    @bstorer said:

    A fair point, but then you have to decide whether the tradeoff is worth it.  I say no.
     

    In my experience with static / dynamically typed languages (use Java at work, use PHP / Python at home) I'm inclined to agree. Type errors are the least of my worries - even in PHP (which is pretty loosely typed) it rarely causes an issue. 

    Although having said that, the projects I work on at home are usually smaller. If I had to re-do our work apps in Python or PHP my opinion would probably change!

    If you are dealing with large nested data structures, static types come in very handy because it's easy to forget a layer of unwrapping or something. However, the Java type system tends to really get in the way when using any sort of polymorphism, so that's probably why you feel dynamic types are better.


  • @bstorer said:

    @tster said:
    It catches typos at compile time rather than run time so that you don't even have to run your tests to find them.  In some cases you might have made a typo that causes your program to stop at a bad time and leave your environment in a bad state. Not that you can't do the same with static typed things, but you at least take away little typos like that.
    A fair point, but then you have to decide whether the tradeoff is worth it.  I say no.
     

    It also depends heavily on what type of software you're writing.

    In a web app, where I'm the sole maintainer? Runtime errors are little problem since the entire distribution is under my control.

    In a distributed app, especially shrink-wrap software? I'll take the compile-time constraint, thank you. 



  • @sootzoo said:

    It also depends heavily on what type of software you're writing.

    In a web app, where I'm the sole maintainer? Runtime errors are little problem since the entire distribution is under my control.

    In a distributed app, especially shrink-wrap software? I'll take the compile-time constraint, thank you. 

    I fail to see what the correlation is between the kind of app and the preferred type system.


  • @bstorer said:

    Static typing doesn't remove the need to test, or even lessen the importance of good testing.  So what does it get you?
     

    The biggest benefit is see is the contractual agreements it forges.  This isn't so important within a single application where you control most parts.  But it becomes a bigger issue if your application is just a single component in a much larger system.  There is no language or methology that removes the need for testing, but from an interop standpoint, it can help with predictability and reliability.



  • @Soviut said:

    @bstorer said:

    Static typing doesn't remove the need to test, or even lessen the importance of good testing.  So what does it get you?
     

    The biggest benefit is see is the contractual agreements it forges.  This isn't so important within a single application where you control most parts.  But it becomes a bigger issue if your application is just a single component in a much larger system.  There is no language or methology that removes the need for testing, but from an interop standpoint, it can help with predictability and reliability.

    Ah, but you should be programming defensively anyway.  The only people less trustworthy than users are fellow developers, as this site aptly demonstrates.


  • @bstorer said:

    @Soviut said:

    @bstorer said:

    Static typing doesn't remove the need to test, or even lessen the importance of good testing.  So what does it get you?
     

    The biggest benefit is see is the contractual agreements it forges.  This isn't so important within a single application where you control most parts.  But it becomes a bigger issue if your application is just a single component in a much larger system.  There is no language or methology that removes the need for testing, but from an interop standpoint, it can help with predictability and reliability.

    Ah, but you should be programming defensively anyway.  The only people less trustworthy than users are fellow developers, as this site aptly demonstrates.
     

    That's sort of the point of static types in the first place.  It more or less forces you to program defensively.  You have to explicitly declare what types can go in, what types will come out, what parts can be extended, etc.

    While Python is dynamic and strongly typed, it still doesn't have a generic list that checks types on set, unless you specifically build one I guess.


Log in to reply