They can't force me to use Java



  • I joined a new team recently and have just inherited the system from the author, who is moving "up". He hates Java and is not afraid to say so. However, the powers that be wanted the system written in Java. His response? The data structures, validation and data tranformations are described in an excel spreadsheet (name, type, field size/# chars, range, enum, etc), which is exported to several xml docs which are sliced and diced through several passes of xslt to produce a system comprised ENTIRELY of generated code, from main() to exit(). Naturally, the generated code has zero comments. You want an explanation? Read the text in the spreadsheet!

    He says it's "simpler this way". No messy classes to debug, or trying to find all the places where some variable is used to change its type. Just change the spreadsheet and regenerate the entire system. The entire source repository contains a single excel spreadsheet and the half dozen xslt files ("we don't need to check in generated code"). If you want to see how the logic got a certain way, just look at the difference between the various versions of the spreadsheet.

    Naturally, the whole thing is built using DOS batch files (we have eclipse and ant, but it's simpler this way).

    W-T-F!?

     



  • I dare to say - if someone succeeds to produce non-trivial java code generated by xslt from excel... it might be a very well designed WTF. And in case you hate java -> you don't even have to look at it.

    Personally, I'd port it to OpenOffice though... You can just unpack xml files, skipping the risky excel->xml conversion step. 



  • On the one hand, I've got to admire anyone who hates Java.

    On the other hand... I've got to admire anyone who can dynamically generate an entire app from an Excel spreadsheet.

     



  • Generated code does have it's place, but not as a replacement for an entire developer.

    The generated code may have produced a small bug.  Trying to find out how to fix the generator so it doesn't create this bug could be nearly impossible as the generator may have built in limitations.  If you try to fix the bug directly in code because it is a few simple lines of change it will only re-appear the next time the code is generated.  Refixing this bug the same way every time now becomes part of this "build process."

     It's time to either toss the generator and start using the code and make your changes appropriately with the support of upper management or back out while you still can and duck out the door quickly and quietly.
     



  • I have actually dealt with someone who did this sort of thing. You should go to a program lead or other manager and explain that this guy's work is guaranteed to need a rewrite, and not only will that rewrite cost money, but the "code" as it stands now is going to cost the company money every day until that rewrite, because everyone who has to work with it will spend three times as much time (at least) learning to work with it.

    That guy shouldn't be on the project if he doesn't want to write Java. The "anything that gets the job done" attitude only works when the author is expected to maintain his own work forever.

    If it isn't possible to generate javadoc (with actual documentation, not just class and method names), his work isn't Java, it's amateur nonsense, and it's proof that he doesn't understand Java in the least.



  • @KattMan said:

    Generated code does have it's place, but not as a replacement for an entire developer.

    The generated code may have produced a small bug.  Trying to find out how to fix the generator so it doesn't create this bug could be nearly impossible as the generator may have built in limitations.  If you try to fix the bug directly in code because it is a few simple lines of change it will only re-appear the next time the code is generated.  Refixing this bug the same way every time now becomes part of this "build process."

     It's time to either toss the generator and start using the code and make your changes appropriately with the support of upper management or back out while you still can and duck out the door quickly and quietly.
     

    Actually, management seems supportive of the idea of rewriting it all in actual Java. But first I need to understand what the thing is supposed to do. It's just amazing that people do stuff like this. BTW, the code is essentially a table driven engine, where the "table" is also in the spreadsheet, and generated into java too. The "logic" is sort of like: Rule x, boolean-java-snippet  [ and | or ] more rules/snippets, etc. so every if, switch, and loop can be "configured" in the spreadsheet. It's like getting paid to be entertained!

     



  • I can't see what's wrong with it - make him add a column in the spreadsheet for comments / documentation and make the code generator generate comments and documentation. I mean, developing without actual coding - isn't that what we're all dreaming of? and it's not even some blown-up visio diagram thing - it's just excel and XML.

    Wow.

    And - seriously a "WTF", not for "nobody should be doing it", but for "amazing you can even do that". Tell that guy there's somebody out there who's giving him great kudos...


    PS: what language did he write the code generator in? You're not telling me XSLT is the code generator. Now that would be a WTF maybe..




  • So let me get this... That guy finds Java messy, so instead, he decides to write his app as a bunch of XSLT sheets?



  • @luketheduke said:

    I can't see what's wrong with it - make him add a column in the spreadsheet for comments / documentation and make the code generator generate comments and documentation. I mean, developing without actual coding - isn't that what we're all dreaming of? and it's not even some blown-up visio diagram thing - it's just excel and XML.

    Wow.

    And - seriously a "WTF", not for "nobody should be doing it", but for "amazing you can even do that". Tell that guy there's somebody out there who's giving him great kudos...


    PS: what language did he write the code generator in? You're not telling me XSLT is the code generator. Now that would be a WTF maybe..


    The code generator is an xslt transform...

     



  • I would like to be a voice of reason here. There are valid reasons to dislike Java. The most important of these is the execution of Java is non-deterministic. Java also forces (attempts to anyway) a programming paradigm which is not appropriate for all modules or applications. Java remains challenging to debug, even though it has gotten much better. The Java standard library continues to change bi-annually leaving no expectation of even code level forward compatibility. All that being said Java sounds like a much better solution than the described system. Also you should not force an editor i.e. Eclipse on developers. ANT is not as commonly available as Make and offers no obvious advantage.



  • I have one word for this: Enterprisey!



  • I would love to see a bit of this beautiful monstrosity in action. It's fascinating like a well coded virus, or a complicated piece of spyware that rootkits your computer in under 15kb of size.



  • Well, at least it isn't written in Java.



  • @ARBaboon said:

    I would like to be a voice of reason here. There are valid reasons to dislike Java. The most important of these is the execution of Java is non-deterministic....

    What?  Personally I haven't used Java much because I think it's an abuse of object-oriented concepts.  I've heard all the fluff about being slow because it's interpreted and all that, and that doesn't even bother me.  The not-really-platform-independent bit is also a factor (how platform dependent is the standard C library anyway, folks!?).

    But non-deterministic execution!? That just sounds scary. Do you have any examples of this, and what you mean by 'non-deterministic execution' exactly?  (In my mind, "non-deterministic" means "if you provide identical inputs in the same exact temporal sequence, you will get different results." That's markedly something you *don't* want computers to do!)

     



  • I'm not even a good programmer, and I know that Java isn't interpreted.  Look up "JIT".



  • @too_many_usernames said:

    @ARBaboon said:

    I would like to be a voice of reason here. There are valid reasons to dislike Java. The most important of these is the execution of Java is non-deterministic....

    What?  Personally I haven't used Java much because I think it's an abuse of object-oriented concepts.  I've heard all the fluff about being slow because it's interpreted and all that, and that doesn't even bother me.  The not-really-platform-independent bit is also a factor (how platform dependent is the standard C library anyway, folks!?).

    But non-deterministic execution!? That just sounds scary. Do you have any examples of this, and what you mean by 'non-deterministic execution' exactly?  (In my mind, "non-deterministic" means "if you provide identical inputs in the same exact temporal sequence, you will get different results." That's markedly something you don't want computers to do!)

     

     

    the actions of the garbage collector make the exact execution of a Java process non-deterministic.  However, you can actually change the settings to make it use a type of garbage collecting that IS deterministic.  So I have no idea what he is talking about. 



  • If this actually works, it sounds pretty damn cool. Java is a general-purpose programming language. Some applications don't need to be written in a general purpose language. The concept of "domain specific languages" gets a lot of attention right now. What this guy has done, is to create a DSL based on spreadsheets. This is actually a business person's dream. Every PHB in the world is going to want to buy a product that allows him to eliminate developers from the payroll and do all the programming himself in Excel. Brilliant!

    I am only half joking.

     



  • @operagost said:

    I'm not even a good programmer, and I know that Java isn't interpreted.  Look up "JIT".

    I consider "JIT" a slightly advanced form of interpreted, because (as far as I know) it "recompiles" every time I run the program fresh - which means the first execution is much slower than subsequent executions (which is really strange in my opinion).  Of course, "compiled" programs are "interpreted" every time you build them.

    I guess I only want my programs to compile when the source changes, not because I decide to run them (also, source code is usually larger than the binary, and I'm still one of those efficiency freaks that doesn't think that it's ok to use vast amounts of storage space just because it's inexpensive).

    I guess the only "non-interpreted" language is machine language! 



  • Picking at Java for interpretedness, garbage-collected-ness, and alleged slowness is old news, guys. In the modern age, the new hotness is bashing Java for being statically typed, not having closures, having a less capable system of generics than some of us would like, still having those weird little types that aren't full objects, contributing to the over-XML-ifcation of the world, and generally being Enterprisey. Get with the current half-decade.



  • @operagost said:

    I'm not even a good programmer, and I know that Java isn't interpreted.  Look up "JIT".

    While the Sun JVM does have a JIT engine, it also has an interpreter, and most desktop Java applications run most of their code on the interpreter. The JIT engine is only applied to functions which are heavily used, in the default configuration.

    (You can tell the JVM to do something more useful, but most applications don't)



  • @too_many_usernames said:

    I guess the only "non-interpreted" language is machine language! 

     What do you think the CPU does to machine language then?
     



  • @too_many_usernames said:

    The not-really-platform-independent bit is also a factor (how platform dependent is the standard C library anyway, folks!?).

    Yeah, having different int sizes everywhere so that every platform has a bunch of uint32_t-like typedefs. Or my favorite example - every C++ libary seems to have its own version of Unicode strings (incompatible with each other, of course).

    And Java also abstracts stuff like filenames and slashes. And you don't have problems when a developer using a cross-platform library makes Windows and Mac binaries for a proprietary application but  is too lazy to recompile the same stuff for Linux.



  • @luketheduke said:

    PS: what language did he write the code generator in? You're not telling me XSLT is the code generator. Now that would be a WTF maybe..

    I once had the idea to write an XSLT to JavaScript+DOM compiler... in XSLT.



  • @asuffield said:

    @operagost said:

    I'm not even a good programmer, and I know that Java isn't interpreted.  Look up "JIT".

    While the Sun JVM does have a JIT engine, it also has an interpreter, and most desktop Java applications run most of their code on the interpreter. The JIT engine is only applied to functions which are heavily used, in the default configuration.

    (You can tell the JVM to do something more useful, but most applications don't)

    Holy cow. Do you really believe this? No wonder you believe Java is slower than C++. Please bring your information up-to-date by reading this.

    All Java code is JIT'd. The JIT compiler is on by default, for every class, unless explicitly disabled.



  • @VGR said:

    @asuffield said:

    @operagost said:

    I'm not even a good programmer, and I know that Java isn't interpreted.  Look up "JIT".

    While the Sun JVM does have a JIT engine, it also has an interpreter, and most desktop Java applications run most of their code on the interpreter. The JIT engine is only applied to functions which are heavily used, in the default configuration.

    (You can tell the JVM to do something more useful, but most applications don't)

    Holy cow. Do you really believe this? No wonder you believe Java is slower than C++. Please bring your information up-to-date by reading this.

    The information on that page is either out of date or wrong (I'm not sure if it was ever accurate). One of us has read the source code to the Sun JVM and knows what it actually does.



  • What's wrong with generated code? Do you also not use GUI designers in your IDE, because they generate code? Or any kind of DAL generators, because it's so much more fun to write a DAL anytime you need a database driven application? Or even any sort of language that is compiled, because your sourcecode turns into generated machinelanguage?



  • @why? said:

    What's wrong with generated code? Do you also not use GUI designers in your IDE, because they generate code? Or any kind of DAL generators, because it's so much more fun to write a DAL anytime you need a database driven application? Or even any sort of language that is compiled, because your sourcecode turns into generated machinelanguage?

    Generated code is wrong if you are forced to manually modify it, especially when

    a) it is so complex that you do not fully understand it

    and/or

    b) it is likely that you have to do the generation step again. 


     



  • @zlogic said:

    @too_many_usernames said:

    The not-really-platform-independent bit is also a factor (how platform dependent is the standard C library anyway, folks!?).

    Yeah, having different int sizes everywhere so that every platform has a bunch of uint32_t-like typedefs. Or my favorite example - every C++ libary seems to have its own version of Unicode strings (incompatible with each other, of course).

    And Java also abstracts stuff like filenames and slashes. And you don't have problems when a developer using a cross-platform library makes Windows and Mac binaries for a proprietary application but  is too lazy to recompile the same stuff for Linux.

    Yeah, I still want to smack K&R for not picking size-defined data types; that's probably the only language specific issue I have with C and its variants (from a portability standpoint only!!! There are a couple other oddities here and there).

    The bit about incompatible libraries for things like Unicode: well, I blame the implementers on that one, not the "language".  Of course, that said, while the purist in me wants to say "that's the library, not the language" I do realize that when people talk about "computer language" they generally refer both to the pure syntax and semantics as well as the base libraries, because without the libraries most languages are just academic exercises.

    That said, it would be interesting to see more efforts in actually standardizing the "standard" libraries, as well as pushing for size-standard data primitives.
     

    However, for most applications, even the data-type discrepancies aren't a big deal, and if they are, you're not going to use standard libraries anyway because they probably don't correctly handle the specific corner cases you need.



  • @too_many_usernames said:

    @zlogic said:

    Yeah, having different int sizes everywhere so that every platform has a bunch of uint32_t-like typedefs. Or my favorite example - every C++ libary seems to have its own version of Unicode strings (incompatible with each other, of course).

    Yeah, I still want to smack K&R for not picking size-defined data types; that's probably the only language specific issue I have with C and its variants (from a portability standpoint only!!! There are a couple other oddities here and there).

    ...

    That said, it would be interesting to see more efforts in actually standardizing the "standard" libraries, as well as pushing for size-standard data primitives.

    C99 requires definitions of uint8_t, uint16_t, uint32_t, int8_t, int16_t and int32_t, and optionally uint64_t and int64_t if the platform supports 64-bit math.

    Most such common issues with C were fixed in C99 (you guys do realise that 1999 was 8 years ago, right?)



  • @asuffield said:

    C99 requires definitions of uint8_t, uint16_t, uint32_t, int8_t, int16_t and int32_t, and optionally uint64_t and int64_t if the platform supports 64-bit math.

    Well, not quite. Well, you’re right, except, the “optionally ... if the platform supports N-bit math” applies to all of those, not just 64. It’s just that systems that don’t support 8, 16, 32-bit math are rather less common these days.



  • @Random832 said:

    @asuffield said:

    C99 requires definitions of uint8_t, uint16_t, uint32_t, int8_t, int16_t and int32_t, and optionally uint64_t and int64_t if the platform supports 64-bit math.

    Well, not quite. Well, you’re right, except, the “optionally ... if the platform supports N-bit math” applies to all of those, not just 64. It’s just that systems that don’t support 8, 16, 32-bit math are rather less common these days.

    C99 requires that all platforms implement 8-bit and 32-bit integer modes. I know offhand that 64-bit mode is optional. I'm not sure about 16-bit mode.



  • What's with the obsession with putting "_t" on the end of new types? What's wrong with just int8?



  • @Thief^ said:

    What's with the obsession with putting "_t" on the end of new types? What's wrong with just int8?

    The suffix _t has been explicitly reserved for use by the language and system specifications, since (at least) the 1980s. Anybody who creates their own type ending in _t, and finds it conflicts with a type introduced into a new version of the specifications, has only themselves to blame. (Also, anything beginning with an underscore is reserved). All other type names are open for use by software, so new versions of the specifications can't use any other names.



  • @asuffield said:

    @Random832 said:
    @asuffield said:

    C99 requires definitions of uint8_t, uint16_t, uint32_t, int8_t,
    int16_t and int32_t, and optionally uint64_t and int64_t if the
    platform supports 64-bit math.

    Well, not quite. Well, you’re right, except, the “optionally ... if the platform supports N-bit math” applies to all of those, not just 64. It’s just that systems that don’t support 8, 16, 32-bit math are rather less common these days.

    C99 requires that all platforms implement 8-bit and 32-bit integer modes. I know offhand that 64-bit mode is optional. I'm not sure about 16-bit mode.

    @N1124 said:

    7.18.1.1 Exact-width integer types

    1 The typedef name intN_t designates a signed integer type with width N, no padding bits, and a two’s complement representation. Thus, int8_t denotes a signed integer type with a width of exactly 8 bits.

    2 The typedef name uintN_t designates an unsigned integer type with width N. Thus, uint24_t denotes an unsigned integer type with a width of exactly 24 bits.

    3 These types are optional. However, if an implementation provides integer types widths of 8, 16, 32, or 64 bits, no padding bits, and (for the signed types) that have two’s complement representation, it shall define the corresponding typedef names.

    The types that are required are the int_leastN_t and int_fastN_t ones; is that what you’re thinking of? C99 doesn’t even require that implementations use two’s complement, though, it’s a step up from C89, since it at least says that implementations must use either two’s complement, one’s complement, or signed-magnitude, and (I believe) must specify which they are using. Or perhaps you’re referring to the requirement that implementations provide types that can support up to at least 64* bits; and, implicitly, they are also required not to provide types that can support less than eight bits (though, beyond that, they are not required to support any particular number of bits, or to support multiple distinct sizes of integers.)

    *Support for long long, a type that must be at least 64 bits, is not optional. 



  • [accidentally hit reply instead of edit above]



  • @Random832 said:

    @N1124 said:

    7.18.1.1 Exact-width integer types

    1 The typedef name intN_t designates a signed integer type with width N, no padding bits, and a two’s complement representation. Thus, int8_t denotes a signed integer type with a width of exactly 8 bits.

    2 The typedef name uintN_t designates an unsigned integer type with width N. Thus, uint24_t denotes an unsigned integer type with a width of exactly 24 bits.

    3 These types are optional. However, if an implementation provides integer types widths of 8, 16, 32, or 64 bits, no padding bits, and (for the signed types) that have two’s complement representation, it shall define the corresponding typedef names.

    The types that are required are the int_leastN_t and int_fastN_t ones; is that what you’re thinking of?

    I distinctly remember that somewhere there is a requirement for an implementation to provide at least one type that's an 8-bit integer, and at least one type that's a 32-bit integer; combined with 7.18.1.1.3, that implies a requirement for the relevant *intN_t types to exist. I can't find that rule offhand, though... it's not something you can easily look up in the index. It may also be an implication itself - sometimes I find myself wishing for a concordance to the C99 spec, the people who wrote it were being far too clever in some parts.

    I suppose a conforming implementation that didn't use 2's complement could get away with leaving them out.


Log in to reply