RIP Java in the Enterprise



  • @Bort Yeah, no. The JVM's features exist purely to make Java work as a language, and while you can write something like Scala to work on it, you're restricted in some odd ways. CIL supports more things than C#, VB, or F# even use.

    I mean, maybe in recent years that has changed a bit. I don't know.


  • Impossible Mission - B

    @Yamikuronue said in RIP Java in the Enterprise:

    Suggestions?

    Personally, I'm partial to Boo.

    It's a CLR language whose syntax was heavily inspired by Python (and Ruby to a lesser degree) but semantic-wise it's a statically typed language with (almost) all the same core features as C#*. But the thing that really sets it apart is its metaprogramming support.

    It's got four different mechanisms to run your own code within the compiler pipeline, in which your code receives an AST and gets to operate on it. And because it's based on a statically typed object-oriented class hierarchy, you can use the Visitor Pattern to narrow down exactly what you care about working on, instead of having to use pattern matching etc. (Or you can use pattern matching inside your visitors for even more fine-grained control.)

    The desire for metaprogramming ability drives a lot of the decisions in the language design. The entire AST hierarchy is built with the Visitor Pattern and code rewriting in mind, the parser has a syntax for AST literals, with language-level support for quasiquotation and interpolation, the compiler pipeline itself is highly flexible, even to the point where you can specify a custom pipeline as a compiler argument, in which you've added, removed, or replaced various passes, etc.

    (Disclosure: As of about a year ago, I'm a contributor to the project.)

    *Notably missing: fixed and pinning, and async/await. I'm currently working on revamping the backend, eliminating a few obstacles that get in the way of implementing the latter feature, because it's really awesome to have.


  • Notification Spam Recipient

    @masonwheeler said in RIP Java in the Enterprise:

    @Yamikuronue said in RIP Java in the Enterprise:

    PRs into my brain

    I'm sure @Tsaukpaetra could find some way to :giggity: that...

    Kink 5b1 (PiE) has got you covered!



  • @Magus said in RIP Java in the Enterprise:

    CIL supports more things than C#, VB, or F# even use.

    Example?

    CLR was designed to run C# and VB.Net. And VB.Net was made more like C# to facilitate this. They're not that different of languages. So it's built around one tight-knit family of 2 languages instead of a single language. Features have been added to the CLR over time to make it support a wider variety of programming forms - like the DLR - but only when relevant features were added to the flagship language (C#'s dynamic type). JVM has done similar (invokedynamic).

    I'll grant that I too prefer the CLR as a runtime (reified generics, stack allocated custom types), but these features, again, are there for C# (generics, structs), not for some hypothetical, future, 3rd party language. Other languages have to work within the bounds of C# or make themselves more C# like (VB -> VB.Net, O'Caml -> F#), if only for the sake of interop, which would apply no matter the platform.

    ...

    In fact, wait... Java doesn't have a dynamic type likeC#. It looks like invokedynamic was added in 1.7 even though Java didn't need it yet (now uses it to build lambdas in 1.8). The original reason for adding it was to support dynamic languages - 3rd party languages. So Oracle has actually gone farther out of it's way to support 3rd party languages than Microsoft and your earlier comment is the opposite of true.

    If 3rd party support was the intent, it was probably because they noticed people don't like java the language and there are several substitutes now (Groovy, Scala, Clojure, Kotlin, JRuby) and they license the JVM not the java language.



  • @Bort Example? Sure, CIL has builtin support for the tail-recursion optimization required to make recursion safe from stack overflows. They never used that in C# or VB.NET by choice, but they were careful to allow such facilities to be used by other languages.


  • area_pol

    @Yamikuronue said in RIP Java in the Enterprise:

    I do want to learn Lisp someday. I like that feeling of my brain stretching and expanding and digesting new ways of working, and I think Lisp will probably provide me that.

    I used to think that, but when I read some basics about Lisp, it turned out the core idea is trivial:
    the code (not just functions, but the full abstract syntax tree) is represented as a runtime object, and can be modified at runtime.
    It is not even statically typed. My impression was that Lisp was designed to be easy for the interpreter/runtime but not for the user.

    If you are looking for a challenge, try something like Haskell. Functional programming makes much more sense with static typing.



  • @Magus said in RIP Java in the Enterprise:

    Sure, CIL has builtin support for the tail-recursion optimization required to make recursion safe from stack overflows.

    Neat.

    @Magus said in RIP Java in the Enterprise:

    They never used that in C# or VB.NET by choice, but they were careful to allow such facilities to be used by other languages.

    Oh, please...

    I can't find any explanation as to why they included the tail. instruction. Could also be vestigial.

    Anyway, my reaction is to your paint Microsoft as this forward-thinking, inclusive company and Sun/Oracle or whoever else, not.

    Sun developed a language and vm that ran on multiple platforms (Windows, Solaris, Linux), with some consistency.

    Microsoft made their own version of Java and tried to tie it to Windows only, to exclude other platforms.

    When Microsoft made the CLR, they designed it, in principle, to run on multiple platforms, and that's great, but they again excluded other platforms by only making a Windows implementation.

    Not to mention C# and the CLR are a blatant rip-off of Java, but like mason said above, MS did improve on it.

    Java was GNU licensed and had the JCP instituted. C# started under a shared ECMA/ISO standard, but since 3.0, has been all Microsoft.

    I work on and prefer C#/.Net to Java/JVM. It just sounds like you're full of it is all.



  • @Bort We aren't really saying very different things. The only point I've been trying to make is that the CLR is better for new languages than the JVM, because it was designed to be. You're bringing in a whole lot of other things about which is a better platform in better, but still coming up with the CLR, but only after making sure to point out that it's bad. Sure, great, but my point stands. I even provided an example when you asked for one, and all you did was try to punch more holes in things I didn't say. Have fun?


  • ♿ (Parody)

    @Magus said in RIP Java in the Enterprise:

    Sure, great, but my point stands. I even provided an example when you asked for one, and all you did was try to punch more holes in things I didn't say. Have fun?

    And he provided a counter example to your point. You point may "stand" but I don't see how it's really very convincing.



  • @boomzilla He said that the JVM, in recent years, has added things not needed for other languages. You might also notice that I never said anything to the contrary.

    Sure, I'd say that means they're playing catchup now, but I also don't consider that negative. I'm happy to see new things supported wherever they are.


  • BINNED

    @Onyx said in RIP Java in the Enterprise:

    @Yamikuronue said in RIP Java in the Enterprise:

    I like that feeling of my brain stretching and expanding

    Sounds like a good way to get a headache, honestly.

    🐠

    At the very least it will be pure.

    @Yamikuronue said in RIP Java in the Enterprise:

    I do want to learn Lisp someday. I like that feeling of my brain stretching and expanding and digesting new ways of working, and I think Lisp will probably provide me that. I doubt it'll be particularly useful, but not everything has to be useful.

    C++14 lambdas can do monadsnomads too :giggity: and at least it will increase your market value.


  • Trolleybus Mechanic

    Glad I'm a .Net coder and not tied to a closed-source, profit-by-license driven framework.


  • Discourse touched me in a no-no place

    @masonwheeler said in RIP Java in the Enterprise:

    in which your code receives an AST and gets to operate on it

    How does that sort of thing square with handling of custom keywords and syntactically-custom operators? The AST stage usually comes a bit too late for that as you're already tokenized and parsed at that point. (The really interesting semantics requires relevant syntax.)

    I'm entirely happy with the idea of embedding one syntax within another. It's really nice to delegate a problem to the right level of abstraction for it.



  • @Adynathos said in RIP Java in the Enterprise:

    @Yamikuronue said in RIP Java in the Enterprise:

    I do want to learn Lisp someday. I like that feeling of my brain stretching and expanding and digesting new ways of working, and I think Lisp will probably provide me that.
    

    I used to think that, but when I read some basics about Lisp, it turned out the core idea is trivial:
    the code (not just functions, but the full abstract syntax tree) is represented as a runtime object, and can be modified at runtime.
    It is not even statically typed. My impression was that Lisp was designed to be easy for the interpreter/runtime but not for the user.

    The interesting thing (to me, anyway, but then I am a Lisp Weenie) is that this is purely a historical accident.

    Strap yourself in, folks, this is gonna be a long one.

    The original design for Lisp was far more conventional, at least by today's standards; while they never finished the design of the actual language, McCarthy did publish his designs using the the so-called meta-expression syntax, which shows where he was going with it. An example given in m-expression LISP from his paper, with the implicit cond syntax he was using at the time, was:

    subst [x; y; z] =
        [atom [z] -> 
            [eq [z; y] -> x; T -> z;]
        [T -> cons [subst [x; y; car [z]]; subst [x; y; cdr [z]]]] 
    

    This is only an approximation of the intended LISP syntax; McCarthy's original paper used the sort of algebraic syntax that the Algol committee - which he was on - would later call a 'publication language', which wasn't limited to the character sets or parsing abilities of the time. Part of the notation was the 'implicit conditional', which said that you could have a conditional expression simply by having a right arrow after an arbitrary expression - if the expression had what we today would call a truthy value, then the first expression in the (implicit) list following the right arrow would be evaluated, otherwise the second value in the list would be. Just how this would have worked in the actual compiled code isn't entirely clear.

    However, the expectation was that writing the Lisp compiler would take years, if not decades, so he originally started out by having his graduate student Steve 'Slug' Russell, hand-compile the m-expression code into IBM 704 assembly language as they were experimenting with it so that they could get a better feel for how the compiler should work, and also to test his then-novel idea of a self-hosted language.

    This is where things take an odd turn. In his original paper, McCarthy wanted to show that LISP would be a 'universal' language in the sense of a Turing Machine or the λ-calculus (which he was drawing heavily on), so he devised two functions for this purpose: apply, which could take a function name and a list of symbols and apply that function to that list as the function's arguments, and eval, which could evaluate a list as a series of symbolic functions (s-functions) representing a program and call apply on them to evaluate the list as function calls. He called this the 'symbolic expression' form, and the expressions became known as s-expressions (later usually abbreviated as sexps or sexprs).

    The functions as given in the paper were meant solely as a proof of concept, to show that he could express the core idea of λ-calculus in his new language.

    However, as he was grinding along through the experimental code he and McCarthy were writing, Slug decided that it would be an interesting experiment to implement these two functions as actual programs. Once he had done this, he realized immediately that what he had done was write an interpreter for the s-expression language. He immediately set up a simple but functional scaffolding to let the program read in and execute a series of expressions in a continuous loop - the first Read-Eval-Print Loop (REPL), which would become the hallmark of LISP and most other interactive languages to follow - and showed McCarthy the results.

    This changed everything: they now had a (mostly) working language using just the tools they'd already developed. The compiler was put on hold while they experimented with this new thing.

    But fate had another twist in store: while the IBM 704 ran these tests in batch mode, a new computer had just been delivered to MIT, the Lincoln Labs TX-0. The Tixo had been built as a test-bed for another system, the TX-1 (renamed the TX-2 after a comprehensive redesign), which was to use two untried technologies: transistor-transistor logic and ferrite-core memory. The TX-1 in turn was to test new innovations meant for the SAGE early warning system, which was in development at the time.

    The TX-0 was a much simpler system than the TX-2, meant mainly to confirm that the transistor systems were reliable and then test the core memory racks before they were installed in the TX-2. To make this easier, the TX-0 could be operated from the console interactively, and even has a simple vector display that could be manipulated using a light gun/light pen.

    When the TX-2 was running, the TX-0 was donated to MIT on a long-term loan (with no end date). It was set up in a spare office - it was small enough to fit in a spare office - and pretty much any MIT student who wanted to could use it.

    Slug immediately saw that he could re-write the interpreter for this new toy (even given the limitations of it's memory and instruction set) and be able to do something previously unimaginable: type code at the console and have it respond immediately. This Holy Shit! moment, and his subsequent development of the first garbage collector to cope with the 4Kword (18-bit) memory that they had left the Tixo with, became one of the defining aspects of LISP for the next twenty years.

    One side effect of all this was that anyone who wanted to write their own LISP dialect - in LISP itself as a meta-circular evaluator, or converting it to something else first - need only take McCarthy's paper and tweak the design to their fancy. While the oft-repeated claim that you could implement LISP in one page of code is a bit of an exaggeration (the pair of eval and apply do indeed fit on half a page of LISP code, but that assumes that you have a number of functions already on hand), the McCarthy-Russell design was to be the go-to example of how to work in LISP ever since.

    Anyway, if you want to learn LISP, the basic thing to is simple: you have atoms, which are things like symbols (variables), numbers (usually bignums with an implicit conversion from big integers to big flonums when needed), and strings (technically, you don't even need a literal form for those, so long as you can convert symbols to strings, but that's a pain); and lists, which are an open parenthesis, one or more atoms or lists separated by spaces, and a close paren:

    THIS-IS-AN-ATOM
    (THIS IS A LIST)
    (THIS IS A LIST (WITH A NESTED LIST))

    An expression is simply a list in which the first element is a symbol bound to a function (that is to say, its the name of a function) and the rest of the elements are the function's arguments:

    (plus 2 2)
    (+ 2 2)       ; same thing
    
    (sqrt (+ (* 3 3) (* 4 4)))
    
    (display 'an-atom)     ;  I will explain the quote mark in a moment
    

    There are only a few special cases that LISP languages usually have; the interpreter has to check the first element (called the CAR of the list, for historical reasons relating to the IBM 704 mainframe's register set) for each of these first, and only try to apply the car if it isn't one of those.

    First, because the interpreter will by default try to use any list it comes across as a function application, there is the (quote) special form, which can be either written out or abbreviated as a single quote character:

    (quote an-atom)    ; gives the quoted symbol "an-atom"
    'an-atom                  ; the same quoted value
    

    Next, the (if) and (cond) forms have to be special forms, because they don't evaluate the consequent expressions unless the conditions they are testing are true:

    (if (nullp a-list)     ; is a-list empty?
         nil                        ; if so, return an empty list
        (foo a-list))       ; otherwise, apply foo to a-list
    

    The binding function, usually named something like (define), which takes an unquoted symbol and a value and binds that value to the symbol:

     (define one 1)
    

    The (lambda) form, which takes a list of arguments and a list of expressions and returns a function using the expressions as the function body:

    (lambda (opp adj)
      (sqrt (+ (* opp opp) (adj adj))))
    

    Now, depending on the language (and specifically depending on whether it treats functions as normal variables or puts them in a separate namespace), you might be able to use define to bind a function to a symbol, or you might have to use a special function such as Common Lisp's (defun):

    ;; in Common Lisp
    (defun hypo (opp adj)
        (sqrt (+ (* opp opp) (adj adj))))
    
    ;; in Scheme using an explicit lambda
    (define hypo 
      (lambda (opp adj)
        (sqrt (+ (* opp opp) (adj adj)))))
    
     ;;; in Scheme using the abbreviated syntax
     (define (hypo opp adj)
        (sqrt (+ (* opp opp) (adj adj))))
    

    The (setq) (or (setf) or (set!), depending on the dialect) form changes a value previously assigned by (define); in some Lisp languages, it is one of the few instances where the value of one of the arguments is changed. The values of pairs (a pair of pointers) can also be assigned by functions which are called (replaca) and (replacd) in Common Lisp, and (set-car!) and (set-cdr!) in Scheme.

    The (cons) (pair constructor), (list) (list constructor), (append) (car) (first element of a list), and (cdr) functions are usually special forms as well, for a number of reasons. The (cons) form takes two values and puts them into a pair of pointers to those values; this pair is often called a cons cell for this reason.

       (cons 'an-atom 'another-atom)
    

    ==> (an-atom . another-atom) ; note the period

    Now, it does not build a list per se; a list is actually a chain of pairs with the cdr element of the last pair being a null pointer. If you use (cons) on values where the second value is a list, it in effect creates a new list with the second list as it's cdr element:

    (cons 'this '(is a list))
    

    ==> (this is a list)

    Pairs are normally written as the elements separated by a space, a period, and another space; lists can be displayed this way too, but are usually shown without the periods or the terminating null pointer.

    (this . (is . (a . (list . nil))))  ; list in paired form
    (this is a list)    ;; same list in standard form
    

    The (car) and (cdr) forms - which are often aliased as (first) and (rest), for obvious reasons - get the first and second parts of a pair, respectively.

    (car '(an-atom . another-atom))
    

    ==> an-atom

    (cdr   '(an-atom . another-atom))
    

    ==> another-atom

    Since lists are just chains of pairs, these are often used for walking through a list:

      (car '(this is a list))
    

    ==> this

      (cdr '(this is a list))
    

    ==> (is a list)

    There are usually a number of other special forms, including those relating to macros (which were developed around 1963), but that's the general layout of things.


  • I survived the hour long Uno hand

    @ScholRLEA said in RIP Java in the Enterprise:

    Part of the notation was the 'implicit conditional', which said that you could have a conditional expression simply by having a right arrow after an arbitrary expression - if the expression had what we today would call a truthy value, then the first expression in the (implicit) list following the right arrow would be evaluated, otherwise the second value in the list would be. Just how this would have worked in the actual compiled code isn't entirely clear.

    Ternaries!



  • @Yamikuronue said in RIP Java in the Enterprise:

    Ternaries!

    I learnt the other day that these are also called Elvis Operators, which pleased me. ?:


  • Grade A Premium Asshole

    @Yamikuronue said in RIP Java in the Enterprise:

    I like that feeling of my brain stretching and expanding and digesting new ways of working

    Learning or aneurysm? With LISP it could go either way.



  • @Yamikuronue Chains of ternaries, actually. Go figure. In most modern Lisps, it comes out as something like:

    (cond
      ((is-this-true? foo) (bar quux))
      ((is-this-other-thing-true? baz) (corge fred))
      (else (bing bang)))
    

    Which makes one version of the Scheme (eval) just:

    (define (eval exp env)
      (cond ((self-evaluating? exp) exp)
            ((variable? exp) (lookup-variable-value exp env))
            ((quoted? exp) (text-of-quotation exp))
            ((assignment? exp) (eval-assignment exp env))
            ((definition? exp) (eval-definition exp env))
            ((if? exp) (eval-if exp env))
            ((lambda? exp)
             (make-procedure (lambda-parameters exp)
                             (lambda-body exp)
                             env))
            ((begin? exp) 
             (eval-sequence (begin-actions exp) env))
            ((cond? exp) (eval (cond->if exp) env))
            ((application? exp)
             (apply (eval (operator exp) env)
                    (list-of-values (operands exp) env)))
            (else
             (error "Unknown expression type -- EVAL" exp))))
    

    Mind you, all these predicates have to be defined somewhere (which the book it is from, Structure and Interpretation of Computer Programs, does); a messier version that does a lot of these tests explicitly is given in the Abelson-Sussman lectures which went to an earlier version of that book.

    BTW, if you want to learn the Lisp (or more specifically, Scheme) core language, the first four or so lectures are still pretty damn good despite their age. They only show the bare minimum of the language they need for the lectures, though, which given that it is a teensy language to begin with means that they spend a lot of time reinventing the wheel, but they do such a good job of it, and show such enthusiasm about it, that it doesn't seem forced (to me, at any rate).


  • Grade A Premium Asshole

    @ScholRLEA said in RIP Java in the Enterprise:

    Strap yourself in, folks, this is gonna be a long one.

    Most of your posts are.

    Also, :giggity:



  • @Adynathos said in RIP Java in the Enterprise:

    • if you have a desktop Java program, its best to bundle the JRE inside it (after all you don't install Oracle's JRE for Eclipse / Android IDE)

    Who doesn't install oracle jdk for Eclipse? Cuz I know if I don't eclipse starts bitching at me.


  • Impossible Mission - B

    @dkf said in RIP Java in the Enterprise:

    @masonwheeler said in RIP Java in the Enterprise:

    in which your code receives an AST and gets to operate on it

    How does that sort of thing square with handling of custom keywords and syntactically-custom operators? The AST stage usually comes a bit too late for that as you're already tokenized and parsed at that point.

    It really doesn't.

    (The really interesting semantics requires relevant syntax.)

    Not necessarily. A lot of things that are implemented as syntax in C# are implemented as metaprogramming in Boo. For example, the using construct:

    macro using:
    	expansion = using.Body
    	for expression as Expression in reversed(using.Arguments):
    		temp = ReferenceExpression(Context.GetUniqueName("using", "disposable"))
    		assignment = [| $temp = $expression as System.IDisposable |].withLexicalInfoFrom(expression)
    		
    		expansion = [|
    			$assignment
    			try:
    				$expansion
    			ensure:
    				if $temp is not null:
    					$temp.Dispose()
    					$temp = null
    		|]
    		
    return expansion
    

    if you know about [|quasiquotes with $interpolation|], it's not hard to figure out exactly what this code is doing.


  • Garbage Person

    0_1482604852306_my_other_car_is_cdr_bumper_bumper_sticker.jpg


  • Discourse touched me in a no-no place

    @masonwheeler said in RIP Java in the Enterprise:

    A lot of things that are implemented as syntax in C# are implemented as metaprogramming in Boo.

    Yes, but can you also implement your own entirely new operators? 👿

    Looking at the using definition you present, it's mostly obvious but not entirely; what does withLexicalInfoFrom actually copy across? (I'm also not sure why you set $temp to null at the end of the ensure block, given that user code cannot refer to it, but that's nothing to do with me.)

    By comparison, here's what I'd do in a language that I like…

    proc using {variable disposeable body} {
        upvar 1 $variable v
        set v $disposeable
        try {
            uplevel 1 $body
        } finally {
            $disposeable destroy;  # Uses a different method name “because”
            unset v
        }
    }
    

    It doesn't handle doing a collection of assignments, but there's no particular reason why that's necessary in any case. (I do know how to fix it.)


  • Impossible Mission - B

    @dkf said in RIP Java in the Enterprise:

    Yes, but can you also implement your own entirely new operators? 👿

    No. For that you need to change the parser.

    Looking at the using definition you present, it's mostly obvious but not entirely; what does withLexicalInfoFrom actually copy across?

    The lexical info from the old AST node. (It's used by the codegen to generate debug info so you can map program execution position to source code locations.)

    (I'm also not sure why you set $temp to null at the end of the ensure block, given that user code cannot refer to it, but that's nothing to do with me.)

    Good point. I didn't write that, but you're right that that's probably not needed.

    @dkf said in RIP Java in the Enterprise:

    By comparison, here's what I'd do in a language that I like…

    What language is that?


  • Discourse touched me in a no-no place

    @masonwheeler said in RIP Java in the Enterprise:

    What language is that?

    Tcl.

    I might've written quite a bit of modern implementations of it…


  • Discourse touched me in a no-no place

    @masonwheeler said in RIP Java in the Enterprise:

    What language is that?

    A misspelled one.

    @dkf said in RIP Java in the Enterprise:

    disposeable


  • :belt_onion:

    @Adynathos said in RIP Java in the Enterprise:

    there seems to be no GUI library

    O.o


  • BINNED


  • Discourse touched me in a no-no place

    @loopback0 I could have called it BertTheLonelyDonkey too, but then the syntax errors given to users would have been a bit less discoverable. ;)



  • @dse 🤦

    Another proof the when well reputed news source start to publish faulty contents, the others still follows quickly.



  • @sloosecannon said in RIP Java in the Enterprise:

    @Adynathos said in RIP Java in the Enterprise:

    there seems to be no GUI library

    O.o

    You know he's talking about .NET Core and not .NET Framework? While .NET Framework has 3 GUI APIs built in, .NET Core has none.


  • :belt_onion:

    @powerlord welllllll....
    OK that makes more sense.

    But .NET Core != C#



  • @sloosecannon said in RIP Java in the Enterprise:

    @powerlord welllllll....
    OK that makes more sense.

    But .NET Core != C#

    It's the cross-platform version of the C# Standard Library.


  • BINNED

    @powerlord said in RIP Java in the Enterprise:

    @sloosecannon said in RIP Java in the Enterprise:

    @powerlord welllllll....
    OK that makes more sense.

    But .NET Core != C#

    It's the cross-platform version of the C# Standard Library.

    And sooner or later will be the C#



  • @powerlord C# is the language, not the platform/framework.



  • @dse said in RIP Java in the Enterprise:

    And sooner or later will be the C#

    Nope. You still need GDI+ and related function to code WinForm applications, something that will never be added to .NET Core. (Actually, the original goal of .NET Core is to create a lightweight framework that does not need graphics support so it can be installed on Windows Server Core editions which lacks GDI+ and the sort. It's shaped to be portable version of .NET framework only after Xamarin enters the view)

    Btw, some time point in future both of them will merge into .NET Standard .


  • FoxDev

    @powerlord said in RIP Java in the Enterprise:

    C# Standard Library

    There's no such thing: you're thinking of the .NET Base Class Library.



  • @RaceProUK Here's your :pendant:



  • @cheong Do you usually make a habit of trying to program in a language without its standard library?

    'cause I'd hate to have to write my own low-level file I/O routines, among other things.



  • @cheong said in RIP Java in the Enterprise:

    Nope. You still need GDI+ and related function to code WinForm applications, something that will never be added to .NET Core. (Actually, the original goal of .NET Core is to create a lightweight framework that does not need graphics support so it can be installed on Windows Server Core editions which lacks GDI+ and the sort. It's shaped to be portable version of .NET framework only after Xamarin enters the view)

    To be fair, it wouldn't hurt for .NET to have a proper image manipulation component that isn't dependent on GDI. Things like "loading a bitmap, turning it into greyscale, scaling it down and saving it to a file" really shouldn't need a dependency on any Windows components.



  • @Maciejasjmj Back in the .NET 2.0 days you can manipulate images without GDI components but it was 100 times lower than using PInvoke madness. There are 3rd party libs but they all either cost money or are just a wrapper around PInvoke I believe.

    I remember doing a lot of work 10 years ago at a cartography company and PInvoke was the fastest way to manipulate images without having to resort to expensive 3rd party components.


  • Discourse touched me in a no-no place


  • Discourse touched me in a no-no place

    @Maciejasjmj said in RIP Java in the Enterprise:

    To be fair, it wouldn't hurt for .NET to have a proper image manipulation component that isn't dependent on GDI. Things like "loading a bitmap, turning it into greyscale, scaling it down and saving it to a file" really shouldn't need a dependency on any Windows components.

    It's amazing just how many image handling libraries across many languages have that exact flaw. 🤷



  • @dkf said in RIP Java in the Enterprise:

    It's amazing just how many image handling libraries across many languages have that exact flaw.

    I wonder how much of it is because bitmap operations are really optimization-sensitive - iterating over several millions of pixels, potentially in non-sequential patterns is where the costs of virtualized memory accesses, boundary checking, cache prefetches and all those other things that usually are a non-issue start coming into play.

    Since the native OS bitmap-handling code is probably as aggressively optimized as it gets, and can work at a low enough level for the abstractions to not get in the way, perhaps it's better not to reinvent the wheel?



  • @powerlord said in RIP Java in the Enterprise:

    @cheong Do you usually make a habit of trying to program in a language without its standard library?

    'cause I'd hate to have to write my own low-level file I/O routines, among other things.

    No, but just to let you know, there is 20+ languages that also uses .NET framework runtimes


  • Discourse touched me in a no-no place

    @Maciejasjmj said in RIP Java in the Enterprise:

    Since the native OS bitmap-handling code is probably as aggressively optimized as it gets, and can work at a low enough level for the abstractions to not get in the way, perhaps it's better not to reinvent the wheel?

    OTOH, it does mean that your web server that gives dynamically-generated images to clients over the web now needs a GUI to talk to even though it never needs to draw anything meaningful on its own screen.


  • Fake News

    @dkf It needs all the GUI libraries but it doesn't need to run any GUI. That's still quite a difference e.g. on Linux when running Java in "headless" mode.


Log in to reply