Javascript is to Java as JScript is to J?



  • @delta534 said:

    There is near feature parity between opengl 4.2 and directx 11 so you could write a wrapper targeting both. The major issue with opengl as that there is a lot of cruft and bad ways of doing things in the api that the khronos group would like to get rid of but cannot for various reasons. Hell, if minecraft had to use something that had to follow the DirectX way of doing things it would be much faster.
     

     Well, "bad way" is kind of harsh - I attended a university lecture in Oregon where the professor was just ranting about how he needed the features they are trying to discourage people from using. Simple matrix transformations, simple kinematics, stuff like that. I would agree that the API is a pain in the ass, but I always assumed it's that way because people designed it to be - the redbook for example used page after page to explain why complete insane things are a good idea. Kind of how KDE-developers decided that JavaScript would be a good language to base desktop-widgets on.

    It's more of a chicken and egg-problem right now: OpenGL-perfomance is absymal, so no gaming studio thinks it's worth doing(excluding OpenGL-ES and certain Engine-developers targeting the indie-market). No OpenGL-games uses the stuff that makes money(powerful hardware), so hardware vendors don't bother with good drivers. Add to that the fact that they actually sell cards that are the same Hardware, but optimized drivers, at a much higher cost(FirePro, Quatro), and well, I don't think anybody even wants to change the situation.


     



  • An obvious solution would to be to have the cleaner stuff in the newer version of OpenGL that's versioned, with a caveat that older stuff won't run with newer OGL.

    I know this has been done in Java, .NET, iOS and various other frameworks/systems... I presume DX has also faced this point and made a decision. Are there reasons[1] why they don't draw a line under the old version and make a distinct break to move with newer developments and drop deprecated stuff?

    [1] political, design, otherwise. 



  • What I call the bad way is immediate mode and display lists. The rest I can understand wanting to use but I also see why they would want to remove it.

    As for the issue of performance, I agree. All drivers suck, they just suck in different ways.



  • @Cassidy said:

    An obvious solution would to be to have the cleaner stuff in the newer version of OpenGL that's versioned, with a caveat that older stuff won't run with newer OGL.

    I know this has been done in Java, .NET, iOS and various other frameworks/systems... I presume DX has also faced this point and made a decision. Are there reasons[1] why they don't draw a line under the old version and make a distinct break to move with newer developments and drop deprecated stuff?

    [1] political, design, otherwise. 

     

     Well first off all Nvidia and others spoke out that they would support the discontinued stuff even after it was removed, as a special addition to their drivers. Apparently nobody wants to see all of their OpenGL-programs broken. Which is why nobody believes the depreceated stuff will actually be removed anytime soon, especially with the speed with which new OpenGL-versions are released...

     Second of all it's not like this is all software - hardware vendors have to actually tailor their hardware around the needs of this.Which makes a lot of the older design choices a bit dated... a big problem with writing OpenGL-drivers for ATI-cards after the 5xxx-series was that they dropped a lot of the 2D-specific circuits and instead just emulated it. And with OpenGL being designed with long-term in mind it's hard to say which parts of the hardware will actually remain the same. I remember when different shaders for different tasks were all the rage - vertex shader, pixel shader, even geometric shader. A few years later everybody decided it wasn't worth it and instead put a ton of unified changeable shaders on there.

     A cleaned version exists btw - OpenGL ES, which is the go-to-API for mobile development. I wonder how that will pan out, but then again I'm still confused by what exactly Microsoft's strategy in the mobile market is.

     



  • Are you people seriously discussing Minecraft's graphics? Is that some weird humor I don't get or are you just bloody crazy? Minecraft was never meant to be good looking. If you don't get... well, you'd have to be retarded. Next, I know hating Java is hip, but try at least not to be rediculous about it. It's the languge of choice for non-toy projects for a reason.



  • and when I say non-toy, I mean Cassandra (the storage engine behind your beloved Facebook), Hadoop (the distributed file-system and mapReduce engine behind Yahoo), Android etc.



  • @veggen said:

    Are you people seriously discussing Minecraft's graphics?

    Yes.

    @veggen said:

    Is that some weird humor I don't get or are you just bloody crazy?

    Are those two things mutually-exclusive?

    @veggen said:

    Minecraft was never meant to be good looking.

    And we're the crazy ones?

    @veggen said:

    It's the languge of choice for non-toy projects for a reason.

    Habit? Ignorance of better alternatives? Some vague desire to "stick it to Microsoft"?



  • They want to make a buggy non-standard UI? Desire to introduce security holes into web browsers? Desiring the application to incorrectly support named folders, multiple monitors, spaces in filenames, etc on Windows? They think non-Java software runs too fast? They love giving money to Oracle corporation?

    Stop me if I get near the answer...



  • Then I won't be stopping you any time soon.

    I do agree Java is far from the best solution for desktop apps. It's natural habitat where it kicks ass is the server side. Still, the UI has no reason to be buggy. The libraries are all quite mature. No idea about multi monitor thing, might be a limitation in the "has to run anywhere" philosophy. Of course, it can't look native, but I for one, very much appreciate being able to run the same copy of the app on my multiboot machine, or support only a single version of the app for my clients (yes, I'm working on a Java project).

    Spaces in filenames is bull. Java installs itself in Program Files by default and it has a space.

    Browser security is bull as well. Of course it can introduce insecurity when it's supposed to run code in the browser. Language X can not harm the browser when it doesn't have a browser plugin. What else would one use to run in a browser? ActiveX? Flash might work, but depending on what needs to be done, it might lack power (as you'd need another language anyway for the back end). Silverlight? Seriously, who even develops that let alone installs that?

    Next, why would anyone be giving any money to Oracle? Java is free in every sense.

    Being slow is quite close to being nonsense as well. It's slower than native code, but just as fast as any managed language and much much faster than the hip scripting langs. I've worked on a variety of platforms, Php, .Net and Java being more recent, and nothing compares to Java on the server side (Google said something very similar when they were choosing the platform for Android).

    People really need to stop judging Java by it's desktop capabilities, but even there it's receiving unerned slack. Because it's hip to hate Java.
    Btw, posting thanks to Opera Mini, which is Java, and thus can run even on my old phone.

     

    mod: linebreaks. I forgive you because the wysiwyg editor is probably not availabe on a mobile device.  –dh



  • Bonus points for Java bashers if you can name any of the following:

    1) an ORM that can compare to Hibernate

    1. full text search that can compare to Lucene and Solr

    2. anything that will let you do SmartCard authorization through the browser alone

    3. anything that can compare to Hadoop

    4. anything that can do distributed transactions without breaking sweat

    5. libraries for everything from form validation to molecelar biology readily available

    6. all of the above on any platform you want

    Mindless bashing made me the Java defender I am.



  • @veggen said:

    Java is free in every sense.

    Yeah, and when you use this wonderfully free technology on a highly popular smartphone OS you risk getting sued over patent infringement. Wait, what?



  • @Cassidy said:

    An obvious solution would to be to have the cleaner stuff in the newer version of OpenGL that's versioned, with a caveat that older stuff won't run with newer OGL.

    That was done a few years ago when OpenGL 3.0 introduced a deprecation model that allows dropping ancient stuff from the spec. Hardware vendors still have to provide support for OpenGL 2.1 though, since there are a lot of existing applications targeting that version (and even new ones being written, since not all hardware supports OpenGL 3.0 yet [I'm looking at you, Intel]).

    @Cassidy said:

    I presume DX has also faced this point and made a decision.

    There's no API compatibility between different versions of DirectX. Presumably some parts look the same, but they're still different APIs. Contrast to OpenGL, which is actually a single evolving API with stuff being added and dropped. OpenGL's extension framework also provides greater flexibility as hardware vendors can add support for cool new features right away, rather than waiting for Microsoft to come up with a new DirectX version where the stuff is supported (in theory at least; unfortunately it's likely that in reality the hardware vendors wait for a new DirectX version and then implement the stuff therein on their chips).



  • @tdb said:

    There's no API compatibility between different versions of DirectX. Presumably some parts look the same, but they're still different APIs. Contrast to OpenGL, which is actually a single evolving API with stuff being added and dropped.

    Okay.. that's interesting. Backwards compatibility is a strong selling point of many Microsoft products, but in this case they've deliberately broken it between versions. By preventing something written for an older version running on a newer version means no chance of something running slowly/badly simply because it hasn't been updated and therefore won't take advantage of improved features. It does mean that there's also increased revenue streams with updated products appearing as new... but cynicism aside, I agree with this decision to make a "clean break".

    @tdb said:

    OpenGL's extension framework also provides greater flexibility as hardware vendors can add support for cool new features right away, rather than waiting for Microsoft to come up with a new DirectX version where the stuff is supported (in theory at least; unfortunately it's likely that in reality the hardware vendors wait for a new DirectX version and then implement the stuff therein on their chips).

    This flexibility can also be its downfall - thinking of deviations and variations in HTML specs introduced by browser vendors that made code reliant upon features found in that particular browser. I recall some articles on Tom's Hardware where a certain game looked nicer on a certain card, thanks to leveraging these customisations and accusations of unfair advantage aplenty.

    It sounds like OpenGL could benefit from following the DirectX dev model of different versions being revolutionary rather than evolutionary. Apache did this between v1 and 2, as did Java, to convey the right message.



  • @fatbull said:

    I have seen that message box before. I believe the automatic Java updater (jusched.exe) causes it.

     

    Tell that to the hordes of know-it-all gamers who seem to have total faith in the fact that a game 'engine' produces the graphics, not graphics artists. Yes, slap the Unreal Tournament 3 SDK in your project and TADA!... you've got nothing. Well, perhaps you can make some colored particles float around.

    About installation of Java games: I tend to find that it is easiest if you don't attempt to install anything at all. Just zip it all up and make sure the app/game runs as an executable jar out of the box - for the less than computer savvy people you could generate a bootstrapper executable using one of the many jar wrapper tools available.

     



  • @tdb said:

    @veggen said:
    Java is free in every sense.

    Yeah, and when you use this wonderfully free technology on a highly popular smartphone OS you risk getting sued over patent infringement. Wait, what?

    Well, not exactly true. Google took pieces out of Oracle's JDK, not the open one (OpenJDK). The fact that none-open implementations exist doesn't make Java non-free. There are non-free Linux distros you know, but that doesn't make Linux non-free.



  • @Cassidy said:

    @tdb said:

    There's no API compatibility between different versions of DirectX. Presumably some parts look the same, but they're still different APIs. Contrast to OpenGL, which is actually a single evolving API with stuff being added and dropped.

    Okay.. that's interesting. Backwards compatibility is a strong selling point of many Microsoft products, but in this case they've deliberately broken it between versions. By preventing something written for an older version running on a newer version means no chance of something running slowly/badly simply because it hasn't been updated and therefore won't take advantage of improved features. It does mean that there's also increased revenue streams with updated products appearing as new... but cynicism aside, I agree with this decision to make a "clean break".

    Well, obviously the older versions continue to be supported. We're going at DirectX 11 currently, but you can still run programs built on any earlier version, right down to the first ones (though I don't remember anything before DX5 being too popular). However, if you have a game engine using DirectX 9 and want to take advantage of new features in 10 and 11, you'll essentially have to rewrite large parts of the engine. Down at the driver level, similar features of the different versions will likely get dispatched to the same parts of hardware, so this API segregation feels like an unnecessary complexity. Compare this with OpenGL where you can just start using the new functions, without touching unrelated code.

    @Cassidy said:

    @tdb said:

    OpenGL's extension framework also provides greater flexibility as hardware vendors can add support for cool new features right away, rather than waiting for Microsoft to come up with a new DirectX version where the stuff is supported (in theory at least; unfortunately it's likely that in reality the hardware vendors wait for a new DirectX version and then implement the stuff therein on their chips).

    This flexibility can also be its downfall - thinking of deviations and variations in HTML specs introduced by browser vendors that made code reliant upon features found in that particular browser. I recall some articles on Tom's Hardware where a certain game looked nicer on a certain card, thanks to leveraging these customisations and accusations of unfair advantage aplenty.

    On the other hand, it enables games to gracefully degrade their rendering quality if they encounter older hardware that doesn't support all the shiny new features (as opposed to not running at all). Admittedly this requires a certain amount of skill to get right.

    @Cassidy said:

    It sounds like OpenGL could benefit from following the DirectX dev model of different versions being revolutionary rather than evolutionary. Apache did this between v1 and 2, as did Java, to convey the right message.

    Each new version of OpenGL does tend to bring considerable improvements, especially major versions. Version 2.0 added shaders, NPOT textures and point sprites, among other things. Version 3.0 introduced framebuffer objects and floating-point textures. Version 4.0 brought tessellation shaders and a whole bunch of other shader improvements. They just don't break compatibility of the parts that didn't change.



  • @veggen said:

    @tdb said:
    @veggen said:
    Java is free in every sense.

    Yeah, and when you use this wonderfully free technology on a highly popular smartphone OS you risk getting sued over patent infringement. Wait, what?

    Well, not exactly true. Google took pieces out of Oracle's JDK, not the open one (OpenJDK).

    That would be copyright infringement. Google is getting sued over patents, which would apply even if they wrote their code from scratch (unless software patents are even more screwed up than I thought). As another example, xiph.org's audio and video codecs (Vorbis and Theora, respectively) are facing problems with device manufacturers being unwilling to use them because no one will guarantee they won't get sued over patents. So while something may be free both as in speech and as in beer, it might still not be free as in innocent.



  • @tdb said:

    Well, obviously the older versions continue to be supported. We're going at DirectX 11 currently, but you can still run programs built on any earlier version, right down to the first ones (though I don't remember anything before DX5 being too popular). However, if you have a game engine using DirectX 9 and want to take advantage of new features in 10 and 11, you'll essentially have to rewrite large parts of the engine. Down at the driver level, similar features of the different versions will likely get dispatched to the same parts of hardware, so this API segregation feels like an unnecessary complexity. Compare this with OpenGL where you can just start using the new functions, without touching unrelated code.

    Most games/usages aren't meant to be evolutionary anyway. You target a feature-set, code it, then release it. Maybe if you're at a transition point(like Crysis 2) you drop a patch with a few rewrites to make use of a lot of the new stuff. I like the idea of evolving games, but most of them are stuck products, and for actual continuing software the new features are a big enough selling point to not care(Cinema, Maya and so on).

    It's even easier when using a third-party engine, where you can just start using the new stuff.

    On that note - DirectX 9.0 was released 2002, 9.0c in 2004. If your engine is from then you're in a big need of a rewrite anyway...


    @tdb said:

    On the other hand, it enables games to gracefully degrade their rendering quality if they encounter older hardware that doesn't support all the shiny new features (as opposed to not running at all). Admittedly this requires a certain amount of skill to get right.

    To be quite honest the only program that ever even had this as a problem(I'm excluding everything that is just plain wasteful with hardware specs) out of DirectX was Crysis 2. So examples would be welcome here... And seeing how OpenGL hits a lot of performance issues they are actually even worse...

    @tdb said:

    Each new version of OpenGL does tend to bring considerable improvements, especially major versions. Version 2.0 added shaders, NPOT textures and point sprites, among other things. Version 3.0 introduced framebuffer objects and floating-point textures. Version 4.0 brought tessellation shaders and a whole bunch of other shader improvements. They just don't break compatibility of the parts that didn't change.

     

    As an addenum for other users we should add that all of these features were way after DirectX already did them. OpenGL has been in need of a better release model for a while, and is getting beaten to all the "new shiny stuff" all the time. And tessellation is a nice thing if ATI is throwing a hissy fit in their driver...

     



  • @fire2k said:

    On that note - DirectX 9.0 was released 2002, 9.0c in 2004. If your engine is from then you're in a big need of a rewrite anyway...
     

    What games right now do not run with DX9?



  • @dhromed said:

    @fire2k said:

    On that note - DirectX 9.0 was released 2002, 9.0c in 2004. If your engine is from then you're in a big need of a rewrite anyway...
     

    What games right now do not run with DX9?

     

    It's not that they do not run with DX9, it's just that they have a lot of redundancy to run both - DX10/DX11 Patches make away with DirectSound or DirectInput for example...

    The problem with making a game DX10-only is that the cards that support it not necessarily have the horsepower to run it - I have a DX10-compatible HD5850, but it is too slow to actually use the features found in DX10 in most games... So game studios will keep a second path with DX9, especially with games being in development for at least a year(if it's rushed out...), when the card situation still was a little different... 

     



  • @fire2k said:

    DX10/DX11 Patches make away with DirectSound or DirectInput
     

    I can't parse this sentence fargment.

     



  • @dhromed said:

    @fire2k said:

    DX10/DX11 Patches make away with DirectSound or DirectInput
     

    I can't parse this sentence fargment.

     

     

    http://idioms.yourdictionary.com/make-away-with

    Third meaning. I could have said "kills with fire". A ye olde gods.

     



  •  Ok



  • @veggen said:

    Spaces in filenames is bull. Java installs itself in Program Files by default and it has a space.

    Admittedly that was probably some badly-written software. But I've encountered more than one Java app that doesn't like being fed paths with spaces in them, most recently Amazon's AWS CLI tools.

    @veggen said:

    Browser security is bull as well. Of course it can introduce insecurity when it's supposed to run code in the browser.

    I'm not griping because it can, I'm griping because it does. Anyway, Java in the browser is dead, it's been dead for ages.

    @veggen said:

    Silverlight? Seriously, who even develops that let alone installs that?

    Silverlight is amazing. I wish a lot more sites used it.

    @veggen said:

    Next, why would anyone be giving any money to Oracle? Java is free in every sense.

    Really? Then why is Google getting sued?

    @veggen said:

    Being slow is quite close to being nonsense as well. It's slower than native code, but just as fast as any managed language and much much faster than the hip scripting langs.

    Please. It takes somewhere around 45 seconds just to close NetBeans. You honestly think that's as fast as "any managed language?" As for as "hip scripting langs", look into the newer generation of JavaScript interpreters, they basically kick ass and take names.

    @veggen said:

    I've worked on a variety of platforms, Php, .Net and Java being more recent, and nothing compares to Java on the server side

    The server side is where Java is least-weak.

    The problem is that when you say "Java" to me, I think WebEx. I think NetBeans. I think Eclipse. I think Amazon AWS CLI tools. I think Lotus Notes. And all those products suck shit. And they're all in Java.

    @veggen said:

    Google said something very similar when they were choosing the platform for Android

    Yeah, and because it's so "free in every way", right? Oh wait...

    @veggen said:

    People really need to stop judging Java by it's desktop capabilities,

    When I run a single Java desktop program that doesn't suck shit, maybe I'll start changing my mind. If you want people to think your little pet language is good, you need to get your community to stop releasing shit software written in it.



  • @Cassidy said:

    Okay.. that's interesting. Backwards compatibility is a strong selling point of many Microsoft products, but in this case they've deliberately broken it between versions.

    They do the same thing with .net, in case you haven't noticed. That's the new strategy for "relatively painless" backwards compatibility. Better than writing 50,000 shims for every new version, at least...

    @Cassidy said:

    This flexibility can also be its downfall - thinking of deviations and variations in HTML specs introduced by browser vendors that made code reliant upon features found in that particular browser. I recall some articles on Tom's Hardware where a certain game looked nicer on a certain card, thanks to leveraging these customisations and accusations of unfair advantage aplenty.

    Supposedly it's a huge pain to make bleeding-edge games in OpenGL for exactly this reason... all of that DirectX backwards compatibility stuff we just talked about has to be in your game engine because you can't make it rely on any particular combination of extensions. I've seen several articles calling-out OpenGL because of this from some big names... Carmack wrote one, if I'm not mistaken... yeah: http://www.bit-tech.net/news/gaming/2011/03/11/carmack-directx-better-opengl/1

    EDIT: This whole Google Java thing reminds me of when Adobe was going on and on and on about how open the PDF standard is, anybody can put PDF in their application it's so open open open! Then Microsoft went, "ok then we'll put it in Office" and Adobe ran to their lawyers so fast they made smoke trails? Yeah, Java being "open" is kind of like that. Words don't matter; actions do.



  • @blakeyrat said:

    @veggen said:
    Spaces in filenames is bull. Java installs itself in Program Files by default and it has a space.

    Admittedly that was probably some badly-written software. But I've encountered more than one Java app that doesn't like being fed paths with spaces in them, most recently Amazon's AWS CLI tools.

    Is there any language where this isn't so? IME, this is always a problem with the way the software is written, or the way it is used, not with the underlying language used. Obviously, CLI users have to escape or quote spaces, and there's not much of anything a program can do to overcome user error.



  • OH YAY IT'S BOOMZILLA

    @boomzilla said:

    Is there any language where this isn't so?

    Maybe, maybe not. The point is I never see this bug in programs that aren't written in Java, but I frequently see it in programs that are. Maybe the problem isn't Java-the-language but Java-the-culture. Either way, it's a Java problem.

    @boomzilla said:

    IME, this is always a problem with the way the software is written, or the way it is used, not with the underlying language used.

    Except that as we've discussed on this site before, the Java runtime has critical bugs in it that make it virtually impossible to write correct software in Windows*. The named folders thing we went over in that NetBeans thread, for example. WebEx's inability to see what's on my second monitor-- those bugs are specifically due to the software being written in Java, because they are inherited from the JRE. Those are Java problems. The named folders one being particularly noteworthy because it was broken even on the JRE's release in 1995, and it's still equally broken now in 2012.

    *) It probably has critical bugs that make it impossible to write correct OS X or Linux software, also, but I'm not as familiar with those environments so I will reserve judgement.

    @boomzilla said:

    Obviously, CLI users have to escape or quote spaces, and there's not much of anything a program can do to overcome user error.

    Yes, well, problem number one is deciding to write a CLI interface in the first fucking place. Thankfully, Amazon has finally gotten around to writing a web interface for all their services and their CLI tools sit around unused.



  • @fire2k said:

    @dhromed said:

    @fire2k said:

    DX10/DX11 Patches make away with DirectSound or DirectInput
     

    I can't parse this sentence fargment.

     

     

    http://idioms.yourdictionary.com/make-away-with

    Third meaning. I could have said "kills with fire". A ye olde gods.

     

    I've never heard it used that way. 1st meaning - to steal - is what's normally meant. Either 'do away with' or 'make an end of' for 'kill'.



  •  I think the troll (veggen) is hilarious. Actually quoting Facebook for choosing Java while a few simple googles could've told you that Facebook is written in PHP an compiles it to C++.



  • @blakeyrat said:

    When I run a single Java desktop program that doesn't suck shit, maybe I'll start changing my mind. If you want people to think your little pet language is good, you need to get your community to stop releasing shit software written in it.

     

     SoapUI is pretty nice.  Well, it would probably be much better if it weren't written in Java, but... still, it doesn't suck.



  • @blakeyrat said:

    @Cassidy said:
    This flexibility can also be its downfall - thinking of deviations and variations in HTML specs introduced by browser vendors that made code reliant upon features found in that particular browser. I recall some articles on Tom's Hardware where a certain game looked nicer on a certain card, thanks to leveraging these customisations and accusations of unfair advantage aplenty.

    Supposedly it's a huge pain to make bleeding-edge games in OpenGL for exactly this reason... all of that DirectX backwards compatibility stuff we just talked about has to be in your game engine because you can't make it rely on any particular combination of extensions. I've seen several articles calling-out OpenGL because of this from some big names... Carmack wrote one, if I'm not mistaken... yeah: http://www.bit-tech.net/news/gaming/2011/03/11/carmack-directx-better-opengl/1


    @Linked article said:

    'The actual innovation in graphics has definitely been driven by Microsoft in the last ten years or so,' explained AMD's GPU worldwide developer relations manager, Richard Huddy. 'OpenGL has largely been tracking that, rather than coming up with new methods. The geometry shader, for example, which came in with Vista and DirectX 10, is wholly Microsoft's invention in the first place.'

    This is unfortunately true. Development of 3D graphics hardware is driven by games these days, and all the big development studios are targeting Windows1. Microsoft has a monopoly on the DirectX specification, so there's little point for graphics chip vendors to innovate themselves. And since Microsoft isn't collaborating with the Khronos group, they have to wait for Microsoft to release the new version of DirectX before they can update the OpenGL spec to match.

    While the specification being in the hands of a single company may be a good thing in some respects (there's exactly one feature set to target, and no need to do any runtime checks), it's also a seriously limiting factor in others. If you write games using DirectX, they will run on Windows and Xbox, period. OpenGL implementation can be found on Windows, Linux, Mac OS X, Android and iOS at least.

    It's also true that despite deprecating most of the ancient cruft, OpenGL still has some baggage from its early days. I'm told the approach it takes to specifying which GPU resources (textures, shaders, buffers etc) to use for rendering is a rather poor match to how graphics hardware actually works. The API also allows modifying several types of objects which in fact are immutable for the GPU (textures dimensions, framebuffer object attachments), so the drivers have to dance around these. Back when OpenGL 3.0 was in development, it was supposed to take care of these issues as well, but eventually the committee chickened out and chose compatibility over technical advancement.

    1) The last game studio to consider releasing a major title on Linux (that I know of) was Epic with Unreal Tournament 3. At first the Linux installer was supposed to be featured on the game DVD, then to be released as a separate download shortly after the retail release of the game, then "as soon as it's ready". Finally, after two or three years of procrastination they finally admitted there won't be a Linux version. Supposedly Ryan Gordon got the game ported, but it ran into some middleware licensing issues. There hasn't been an official statement of that, so we'll probably never know.



  • @blakeyrat said:

    When I run a single Java desktop program that doesn't suck shit, maybe I'll start changing my mind. If you want people to think your little pet language is good, you need to get your community to stop releasing shit software written in it.

     

    Jdownloader?

     



  • @blakeyrat said:

    Yes, well, problem number one is deciding to write a CLI interface in the first fucking place. Thankfully, Amazon has finally gotten around to writing a web interface for all their services and their CLI tools sit around unused.

     

    What is your weird obsession with all things CLI? I fail to see the problem with command line interfaces (there's a wtf that you called it CLI interface, +1000 stupid points for the redundant word)  they actually make some things a hell of a lot easier to do.

     



  • @dtech said:

    Actually quoting Facebook for choosing Java while a few simple googles could've told you that Facebook is written in PHP an compiles it to C++.
     

    Veggen was referring to Cassandra, not Facebook.



  • @ASheridan said:

    @blakeyrat said:

    Yes, well, problem number one is deciding to write a CLI interface in the first fucking place. Thankfully, Amazon has finally gotten around to writing a web interface for all their services and their CLI tools sit around unused.

     

    What is your weird obsession with all things CLI? I fail to see the problem with command line interfaces (there's a wtf that you called it CLI interface, +1000 stupid points for the redundant word)  they actually make some things a hell of a lot easier to do.

     

    NOT THIS SHIT AGAIN!



  • @blakeyrat said:

    They do the same thing with .net, in case you haven't noticed.

    Nope, hadn't - but then I'm not a .net user, so ta for the info.

    @blakeyrat said:

    Supposedly it's a huge pain to make bleeding-edge games in OpenGL for exactly this reason...

    .. and once a much easier method is provided (DirectX), people find it hard to justify going down the more arduous route. Surely OpGL will die if it retains this mentality?

    @tdb said:

    And since Microsoft isn't collaborating with the
    Khronos group, they have to wait for Microsoft to release the new
    version of DirectX before they can update the OpenGL spec to
    match.

    I'm not convinced about that - browser vendors didn't wait for updates to IE6 so they could play catchup, they went ahead and innovated anyway, and it suddenly became Microsoft that began implementing other browser functionality into their products.

    To make OpGL successful, two things need to happen:

    • give a reason to continue to use it - clear out cruft, make things better and easier for existing users
    • give a reason to begin using it - don't wait for Microsoft to dictate the next thing in DX so OpGL can clone it, start coming up with new and exciting ways of graphics rendering NOW that arouse juices of the games industry and give developers a reason to choose it over DX (or at least offer it as a parallel option).

    @ASheridan said:

    What is your weird obsession with all things
    CLI? ...they actually make some things a hell of a lot easier to do.

    Blakey's a UI bod, so prefers mouse-driven over keyboard-driven. I think also some past (bad) experiences with Linux running MUD have traumatised him away from the command line. You've got to read those rants as reasons why he won't use the CLI, rather than reasons why it shouldn't be used.



  • @dtech said:

     I think the troll (veggen) is hilarious. Actually quoting Facebook for choosing Java while a few simple googles could've told you that Facebook is written in PHP an compiles it to C++.

    S/he said Cassandra, which is used by facebook and written in Java (not the facebook website).



  • @Cassidy said:

    Blakey's a UI bod

    ... what the hell does that mean?

    @Cassidy said:

    You've got to read those rants as reasons why he won't use the CLI, rather than reasons why it shouldn't be used.

    The CLI, at least all the ones I've been exposed to (with the possible exception of PowerShell-- although I don't have enough experience with that to make the call really) are shit UIs. That's the reason I come here and say they're shit UIs.

    I have no issues with the idea of a command line interface. It's definitely something that can be done well. The problems are:
    1) Much like the Java thing, I've never seen a CLI that wasn't shit
    2) CLIs are completely, 100% stagnant. The primary reason PowerShell is a possible exception is that it's not 30 years old

    Stagnation is death.

    But the idea that there's no such thing as a "friendly" CLI is stupid. You can make a friendly CLI if you actually gave a shit and tried to make one. It's probably harder than making a friendly GUI, but it's certainly not impossible-- it's just nobody's fucking tried.



  • @blakeyrat said:

    But the idea that there's no such thing as a "friendly" CLI is stupid. You can make a friendly CLI if you actually gave a shit and tried to make one. It's probably harder than making a friendly GUI, but it's certainly not impossible-- it's just nobody's fucking tried.

    Out of curiosity, what would be required of a good CLI? I've been using Linux for the past 10 years and gotten used to the way its CLI tools work, so I might be stuck inside the box. I'm interested to hear your ideas though, so that I could write better software in the future.



  • @blakeyrat said:

    ... what the hell does that mean?

    User-Interface person. I've read many of your rants about crap implementations of the meatspace<->cyberspace layer. They're quite a fascinating insight into how presentation and data capture is quite important.

    @blakeyrat said:

    But the idea that there's no such thing as a "friendly" CLI is stupid. You can make a friendly CLI if you actually gave a shit and tried to make one. It's probably harder than making a friendly GUI, but it's certainly not impossible-- it's just nobody's fucking tried.

    There doesn't seem to be a requirement for it: The Linux/Unix shell isn't user-hostile, it's more noob-intolerant at
    the expense of speed and simplicity. Change it to improve the user
    environment is largely subjective taste, so is considered part of
    user-controlled customisations. Windows is aimed at the more novice computer user, for which pointy-clicky methods suit over typed commands.

    .. unless we factor in SpagettiSwamp. Obviously then all bets are off.

    @tdb said:

    I've been using Linux for the past 10 years and gotten
    used to the way its CLI tools work, so I might be stuck inside the
    box.

    If you've used HP-UX, Solaris, or SCO (spit) beforehand, you'll see the advances in Linux towards friendliness. But sometimes it amounts nothing more than giving the cluebat a fresh lick of paint - the pain is identical but you're impressed at the way it shines when glancing off your forehead.





  • @Cassidy said:


    I'm not convinced about that - browser vendors didn't wait for updates to IE6 so they could play catchup, they went ahead and innovated anyway, and it suddenly became Microsoft that began implementing other browser functionality into their products.

    To make OpGL successful, two things need to happen:

    • give a reason to continue to use it - clear out cruft, make things better and easier for existing users
    • give a reason to begin using it - don't wait for Microsoft to dictate the next thing in DX so OpGL can clone it, start coming up with new and exciting ways of graphics rendering NOW that arouse juices of the games industry and give developers a reason to choose it over DX (or at least offer it as a parallel option).

     

    Completely different situation:

     That won't happen, because updates to the specification are decided by a committee which consists of:

    - major vendors(ATI, Nvidia, Intel), which naturally hate each other, don't want to share any knowledge and don't want to collaborate unless either forced(DirectX, freshness of feature gone) or when they see an advantage in playing nice(OpenCL, CUDA). Why do you think the whole accelerating h.264-thing is such a hassle?

    - embedded systems vendors that actually really care about OpenGL-ES, or WebGL, or other shiny new stuff(TI, Google with Android/ChromeOS, ARM)

    - Operating system vendors that only vaguely profit from actual innovation. Mac never had a major focus on high-end gaming, and the Mesa/Linux-people are still trying to catch up with OpenGL 3.0

     Please note that even if anybody had the actual manpower for your ideas(which was the case with Firefox), you can't just kick out an API and call it a day. You need hardware for it. Which has to sell. Which it won't because there are neither vendors, nor games, nor infrastructure.

     

     



  • @Cassidy said:

    @blakeyrat said:

    ... what the hell does that mean?

    User-Interface person. I've read many of your rants about crap implementations of the meatspace<->cyberspace layer. They're quite a fascinating insight into how presentation and data capture is quite important.

    Which of these definitions of "bod" were you going for?

    @Cassidy said:

    There doesn't seem to be a requirement for it: The Linux/Unix shell isn't user-hostile, it's more noob-intolerant at
    the expense of speed and simplicity.

    You're just replacing one stereotype with another. "CLI = hard to use" is turning into "easy to use = inefficient." Neither of those stereotypes are true. (Rather, they don't have to be true.)

    Mac Classic was, in its era, by far the most productive computer system while being simultaneously the easiest-to-use.

    @Cassidy said:

    Change it to improve the user
    environment is largely subjective taste, so is considered part of
    user-controlled customisations.

    That's exactly like saying "we can fix all our usability problems if we add themeing to our application." Dead. Fucking. Wrong. Worse than wrong, because it gets people writing code in the wrong direction.



  • @tdb said:

    @blakeyrat said:

    But the idea that there's no such thing as a "friendly" CLI is stupid. You can make a friendly CLI if you actually gave a shit and tried to make one. It's probably harder than making a friendly GUI, but it's certainly not impossible-- it's just nobody's fucking tried.

    Out of curiosity, what would be required of a good CLI? I've been using Linux for the past 10 years and gotten used to the way its CLI tools work, so I might be stuck inside the box. I'm interested to hear your ideas though, so that I could write better software in the future.

    Thing is, I'm probably in the box as well. What you really need is to define a few tasks a CLI would be useful for (and this is where I am in the box), and then start from scratch and do frequent and regular user testing. The main point is to do testing with human beings, not uber-nerd "high priesthood of technology" idiots. There's no reason your grandma shouldn't be able to use the CLI to do a mail-merge for her knitting club.

    The cultural problem is that the only group with any interest in the CLI whatsoever is the uber-nerd "high priesthood of technology" idiots, which is exactly why it's completely stagnant and will never improve. As long as the CLI is difficult, their position at the top of their bullshit fake hierarchy where they can pretend to be better than everybody else is secure.



  • @Cassidy said:

    I'm not convinced about that - browser vendors didn't wait for updates to IE6 so they could play catchup, they went ahead and innovated anyway, and it suddenly became Microsoft that began implementing other browser functionality into their products.

    I'm not convinced that this is a valid comparison. HTML was already an existing standard, and all the browsers were able to do the basic things. What you described would be more akin to different hardware vendors making their own extensions to OpenGL before waiting for Khronos to update the core spec - and this is exactly what's happening.

    @Cassidy said:

    To make OpGL successful, two things need to happen:

    • give a reason to continue to use it - clear out cruft, make things better and easier for existing users
    • give a reason to begin using it - don't wait for Microsoft to dictate the next thing in DX so OpGL can clone it, start coming up with new and exciting ways of graphics rendering NOW that arouse juices of the games industry and give developers a reason to choose it over DX (or at least offer it as a parallel option).

    I find it hard to imagine what new things OpenGL could offer that wouldn't have equivalents in DirectX by the time they're actually usable performance-wise. Consider that DirectX 10 was released in 2006, and there are still [url=http://en.wikipedia.org/wiki/Civilization_V]recently published games[/url] as well as [url=http://us.battle.net/support/en/article/diablo-iii-system-requirements]upcoming ones[/url] that use DirectX 9. One possible thing would be real-time radiosity, but I'm not sure if that needs new hardware/API features as much as new methods and algorithms for using what's already available. Maybe something to control really huge amounts of data and render trees with individual leaves?

    Besides the available features, there are other costs associated with switching APIs. If your developers are all of the DirectX variety, they're not going to learn OpenGL overnight. And you'll need new systems for handling audio and input as well, if you truly want to leverage OpenGL's greatest advantage which is its portability. Vendor loyalty should not be underestimated either. I've heard that Windows Phone is a good choice for users because it's "safe and familiar" - as far as I can see, the only familiar thing about it is the name "Windows". And I don't even know what kind of incentives Microsoft might be offering game developers to use its APIs.

    If you really want to build up pressure for an API switch, you'll need something revolutionary; preferably a complete paradigm switch. Affordable hardware that can do real-time ray-tracing could be such a thing. That'd need an entire new API and a new way of thinking about the 3D data, but it would provide huge advantages in fields like reflections, refractions and radiosity which are notoriously hard to do with a polygon-based rasterizer.



  • @blakeyrat said:

    But the idea that there's no such thing as a "friendly" CLI is stupid. You can make a friendly CLI if you actually gave a shit and tried to make one. It's probably harder than making a friendly GUI, but it's certainly not impossible-- it's just nobody's fucking tried.

    I disagree, but only because I go much further than you. CLIs are bad and will always be bad (as user interfaces), although could be better. They're not really user interfaces so much as opportunities to enter a single line of code at a time, though.

    And GUI is a complete misnomer, in my book - the user interfacing is graphical only on the output side. I'm not just being pedantic about language there, by the way: I mean that until you're interacting directly with the displayed objects - which will require proper 3d displays and motion tracking - it's a temporary workaround at best.

    Although technically even a row of dip-switches is a user interface, in the truer meaning of the phrase there is very little which actually fits the bill. Noteworthy exceptions are the accelerometers in smartphones, and laptops which detect when they're closed. Laptop screens and smartphones are actual physical objects, but the point is that we only ever 'interface' with things using our actual physical bodies. A decent UI would allow us to interact with virtual objects as simply, easily and intuitively as with actual ones - which means that whilst they don't have to have mass or solidity, they do have to occupy volume and detect when they are 'touched'.

    Whilst keyboards aren't going to be beaten any time soon as a means of entering text, entering text is an inherently bad idea to base a user interface on.



  • @tdb said:

    I'm not convinced that this is a valid comparison. HTML was already an existing standard, and all the browsers were able to do the basic things. What you described would be more akin to different hardware vendors making their own extensions to OpenGL before waiting for Khronos to update the core spec - and this is exactly what's happening.

    Which is why we have one unified syntax for CSS. Oh wait, no we don't.

    Pretty much what's going on is what you're describing. Browser vendors are making their own extensions to CSS and JS (which is what most people mean when they talk about HTML5) without waiting for the W3C to finish the spec- and suddenly there's five different ways to create a gradient.



  • @blakeyrat said:

    There's no reason your grandma shouldn't be able to use the CLI to do a mail-merge for her knitting club.

    No, but isn't it always the case that it would be easier to teach her to do it with a GUI? Multiple choice test are always easier than those with essay questions.



  • @tdb said:

    One possible thing would be real-time radiosity, but I'm not sure if that needs new hardware/API features as much as new methods and algorithms for using what's already available. Maybe something to control really huge amounts of data and render trees with individual leaves?

    If you really want to build up pressure for an API switch, you'll need something revolutionary; preferably a complete paradigm switch. Affordable hardware that can do real-time ray-tracing could be such a thing. That'd need an entire new API and a new way of thinking about the 3D data, but it would provide huge advantages in fields like reflections, refractions and radiosity which are notoriously hard to do with a polygon-based rasterizer.

     

    This kind of talk made me hot, and also made realise I've proficiated in the wrong part of the field, programming-wise.

    Time to jump ship.

    Any recommendations for starting out painting some pixels on the screen with room for evolving into letting my code have long conversations with the video card?

    ... I feel like I've asked this before.



  • @fterfi secure said:

    @blakeyrat said:
    There's no reason your grandma shouldn't be able to use the CLI to do a mail-merge for her knitting club.
    No, but isn't it always the case that it would be easier to teach her to do it with a GUI? Multiple choice test are always easier than those with essay questions.

    That's exactly why I'm in the box. As I mentioned. Because I don't see a need for the CLI existing at all. I would definitely be the wrong person to design the "usable CLI".


Log in to reply

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.