Which language is the least bad?



  • PyCharm. Made by JetBrains, so you get Resharper Quality™.

    Agree with you on points 2 and 3, to a point. I don't use Python for anything past basic sites, so I haven't had the need for what you do in #2. As for #3, it's still bad, but we live in a world where MUMPS exists.

    EDIT: and all the other things that everyone has already said :facepalm:



  • Can we agree that any language that uses type inference is right out, when it comes to this question?

    I like Python, but I just can't say it doesn't have that inherent flaw.



  • Haskell uses static type inference all over the place. But the compiler throws errors if it could possibly be ambiguous. I'd say Haskell is a contender for top 5 least bad languages.

    I think your problem is better described as dynamic type inference or duck typing. (yuck)



  • @antiquarian may disagree with you about that.



  • I'm pretty sure he'd say Haskell is a contender for top 2 least bad.

    Haskell is definitely my favorite. I can't think of another language I actually want to use, though sometimes I have to use JavaScript.



  • Ooooooh, I just misread what you were saying, I somehow managed to ignore this was 'least bad' *facepalm*



  • I'd say that being niche can certainly make a language bad.



  • It depends on the niche.

    "All" the recent CS grads know it or ML. Academia probably isn't going to change to another paradigm until research opportunities in Haskell run out. That's going to be a while.

    Honestly, IMO, OO is on its way to becoming "legacy", just through sheer force of training inertia and supply and demand. Note that this is how OO became dominant, as well.

    In fact, Haskell programmers currently get paid less than OO programmers, because there is a glut of people who want to do Haskell and are willing to take a pay cut for it.



  • Supply and demand. It's crazy that you get paid for a language in large part because someone that knows very little about the language chose to use it for a company.



  • Go is the least bad.</thread></topic>



  • @ben_lubar said:

    Go is the least bad.</thread></topic>

    I’m surprised you didn‘t react earlier.


  • Discourse touched me in a no-no place

    @Captain said:

    "All" the recent CS grads know it or ML. Academia probably isn't going to change to another paradigm until research opportunities in Haskell run out. That's going to be a while.

    It depends on what you're doing. I could well believe the algorithms side of things being keen on Haskell (and it's a great teaching language, as it's conceptually similar to maths but different to anything students have programmed with before) but the data-heavy side of CS research mostly seems to favour Java or C# (depending on platform) with quite a lot of Ruby and Python too.


  • 🚽 Regular

    @cartman82 said:

    Mongo: Doesn't scale as well as people hoped, it's nice to play around, but would you really trust your core data to it?



  • So in short...

    @Zecc said:

    would you really trust your core data to it?

    No.


  • BINNED

    @Captain said:

    Haskell uses static type inference all over the place. But the compiler throws errors if it could possibly be ambiguous. I'd say Haskell is a contender for top 5 least bad languages.

    The last time I did anything in Haskell, I declared types for everything before putting in any real function definitions and used the compiler to check for errors.

    I'm seriously looking into Ada these days for a home project. It has a lot of the type safety features of Haskell, but the code is more readable (even other people's code).


  • Discourse touched me in a no-no place

    @antiquarian said:

    I'm seriously looking into Ada these days for a home project. It has a lot of the type safety features of Haskell, but the code is more readable (even other people's code).

    ROFL--that's because Ada is written like English was 100 years ago: overly verbosely.

    Also: "Oh, I'm sorry, you only instantiated Text_IO for 23 out of the 24 integer subtypes you've define? Too bad! Here's 6 pages of compiler diagnostics that will only suggest what went wrong." Although maybe that was just the VMS version. Had I been writing the compiler, that error would've been replaced with "you probably forgot to instantiate Text_IO for the integer subtype declared on line 23."


  • BINNED

    Badge Request: Resident Troll



  • @tarunik said:

    JPA gets the single biggest thing wrong it can, and that's putting the query syntax outside the language it's operating in. Then again, that's because Java threw the overloaded operator baby out with the bathwater, I bet...

    I personally don't like overloaded operators for query syntax (my only gripe with Python), but that's because I'm used to C#'s gratuitous use of lambdas.


  • Discourse touched me in a no-no place

    @FrostCat said:

    Also: "Oh, I'm sorry, you only instantiated Text_IO for 23 out of the 24 integer subtypes you've define? Too bad! Here's 6 pages of compiler diagnostics that will only suggest what went wrong." Although maybe that was just the VMS version. Had I been writing the compiler, that error would've been replaced with "you probably forgot to instantiate Text_IO for the integer subtype declared on line 23."

    That brings back some memories. My first boss had a hard-on for Ada's complicated integer subtypes, and so our first attempt at writing software to realize his vision had all that complicated stuff and a whole custom type algebra to deal with it. Brain-bending stuff, especially once you start trying to make everything work sensibly with arithmetic; I still don't know what would have happened when we got around to looking at dealing with division.

    Then the more practical members of the team (everyone except my boss) decided that this was just nuts and switched integers to being a simpler model (with variants based on a count of how many bits were used and whether the value was signed) which gave something we could actually work with and which our users greatly preferred.



  • @dkf said:

    My first boss had a hard-on for Ada's complicated integer subtypes

    Completely unfamiliar with ADA.

    Named subtype[edit]
    A subtype which has a name assigned to it. “First subtypes” are created with the keyword type (remember that types are always anonymous, the name in a type declaration is the name of the first subtype), others with the keyword subtype. For example:

    Stopped reading there. Headache.



  • @Shoreline said:

    I would like to unify this forum's opinions and decipher the least bad programming language.

    That's easy: HQ9+


  • BINNED

    One of the things you can use subtypes for is distinguishing numbers that are used as measurements from numbers that are used as prices. The compiler will then give you a slap on the wrist if you mistakenly use one where the other is needed.


  • Discourse touched me in a no-no place

    "Ada's complicated integer subtypes"

    The trick is[1] to declare subtypes instead of types of integer. Subtypes are type-compatible; types aren't.

    [1] it was 20 years ago when I was using Ada in college. It's possible--but unlikely--that's been changed.


  • Discourse touched me in a no-no place

    Heh. I had to go look up the syntax, it's been so long.

    type X1 is range 1..10; /* you'd use this frequently as an array bounds */
    type X2 is range 1..10;

    declare
    y1 : X1;
    y2 : X2

    y1 = 3;
    y2 = y1; /* compile-time error! */

    But:

    subtype W1 is range 1..10;
    subtype W2 is range 1..10;

    declare
    u1 : W1;
    u2 : W2;

    u1 = 3;
    u2 = u1; /* NO PROBLEM! */


  • Discourse touched me in a no-no place

    @FrostCat said:

    it was 20 years ago when I was using Ada in college.

    I was just trying to make an IDE and graphical programming language that used them, not understand that damned type logic. As noted above, the users also hated the complexity; it went…



  • Not surprisingly, no one has mentioned my own preferred language (old-timers on the forum may or may not recall what that is, but I'm not saying lest I provoke a firestorm). I suppose it is just as well.

    As for worst, I vote for COBOL, though I understand MUMPS was even worse. For more modern work, I am currently suffering under Apex, a proprietary Java-esque mess that is used in SalesForce CRM. Given that CRM is a crappy idea to begin with, and that SleazeFarce, being the first ones to bring that bad idea to market, made all the mistakes possible to make it even worse, it is no surprise that I am on the verge of a nervous breakdown daily. If it weren't for the fact that I have a really good boss and teammates, and a decent overall work situation, I'd have walked by now.



  • IMHO worrying about spaces once (when you configure your editor not to be a dick) is worth the price of all the ambiguity that Python removes.

    There are few languages out there with as unambiguous, and yet reasonably concise syntax as Python.


  • Discourse touched me in a no-no place

    @unwesen said:

    There are few languages out there with as unambiguous, and yet reasonably concise syntax as Python.

    Whereas I think Python's got too much syntax and too many inconsistencies.



  • That makes you a parser, not a human being ;)

    I mean, fair enough, everyone is entitled to their opinion. I just don't understand what you're thinking of when you're making that statement. Perhaps you would like to elaborate?

    FWIW, your comment about Python threads I can only agree with. It's the one big issue with the language that I have.


  • Discourse touched me in a no-no place

    @unwesen said:

    Perhaps you would like to elaborate?

    Python's got too complicated a syntax for my taste. (I like Tcl, and that has fewer formal syntax rules than Lisp.)

    But more to the point, Python's got a complicated emergent syntax, at the level above the “how to parse these letters” level. Things like the _ prefixes for lots of things are a part of this, as is the way you handle passing a reference to an object instance to a method. This all makes Python code rather noisy. Ruby did a better job. (Ruby's key problems are at the semantic level. And the formal syntax is still too fussy IMO.)

    (The whitespace-indent thing is exactly what I won't criticise Python for. I don't like it much, but it's an entirely reasonable thing to do.)



  • oh good lord. Fuck you, Discourse, fuck your lying preview panel, and fuck everyone involved in your development.



  • Oh, no, COBOL was sweet!

    Whenever your current block (paragraph) of code started to get too obscure, or if you repeated some logic, you just pulled it out and threw in a new paragraph!

    No fussy overhead with parameters, call by ref vs. value, blah, blah..

    Granted we had to work on our paragraph number/naming standards to make it all simple, but it worked well for what we were doing, and it was super easy to follow someone else's code.


    Filed under: the only really bad bug we ever had was when somebody left a period in column 127 by accident.



  • @ScholRLEA said:

    For more modern work, I am currently suffering under Apex, a proprietary Java-esque mess that is used in SalesForce CRM. Given that CRM is a crappy idea to begin with, and that SleazeFarce, being the first ones to bring that bad idea to market, made all the mistakes possible to make it even worse, it is no surprise that I am on the verge of a nervous breakdown daily. If it weren't for the fact that I have a really good boss and teammates, and a decent overall work situation, I'd have walked by now.

    VisualForce, Apex, and SOQL can all go DIAF. I'm never touching SFDC code again.


  • BINNED

    @chubertdev said:

    VisualForce, Apex, and SOQL can all go DIAF. I'm never touching SFDC code again.

    @ScholRLEA said:

    I am currently suffering under Apex,

    I am now starting to suffer under Apex... And at least SalesForce is not involved (a small kindness).
    Misery does love company.



  • This post is deleted!


  • Doctor Zoidberg, soaking in brine.



  • Last time I had to use Haskell, I had to figure out which of the dozens* of feature-incomplete string implementations had the method I needed (something as esoteric as substring replacement) and how to go from one to another and back. IIRC you have to go from whatever you're using to the right flavour of Data.Text and back, unless your string replacement is printf-style, then Data.Text doesn't work, you should use [Char] instead. Hoping that in the conversion you don't lose unnecessary information such as encoding...

    It's not like Haskell lacks the necessary sophistication to abstract those differences away (I suppose you could store strings in Either [Char] Text and transparently use whichever implementation can do each operation more efficiently and then convert the result from one representation to the other for a little performance penalty).

    This doesn't strike me as great language design. My beloved Python has very similar woes with date-time representations, all alike, all different, all incomplete.

    *slight exaggeration


  • Discourse touched me in a no-no place

    @bp_ said:

    My beloved Python has very similar woes with date-time representations, all alike, all different, all incomplete.

    That's understandable, as date-time is hard. Genuinely hard. Lots of arbitrary annoyances (timezones are ground-zero for that) and what's worse, lots of people assume that it's simple because they can read a calendar or something…



  • Possibly, but why have both time.struct_time and datetime.datetime? Just as an example.



  • @bp_ said:

    Possibly, but why have both time.struct_time and datetime.datetime? Just as an example.

    You can use time.struct_time to interact with the underlying buggy libc implementation; when you need something that actually works, use the other one. Choice is good, right?



  • This post is deleted!

  • Discourse touched me in a no-no place

    @VinDuv said:

    You can use time.struct_time to interact with the underlying buggy libc implementation; when you need something that actually works, use the other one.

    Heh heh. It seems to be a rite of passage for languages to rewrite their datetime handling from scratch a few times just to try to avoid the problems of the older code (e.g., the libc code, which isn't just broken, but is broken differently on different platforms). At least then they get to own the bugs properly…



  • There should be one — and preferably only one — obvious way to do it.

    Especially when it gets down to a language's standard library. I'm saddened Python 3 didn't deprecate the time module then.



  • Surprised no one mentioned the bestest language EVA!

    PAL-8


  • kills Dumbledore

    @TheCPUWizard said:

    PAL-8

    No, it could never reach the heights of PAL-5. By PAL-7 they'd completely jumped the shark



  • @Jaloopa said:

    No, it could never reach the heights of PAL-5. By PAL-7 they'd completely jumped the shark

    Clearly you were not doing software development 40-45 years ago. Neither of those ever existed. The "8" refers to the hardware the code ran on, and was not a version number.


  • kills Dumbledore

    Your showing you're lack of experience here. Obviously the 8 was the hardware, but how do you think that hardware was versioned?



  • No; the Python way is to keep all broken obsolete libraries around for the purposes of pointlessly wasting hours of developer time every time one needs to, for example, find a library to talk to a SOAP web service.

    After using the damned thing, I honestly don't get the Python praise. It gets a lot wrong. It's been around since 1995 and has tooling far more primitive than C# (which has only been around since 2001). The authors fucked-over all their devs with the 3.0 thing. The state of the libraries is ludicrously awful... like even worse than JavaScript libraries.

    The sad thing is, the language itself is pretty great. All the fuck-ups are periphery to the language itself. But they're still fuck-ups.



  • @blakeyrat said:

    No; the Python way is to keep all broken obsolete libraries around for the purposes of pointlessly wasting hours of developer time every time one needs to, for example, find a library to talk to a SOAP web service.

    I hope that someday you'll be forced to find a different example in deficiencies in the Python libraries than SOAP web services :)

    There was a timid attempt to improve tooling by extending Python function defintion syntax in version 3 to include arbitrary parameter and return value annotations:

    def foo(bar: Int, baz: [2], *, egg: {"whatever": "really"}) -> stuff:
      ...
    

    ...with the idea that somebody would take the syntax, the reflection APIs and build libraries around it.

    AFAIK, it didn't quite happen. It does get in the way of testing tools such as monkeypatching.



  • urllib. urllib2. urllib3.


Log in to reply