Can I borrow an apology?


  • Banned

    @Bulb you are reading "actually designed" way too literally. Or not literally enough, that's also an option - ignoring the less convenient of two words. Being public doesn't make design better, I agree - look at C++ for example. But what does make design better is spending time on analyzing the many alternative approaches and thinking about how the language will be used in practice.

    There is very strong evidence JS had nothing of the sort. There is strong evidence that Python had nothing of the sort until they started planning 3.0. There is a fair amount of circumstantial evidence that neither Go nor Kotlin had such design phase, whether public or private. A language that did go through design phase doesn't drastically pivot in another direction 2 years down the line.



  • @dkf said in WTF Bites:

    really big errors (such as in the async code

    There are some small errors in the async code in the async code due to immovable values not being included earlier, leading to the ugly Pin type. But otherwise … it is basically the only way that is compatible with the way ownership works and does not rely on a specific runtime to be useful in “non-hosted” environments. See Why async Rust?.

    Note that it also works pretty much exactly the same way as in Python—though nobody ever talks about that, perhaps because few people actually use async Python. Except Rust still didn't get the other half Python had since ages, generators. That's a big omission.

    @dkf said in WTF Bites:

    big errors (in […] the floating point handling

    Which ones?



  • @Gustav said in WTF Bites:

    Being public doesn't make design better, I agree - look at C++ for example.

    I consider the design additions to C++ that were added through the public process a couple bits better than the base design that was done by one man by himself. Though the rigidity of a formal ISO process is probably getting in the way a bit.

    @Gustav said in WTF Bites:

    A language that did go through design phase doesn't drastically pivot in another direction 2 years down the line.

    Like Rust? The Rust proposals of around 2012 have nothing in common with what eventually became Rust 1.0 in 2015. The final design is a lot simpler and cleaner, but also very different.

    It was probably a mistake to call Go version 1.0 before that phase was complete, but that phase itself is basically always there.


  • Banned

    @Bulb said in WTF Bites:

    @Gustav said in WTF Bites:

    A language that did go through design phase doesn't drastically pivot in another direction 2 years down the line.

    Like Rust? The Rust proposals of around 2012 have nothing in common with what eventually became Rust 1.0 in 2015.

    Rust hasn't gone through design phase by 2012. It WAS in the design phase.



  • @Gustav Yes. And the phase Go went through after 1.0 because the closed team couldn't get it right was effectively also design phase, though the authors declared it production ready—which it totally wasn't—already.


  • Banned

    @Bulb also, there was no ground-up rewrite of Go (like the one in Rust pre-release) so to this day they are bound to the bad decisions they made in 2012. Everything that happened after Go 1.0 was incremental and on as-needed basis.

    The point is, Go had orders of magnitude less effort put into design than Rust, both before, during and after release, and that's a major reason why it sucks so much. Same with Kotlin. Same with Python. Same with JS.


  • BINNED

    @Zerosquare said in WTF Bites:

    @Bulb said in WTF Bites:

    I think it was … in JWZ's head, basically.

    Don't you mean Brian Eich's?

    :um-pendant:^2: *Brendan


  • ♿ (Parody)

    @dkf said in WTF Bites:

    The saddest thing about Rust is how little the core developers of it are aware of other programming languages.

    Interview with Senior Rust Developer in 2023 – 09:46
    — Programmers are also human


  • BINNED

    @Bulb said in WTF Bites:

    @Gustav said in WTF Bites:

    Being public doesn't make design better, I agree - look at C++ for example.

    I consider the design additions to C++ that were added through the public process a couple bits better than the base design that was done by one man by himself. Though the rigidity of a formal ISO process is probably getting in the way a bit.

    Eh, kind of. It’s also lacking the coherent vision of the BDFL approach, and a lot of the additions that are made piecemeal need another minor release to actually interact well.

    Heck, a lot of the things added / proposed, including some things I desperately want, are only proposed for the standard library because of the horrible build / package management tooling story. And once they’re added, they’re inferior to their boost-or-whatever counterpart, and stuck that way forever because can’t-break-ABI. See regex.
    Now they’re proposing to add linalg. I already use a library for that (Eigen) and a BLAS implementation (MKL), I doubt the 3 major stdlib implementers have the manpower to beat what’s already available.


  • ♿ (Parody)

    @Bulb said in WTF Bites:

    @dkf said in WTF Bites:
    "big errors (in […] the floating point handling"

    Which ones?

    I assume this is referring to the "can't compare floats" brain fart.


  • Discourse touched me in a no-no place

    @boomzilla Exactly that. Someone read about NaN and didn't think of it as an error value.



  • This post is deleted!


  • @dkf said in WTF Bites:

    @boomzilla Exactly that. Someone read about NaN and didn't think of it as an error value.

    Whether NaN is an error value or not does not change the way the comparison operators behave. There is no language where floats would satisfy total ordering, because the standard for floats actually requires that.

    Now Rust could do better. It could have the base float guaranteed non-NaN and use Option<f64> with NaN represented as None, but then almost everything would work on optional floats and not floats, and the way it is is no worse than every other language where floats are not comparable either (note that the <, > and == use the Partial order traits, so they work on floats just fine).


  • ♿ (Parody)

    @Bulb said in WTF Bites:

    Now Rust could do better.

    Yeah. It's the same sort of :galaxy_brain: that goes, There was no year zero, so the decade of the 20s goes from 21 - 30. And then instead of just having one special case everything is lame.



  • @boomzilla said in WTF Bites:

    Yeah. It's the same sort of :galaxy_brain: that goes, There was no year zero

    Whenever someone says that I always think "OK. What was in between 1 BC and 1 AD ?"



  • @Gern_Blaanston said in WTF Bites:

    @boomzilla said in WTF Bites:

    Yeah. It's the same sort of :galaxy_brain: that goes, There was no year zero

    Whenever someone says that I always think "OK. What was in between 1 BC and 1 AD ?"

    Nothing. 1 AD immediately followed 1 BC in the usual way historians number years.



  • @boomzilla said in WTF Bites:

    @Bulb said in WTF Bites:

    Now Rust could do better.

    Yeah. It's the same sort of :galaxy_brain: that goes, There was no year zero, so the decade of the 20s goes from 21 - 30. And then instead of just having one special case everything is lame.

    It would actually be quite impractical indeed, and wouldn't be following the IEEE 754 standard. Because while the most common way to get a NaN is from division by zero, all arithmetic operations can result in a NaN if Inf(inity) is involved.

    It would make some sense to encode floats in the type system as enum Float { NaN(Signalling), AN(DefiniteFloat) }, enum DefiniteFloat { Inf(Sign), Fin(FiniteFloat) } (enum Signalling { Signalling, Quiet }, enum Sign { Positive, Negative }). But most of the time it would get in the way and you can define the restricted DefiniteFloat and FiniteFloat types in a library for the few cases when you need them.


  • 🚽 Regular

    @Gern_Blaanston said in WTF Bites:

    Whenever someone says that I always think "OK. What was in between 1 BC and 1 AD ?"

    New Year's Eve.


  • Banned

    @dkf personally, I'm way more interested in the async part of your post than the float part. I'm honestly super curious about which language they should've copied the solution from and what problem it would avoid in your opinion.



  • Are we still arguing about NaN? Makes PHP’s bullshit look quite sane 🍹


  • Discourse touched me in a no-no place

    @Bulb said in WTF Bites:

    It would actually be quite impractical indeed, and wouldn't be following the IEEE 754 standard. Because while the most common way to get a NaN is from division by zero, all arithmetic operations can result in a NaN if Inf(inity) is involved.

    Specifically, it's from 0.0 / 0.0 and other entirely undefined things (like taking the logarithm of a negative number) where you're outside the domain of the operation. Things that produce an infinity (or, more obscurely, a denormalised number) aren't a big problem.

    The real place they come from most commonly is javascript because the idiots there don't understand jack shit about values that shouldn't be used.

    Operations on a NaN ought to be an error. An ideal use case for an unchecked exception (and the programmer damn well shouldn't divide by zero).

    (How does Rust handle the equivalent issues in integer math? 1 / 0 has no way to succeed there. NaNs ought to use the same model of failure handling.)

    @Gustav said in WTF Bites:

    @dkf personally, I'm way more interested in the async part of your post than the float part. I'm honestly super curious about which language they should've copied the solution from and what problem it would avoid in your opinion.

    Async is one of those things that is quite complicated! The base reality is that you have functions that need to do work, stop and wait for something to occur, and then do some more processing. This can be by arranging for another function to be called at that point, but doing that gets really confusing; the name of the game here is how to get the compiler to generate the confusing parts for us and instead have simpler structured code that includes points where you can suspend to wait for the triggering event.

    There are several possible strategies: Rust has gone with async functions and await which stops the current async function while another async function runs; that async function knows to resume its caller when it is done. The problem with that is the function colouring apocalypse; libraries end up having to do async versions of their functions just in case they are called through to a waiting task, despite not actually doing any async operations themselves.

    I prefer to make the stacks more explicit; if running on such a stack, code can suspend and resume without ceremony, whereas if it is on the main stack then it runs synchronously. And only the functions actually doing suspending and resuming need to know; their callers do not except for the top level operation of enabling the stack. (That way ends up being very much like green threads...) Colouring of functions is not needed (for this at least).

    My complaint about Rust's developers was that they come from backgrounds where this wasn't a common choice, so they didn't consider it as an option. Everyone makes that class of error from time to time — you don't know what you don't know — but they're arrogant enough to assert that they got it right for everything anyway and in some areas that's hitting the :doubt: of experienced devs with other backgrounds.


  • 🚽 Regular

    @dkf said in WTF Bites:

    a demoralised number

    Are they demoralised because they feel like they aren't normal?


  • Discourse touched me in a no-no place

    @Zecc Damnit! Autocarroted!



  • To be fair, the impact on performance of such numbers is pretty demoralizing. (They're a lot slower than regular floats.)


  • Banned

    @dkf said in WTF Bites:

    There are several possible strategies: Rust has gone with async functions and await which stops the current async function while another async function runs; that async function knows to resume its caller when it is done. The problem with that is the function colouring apocalypse; libraries end up having to do async versions of their functions just in case they are called through to a waiting task, despite not actually doing any async operations themselves.

    It's true function coloring is a problem, but you've got the reason backwards. An async function can call a normal function directly just fine. Just think of it. Sorting a vector is a normal function. Can you sort vectors in async functions? Is there a special async version of sort function to support it?

    Coloring is a problem when you want to call async function from a regular non-async function. It isn't exactly impossible - nothing stops you from making the call and repeatedly polling the returned future in place, effectively making a local impromptu scheduler - but it does mean an async calling a non-async calling an async that blocks is a deadlock. Also, arguably more importantly, you cannot pass an async lambda to a function expecting a regular lambda. This means an async function cannot use iterator adapters (map, filter etc.) and must do the iteration manually if it wants to await mid-iteration, leading to much more tedious code than the non-async equivalent.

    My complaint about Rust's developers was that they come from backgrounds where this wasn't a common choice, so they didn't consider it as an option.

    Except they did. They considered pretty much every option under the sun. I believe you're talking about what the literature calls stackful coroutines, right? AFAIK it was their first idea, they actually had a working implementation back when Rust had GC. They had some very good reasons to move away from that model, I think it was reliance on runtime which they wanted to get rid of, also probably something about cross-stack references (what if pointed-at object disappears because the owner coroutine drops a stack frame?). After much deliberation they decided stackless coroutines are better after all for their case, and not because they don't know Fortran. I can send you links to the discussions when I find them, if you're interested.


  • BINNED

    Thanks to The Jeffening, I lost my place in re-reading the linked Zig thread, to catch up if something interesting has happened since.

    All hail ⛔ 👶 bugs.


  • ♿ (Parody)

    @topspin said in Can I borrow an apology?:

    I lost my place in re-reading the linked Zig thread

    It's for the best.



  • @boomzilla said in Can I borrow an apology?:

    @topspin said in Can I borrow an apology?:

    I lost my place in re-reading the linked Zig thread

    It's for the bestgreat justice.

    🔧



  • @Gustav said in Can I borrow an apology?:

    Coloring is a problem

    Especially if you're a crayon-eating :ralph:.


  • Banned

    @HardwareGeek that's why Rust haters are the ones complaining about it the most 🚜


  • Discourse touched me in a no-no place

    @Gustav said in Can I borrow an apology?:

    They had some very good reasons to move away from that model, I think it was reliance on runtime which they wanted to get rid of, also probably something about cross-stack references (what if pointed-at object disappears because the owner coroutine drops a stack frame?).

    You guys like keeping boxes directly on threads' stacks? Much is explained. :tro-pop:

    For async to be relevant, you need a runtime anyway for the event sources if nothing else. It would be different if you were doing generators, as those are meaningful without a runtime, but anything with timers or IO events will end up tangling into the OS (or hardware directly) and that implies a runtime.

    But it could be worse. Have you seen the horrible mess that C++ offers for doing standard coroutine support? Severe yikes there!


  • Banned

    @dkf said in Can I borrow an apology?:

    @Gustav said in Can I borrow an apology?:

    They had some very good reasons to move away from that model, I think it was reliance on runtime which they wanted to get rid of, also probably something about cross-stack references (what if pointed-at object disappears because the owner coroutine drops a stack frame?).

    You guys like keeping boxes directly on threads' stacks? Much is explained. :tro-pop:

    Of course we do. It's free performance. Why wouldn't you want it?

    For async to be relevant, you need a runtime anyway for the event sources if nothing else. It would be different if you were doing generators, as those are meaningful without a runtime, but anything with timers or IO events will end up tangling into the OS (or hardware directly) and that implies a runtime.

    Well, yeah, but also no. Depends on what you mean by runtime. I meant the mandatory language runtime that you cannot not build your programs without. Runtime-less is actually more like runtime-implemented-in-application-code-directly. Also called bare metal. You know, like an OS or an embedded program. Something you've done for decades.

    If you want async to work in embedded context, you must go runtime-less route. And if you go runtime-less, you must go stackless, and once you go stackless, you pretty much must do coloring.

    I'm still missing which literature they should've read to design it better, and how it all relates to LISP.

    But it could be worse. Have you seen the horrible mess that C++ offers for doing standard coroutine support? Severe yikes there!

    I did. I'm happy I managed to forget everything about it. Please refrain from reminding me thank you.


  • ♿ (Parody)

    @HardwareGeek said in Can I borrow an apology?:

    @Gustav said in Can I borrow an apology?:

    Coloring is a problem

    Especially if you're a crayon-eating :ralph:.

    Semper Fi!


  • BINNED

    @dkf said in Can I borrow an apology?:

    But it could be worse. Have you seen the horrible mess that C++ offers for doing standard coroutine support? Severe yikes there!

    Just this week!
    I mean, I have remotely followed it for a long time, but I just got my hands on it for the first time, after I had long concluded that C++20 was mostly DOA on all fronts.
    Having a bit of a downtime, a colleague had a puzzle and, as usual, I was more interested in programming than in mental arithmetic, so I coded up a solution in python using recursion and generators. But when I decided to crunch it for all possible inputs I figured I should rewrite it in C++ for better performance. Now, I know that C++20 didn't actually deliver anything useful except the compiler machinery, but generators are the simplest case. Hmm, std::generator is C++23, but surely that's already implemented, right?
    LOL, NOPE:

    Bildschirmfoto 2023-12-21 um 18.50.14.png Bildschirmfoto 2023-12-21 um 18.49.43.png

    So I had to download a coroutine library from github first, compile that, which then didn't work ... and yeah, that's for the trivial case. The Python version was more fun, this felt like a chore.


  • Banned

    @topspin said in Can I borrow an apology?:

    But when I decided to crunch it for all possible inputs I figured I should rewrite it in

    :coding-yay:

    C++

    :coding-ohno:


  • BINNED

    @Gustav From memory:

    Python stupid solution (building a string representation and using eval to check if it's actually the solution): 20s
    Python non-stupid solution (building both an evaluated expression for arithmetic and a string representation for output): 1s
    Compiling with Cython: 300ms
    Rewriting in C++: ~20ms
    Rewriting as a tangled mess of a state machine in C without any memory allocations: fast exercise left for the reader.

    🏆


  • Banned

    @topspin I meant you almost did the meme.

    17c8599b-8917-4734-95ca-59686a39a208-54575453-bec6fc80-49b9-11e9-862a-2560348dcc4b.png


  • BINNED

    @Gustav it actually would have been a good learning opportunity. Alas, it was also a learning opportunity for C++ coroutines, and not a pleasant one.



  • @Medinoc said in Can I borrow an apology?:

    In the end, seriously, among the slew of "new" programming languages, is Rust worth learning? (and if not Rust, which one?)

    I don't want to fall too behind technologically, since I'd rather not be fired and replaced by an upcoming youth in twenty-plus years when I'm five years away from retirement.

    It is fun, but I don't see many job posts asking for it and they're mostly about Blockchain things



  • @dkf said in WTF Bites:

    @Bulb said in WTF Bites:

    It would actually be quite impractical indeed, and wouldn't be following the IEEE 754 standard. Because while the most common way to get a NaN is from division by zero, all arithmetic operations can result in a NaN if Inf(inity) is involved.

    Specifically, it's from 0.0 / 0.0 and other entirely undefined things (like taking the logarithm of a negative number) where you're outside the domain of the operation. Things that produce an infinity (or, more obscurely, a denormalised number) aren't a big problem.

    The real place they come from most commonly is javascript because the idiots there don't understand jack shit about values that shouldn't be used.

    Those NaNs will either be tried to be deserialized into integers (because javascript does not distinguish integers and floats, so they'll appear even in places that are supposed to be integers), or caught by validation, because data received from another system should be validated anyway.

    Operations on a NaN ought to be an error. An ideal use case for an unchecked exception (and the programmer damn well shouldn't divide by zero).

    (How does Rust handle the equivalent issues in integer math? 1 / 0 has no way to succeed there. NaNs ought to use the same model of failure handling.)

    1/0 will panic. 🤔 … and so will integer overflow, which is a deviation from the standard platform behaviour, though for performance reasons that check is turned off in release builds.

    There is also checked, unchecked, saturated, overflowing and wrapping variant of each operator so you can specify the behaviour explicitly in your code if you care enough.

    Which means, nobody cared enough about the floating point arithmetic to give it the same treatment, but it probably would be accepted if someone proposed it.


  • Discourse touched me in a no-no place

    @Bulb You just state that any floating point operator or (standard) function may choose to panic if it would otherwise have produced a NaN as a result, unless otherwise explicitly stated (allowing an is_nan() to be defined). Which damn well should be a good hint to people to check the damn domain if they care.


  • 🚽 Regular

    @dkf said in WTF Bites:

    Which damn well should be a good hint to people to check the damn domain if they care.

    Found your problem.



  • @bulb, @dkf: check the above posts ; boomzilla jeffed that topic.



  • @dkf said in WTF Bites:

    allowing an is_nan() to be defined

    We never define is_grandad() though. I feel this is a missed opportunity.


  • 🚽 Regular

    @Zerosquare said in Can I borrow an apology?:

    @bulb, @dkf: check the above posts ; boomzilla jeffed that topic.

    Yes, into this topic. :wtf-whistling:


  • Discourse touched me in a no-no place

    @Zecc said in Can I borrow an apology?:

    @Zerosquare said in Can I borrow an apology?:

    @bulb, @dkf: check the above posts ; boomzilla jeffed that topic.

    Yes, into this topic. :wtf-whistling:

    The world vs the Rust-edons.


  • ♿ (Parody)

    @Arantor said in Can I borrow an apology?:

    @dkf said in WTF Bites:

    allowing an is_nan() to be defined

    We never define is_grandad() though. I feel this is a missed opportunity.

    pop() is pretty common though. You should be able to use that twice in many circumstances.



  • @boomzilla said in Can I borrow an apology?:

    @Arantor said in Can I borrow an apology?:

    @dkf said in WTF Bites:

    allowing an is_nan() to be defined

    We never define is_grandad() though. I feel this is a missed opportunity.

    pop() is pretty common though. You should be able to use that twice in many circumstances.

    And sometimes you can find spawn() to go the other way.



  • @boomzilla that would only cover half the cases



  • Stuff that reminds me of this topic:


Log in to reply