WTF Bites


  • Considered Harmful

    @cvi said in WTF Bites:

    @error said in WTF Bites:

    The weirdest thing is when you go to a live sportsball event but you're so high up in the rafters that the players look like ants, so you watch them on the jumbotron.

    Yes. In this particular case it was combined with the fact none of people I was there with really had any clue as to what was going on. Probably would have been a better experience overall if the latter wasn't the case (and somebody in our crowd actually had given a damn about the outcome).

    But, even so, it seemed that the game wasn't really that important. At least half of the "entertainment" seemed to stem from other sources (various shoutouts, crowd games, the "hug cam", etc etc.).

    Baseball is about combining beer with mild heatstroke.


  • Considered Harmful

    @dcon said in WTF Bites:

    @cvi said in WTF Bites:

    @Zerosquare said in WTF Bites:

    "But the transcript didn’t make any sense."

    How bad can it be? Well:

    "So humidity is desk a beat-up. Sociology, does it iron? Mined material nematode adapt. Secure location, mesons the first half gamma their Fortunes in for IMD and fact long on for pass along to Eurasia and Z this particular location mesons."

    Still more sensible than @Gribnit.

    Proof:

    @Gribnit said in WTF Bites:

    Avoid cackling, at all costs. Any peculiarly long, deep, or hollow laughter is also probably weighted as negative, most likely. Screams in the background can also indicate potential workplace fit issues.

    Ah. The subclause "most likely" is redundant here. Thank you.



  • @cvi said in WTF Bites:

    @TimeBandit said in WTF Bites:

    I left in the middle of the 3rd inning

    IIRC we were told they'd stop serving alcoholic beverages at/after the Xth inning. They indeed stopped serving alcoholic beverages at some point, but we never quite figured out the correlation to anything that was happening in the game.

    I would suspect that the Xth inning is Y units of time before the end of a "typical" game, so that gives people time to sober up slightly before attempting to drive home. Whether that achieves it's presumed purpose is an entirely different question.


  • Considered Harmful

    @HardwareGeek said in WTF Bites:

    @cvi said in WTF Bites:

    @TimeBandit said in WTF Bites:

    I left in the middle of the 3rd inning

    IIRC we were told they'd stop serving alcoholic beverages at/after the Xth inning. They indeed stopped serving alcoholic beverages at some point, but we never quite figured out the correlation to anything that was happening in the game.

    I would suspect that the Xth inning is Y units of time before the end of a "typical" game, so that gives people time to sober up slightly before attempting to drive home. Whether that achieves it's presumed purpose is an entirely different question.

    Are you talking about the Flask Innings?


  • I survived the hour long Uno hand

    @HardwareGeek said in WTF Bites:

    @cvi said in WTF Bites:

    @TimeBandit said in WTF Bites:

    I left in the middle of the 3rd inning

    IIRC we were told they'd stop serving alcoholic beverages at/after the Xth inning. They indeed stopped serving alcoholic beverages at some point, but we never quite figured out the correlation to anything that was happening in the game.

    I would suspect that the Xth inning is Y units of time before the end of a "typical" game, so that gives people time to sober up slightly before attempting to drive home. Whether that achieves it's presumed purpose is an entirely different question.

    That, and given the pacing of a baseball game, they don't want to risk the game going into extra innings and fans dropping due to alcohol poisoning :half-trolling:


  • Considered Harmful

    @Gribnit said in WTF Bites:

    @Gąska said in WTF Bites:

    @LaoC said in WTF Bites:

    I can't explain the offside rule either

    Are we still talking about men pretending to be women?

    Best to assume all 4 contexts simultaneously from this point forward, really.

    I am now drained, in every sense of the word.


  • Notification Spam Recipient

    @cvi said in WTF Bites:

    Filed under: Framesets, another 90s technology

    Yes, you require to open the page in an era-appropriate browser.


  • Notification Spam Recipient

    @HardwareGeek said in WTF Bites:

    Or do you just spend 3 hours wishing it would kill you and end the pain?

    They say seasickness works in 2 stages:

    Stage 1: You are scared you are going to die
    Stage 2: You are scared you aren't going to die


  • Discourse touched me in a no-no place

    @HardwareGeek said in WTF Bites:

    Or do you just spend 3 hours wishing it would kill you and end the pain?

    It's a form of sportsball, so …


  • ♿ (Parody)

    @Gribnit said in WTF Bites:

    I wonder if there's a decent anime with Bob Newhart.

    Such a thing could be what gets me to finally watch some japanimation as an adult, now that you mention it.


  • :belt_onion:

    Error message upon booting up

    explorer.png


  • Considered Harmful



  • WTF of my day: So, I've run out of USB slots on my mainboard. No worries, simply add a PCI-USB-card, plug in the proper SATA connector so it gets 5 V and everything is hunky-dory, right?

    Except for this very weird bug I just ran across. I did some more cleaning up of my case and for that, disconnected every cable. After I was done, I then reconnected everything, USB3 cables going into USB3 slots, the rest whereever.

    PC boots up, everything is fine. Except that when I tried to play a YouTube video in Edge Chromium, the mouse cursor first freezes, then vanishes. Everything else works fine, just the mouse cursor disappears. It's also not merely "invisible" but is completely gone, no further mouse interactions possible until I ALT-F4 the browser window.

    Weird. A bug in Edge Chromium? Let's do a web search for that. Turns out that it's not YouTube which triggers it but any kind of video overlay in the browser. And guess what search results nowadays return? Helpful videos related to your search. So, basically, I could see the search results but not really interact with them because the cursor always disappeared.

    Alright, very annoying. Let's switch to Firefox?

    Same thing. Seems like the browser merely triggers the bug but the actual problem sits somewhere else.

    So I walked back my steps where I did something video/graphics or mouse related.

    Turns out that the culprit was the mouse being plugged into the PCI-USB-card. When I moved it to one of the motherboard's USB slots, the problem vanished.

    :wtf:



  • Uh-oh. I'm getting @Tsaukpaetra vibes from your machine. This can't be good...



  • @Zerosquare said in WTF Bites:

    Uh-oh. I'm getting @Tsaukpaetra vibes from your machine. This can't be good...

    I've ruined an USB port on the motherboard by short circuit / water (tea, technically). Aside from that USB port specifically, there's definitively been some weirdness with USB after that (I think it might be when I'm drawing a lot of power from the ports).

    Since, I've been looking at swapping out the motherboard/upgrading the CPU (current one is relatively old at this point). But :kneeling_warthog:. I don't trust windows to handle the swap gracefully, and reinstalling+restoring is way too much effort. (Even though getting rid of the accumulated garbage on the Windows side would likely be rather good.)



  • @cvi said in WTF Bites:

    Since, I've been looking at swapping out the motherboard/upgrading the CPU (current one is relatively old at this point). But :kneeling_warthog:. I don't trust windows to handle the swap gracefully, and reinstalling+restoring is way too much effort. (Even though getting rid of the accumulated garbage on the Windows side would likely be rather good.)

    I actually did that. It works way better than expected - you might just have to install some drivers manually afterwards. After all, it's hard to download drivers from Windows Update when the driver for the network card is missing.

    Oh, and Windows might complain about it being unregistered afterwards. But you can usually transfer the license.


  • Considered Harmful

    @error said in WTF Bites:

    you can find the most depraved shit imaginable in anime form

    Sadly, this is false.


  • Considered Harmful

    @Gąska said in WTF Bites:

    crossdressing to hide identity is far more common there than being trans.

    Also the case for The Three Stooges. And Harpo.


  • Considered Harmful

    @MrL said in WTF Bites:

    @Gąska said in WTF Bites:

    Do you watch any TV series? Anime isn't much different than any other kind TV series. Except that it's Japanese so the common tropes and cultural references are going to be totally different than in US productions.

    I'm not anime fan. That would be like saying I'm cinema fan, or cartoon fan, or prose fan. Anime can be anything, just like a book can by anything - from comedy to sci-fi drama to psychological horror. Same with anime. Anime just describes animation style that's most popular in Japan. I'm a fan of a few TV shows that just so happen to be anime, because they're good shows.

    I used to say that I hate anime, because I don't like Japan culture. I always thought that Ghost In The Shell is one of the best movies ever made, but in my mind it was the single outlier.

    During pandemic I started watching movies 'considered to be the best', to kill time and get a breather from Netflix mediocre mush. Among those things there was some anime stuff and I must say I changed my mind on anime completely. Sure, there is a lot of anime that is not worth my time, but the good stuff can be really great.

    GITS, GITS SAC, Akira, Cowboy Bebop, Armitage III, Moving Castle, Kiki Delivery Service, My Friend Totoro, Porco Rosso - all really worth seeing.

    I'm about to start Planetes now.

    IMO, "Vampire Hunter D" and the "Bastard!!!" series are both worth seeking out.


  • Considered Harmful

    @TimeBandit said in WTF Bites:

    @HardwareGeek said in WTF Bites:

    you think curling is exciting.

    I don't even call that a sport 🤷♂

    Seems to be more of a sort of cold-themed religious ritual.



  • @Rhywden said in WTF Bites:

    Turns out that the culprit was the mouse being plugged into the PCI-USB-card. When I moved it to one of the motherboard's USB slots, the problem vanished.

    Ugh, hate that stuff. I have one USB port on my motherboard I can't use or Windows will occasionally think whatever is connected to it has been removed and plugged back in. Hope you didn't have an external drive on that port.

    Moving your mouse and keyboard (if USB) to the motherboard is for the best, though; if they're not there you might not be able to use them in UEFI/BIOS, Windows Recovery, OS setup, etc.

    I've got enough USB 3 drives now that I ended up buying a powered USB 3 hub for them. They almost never get used at the same time so I haven't seen any performance issues, and now I've got plenty of free ports.



  • @Parody said in WTF Bites:

    Moving your mouse and keyboard (if USB) to the motherboard is for the best, though; if they're not there you might not be able to use them in UEFI/BIOS, Windows Recovery, OS setup, etc.

    Naw, that actually worked fine - both mouse and keyboard worked in the motherboard's setup menus.




  • Considered Harmful

    @Rhywden it ruins it



  • I'm defining some data as compile time constants. This data has some nested structures, of which some members have different (but known) lengths. My first attempt compiled "fine" with GCC and Clang, but Visual Studio made a mess of it (possibly rightly, I was relying on some object's lifetimes being longer than they are guaranteed to be).

    Second attempt. Less neat, more messy, and without the sketchy object lifetime assumptions. Compiles fine with GCC and Clang. Compiles fine with Visual Studio 19.27. Visual Studio 19.28:

    fatal error C1001: Internal compiler error.
    

    £$€¡@$£@#€&%!!.



  • @cvi Microsoft didn't throw the hairball that is MSC into the trash where it belongs and switch to clang yet? :half-trolleybus-l:


  • Considered Harmful

    @Bulb said in WTF Bites:

    @cvi Microsoft didn't throw the hairball that is MSC into the trash where it belongs and switch to clang yet? :half-trolleybus-l:

    Just as soon as nothing depends on it.


  • 🚽 Regular

    @Gribnit said in WTF Bites:

    @Bulb said in WTF Bites:

    @cvi Microsoft didn't throw the hairball that is MSC into the trash where it belongs and switch to clang yet? :half-trolleybus-l:

    Just as soon as nothing depends on it.

    Behold! The rare sensible @Gribnit post!


  • BINNED

    @Bulb said in WTF Bites:

    @cvi Microsoft didn't throw the hairball that is MSC into the trash where it belongs and switch to clang yet? :half-trolleybus-l:

    They do support (some kind of?) clang build now.
    As far as I know, their C++20 support for modules is ahead of clang/gcc though. Or maybe it was coroutines, or both.



  • @topspin I heard similar, but haven't tried myself. Not super excited about modules. They smell of build trouble and so far I'm not convinced they are worth it. (Isolating stuff from cluttering the namespace is neat, but a strictly ordered build is a high price to pay.)


  • BINNED

    @cvi said in WTF Bites:

    @topspin I heard similar, but haven't tried myself. Not super excited about modules. They smell of build trouble and so far I'm not convinced they are worth it. (Isolating stuff from cluttering the namespace is neat, but a strictly ordered build is a high price to pay.)

    I haven't used any of the "Big Four" yet. They all sound nice on the surface but not entirely convincing at a closer look.

    Modules: half-baked, mess with your build system.
    Co-routines: no library support
    Ranges: "Oh this looks nice", includes 7 billion lines to compile.
    Concepts: Great in theory, maybe not quite as brain-dead mind-bending to use as the preliminary proposals made it look like, but I didn't keep up.



  • @topspin Same-ish. With the exception of ranges, those are the features that would require committing to >= C++20, which I'm not quite yet ready to do.

    I'll have to catch up on the co-routines. Saw an early demo that used them to avoid stalling on reads from memory as much - this seems interesting (and didn't sound like it'd need much library support).

    Ranges ... well, apparently you get to pay for those 7 billion lines either way, since they're in <algorithm>, <utility> and whatnot. As you say, they look nice (which isn't to be underestimated). There's nanorange, so it might be usable without going full C++20.

    As for concepts. I have at least one very specific use case for them. It's currently a mess of SFINAEd functions, which doesn't exactly make the code look great, and has some implication on compile times. It's also one of the places that used to ICE VS a few versions ago ... so, in conclusion, I'll let the concepts implementations mature a bit before considering updating that code.



  • @cvi said in WTF Bites:

    Not super excited about modules. They smell of build trouble and so far I'm not convinced they are worth it.

    You are not going to introduce them into a running project, and I wouldn't recommend starting a new project in C++ at this time either way.



  • @Bulb said in WTF Bites:

    I wouldn't recommend starting a new project in C++ at this time either way.

    Not like there are too many other options.



  • @cvi You don't need too many other options, you only need one that is better. And for most cases where managed languages were not superior already, either Rust or Go is going to be better (as in will reach the desired stability and performance targets sooner).


  • BINNED

    @Bulb said in WTF Bites:

    Go

    cb4326f8-08f6-4fb2-8aec-9c7eb0dc4209-grafik.png



  • @Bulb Well, yes, one clearly better option would be sufficient. Alas...

    Rust is the one that comes close - some colleagues are using it. But for the most part, it just seems like trading one set of problems to another. For the actually difficult problems, it seems to be a crapshot.

    Go, as far as I can tell, is focused on a different audience entirely. When I checked it out last, it seemed to try to solve problems that I'm not having, and not giving a shit about the ones that I do have. As for managed languages ... I'm mainly working on the stuff that managed languages tend to outsource to non-managed ones.



  • @cvi Which actually difficult problems? The biggest problem I usually have with C++ is that the thread synchronization sucks, and Rust is a massive improvement there.

    @topspin
    I am not really fan of Go either, but for things that were in C++ mainly for portability and/or to avoid the huge Java runtime it is a viable option. Yes, it is actually managed.


  • Considered Harmful

    @Zecc said in WTF Bites:

    @Gribnit said in WTF Bites:

    @Bulb said in WTF Bites:

    @cvi Microsoft didn't throw the hairball that is MSC into the trash where it belongs and switch to clang yet? :half-trolleybus-l:

    Just as soon as nothing depends on it.

    Behold! The rare sensible @Gribnit post!

    You think think that was sensible? Bahahhahaha!

    That was just a long spelling of "never".



  • @Gribnit That's how we understood it. Old kludges never die.


  • Considered Harmful

    @Bulb said in WTF Bites:

    @Gribnit That's how we understood it. Old kludges never die.

    Ah - you are also "many fingers"? We smelled you was "tiny bubbles".



  • @Bulb said in WTF Bites:

    Which actually difficult problems? The biggest problem I usually have with C++ is that the thread synchronization sucks, and Rust is a massive improvement there.

    I think one of the main problems is ensuring that the data is in the right place, in the right form, at the right time. To give a bit of context, I'm mainly working on design of efficient algorithms/data structures. The motivation is either handling moderate amounts of data really fast (so a few ~100k to 100M elements in ~millisecond), or large-ish amounts of data not terribly slow (I'm mainly sticking to single machines and avoid clusters when possible, so this typically involves dealing with compressed data all the way -- we can manipulate datasets with >=2^48 "elements" in tens of milliseconds on a single machine). Either way, nailing down the layout of the data is half of the work.

    Fair part of this takes place on GPUs, so C++'s thread synchronization doesn't really matter too much. "Massively parallel" (in the GPU sense) stuff is largely about avoiding traditional synchronization in the first place, because you're more or less guaranteed to have bad contention if you start locking. (Not to mention that most synchronization primitives are quite expensive -- for example, a std::mutex is 40+ bytes, you're not going to embed those into a tree with a few million nodes).

    For more coarse-grain parallelism, C++'s stuff is ... adequate. There are 3rd party libraries that make it more tolerable (and some of them are quite good). I don't usually need to deal with this as much, so YMMV.

    How's Rust with stuff like figuring out core layout (hyper threading siblings, or big-little stuff) and binding threads to specific cores based on that? One of the places where I couldn't find any good C++ options, and parsing cpuid on x86 sucks.

    Edit: One of the features that I frequently wish for is defining more logical data structures, without tying them into a specific representation. Simple example: struct with two elements can either be stored as SoA or AoS (plus a pile of other options, like alignment and padding). I'd like to have a language that allows me to decouple the logical structure/interface from the in-memory storage more easily. No, not yet sure what that would look like in practice either.


  • Discourse touched me in a no-no place

    @topspin said in WTF Bites:

    As far as I know, their C++20 support for modules is ahead of clang/gcc though. Or maybe it was coroutines, or both.

    Probably coroutines. MS have been working on that for a while, including contributing the bits of clang/LLVM (key intrinsics and optimizers) to support them.



  • @dkf Kinda both, but more modules than co-routines. See here. VS supports both fully, GCC claims full support for co-routines and partial for modules, and Clang has partial for both.



  • @cvi said in WTF Bites:

    I think one of the main problems is ensuring that the data is in the right place, in the right form, at the right time. To give a bit of context, I'm mainly working on design of efficient algorithms/data structures. The motivation is either handling moderate amounts of data really fast (so a few ~100k to 100M elements in ~millisecond), or large-ish amounts of data not terribly slow (I'm mainly sticking to single machines and avoid clusters when possible, so this typically involves dealing with compressed data all the way -- we can manipulate datasets with >=2^48 "elements" in tens of milliseconds on a single machine). Either way, nailing down the layout of the data is half of the work.

    Rust gives you all the control like C++ in this regard. It by default allows the compiler to reorder struct members, but you can always tell it to follow the C rules with an attribute if you find its pessimising it. And I find the lifetimes and Cow allow more aggressively removing copies while staying confident that it won't crash in some odd case (Cow is basically a pointer with a flag whether you have to free it, so you can easily pass a copy if you had to modify it and reference to the original if you didn't).

    @cvi said in WTF Bites:

    Fair part of this takes place on GPUs, so C++'s thread synchronization doesn't really matter too much. "Massively parallel" (in the GPU sense) stuff is largely about avoiding traditional synchronization in the first place, because you're more or less guaranteed to have bad contention if you start locking. (Not to mention that most synchronization primitives are quite expensive -- for example, a std::mutex is 40+ bytes, you're not going to embed those into a tree with a few million nodes).

    The main benefit of Rust is that it checks that you are not writing into the shared parts. It does restrict the code structure somewhat, but usually the things it prevents you from writing are the least readable, and there is much less hard to track issues left for the testing phase. The default synchronization is quite a bit better optimized too.

    Someone already mentioned a GPU target too (so you can have the kernels in Rust as well), but I am not working with that so I don't know how far it is.

    @cvi said in WTF Bites:

    For more coarse-grain parallelism, C++'s stuff is ... adequate. There are 3rd party libraries that make it more tolerable (and some of them are quite good). I don't usually need to deal with this as much, so YMMV.

    I learned to be rather careful about races a long time ago, and now I just often encounter coders who mess it up and then it takes a lot of effort to refactor to something both readable and with reasonable confidence correct.

    @cvi said in WTF Bites:

    How's Rust with stuff like figuring out core layout (hyper threading siblings, or big-little stuff) and binding threads to specific cores based on that? One of the places where I couldn't find any good C++ options, and parsing cpuid on x86 sucks.

    There are some libraries for cpuid, but I don't know how far into the layout they get as I never needed that.


  • Considered Harmful

    @Bulb said in WTF Bites:

    Cow is basically a pointer with a flag whether you have to free it,

    Neat, it does the thing that people suck at and sneaks in Copy-On-Write at the same time. Clever.

    504.R summary: moo! Moooo.


  • Discourse touched me in a no-no place

    @Bulb said in WTF Bites:

    Cow is basically a pointer with a flag

    c5af5db9-f242-44f1-acf0-b6025fa23830-image.png


  • BINNED

    @cvi said in WTF Bites:

    we can manipulate datasets with >=2^48 "elements" in tens of milliseconds on a single machine

    😲

    Assuming, for the lack of something lower, 1 bit per "element", thats 32 TiB in "tens of milliseconds."

    722c4e72-b375-47ec-944b-6cf6a1e93017-grafik.png


  • Considered Harmful

    @topspin said in WTF Bites:

    @cvi said in WTF Bites:

    we can manipulate datasets with >=2^48 "elements" in tens of milliseconds on a single machine

    😲

    Assuming, for the lack of something lower, 1 bit per "element", thats 32 TiB in "tens of milliseconds."

    722c4e72-b375-47ec-944b-6cf6a1e93017-grafik.png

    Compression and parallelism are featured. So, all in all, apparently.



  • @topspin Data is sparse and is stored in a heavily compressed form, so despite there being that many elements, we can fit the whole data structure into a few GB of memory. Key property is that we can access the data structure in the compressed form, so we don't ever need to decompress it. (Yes, you can come up with data sets that would cause the method to keel over, but that's true for all types of compression to some degree).

    For now, the operations that we've been interested in only affect a small subset of the elements (a few K up to maybe a G or so). Conceivably you could do global modifications too (e.g., matching patterns), but we've not really had any reason to look at that yet.


Log in to reply