WTF Bites


  • Java Dev

    @Bulb said in WTF Bites:

    @Zerosquare said in WTF Bites:

    The copy of the 2nd edition K&R I have is a French translation, but it says this about the bit-shifting operators:

    The result is undefined if the right-hand operand is negative, or is larger or equal to the number of bits of the left-hand operand type.

    Looks like they tightened that for unsigned integers in C99 and above?

    I doubt it. The behaviour of the left shift instruction in the CPU didn't change because of C99 and C never specified things that didn't actually behave consistently on common CPUs. And I've checked the actual behaviour—only the low 5 bits seem to be taken into account, so a x << 32 behaves as x << 0 (I was writing some bit packing and it would have been easier if x << 32 behaved consistently and returned 0, but it does not, so I had to add some special cases).

    64-bit Intel has a shift instruction which uses 5 bits of the shift register when operating on a 32-bit value register, but I believe there are other 64-bit architectures which use 6 bits of the shift register even when operating on a 32-bit (or smaller) value register.


  • Considered Harmful

    @dkf said in WTF Bites:

    @topspin said in WTF Bites:

    They want me to type that on a phone?

    That's what you have a USB keyboard in your other pocket for, amirite?

    Actually he was just happy to see you.



  • @PleegWat I was not saying it was consistent across architectures. It isn't. That's part of the reason C and C++ give up and declare it undefined behaviour.



  • @Bulb said in WTF Bites:

    x << 32 behaves as x << 0

    Not really. x << 32 is UB, and some compilers will do really surprising stuff when they encounter UB. I distinctly remember being bitten by that once, with a template containing something like 1ULL << NBits where for some template arguments NBits could be 64, resulting in the compiler just going "LOL, UB!" and skipping over a whole bunch of code because "can't ever happen". The effect definitely wasn't the same as 1ULL << 0 would have had.



  • @ixvedeusi Yes, I know. Gcc is mean and when it notices a bug in your code, it invents consequences that make absolutely no sense just to make fun of you ;-).

    However I was describing the behaviour of the underlying x86 instruction that you see when the compiler actually compiled it straight because the code didn't really offer much shortcuts to take. And that behaviour is consistent, but surprising and useless.

    Of course, not offering much shortcuts means the right hand argument must be a variable. With a constant I would expect gcc to simply ignore whole bunch of code around it, because by definition nothing depends on it despite all the indications to the contrary.



  • Today's WTF Bite: Email from Google warning me that my AdWords impressions are suddenly plummeting and I should pay them to have an "expert" look at my ad settings and make corrections to bring them back up.

    No, thanks. My ad impressions dropped because I turned my ads off because AdWords ignores keywords, budgets, and location settings, and racks up $80/week bills by showing my sci fi novel ad to people looking for English test answer keys or incest porn in a country I specifically blacklisted. And I'm definitely not paying an "expert" to click the slider to re-enable them.



  • @mott555 said in WTF Bites:

    showing my sci fi novel ad to people looking for English test answer keys or incest porn

    So Google shows you exactly what search phrases trigger your ads. When I started looking closely at the reports, my first reaction was this:

    My second reaction was "Why the F:wtf::wtf::wtf: is that triggering my ad when there is literally zero intersection between the set of words in those search phrases and the set of keywords I configured?"


  • I survived the hour long Uno hand

    @mott555 said in WTF Bites:

    Why the F:wtf::wtf::wtf: is that triggering my ad when there is literally zero intersection between the set of words in those search phrases and the set of keywords I configured?

    Because your ad campaign wasn't earning Google money at the rate that Bob in Accounting decided that it should.



  • @izzion Well thanks to Bob in Accounting, my ad campaign isn't earning Google any money now.


  • I survived the hour long Uno hand

    @mott555
    Hence Bob's referring your account to Larry the Ad Settings Expert and his Cluebat.



  • @hardwaregeek said in WTF Bites:

    Table service is far too fancy for any McDonald's I've ever eaten at.

    Optional table service at the one near my place. I usually just wait at the counter though.



  • @cartman82 said in WTF Bites:

    Gee, if only there was some way for user to supply parameters to a program other than writing then into the source code.

    Worse, they're already using it for other parameters.



  • I recently bought one of these

    It's a simple adapter so you can use the mini-PCIe wifi cards from laptops in a desktop. Technically it works for any mini-PCIe device, but 99% of those are WiFi adapters, so they market it that way and they add the external antennas.

    Of course, the customers don't always understand what they're buying...
    0_1533565899034_323d5f41-48de-42eb-ab9f-4b7a60f73eb4-imatge.png


  • BINNED

    @anonymous234 said in WTF Bites:

    I recently bought one of these

    a triple black dildo?


  • Considered Harmful

    @Luhmann for what animal?



  • @anonymous234 Why would you buy that instead of that but with the wifi chip actually on it? I mean I get that reviewer's confusion: why does this thing even exist?



  • @blakeyrat said in WTF Bites:

    @anonymous234 Why would you buy that instead of that but with the wifi chip actually on it? I mean I get that reviewer's confusion: why does this thing even exist?

    Why would you buy an extension cord without a toaster on the other end?



  • @ben_lubar I wouldn't.


  • Discourse touched me in a no-no place

    @Zerosquare said in WTF Bites:

    much of the industry has migrated to GCC (either as the standard or as an option)

    Unfortunately, it's pretty shit at really getting optimisations sorted. It's not too critical on desktop systems, where pipelining and so on mean that instructions really don't all take the same amount of time at all even before you start to think about the accessing of memory, but it's a total nightmare on embedded platforms where every last instruction can count when in that key inner loop in a fast interrupt service routine… One ends up having to write far more direct machine code than you might hope, in order to get that last order of magnitude of speed (yes, GCC is that bad).



  • @blakeyrat said in WTF Bites:

    @anonymous234 Why would you buy that instead of that but with the wifi chip actually on it? I mean I get that reviewer's confusion: why does this thing even exist?

    I've got a few of the mini cards lying about, I've wanted to buy an adapter like that one.
    I don't really know how I got the mini cards, probably from 'old' laptops.


  • Discourse touched me in a no-no place

    @PleegWat said in WTF Bites:

    In practice, the compiler can (and an optimising compiler will) assume any code path containing undefined behaviour is unreachable.

    That's only an assumption that is available to some compilers. Others define what the behaviour is in that situation.

    It varies by language. C (and C++ and Objective-C) compilers mark it as undefined, but languages where the underlying logical model of arithmetic is based on arbitrary precision integers will make different choices (and I wouldn't be at all surprised if Java and C# use a different approach again in order to make all non-division operations be total functions over the types of integers supported). Authors of optimising compiler cores must not make that assumption on behalf of the language front-ends; it is up to those front-ends to add in the explicit assertions that some situation is logically unreachable (that enables the optimisers to do dirty tricks).


  • Discourse touched me in a no-no place

    @HardwareGeek said in WTF Bites:

    We still have thousands of warnings from compiling Verilog

    Everyone has warnings from compiling Verilog. Apparently, according to one of the hardware guys at work, you can't even do basic arithmetic like addition with it without getting warnings.


  • Discourse touched me in a no-no place

    @Luhmann said in WTF Bites:

    @anonymous234 said in WTF Bites:

    I recently bought one of these

    a triple black dildo?

    It looks a bit small for that. :takei:



  • @dkf It's not for him, it's for the mouse.



  • @ben_lubar said in WTF Bites:

    @blakeyrat said in WTF Bites:

    @anonymous234 Why would you buy that instead of that but with the wifi chip actually on it? I mean I get that reviewer's confusion: why does this thing even exist?

    Why would you buy an extension cord without a toaster on the other end?

    I admire your ability to come up with good metaphors.



  • @anonymous234 said in WTF Bites:

    @ben_lubar said in WTF Bites:

    @blakeyrat said in WTF Bites:

    @anonymous234 Why would you buy that instead of that but with the wifi chip actually on it? I mean I get that reviewer's confusion: why does this thing even exist?

    Why would you buy an extension cord without a toaster on the other end?

    I admire your ability to come up with good metaphors.

    I can also come up with very bad ones.



  • @ben_lubar Bad metaphors are like bad eggs. They give you a stomachache and smelly gas.


  • Java Dev

    @dkf said in WTF Bites:

    @Zerosquare said in WTF Bites:

    much of the industry has migrated to GCC (either as the standard or as an option)

    Unfortunately, it's pretty shit at really getting optimisations sorted. It's not too critical on desktop systems, where pipelining and so on mean that instructions really don't all take the same amount of time at all even before you start to think about the accessing of memory, but it's a total nightmare on embedded platforms where every last instruction can count when in that key inner loop in a fast interrupt service routine… One ends up having to write far more direct machine code than you might hope, in order to get that last order of magnitude of speed (yes, GCC is that bad).

    Apparently clang is better at it than gcc?

    It's sure better at tossing up warnings for my codebase. Mostly 'cast to type of different alignment', as I recall. In wacky serialized data stuff that has no good business being untyped but I don't have a good usecase to improve upon. And since clang isn't even in the redhat repos, trying to fix compiles on it is not worth the bother.



  • @dkf said in WTF Bites:

    Unfortunately, it's pretty shit at really getting optimisations sorted. It's not too critical on desktop systems, where pipelining and so on mean that instructions really don't all take the same amount of time at all even before you start to think about the accessing of memory, but it's a total nightmare on embedded platforms where every last instruction can count when in that key inner loop in a fast interrupt service routine… One ends up having to write far more direct machine code than you might hope, in order to get that last order of magnitude of speed (yes, GCC is that bad).

    Which CPU/µP are you targetting?

    GCC is not the best compiler in the world, but I've not experienced it being that bad, especially when compared to the nasty compilers that used to be the norm in embedded development. Like Microchip's C18 I mentioned above, which is so hopeless I had to rewrite a part of the code in assembly. Usually I wouldn't mind too much (I like assembly), but PIC assembly and general architecture is painful.



  • 0_1533585920364_4912b7b3-25d2-4b23-9f81-0b5cba5d06d3-image.png

    Why do people use Azure? It seems like the lame duck of cloud services.


  • Considered Harmful

    @cartman82 Easier lift and shift of existing WTFs. Well. At least that's the reason they start.



  • @dkf said in WTF Bites:

    @HardwareGeek said in WTF Bites:

    We still have thousands of warnings from compiling Verilog

    Everyone has warnings from compiling Verilog. Apparently, according to one of the hardware guys at work, you can't even do basic arithmetic like addition with it without getting warnings.

    Verilog does have some stupidly pedantic warnings, like ignoring the return value of a function with a side-effect, but it is possible to silence them with some effort (e.g., casting every single -$&(_#& call to void, or turning the warning off with a pragma if you really want it to never complain, including places where it might actually be wrong). And there are a lot of those in our code. However, there are also a lot of things like pins that aren't connected, which is almost certainly an error for input pins and may or may not be intended for outputs; it's a really good idea to tell both the compiler and other humans that you are intentionally not using it.



  • @cartman82 said in WTF Bites:

    Why do people use Azure? It seems like the lame duck of cloud services.

    We're actively migrating customers away.
    It's more expensive than dedicated hosting, and there are not really any benefits. Everything can be controlled by powershell, true, but who has weeks of time to automate tasks i could do in 5 minutes on my own server ?



  • @cartman82 said in WTF Bites:

    Why do people use Azure? It seems like the lame duck of cloud services.

    You apparently haven't been introduced to Google Cloud Platform!


  • Considered Harmful

    @blakeyrat Oracle, man. Oracle.



  • @blakeyrat said in WTF Bites:

    You apparently haven't been introduced to Google Cloud Platform!

    You come to defend a Microsoft product while bashing a Google product.

    How surprising 🍹



  • @blakeyrat said in WTF Bites:

    You apparently haven't been introduced to Google Cloud Platform!

    That's what the people in the article were migrating to :/



  • @TimeBandit Oh no, Azure is utter garbage. But then so is AWS.

    Google Cloud Platform is equally garbage, but has the added problem that they shut down people's accounts with no recourse when malfunctioning bots think they're spamming. And they have downtime every fucking month when Azure and AWS have gone years without downtime.



  • @blakeyrat said in WTF Bites:

    Why do people use Azure? It seems like the lame duck of cloud services.

    You apparently haven't been introduced to Google Cloud Platform!

    I actually tried using it one time. A tutorial of sorts suggested it as a "quck alternative" to setting up own hosting. I gave up after 3-4 hours. Own hosting - done in 5 minutes.



  • @swayde said in WTF Bites:

    It's more expensive than dedicated hosting

    I have never done anything with Azure (nor other cloud hosting, actually), but lately I've always been told that they are rather cheap. So I am somewhat surprised (it probably depends on utilization too; clouds are interesting if you need large peeks, but general utilization is low).

    @blakeyrat said in WTF Bites:

    You apparently haven't been introduced to Google Cloud Platform!

    And that is the home of this Kubernetes madness.

    While I didn't try to deploy it to either cloud, I did build a Kubernetes deployment recently. It is a shame that this appears to be the best thing out there when you need to orchestrate a bunch of containers.



  • @ben_lubar A metaphor is like a used car. Sometimes it doesn't work.



  • @Bulb said in WTF Bites:

    I have never done anything with Azure (nor other cloud hosting, actually), but lately I've always been told that they are rather cheap. So I am somewhat surprised (it probably depends on utilization too; clouds are interesting if you need large peeks, but general utilization is low).

    My understanding is, all these cloud providers kill you on the bandwidth. So compute = go cloud, media storage = go dedicated.



  • @cartman82 Compute with high utilization = go dedicated too.

    And it makes a lot of sense. The cloud is only cheaper by sharing the resources—any resources, so bandwidth, CPU and memory. If you have high utilization, there is not much reason for cloud to be cheaper, and it indeed isn't.


  • Notification Spam Recipient

    @cartman82 said in WTF Bites:

    0_1533585920364_4912b7b3-25d2-4b23-9f81-0b5cba5d06d3-image.png

    Why do people use Azure? It seems like the lame duck of cloud services.

    Because it's one of the only ones I know of that can take an arbitrary VHD file and boot it.

    And that's pretty much the way it has to be until I can dedicate resources to making UE4 play with MS SQL Server on Linux properly.


  • Considered Harmful

    @cartman82

    How in the hell did they even? Thought if you could get the VM to crash from a container you'd found a vulnerability.


  • Notification Spam Recipient

    @cartman82 TRWTF is immediately switching everything over without testing or (apparently) anything other than "a marketing email said so, let's do it".



  • @Bulb said in WTF Bites:

    I have never done anything with Azure (nor other cloud hosting, actually), but lately I've always been told that they are rather cheap. So I am somewhat surprised (it probably depends on utilization too; clouds are interesting if you need large peeks, but general utilization is low).

    Their pricing model is weird. You don’t pay for DB storage, you pay for number of DBs. So if an application has 10 DBs you pay 10x dB fee. Conversely, one 200GB GB is cheaper than 10x2MB DBs.
    You don’t only pay for VM performance, you also pay for the appservice slots that your application goes in.
    So the singular VM we have on azure costs approx 100 usd/month, this is just like regular (shared hosting).
    The appservices has a “app service plan” with the same performance as the VM. They all share this VM. This now costs 300 usd/month.
    Then each app service costs 10/usd month.
    Each db costs 10 usd/month.
    This means our customer pays like
    One pays for storage of TLS certs, and access to them.
    It’s a PITA to configure https certs for app services (even worse for VMs), and they have to be manually rotated (or spend weeks automating it!).
    They call “https/tls” certificates “SSL certificates” in their UI. SSL certs havn’t be used in what, 5 years now?
    Azure AD is shit, we keep getting locked out of accounts/ADs for no discernable reason. We now have people sharing accounts/passwords.
    On top of all of that, you also have to pay for storage, and it’s ALSO a pain in the ass to configure correctly. And you pay to read/write that storage.
    Did i mention support is useless ?

    An example:
    A customer who requires azure now pays approx 2k USD/month to host 10 teeny sites. On top of this comes HTTPS certs and administration.

    More magic:
    The VMs we use are ‘old’, so i’d like to upgrade them to the new better v2 VMs.
    MS: Thats easy. Just upgrade.
    :swayde: mmmkay. I can’t seem to find the upgrade button
    MS: you just have to redeploy or clone the app service plan.
    Me: how do i do that ?
    MS: POWERSHELL!
    :headdesk:
    I refuse to fuck with redeployments on live sites. Didn’t ms consider that most people don’t want to spend that effort on upgrades ?



  • @swayde IOW, dedicated hardware is cheaper 🤷🏽♂



  • @swayde said in WTF Bites:

    On top of this comes HTTPS certs

    Guess how much money I've spent on TDWTF SSL certificates.

    Here's a hint: It's between $0.01 and $-0.01.


  • Considered Harmful


Log in to reply