WTF Bites



  • Some time ago I set up a team at bitbucket.org, adding myself as the sole member.

    Yesterday I decided this makes things more difficult for me so I deleted the team.

    Today I noticed I'm still a member, and the test project and repo I added are still on my dashboard.

    However, as the only way to manage anything is to first go to the relevant team/project/repo (which all 404) and click the buttons, there seems to be no way to fix this :(


  • Notification Spam Recipient

    0_1534645015272_c33103e2-21ad-4c8f-a8fd-cf9c679cb7e8-image.png

    "You don't have DirectX. Change your monitor settings to something worse, and try again!"


  • Considered Harmful

    https://i.imgur.com/sQkIgPt.png
    I managed to back-button this into showing me the 23 unread topics, which linked correctly.


  • Notification Spam Recipient

    @pie_flavor said in WTF Bites:

    back-button

    Aye, for the purposes of sanity I treat the forum as a SPA and pretend history buttons do not exist.


  • Considered Harmful

    @Tsaukpaetra That's part of being an SPA, the forward and back buttons are integrated.


  • Notification Spam Recipient

    @pie_flavor said in WTF Bites:

    @Tsaukpaetra That's part of being an SPA, the forward and back buttons are integrated.

    Integrated to what?


  • Considered Harmful

    @Tsaukpaetra The SPA.


  • 🚽 Regular

    @Tsaukpaetra said in WTF Bites:

    Aye, for the purposes of sanity I treat the forum as a SPA and pretend history buttons do not exist.

    Back -buttoning back to the notifications page from a thread seems to work well enough for me.


  • Discourse touched me in a no-no place

    @Tsaukpaetra said in WTF Bites:

    for the purposes of sanity

    :rofl:



  • @Tsaukpaetra Is it even possible to change a monitor to 16-bit color anymore? I guess maybe NVidia's control panel might have a super-buried setting for it...



  • @blakeyrat said in WTF Bites:

    @Tsaukpaetra Is it even possible to change a monitor to 16-bit color anymore? I guess maybe NVidia's control panel might have a super-buried setting for it...

    Not on mine, according to the adapter settings...
    0_1534693844304_09679011-65e8-4920-9e98-5a1ab6a590bc-image.png
    (the scrolled off ones are all lower resolutions - all at 32bit - I have a nVidia GTX 1060)


  • Considered Harmful

    This post is deleted!

  • Notification Spam Recipient

    0_1534740335471_7fc7ec87-03ee-4bb8-9737-6311c68505b9-image.png

    What's "broken" about my account?

    0_1534740380164_08d5d787-5edf-421c-a6f5-a0409e013c77-image.png

    Nothing, apparently.


  • Notification Spam Recipient

    @dcon said in WTF Bites:

    @blakeyrat said in WTF Bites:

    @Tsaukpaetra Is it even possible to change a monitor to 16-bit color anymore? I guess maybe NVidia's control panel might have a super-buried setting for it...

    Not on mine, according to the adapter settings...
    0_1534693844304_09679011-65e8-4920-9e98-5a1ab6a590bc-image.png
    (the scrolled off ones are all lower resolutions - all at 32bit - I have a nVidia GTX 1060)

    Echoed. nVidia GeForce 735M (passthrough by the Intel integrated). Except, Windows' Settings App claims something different:

    0_1534745710251_76bf1a6f-925d-4aba-9149-682ed3ed96e1-image.png

    Edit: And someone actually left feedback for this. How nice:

    feedback-hub:?contextid=109&feedbackid=de569b4a-8622-4edf-97bb-a07533848f95&form=1&src=1

    Apparently, feedback-hub makes not-short-links that don't appear to be URLs. Oh well.

    Edit: And in case you're not on Windows, have a lovely screenshot!

    0_1534746176573_907b1b24-8fb0-4d16-865c-68ba54a0243a-image.png


  • BINNED

    @Tsaukpaetra You need to take it to the vet to get it fixed.


  • kills Dumbledore

    Representative lines from the settings file for Tabs Studio

    <ArrayOfArrayOfArrayOfString xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema">
      <ArrayOfArrayOfString>
        <ArrayOfString>
          <string>c:\path\to\the\source\someFile.vb</string>
    

    0_1534752182009_aeacb87e-1bc2-440a-b5c2-c44474153967-image.png


  • Notification Spam Recipient

    @topspin said in WTF Bites:

    @Tsaukpaetra You need to take it to the vet to get it fixed.

    But what if I want it to stay fuckable and keep the desire to fuck around?



  • @Jaloopa said in WTF Bites:

    Tabs Studio

    So you spent $49 on an extension that lets you manage insane number of Visual Studio tabs that you can't keep track of in your mind anyway?


  • Banned

    @Tsaukpaetra said in WTF Bites:

    0_1534645015272_c33103e2-21ad-4c8f-a8fd-cf9c679cb7e8-image.png

    "You don't have DirectX. Change your monitor settings to something worse, and try again!"

    This makes more sense than you think. I mean, it's still weird given the setup runs at all, but at least there's a perfectly logical explanation why the proposed solution might work.


  • :belt_onion:

    @Gąska said in WTF Bites:

    at least there's a perfectly logical explanation why the proposed solution might work.

    So... you want to tell us what that is, or...?


  • Banned

    @heterodox I'm not old enough to know exact details, and googling such ancient stuff up is very difficult, but back in the past, desktop bit depth determined the bit depth of screen buffers available to applications, and apps had to either support all possible options (greyscale, 256 colors, 16-bit, 32-bit), or couldn't be run in some of those modes. The good news is, if you sticked with system APIs only, all that was provided for free, so there's a chance you might've never noticed this; but whenever you had to mess with drawing manually, you had to take bit depth into account. Seems like in this particular case, the installer runs some secondary utility that can be only run in 16-bit color, but has a fallback option when desktop is set to 32 bits - but this fallback requires DirectX 8 to work. The "or better" is there because DirectX 8.1 most likely already existed back then and it worked with this program too, and they didn't except (or didn't care) that 15 years down the line, Microsoft would break backward compatibility.


  • :belt_onion:

    @Gąska Explanation makes sense; thanks. 👍🏼


  • I survived the hour long Uno hand

    @Tsaukpaetra
    Basically that message means it couldn't authenticate when it was trying to connect to the Shared Experiences server. I get that every start-up, since my account is MFA enabled (I could only have to fix it every 2 weeks, but I usually don't bother fixing it at all, since I don't use multiple computers with that account). If you had a network connectivity issue on start-up, that will also cause the pop-up.



  • @Tsaukpaetra One of my huge pet peeves is software that gives really nasty messages JUST BECAUSE THE USER CHANGED THEIR PASSWORD.

    If I change my password, please spare me the giant red "ACCESS DENIED" bullshit, buddy. I have access, I just have to type in a new password, relax a bit, ok? Christ.



  • @Gąska said in WTF Bites:

    @heterodox I'm not old enough to know exact details, and googling such ancient stuff up is very difficult, but back in the past, desktop bit depth determined the bit depth of screen buffers available to applications,

    That's not true, though.

    Unless you're defining "screen buffer" in some extremely specific way, like "screen buffer created via this particular API function" or "screen buffers that can blit back to the screen at a higher rate of speed because the OS doesn't have to perform color dithering".

    But you can use any arbitrary block of memory as a screen buffer, and the OS doesn't have anything to say about that normally. (Of course now you're better-off putting the buffer in GPU memory, but you're talking about back in the days when computers would frequently be running in 256-color mode, so.)


  • Banned

    @blakeyrat said in WTF Bites:

    @Gąska said in WTF Bites:

    @heterodox I'm not old enough to know exact details, and googling such ancient stuff up is very difficult, but back in the past, desktop bit depth determined the bit depth of screen buffers available to applications,

    That's not true, though.

    Cool, learning time!

    Unless you're defining "screen buffer" in some extremely specific way, like "screen buffer created via this particular API function" or "screen buffers that can blit back to the screen at a higher rate of speed because the OS doesn't have to perform color dithering".

    But you can use any arbitrary block of memory as a screen buffer, and the OS doesn't have anything to say about that normally. (Of course now you're better-off putting the buffer in GPU memory, but you're talking about back in the days when computers would frequently be running in 256-color mode, so.)

    ...Oh. I thought you were about to say that my explanation was completely wrong and the reason why desktop bit depth matters is totally different. But instead you snatched to a meaningless technicality about usage of words, because I said "screen buffer" when I actually meant "screen buffer used by the OS to draw the window". Bummer.



  • @pie_flavor said in WTF Bites:

    0_1534387131124_Screenshot_20180814-072505_Device maintenance.jpg
    fascinating

    Pssh, yeah, right! If only it actually sent SMS messages. If one of its messages gets sent to a cell phone that has only SMS (and doesn't support MMS, but sometimes even if it does), the text appears to be some scrambled combination of wrong encoding and binary data. Almost every other character is @, though, so I don't know what the encoding might be. If it's binary data, it might be a location reference into the emoji library that doesn't exist on the receiving phone...? Or perhaps a reference to the prior conversation? Whatever it is, it's all wrong, and texting the sender back to say "Hey, your message came through as a bunch of random garbage" doesn't fit my idea of sociable interaction. And calling them defeats the purpose of text messages.

    (On the other hand, the receiving phone is a Samsung, so... 🤷🏻♂ )


  • Java Dev

    @djls45 Standard SMS messages use a 7-bit character set called GSM 03.38, which is stored packet (8 characters per 7 octets). In this character set the @ symbol has codepoint zero.
    The other common character set for SMS messages is UCS-2. If the message text is in a latin language, it's not inconceivable that apple is sending UTF-16 or UCS-2 data but specifying GSM 03.38 in the headers. Or, alternatively, your Samsung device does not correctly support SMS character sets.


  • ♿ (Parody)

    I remember someone pointing out recently that when Windows "sleeps" or whatever, shit keeps running, like when you close the lid on a laptop. Which explains why if I leave mine like that for a few days the battery is always drained. :wtf:


  • Considered Harmful

    @boomzilla or it might not. You can usually set the lid close response to either "sleep" or "hibernate". Now, Windows et. al. will fuck up hibernate sometimes, but, it sounds like you wanted hibernate.



  • @PleegWat said in WTF Bites:

    @djls45 Standard SMS messages use a 7-bit character set called GSM 03.38, which is stored packet (8 characters per 7 octets). In this character set the @ symbol has codepoint zero.
    The other common character set for SMS messages is UCS-2. If the message text is in a latin language,

    USA-ian English.

    it's not inconceivable that apple is sending UTF-16 or UCS-2 data but specifying GSM 03.38 in the headers.

    So it's possible that iOS and Android (or their messaging apps at least) are both :doing_it_wrong: (i.e. "No one would ever be using that setup!")

    Is there an easy way to reverse the incorrect encoding to get the original message?

    Or, alternatively, your Samsung device does not correctly support new SMS character sets.

    Perhaps so. The phone is several years old.


  • ♿ (Parody)

    @Gribnit said in WTF Bites:

    @boomzilla or it might not. You can usually set the lid close response to either "sleep" or "hibernate". Now, Windows et. al. will fuck up hibernate sometimes, but, it sounds like you wanted hibernate.

    Yeah, probably. It's probably locked down by group policy, though. Everything else on it seems to be.


  • Java Dev

    @djls45 said in WTF Bites:

    @PleegWat said in WTF Bites:

    Or, alternatively, your Samsung device does not correctly support new SMS character sets.

    Perhaps so. The phone is several years old.

    UCS-2 SMS is pretty old (the wiki link says UCS-2 not UTF-16, showing age). However it's uncommon in the west because it means you get 70 characters instead of 160, and apparently SMS never really caught on in the east.

    We have code that can generate SMS messages at work and about 8-9 years ago I worked on unicode support for that. I tested it and hence know my own phone at the time could work with them (probably a motorola, possibly a sony). But I don't have the code to hand, it takes some setting up to send SMS messages, and I've only got 1 SIM card available to me.



  • :wtf: SQL Server Management Studio

    I tried to do something with a server that I could no longer connect to (VPN related stuff). It hung for a while and I realized "Oh, right, VPN" and tried to stop the action. Nothing happened, I switched to some other window while waiting for it to time out. But now, the SSMS window is unresponsive and just dings no matter what I try to click on. I'm guessing it popped up a modal dialog, informing me of the fucked-ness of my connection, but that modal is somewhere where I can't see it, as it has no taskbar icon, alt-tabbable window or anything else I can use to find it and hit ok.


  • Notification Spam Recipient

    @Gribnit said in WTF Bites:

    @boomzilla or it might not. You can usually set the lid close response to either "sleep" or "hibernate". Now, Windows et. al. will fuck up hibernate sometimes, but, it sounds like you wanted hibernate.

    ⬆ this. Microsoft on their mobile-is-everything train said that if a device thinks it can save as much power by turning off most everything (like the display and storage) then do that instead of going S5 . Except "everything" has this qualification by default it seems, so sleep isn't sleep anymore.



  • @blakeyrat said in WTF Bites:

    @Tsaukpaetra Is it even possible to change a monitor to 16-bit color anymore? I guess maybe NVidia's control panel might have a super-buried setting for it...

    Are you talking about the monitor, or the driver? Because the monitor doesn't really care how many colors you're sending it... when the video signal goes across the wire, it doesn't really matter. The number of colors is more of a memory limitation on the sending side than it is anything that the receiving side has to care about.



  • @blakeyrat said in WTF Bites:

    @Tsaukpaetra One of my huge pet peeves is software that gives really nasty messages JUST BECAUSE THE USER CHANGED THEIR PASSWORD.

    If I change my password, please spare me the giant red "ACCESS DENIED" bullshit, buddy. I have access, I just have to type in a new password, relax a bit, ok? Christ.

    Some of them are actually quite nice and go "Incorrect password (your password was changed 11 days ago)"...



  • @boomzilla said in WTF Bites:

    I remember someone pointing out recently that when Windows "sleeps" or whatever, shit keeps running, like when you close the lid on a laptop. Which explains why if I leave mine like that for a few days the battery is always drained. :wtf:

    Well, yeah. It's keeping the information in RAM hot so it can resume quickly when you wake it up. It's supposed to automatically hibernate, too, so if the battery reaches a critically low level or you lose power while it's sleeping, it'll still be able to resume (just from disk, since it let the RAM go cold). The really fun part is that if you disable "hybrid sleep" on Windows, so that you actually can tell your machine to hibernate, then it disables completely the automatic hibernation during sleep, so if you let your PC go to sleep and the battery gets drained (or you lose power) you lose your work.

    At least, that's what it seemed to be saying when I read about hybrid sleep when trying to figure out why my PC wouldn't let me tell it to hibernate.


  • Notification Spam Recipient

    @anotherusername said in WTF Bites:

    The really fun part is that if you disable "hybrid sleep" on Windows, so that you actually can tell your machine to hibernate, then it disables completely the automatic hibernation during sleep, so if you let your PC go to sleep and the battery gets drained (or you lose power) you lose your work.

    You know, that one made my scratch my head as well. Like, WTF that's not obvious what it's doing...


  • kills Dumbledore

    @Bulb said in WTF Bites:

    @Jaloopa said in WTF Bites:

    Tabs Studio

    So you spent $49 on an extension that lets you manage insane number of Visual Studio tabs that you can't keep track of in your mind anyway?

    When you work on a project designed by architecture astronauts you tend to need a lot of tabs open to track all but the most trivial operations


  • Discourse touched me in a no-no place

    @Jaloopa said in WTF Bites:

    When you work on a project designed by architecture astronauts you tend to need a lot of tabs open to track all but the most trivial operations

    👍 :(



  • My new Alcatel phone has an incredibly sensitive touchscreen. A slight contact with the earphone cable (or pretty much anything) will always trigger a tap.

    So I did a test by putting some kitchen paper in front of it. Turns out it can perfectly detect my fingers through 4 layers of (double-ply) kitchen paper. And if I press it a bit harder against the screen, it goes up to 8.

    Edit: I tried with normal (notebook) paper. It's usable with up to 14 sheets (around 1mm) on top, but will still detect taps with about 4 times as much. Yes, it detects taps through about 0.5cm of paper.


  • Considered Harmful

    @anonymous234 My phone is even better; it will register a tap without anything touching the screen at all! Usually in the region of a "confirm" button.

    I need a new phone.



  • GMail just decided that an English-language email from a coworker was actually written in Polish, and auto-translated it to English for me. The original email and the translated email are identical so it didn't screw anything up for me, but :wtf:



  • @mott555 Plot twist! Coworker's last name happens to be of Polish origin! So obviously to Google that means he only types in Polish!

    :facepalm:


  • Considered Harmful

    @mott555 That's :arrows:!


  • Considered Harmful

    0_1534916623923_Screenshot_20180821-220807_Google.jpg



  • @boomzilla But that's the thing, there is absolutely no chance that would have worked.


  • BINNED

    qint64 QIODevice::readLine(char *data, qint64 maxSize)

    This function reads a line of ASCII characters from the device, up to a maximum of maxSize - 1 bytes, stores the characters in data, and returns the number of bytes read. If a line could not be read but no error ocurred, this function returns 0. If an error occurs, this function returns the length of what could be read, or -1 if nothing was read.

    This was likely a cause of a bug I was hunting for. So, let me get this straight:

    • if it reads a line without an error, it returns the number of bytes read
    • if it reads a line, but there's an error, it still returns the number of bytes read
    • if it can't read a line, but there was no error, it returns 0
    • if ti can't read a line, but there is an error, it returns -1

    Whose demented brainchild is this? What in the holy fuck?

    Anyway, changed while(socket->readLine(buf, MAX_LINE_SIZE)) to while(socket->readLine(buf, MAX_LINE_SIZE) > 0). Now, we hope. Because fuck if I can figure out why it fucking fails in this particular case (on an install that's been there for like 2 years and suddenly decided it's now the time to become a dick).



  • @cartman82 I can confirm that I substantially outdo Google. I run over a trillion containers per day on a few servers as such: while true docker run hello-world * cores * 10. I was talking to my friend at google though about this and he talked about this even more popular technology called "processes". Apparently Google runs these ten thousand times more than then run docker containers! What's more is that apparently there's this even more popular tech called threads. Apparently Google runs yet fifty times more of those than they do processes! Even more impressive are these things called "subroutines". Apparently Google runs a thousand times more these than threads! If that's not impressive however, I've heard of the ultimate new technology called "instructions". Apparently Google runs a thousand times more of these than even subroutines! Obviously docker is very low down in the food chain and people ought simply be using instructions instead.


Log in to reply