WTF Bites



  • @cvi in current versions of the Windows 10 first boot process, you have to click Yes to setting a PIN, then click Cancel, then click Yes I meant to cancel, then it finally gives you the option to skip setting a PIN. No idea why they want people to set a PIN so badly.


  • Notification Spam Recipient

    @lb_ said in WTF Bites:

    first boot process

    Aka OOBE?



  • @lb_ said in WTF Bites:

    @anotherusername said in WTF Bites:

    @lb_ Can you provide an example?

    The post you quoted was the example. Because of the warning you now have to write something? true : false or !!something instead of the more clear static_cast<bool>(something) or bool b = something;.

    The question was why you need a bool. All you're doing is adding some syntactic sugar to an int and some logic, probably completely unnecessary, to make sure it's only allowed to be 0 or 1.


  • area_can

    @tsaukpaetra oobey indeed.


  • area_can

    Doing it right: using a forum as a forum
    Doing it wrong: using a forum as a bug tracker
    Doing it :wtf:: using a bug tracker as a forum



  • WTF of my day: So, recently I had the problem on my Surface Book that Simplify3D (a slicer program for 3D printers) crashed on startup. A thorough examination revealed that for some reason OpenGL did not start up properly because, when forced into software rendering, the program did start.
    Further examination revealed that this was not S3D's fault because anything relying on OpenGL (e.g. everything using the QT framework) would have similar issues. It also wasn't a graphics driver issue because my Surface Book has a discrete nVidia GPU in addition to the Intel one - and programs crashed using either one with the same symptoms.

    A factory reset did help after everything else failed.

    Fast forward to today when I noticed that my SMART Notebook software complained that DirectX10 was unavailable. Waitaminute.

    Yeah. Everything related to hardware acceleration crashes again. Damnnit.

    Anyone else here run into something like this before?


  • Notification Spam Recipient

    @rhywden said in WTF Bites:

    Anyone else here run into something like this before?

    Sounds typical of Windows Update "updating" drivers with a more minimal version for raisins. But that shouldn't typically delete DirectX...



  • @tsaukpaetra said in WTF Bites:

    @rhywden said in WTF Bites:

    Anyone else here run into something like this before?

    Sounds typical of Windows Update "updating" drivers with a more minimal version for raisins. But that shouldn't typically delete DirectX...

    Naw, I checked. The drivers were not updated. Also, something like a driver update shouldn't take out both GPUs.



  • @lb_ Can you still use the other parts of Windows Hello (i.e. biometric stuff) when you do that? You can delete the PIN later on (there's an option in the settings), but apparently that will also disable all of the other Windows Hello methods (i.e., any biometric things).

    Meh. Maybe I will go ahead anyway. Typing the password is not that much of a pain.



  • @rhywden said in WTF Bites:

    Anyone else here run into something like this before?

    No, not quite, at least. I had a bunch of trouble getting the Intel GPU to show up when enumerating GPUs (even with the base disconnected); installing newer Intel drivers fixed that though.



  • @coldandtired said in WTF Bites:

    @anotherusername said in WTF Bites:

    @coldandtired said in WTF Bites:

    Windows put the clock forward two hours last night on the desktop but one on the phone.

    :wtf: Daylight Savings Time was two weeks ago.

    It's a big world out there.

    The rest of the world is :doing_it_wrong:

    Then again DST is :doing_it_wrong: in the first place.



  • @anotherusername said in WTF Bites:

    A bool is an int anyway, and any nonzero value is true.

    That's not true. bool is an integral type that can only be false, which equals to 0, and true, which equals 1. Now the specification does not prescribe implementation, so it could be that any non-zero value was treated as true, but at least Microsoft compiler opts for forcing the value to 0 or 1 when converting something else to bool and then relying on the fact it can't have any other value.

    I once spent a couple of hours hunting down a strange bug which turned out to have been caused by uninitialized boolean. It ended up having some invalid value (like 24), and behaved as neither true, nor false.

    Maybe it's complaining because you've pointlessly added logic to check to see if the integer value is "truthy" and convert it to either 1 or 0, when the condition that you're using it in would treat any nonzero value as truthy anyway.

    We are talking about conversion to bool outside of a condition.



  • @bulb said in WTF Bites:

    @anotherusername said in WTF Bites:

    A bool is an int anyway, and any nonzero value is true.

    That's not true. bool is an integral type that can only be false, which equals to 0, and true, which equals 1.

    For the bool type, yes. For the int type, which is for all intents and purposes just as good, any nonzero value is as good as any other, so casting to bool is a waste of effort.

    Maybe it's complaining because you've pointlessly added logic to check to see if the integer value is "truthy" and convert it to either 1 or 0, when the condition that you're using it in would treat any nonzero value as truthy anyway.

    We are talking about conversion to bool outside of a condition.

    There's never a good reason to convert anything to bool. Prove me wrong.

    Unless you're storing it as a single physical bit, anyway. And a bool isn't that.



  • @bb36e said in WTF Bites:

    @anotherusername Europe does it on the 25th, typical EU bureaucracy

    No, it is not always 25th. It is last Sunday in March (i.e just after the equinox; back is last Sunday in October, which is about month too late—about equinox–equinox would make most sense).



  • @anotherusername said in WTF Bites:

    There's never a good reason to convert anything to bool. Prove me wrong.

    There is never a good reason to program in C++ anyway. But when you do, overloading might require that you use it. Plus there is all the self-documentation—some functions simply make much more sense when they accept or return bool than if they used int.

    @anotherusername said in WTF Bites:

    Unless you're storing it as a single physical bit, anyway. And a bool isn't that.

    Individually, you don't. But e.g. std::vector<bool> does only use 1 bit per element.


  • BINNED

    @anotherusername said in WTF Bites:

    @lb_ said in WTF Bites:

    @anotherusername said in WTF Bites:

    @lb_ Can you provide an example?

    The post you quoted was the example. Because of the warning you now have to write something? true : false or !!something instead of the more clear static_cast<bool>(something) or bool b = something;.

    The question was why you need a bool. All you're doing is adding some syntactic sugar to an int and some logic, probably completely unnecessary, to make sure it's only allowed to be 0 or 1.

    You can also do

    int a = 2;
    bool b = a;
    

    but now you're converting int to boolimplicitly and if the compiler warned about that it would at least be reasonably defendable. But when you do bool b = (bool) a (or the static_cast variant) you're telling the compiler that "this is explicitly what I want, shut up, I know what I'm doing". Warning you about this is fucking braindead.

    Unless your question is literally "why you need a bool, ever". The answer to that is static typing.



  • @topspin said in WTF Bites:

    Unless your question is literally "why you need a bool, ever". The answer to that is static typing.

    The second answer is that the generated code can differ by some amount. On GCC and Clang, the expression a && b will

    • for a and b being int: test if a is non-zero, remember this; then test if b is non-zero, and combine this with the previous result (~6 insns)
    • for a and b being bool: and a with b (~2 insns, one of them register housekeeping and the other one being the bit-and)

    The latter is possible with bool, because the compiler knows exactly which values the bool can have (and thus knows that it's safe to just bit-and these).

    MSVC produces the same (longer) code for both, though. The impact is also likely to be smaller in real code, especially if the compiler can prove that you're misusing a int as a boolean (that is, it can show that you only ever assign zero or one to it).

    Obligatory compiler explorer link



  • @anotherusername said in WTF Bites:

    @lb_ said in WTF Bites:

    @anotherusername said in WTF Bites:

    @lb_ Can you provide an example?

    The post you quoted was the example. Because of the warning you now have to write something? true : false or !!something instead of the more clear static_cast<bool>(something) or bool b = something;.

    The question was why you need a bool. All you're doing is adding some syntactic sugar to an int and some logic, probably completely unnecessary, to make sure it's only allowed to be 0 or 1.

    Because the interface that gave you the value returned an int and the interface that you're giving the value to wants a bool. It's entirely out of my control most of the time.



  • @cvi said in WTF Bites:

    @topspin said in WTF Bites:

    Unless your question is literally "why you need a bool, ever". The answer to that is static typing.

    The second answer is that the generated code can differ by some amount. On GCC and Clang, the expression a && b will

    • for a and b being int: test if a is non-zero, remember this; then test if b is non-zero, and combine this with the previous result (~6 insns)
    • for a and b being bool: and a with b (~2 insns, one of them register housekeeping and the other one being the bit-and)

    The latter is possible with bool, because the compiler knows exactly which values the bool can have (and thus knows that it's safe to just bit-and these).

    Except that logical and doesn't short circuit, so if it does that, (bool) a && (bool) b will always have to evaluate both sides. (Or use the longer code, which does short circuit. Realistically, the compiler will probably use the longer, short-circuiting code if it can't prove that evaluating b doesn't have side effects, and use logical and only if it can.)



  • @anotherusername Sorry - should have clarified that in that example a and b are either just an int or just a bool, and not sub-expressions. So, short-circuiting doesn't make any sense there. But in cases where it does, you're right - there's probably not much difference in the code for either (at least on x86).

    I admit that it's a bit of a construed example. Still, the key idea is that giving the compiler more accurate information (i.e., something is a boolean vs something is an int that happens to be used like a boolean) potentially allows it to do a better job. And the example shows that this happens, at least occasionally.



  • @anotherusername said in WTF Bites:

    @coldandtired said in WTF Bites:

    Windows put the clock forward two hours last night on the desktop but one on the phone.

    :wtf: Daylight Savings Time was two weeks ago.

    That varies...


  • BINNED

    @anotherusername It's also about abstraction. bool and int are semantically different, one is for logic and the other for arithmetic. That both are represented as bits, just like everything else, is an implementation detail at some level. enums are ints, too, but types convey certain semantics.

    Shitty made up example, again:

    int a = 1;
    bool b = a;
    std::cout << std::boolalpha;
    std::cout << a << "\n";
    std::cout << b << "\n";
    
    1
    true
    

  • Discourse touched me in a no-no place

    @topspin IOW, let's describe what we're doing with our program more accurately to the computer so it (and our future selves too) can do a better job with understanding it. The compiler (or interpreter) that understands your code better can potentially generate better code from it, warn about potential problems in it, etc.


  • Considered Harmful

    @dkf Rust does that a lot, which I like. E.g. nearly every language takes a millisecond count in its sleep functions, but not Rust - only std::time::Duration is used. It's zero-cost to convert a millisecond count to a Duration struct, but far more expressive.


  • BINNED

    @bb36e said in WTF Bites:

    Doing it right: using a forum as a forum
    Doing it wrong: using a forum as a bug tracker
    Doing it :wtf:: using a bug tracker as a forum

    0_1522065387023_295a3f9a-ff67-4676-ae87-b2b69df42daf-image.png


  • Java Dev

    @atazhaia said in WTF Bites:

    Good job Microsoft, breaking yet another redirect. Now, typing microsoft.se into the browser used to give you the swedish Microsoft site as expected. However...

    0_1519898142442_ms-se.png

    As you can see, the URL you get redirected to gives a 404. I used to language picker to manually select the language and...

    0_1519898205158_ms-se-2.png

    So instead of country name it's the locale. Good job fucking up redirecting to the MAIN PAGE OF YOUR SITE. It's not like that one is any important, right?

    Oh, it's been 25 days since I posted this? Well, it's still like this. I had to ask a colleague to make sure it wasn't just me, but his Chrome on Windows 10 had the same result as Firefox on Linux Mint, so...


  • BINNED

    @rhywden said in WTF Bites:

    Also, something like a driver update shouldn't take out both GPUs.

    That depends. Pretty much all notebooks with dual graphics today use muxless switching, meaning I'm 99.9% sure 3D stuff gets offloaded to the nvidia card, but is in the end displayed by the Intel card. I never messed with that on Windows, but see if you can disable offloading somewhere in Intel's settings. It is possible that the nvidia driver is busted and the Intel card offloads all 3D to it at all times.

    Might not be the case, but if you can force using only the Intel card you'll know which one fails - if it fixes the problem, it's the nvidia card, if it doesn't it's the Intel.



  • @pie_flavor said in WTF Bites:

    @dkf Rust does that a lot, which I like. E.g. nearly every language takes a millisecond count in its sleep functions, but not Rust - only std::time::Duration is used. It's zero-cost to convert a millisecond count to a Duration struct, but far more expressive.

    So does C++:



  • @onyx said in WTF Bites:

    @rhywden said in WTF Bites:

    Also, something like a driver update shouldn't take out both GPUs.

    That depends. Pretty much all notebooks with dual graphics today use muxless switching, meaning I'm 99.9% sure 3D stuff gets offloaded to the nvidia card, but is in the end displayed by the Intel card. I never messed with that on Windows, but see if you can disable offloading somewhere in Intel's settings. It is possible that the nvidia driver is busted and the Intel card offloads all 3D to it at all times.

    Might not be the case, but if you can force using only the Intel card you'll know which one fails - if it fixes the problem, it's the nvidia card, if it doesn't it's the Intel.

    Unlikely. Because, you see, the Surface Book is a bit of a special beast because I can detach the display from the base unit. And it's the base unit which contains the nVidia GPU.

    Which means that in detached mode, it only has access to the Intel GPU.

    Behaviour is the same in both modes.

    Also, the Surface Book definitely does not use muxless switching for the simple reason that, as soon as the nVidia GPU becomes engaged, undocking from the base is disallowed.


  • BINNED

    @rhywden found online, possibly for a different model, but:

    Microsoft does have Optimus enabled, with the default setting in the custom control panel for Auto-Select on the GPU, but you can also change to integrated or the NVIDIA graphics.

    Do you have that? Maybe force nvidia there, see if anything changes? It is a weird problem either way, but the whole switchable graphics is finnicky as all hell to begin with (don't get me started on trying to get Intel + AMD combo working on Linux...).



  • @onyx said in WTF Bites:

    @rhywden found online, possibly for a different model, but:

    Microsoft does have Optimus enabled, with the default setting in the custom control panel for Auto-Select on the GPU, but you can also change to integrated or the NVIDIA graphics.

    Do you have that? Maybe force nvidia there, see if anything changes? It is a weird problem either way, but the whole switchable graphics is finnicky as all hell to begin with (don't get me started on trying to get Intel + AMD combo working on Linux...).

    It's not the graphics drivers themselves. I already did all that. It's either something higher level or in another chain of the graphics stack.
    Basically, it has nothing to do with the dual stack.



  • @pie_flavor said in WTF Bites:

    @dkf Rust does that a lot, which I like. E.g. nearly every language takes a millisecond count in its sleep functions, but not Rust - only std::time::Duration is used. It's zero-cost to convert a millisecond count to a Duration struct, but far more expressive.

    Languages that take millisecond count as int/unsigned int are :doing_it_wrong:. C++ takes a std::duration. Granted, it does have a nicer syntax for it, because you can simply write 42ms. At least, Rust should get a ms constant, so you can write almost as good 42 * ms.


  • BINNED

    @rhywden in that case, I recommend taking a crowbar to it.

    What? I'm out of other ideas, and a crowbar is bound to do something at least.


  • area_deu

    @onyx said in WTF Bites:

    a crowbar is bound to do something at least.

    It gives you certainty in what actually is the problem at least. And isn't that the first step to solving it?


  • 🚽 Regular

    @bulb said in WTF Bites:

    Languages that take millisecond count as int/unsigned int are :doing_it_wrong:. C++ takes a std::duration. Granted, it does have a nicer syntax for it, because you can simply write 42ms. At least, Rust should get a ms constant, so you can write almost as good 42 * ms.

    TIL about C++11 user-defined literals.


  • BINNED

    @zecc said in WTF Bites:

    TIL about C++11 user-defined literals.

    Dear lord. Don't tell people about this. I mean, it's cool, but I'm afraid of people using them. I'm pretty sure that the amount of abuse this enables is greater than #define and goto combined...



  • @onyx Here's something for you then:

    // Compile with: g++ -std=c++11 -O2 file.cpp -static
    #include <cstdio>
    #include "evil.hpp"
    
    int main()
    {
    	volatile int i = 0;
    
    	500_label;
    	printf( "Hello %d\n", i );
    	
    	if( i++ < 10 )
    		500_goto;
    
    	return 0;
    }
    
    evil.hpp
    #include <setjmp.h>
    
    template< char... > struct Env_ {
    	static jmp_buf env;
    };
    template< char... tChar > jmp_buf Env_<tChar...>::env;
    
    template< char... tChar > inline 
    void __attribute__((always_inline)) operator ""_label() {
    	// g++ refuses to inline a function with setjmp(). But let's not have that
    	// stop us.
    	__asm__ volatile(
    		"1:\n"
    		"movl %0, %%edi\n"
    		"call _setjmp\n"
    		"test %%eax, %%eax\n"
    		"jne 1b\n"
    		: : "g"(Env_<tChar...>::env) : "eax", "edi"
    	);
    }
    template< char... tChar > inline
    void operator ""_goto() {
    	longjmp( Env_<tChar...>::env, 666 );
    }
    

    It "works". For some definition of "work". (Disclaimer: I don't ever recommend doing this.)

    Filed under: Aiming for the front-page.


  • BINNED

    @cvi For the first time ever, I'm tempted to Godwin the thread...



  • @onyx Well, to fight evil, you have to know evil. (Problem is that the dark side is so much more fun once you get to know it.)



  • @anotherusername said in WTF Bites:

    For the bool type, yes. For the int type, which is for all intents and purposes just as good, any nonzero value is as good as any other, so casting to bool is a waste of effort.

    You're relying on implementation detail. That just happens to be true at the moment; it's not true due to some God-delivered commandment from heaven. Use the correct data types.

    @anotherusername said in WTF Bites:

    There's never a good reason to convert anything to bool. Prove me wrong.
    Unless you're storing it as a single physical bit, anyway. And a bool isn't that.

    Maybe tomorrow a CPU that has a 1-bit bool register will come out and everybody who foolishly relied on the implementation detail that in older CPUs there was no performance difference between int and bool, their code is way slower now.

    Use the correct data types. Don't rely on implementation details to make your program work optimally.

    @bulb said in WTF Bites:

    There is never a good reason to program in C++ anyway.

    That's really the better answer. C++ is a shitty language. Stop using it.

    (If only because every time someone posts some stupid C++ trivia, it turns into this long-ass boring conversation about boring C++ features that other, better, languages don't have by boring "standards nazis" that other, better, languages also don't have because other, better, languages are designed well enough to not have hundreds of traps and omissions in their standards. And lo and behold, look what happened in this thread.)

    Stop The Boring: Stop C++.



  • @blakeyrat said in WTF Bites:

    @anotherusername said in WTF Bites:

    For the bool type, yes. For the int type, which is for all intents and purposes just as good, any nonzero value is as good as any other, so casting to bool is a waste of effort.

    You're relying on implementation detail. That just happens to be true at the moment; it's not true due to some God-delivered commandment from heaven. Use the correct data types.

    The "implementation detail" that int types work correctly in conditions isn't going to change.

    @anotherusername said in WTF Bites:

    There's never a good reason to convert anything to bool. Prove me wrong.
    Unless you're storing it as a single physical bit, anyway. And a bool isn't that.

    Maybe tomorrow a CPU that has a 1-bit bool register will come out and everybody who foolishly relied on the implementation detail that in older CPUs there was no performance difference between int and bool, their code is way slower now.

    Sure. I mean, CPUs have gone from 8-bit to 16-bit to 32-bit to 64-bit, but hypothetically speaking, tomorrow someone might build a CPU with dedicated 1-bit registers, and all the 1-bit architecture that it'd need so that everything is bit-addressable and moving bits in and out of those registers isn't a complete waste of time and effort. It would be stupid, and I don't know why they would, but sure.

    So hypothetically speaking, on that CPU, what would sizeof(bool) be?

    Use the correct data types. Don't rely on implementation details to make your program work optimally.

    If you really believed that, you'd have to write if ((bool) a) or if (a != 0) instead of just writing if (a).



  • @anotherusername said in WTF Bites:

    So hypothetically speaking, on that CPU, what would sizeof(bool) be?

    Butt.

    sizeof(bool) == "butt".



  • @blakeyrat said in WTF Bites:

    @anotherusername said in WTF Bites:

    So hypothetically speaking, on that CPU, what would sizeof(bool) be?

    Butt.

    sizeof(bool) == "butt".

    CPUs already have several 1-bit registers. They're listed here: https://en.wikipedia.org/wiki/FLAGS_register#FLAGS



  • @cvi said in WTF Bites:

    @onyx Here's something for you then:

    // Compile with: g++ -std=c++11 -O2 file.cpp -static
    #include <cstdio>
    #include "evil.hpp"
    
    int main()
    {
    	volatile int i = 0;
    
    	500_label;
    	printf( "Hello %d\n", i );
    	
    	if( i++ < 10 )
    		500_goto;
    
    	return 0;
    }
    
    evil.hpp
    #include <setjmp.h>
    
    template< char... > struct Env_ {
    	static jmp_buf env;
    };
    template< char... tChar > jmp_buf Env_<tChar...>::env;
    
    template< char... tChar > inline 
    void __attribute__((always_inline)) operator ""_label() {
    	// g++ refuses to inline a function with setjmp(). But let's not have that
    	// stop us.
    	__asm__ volatile(
    		"1:\n"
    		"movl %0, %%edi\n"
    		"call _setjmp\n"
    		"test %%eax, %%eax\n"
    		"jne 1b\n"
    		: : "g"(Env_<tChar...>::env) : "eax", "edi"
    	);
    }
    template< char... tChar > inline
    void operator ""_goto() {
    	longjmp( Env_<tChar...>::env, 666 );
    }
    

    It "works". For some definition of "work". (Disclaimer: I don't ever recommend doing this.)

    Filed under: Aiming for the front-page.

    When I see this, I am saddened that I did not have C++x11 to sodomise when I went to university.
    Me and another guy in a big project did our best to outdo eachother with ways to use C++ in entirely wrong ways in a single class file.
    The rest of the people on that project didn't know what the hell that file was, and didn't dare touch it. In reality, it was entirely unused, but looked like it was used. :D
    But that there stuff is one step above the shenanigans we pulled.


  • 🚽 Regular

    @anotherusername said in WTF Bites:

    The "implementation detail" that int types work correctly in conditions isn't going to change.

    For some value of "correctly"; where zero is not true but negative integers are.



  • WTF of my day: So, some of my pupils have to take a rather basic math test. Including the analysis of basic diagrams, y'know, pie charts, bar diagrams and stuff.

    So I looked around a bit and found some sites which had practice materials for exactly this. One of those sites being named "www.commoncoresheets.de"

    Now, I know that this common core bit is a more of a US'ian thing but they proclaimed that they had translated everything into German (and other assorted languages).

    I'm not quite sure who exactly did the translation. Or maybe what. Could also have been a bunch of monkeys.

    I mean, take this bar graph:

    0_1522088973730_f5fe399b-5780-47a1-8bef-7b08e664dc7c-image.png

    y-axis being the number of cars for a given colour, with the colours (x-axis) being: silver, red, white and yellow.

    There are some questions you have to answer using the diagram. Those questions contain the following gems:

    1. How many cars were a snake?
    2. Were there more cars in country or more cars in yellow?
    3. How big is the difference between the number of cars in a fish and the number of cars in an interview?
    4. How many more cars are there in interview than in gerbil?
    5. Are there less cars in the 80s than in a gerbil?

    :wtf:


  • Java Dev

    @rhywden Are they testing for teachers who don't pre-check the tests they put out?


  • Notification Spam Recipient

    @pleegwat said in WTF Bites:

    @rhywden Are they testing for teachers who don't pre-check the tests they put out?

    Teachers are expected to give it verbatim, and to mark the students incorrect if they don't give the right answer verbatim.



  • @zecc said in WTF Bites:

    @anotherusername said in WTF Bites:

    The "implementation detail" that int types work correctly in conditions isn't going to change.

    For some value of "correctly"; where zero is not true but negativeall other integers are.

    FTFY.


  • :belt_onion:

    @tsaukpaetra said in WTF Bites:

    @pleegwat said in WTF Bites:

    @rhywden Are they testing for teachers who don't pre-check the tests they put out?

    Teachers are expected to give it verbatim, and to mark the students incorrect if they don't give the right answer verbatim.

    :angry:


Log in to reply