Visual Studio WTF


  • Banned

    @PleegWat oh, I see what you're getting at. In case p is not inside buf, then the comparisons aren't guaranteed to work. Gotcha.

    This is actually an interesting example of differences between C and C++. In C, comparison between unrelated objects is undefined, while in C++ it's unspecified. Unlikely to make a difference in practice regardless of platform, but technically speaking, a C compiler is allowed to replace that function with return p != buf + bufsz, whereas a C++ compiler cannot do that. This is also why I didn't spot it at first, since I code in C++, not C.



  • @Gąska said in Visual Studio WTF:

    I know the difference is quite subtle, but *p++ = expression with *p is different from "same class defined twice and one of them has extra private:". There's a reason I put emphasis on LITERALLY JUST THIS ONE EXAMPLE AND NOTHING ELSE, I MEAN NOTHING ELSE, LITERALLY JUST THIS ONE EXAMPLE.

    Good grief. It's not "same class defined twice and one has extra private:. It's two different classes with the same name.



  • @Gąska said in Visual Studio WTF:

    @Steve_The_Cynic said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @Steve_The_Cynic said in Visual Studio WTF:

    Have you seen LITERALLY THIS ONE SPECIFIC CASE that I'm talking about? Because if you haven't seen LITERALLY THIS ONE SPECIFIC CASE, then no, you haven't seen anything that disproves anything I say. Because I'm talking about LITERALLY THIS ONE SPECIFIC CASE.

    OK, so let's go back to the specific case. The two classes have a semantically-equivalent but textually-different definition, but their member functions are defined in some .cpp files rather than in the headers that I, as a user of those classes, will #include. When I call a member function on the object, which version of the member function will the linker give me? The two .cpp files both define void Fred::some_function(some_parameters); and I cannot predict which version of that function will be called. That's why it matters.

    The classes are semantically identical, so it doesn't matter which of the functions gets picked every time - the result will always be the same. Now, if the two methods had different code (ie. the classes DID differ semantically), it would be different.

    Fussy: the definitions of the classes (i.e. their external interfaces) are semantically equivalent. That doesn't mean that the two classes are the same. Presumably the fact that they are defined like that makes them different classes, and therefore we have to assume (because the standard says it's UB) that the code is different as well, and that weird shit will happen.

    Once again, in case you didn't get it the other 57 times I've said it: once you have UB in your program, you can rely on exactly nothing working correctly.

    Once again, in case you didn't get it the other 57 times I've said it: you still haven't shown how THIS PARTICULAR UB can lead a NON-MALICIOUS compiler to generate wrong code. Not even a hypothetical. You just repeat "UB is UB is UB" ad nauseam as if I didn't know the subject of this discussion. But you haven't even attempted to show that an actual problem can result from this code.

    I did give a hypothetical. The class definition begins at the word "class" and ends at the closing brace. The actual code for the member functions is specified elsewhere that I, as an includer of withprivate.hpp or withoutprivate.hpp cannot see. Specifically, it's in withprivate.cpp and withoutprivate.cpp. What will the linker do if the code is different in those two files?

    That is, there is more than just the compiler in play here. The linker might be nice to you and fail immediately, or it might be more like a Unixy dynamic load, and depend on the order in which things are loaded or linked or something. No one element is malicious, but it won't work in a way that you can predict just by looking at the code.


  • Banned

    @Steve_The_Cynic said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @Steve_The_Cynic said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @Steve_The_Cynic said in Visual Studio WTF:

    Have you seen LITERALLY THIS ONE SPECIFIC CASE that I'm talking about? Because if you haven't seen LITERALLY THIS ONE SPECIFIC CASE, then no, you haven't seen anything that disproves anything I say. Because I'm talking about LITERALLY THIS ONE SPECIFIC CASE.

    OK, so let's go back to the specific case. The two classes have a semantically-equivalent but textually-different definition, but their member functions are defined in some .cpp files rather than in the headers that I, as a user of those classes, will #include. When I call a member function on the object, which version of the member function will the linker give me? The two .cpp files both define void Fred::some_function(some_parameters); and I cannot predict which version of that function will be called. That's why it matters.

    The classes are semantically identical, so it doesn't matter which of the functions gets picked every time - the result will always be the same. Now, if the two methods had different code (ie. the classes DID differ semantically), it would be different.

    Fussy: the definitions of the classes (i.e. their external interfaces) are semantically equivalent. That doesn't mean that the two classes are the same. Presumably the fact that they are defined like that makes them different classes, and therefore we have to assume (because the standard says it's UB) that the code is different as well, and that weird shit will happen.

    Once again, in case you didn't get it the other 57 times I've said it: once you have UB in your program, you can rely on exactly nothing working correctly.

    Once again, in case you didn't get it the other 57 times I've said it: you still haven't shown how THIS PARTICULAR UB can lead a NON-MALICIOUS compiler to generate wrong code. Not even a hypothetical. You just repeat "UB is UB is UB" ad nauseam as if I didn't know the subject of this discussion. But you haven't even attempted to show that an actual problem can result from this code.

    I did give a hypothetical. The class definition begins at the word "class" and ends at the closing brace. The actual code for the member functions is specified elsewhere that I, as an includer of withprivate.hpp or withoutprivate.hpp cannot see. Specifically, it's in withprivate.cpp and withoutprivate.cpp.

    As I explained earlier, this isn't the original scenario. At least not how I understood the original scenario. I thought we're just talking about class definitions and nothing more. But here we're talking about multiple definitions of members, specifically outside of the class definition.

    private: is a red herring. Having two compilation units defining the same functions (as in same-named members of a same-named class) is UB regardless of whether you have two definitions or just one. And anything wrong that happens, will happen due to two compilation units defining the same functions, not due to any difference between class definitions.


  • BINNED

    @dkf said in Visual Studio WTF:

    The point is, if it is defined then it is defined, even if the defining authority is not the main standards committee.

    Reminds me of when Linus herp-derped about people pointing out UB. Along the lines of who cares if it’s UB in C, it’s well defined by gcc and that’s what we‘re compiling with. The idea of making something less dependent on a particular compiler seemed crazy to him.



  • @Gąska said in Visual Studio WTF:

    technically speaking, a C compiler is allowed to replace that function with return p != buf + bufsz

    I didn't first understand why you wrote that specific example of what it could be replaced with.
    But since it's UB the compiler can technically replace it with anything.



  • If there are two C++ functions with the same name and signature, but one of them is only used locally and so decorated with static, is it still allowed to be called instead of the other one?


  • Java Dev

    @marczellm Not quite.

    The compiler can assume the inputs will never be such that the function results in UB. Thus, since the block goes from buf to buf + bufsz, the two conditions p >= buf and p <= buf + bufsz can be assumed to always hold.

    In practice, the compiler doesn't actually know about units of allocation, so it cannot do this. And on a hardware level a pointer in 32 or 64 bit x86 architecture is just a 32 or 64 bit integer (respectively) so the code tends to work.

    However in 16-bit x86 (and I'm approximating here based on limited knowledge, I'm sure someone will be along to correct me) a pointer is a 16-bit segment plus a 16-bit offset, which are stored in different registers. It's quite likely a compiler would optimise all pointer math to use only the offset and ignore the segment.


  • Banned

    @marczellm said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    technically speaking, a C compiler is allowed to replace that function with return p != buf + bufsz

    I didn't first understand why you wrote that specific example of what it could be replaced with.
    But since it's UB the compiler can technically replace it with anything.

    Here's the funny thing about UB: it's not UB to have a possibility of doing the wrong thing; it's only UB to actually do the wrong thing at runtime. A compliant compiler must generate code such that if the program only ever provides values that don't trigger UB, then everything works correctly and the standard must be followed and "anything can happen" does not apply.

    Example:

    if (x > 0) {
        printf("%d", *((int*)NULL));
    } else {
        printf("%d", x);
    }
    

    A standard-compliant compiler is allowed to do whatever it wants when x is greater than zero. But when x is less than or equal to 0, there is no UB and the compiler must follow the standard precisely and doesn't have any wiggle room, since UB didn't happen. So the compiler is allowed to remove the first printf, it's allowed to replace the first printf with any nasal demons it wants, it's allowed to remove the conditional entirely, but the second printf must stay intact no matter what.

    Same with @PleegWat's example. When UB happens the compiler can do whatever it wants, but if it doesn't happen, it must follow the standard precisely. If the pointer p is inside the buffer, the function must return always return true and do nothing else. If p is one-past-end of the buffer, the function must always return false and do nothing else. Everything else is undefined (in C, but not in C++). An optimizing compiler would look at the code and think, "I need to return false for one-past-end and true for everything else, what's the fastest way to do that?" and you end up with p != buf + bufsz.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    Here's the funny thing about UB: it's not UB to have a possibility of doing the wrong thing; it's only UB to actually do the wrong thing at runtime.

    That seems to miss the point of UB entirely. Which is that there is no wrong thing. Because there's also no right thing. It's undefined.

    The situation you're describing is where the action is felicitous, perhaps by chance and not at all guaranteed to continue (assuming that the particular implementation did not go beyond the standard and officially define that behavior).



  • @TwelveBaud said in Visual Studio WTF:

    @Gribnit From 2002-2007 there was Visual J#, which was a source-compatible Java language implementation on top of the CLR. Aaaaand then it got Oracled.

    I don't know about anything more recent than that, but I suspect not.

    Microsoft Java --> J# --> C#

    They just kept changing the name of their Java shit until they stopped getting sued. And it was Sun not Oracle at the time.


  • Considered Harmful

    @CodeJunkie said in Visual Studio WTF:

    @TwelveBaud said in Visual Studio WTF:

    @Gribnit From 2002-2007 there was Visual J#, which was a source-compatible Java language implementation on top of the CLR. Aaaaand then it got Oracled.

    I don't know about anything more recent than that, but I suspect not.

    Microsoft Java --> J# --> C#

    They just kept changing the name of their Java shit until they stopped getting sued. And it was Sun not Oracle at the time.

    C# is definitely not the same as Java. C# interface names tend to start with an I.


  • Banned

    @boomzilla said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    Here's the funny thing about UB: it's not UB to have a possibility of doing the wrong thing; it's only UB to actually do the wrong thing at runtime.

    That seems to miss the point of UB entirely. Which is that there is no wrong thing. Because there's also no right thing. It's undefined.

    Once again, you completely misr... okay, I feel charitable today so let's try again.

    There are some obviously wrong things to do in a program that should never happen. Accessing uninitialized memory is obviously wrong. Dereferencing null pointer is obviously wrong. Casting a pointer to an incompatible type is obviously wrong. And so on. These are things that no program should even attempt to do. The problem is, it's impossible to tell whether this is happening until you actually run code. Say, a function accepting a pointer in an argument. It may be a valid pointer, or it may be invalid. At the time of compiling that function, it's impossible to tell which of the two scenarios you're in. The function may be completely fine, or the function may be completely wrong because of a null pointer in the argument, and you won't know until the function is actually called. From the language spec's point of view, there are three ways to approach this problem:

    • Ensure at compile time that the wrong thing can never ever happen and reject code that leaves any room for error (the Rust approach.)
    • Detect the wrong thing at runtime and make those runtime checks part of the language semantics (the Java approach.)
    • Trust the programmer to never do the wrong thing (the C approach.)

    With the Rust approach, you get a lot of boilerplate related to ensuring that the thing that will never happen anyway actually won't happen, in a way that the compiler can verify at compile time (which, by the way, is the main complaint people have about Rust). With the Java approach, you get a significant performance penalty basically everywhere in the program due to all the runtime checks required. With the last approach, you avoid both issues, and also you get a lot less to worry about when developing the compiler, since you can assume that those obviously wrong things will never happen and so you don't need any special code to handle it (compiler developers, like all developers, are lazy fucks).

    Most UBs aren't about the code itself, but rather about what the code might do under some circumstances. As long as those circumstances don't happen (e.g. the argument in our example function from before is never null at runtime), there is no UB. The semantics of the program are well defined and the program will always behave in a predictable way in line with the language spec, as long as the UB-triggering circumstances don't happen. Overflow during addition is UB, but if you take steps to ensure the overflow won't happen, the a + b will always, always result in exactly the sum of a and b. And you don't have to ensure it in the same function. You can do it on the outside, before the function containing addition is called. Hell, you can even ensure it outside the program itself, by checking the input data before it's even fed into the program. As long as the addition doesn't result in overflow at runtime, the program will always work in a predictable way, because the circumstances that trigger UB never happen.

    And it's not just language lawyering. This has very practical consequences. Every downcast is potentially a bad downcast, and potentially UB - but if you take care to only downcast when you're sure of the actual type, then everything works correctly. Most message-handling-like code relies on this - downcasting based on the message type enum, even though there's no actual guarantee the enum value and the object type will always match. But since in practice it always does, there's no UB.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    Once again, you completely misr... okay, I feel charitable today so let's try again.

    Everything you said confirms my initial read. You're smearing together the run time effects of UB with the definition, as well as talking about different situations and confusing yourself by the fact that they have different triggers and manifestations.


  • Considered Harmful

    @boomzilla said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    Once again, you completely misr... okay, I feel charitable today so let's try again.

    Everything you said confirms my initial read. You're smearing together the run time effects of UB with the definition, as well as talking about different situations and confusing yourself by the fact that they have different triggers and manifestations.

    Alright, lines of C code read + written2 + spec lines understood3 + spec lines written4 time, gentlemen. No lying, this is the Internet.



  • @Gribnit said in Visual Studio WTF:

    spec lines understood3

    Is this spec lines understood at the moment of reading, or spec lines understood and remembered?


  • Considered Harmful

    @cvi said in Visual Studio WTF:

    @Gribnit said in Visual Studio WTF:

    spec lines understood3

    Is this spec lines understood at the moment of reading, or spec lines understood and remembered?

    Shit. okay, insert and update allO(0.5) the exponents.


  • Banned

    @boomzilla said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    Once again, you completely misr... okay, I feel charitable today so let's try again.

    Everything you said confirms my initial read.

    No?

    You're smearing together the run time effects of UB with the definition

    No?

    as well as talking about different situations and confusing yourself by the fact that they have different triggers and manifestations.

    No?

    In my last post, I didn't say anything about runtime effects OF UB. It was entirely about runtime behavior that avoids UB entirely. When I say UB didn't happen, it's because it didn't, literally didn't, and it has no effects whatsoever because it literally didn't happen. And I have the C language specification to back my words with. It contains the precise definition of UB, and that definition says that if a function MAY cause integer overflow but DOESN'T ACTUALLY cause integer overflow because of runtime conditions, then IT IS NOT UB.


  • Considered Harmful

    @Gąska I think you are missing an essential point here, since you are not an alien who sits on shoulders. I am, and as a :shoulder_alien:🏈 I advise you, you have insufficient standing here.



  • @Gribnit said in Visual Studio WTF:

    @CodeJunkie said in Visual Studio WTF:

    @TwelveBaud said in Visual Studio WTF:

    @Gribnit From 2002-2007 there was Visual J#, which was a source-compatible Java language implementation on top of the CLR. Aaaaand then it got Oracled.

    I don't know about anything more recent than that, but I suspect not.

    Microsoft Java --> J# --> C#

    They just kept changing the name of their Java shit until they stopped getting sued. And it was Sun not Oracle at the time.

    C# is definitely not the same as Java. C# interface names tend to start with an I.

    Well, yeah. They changed a lot of stuff when they moved to C#. Interface Definitions also start with an "I" in Delphi because the same guy created both.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    . When I say UB didn't happen, it's because it didn't, literally didn't, and it has no effects whatsoever because it literally didn't happen.

    It's like you don't even read my posts. Fascinating.


  • Banned

    @boomzilla said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    . When I say UB didn't happen, it's because it didn't, literally didn't, and it has no effects whatsoever because it literally didn't happen.

    It's like you don't even read my posts. Fascinating.

    Which post did I not read? The one where you talk about runtime effects of UB that actually happen (as opposed to never happening), and different triggers and manifestations of UBs that actually happen (as opposed to never happening)? Or was there some other post that you're referring to that I can't see, presumably because you never submitted that other post?

    Let's try this. Can you, in your own words, describe what you think I said? Because it's painfully obvious it's not even remotely close to what I actually said, and I'd like to know what that difference is exactly so I can convey my point better.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    @boomzilla said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    . When I say UB didn't happen, it's because it didn't, literally didn't, and it has no effects whatsoever because it literally didn't happen.

    It's like you don't even read my posts. Fascinating.

    Which post did I not read? The one where you talk about runtime effects of UB that actually happen (as opposed to never happening), and different triggers and manifestations of UBs that actually happen (as opposed to never happening)? Or was there some other post that you're referring to that I can't see, presumably because you never submitted that other post?

    The one where I said what you're saying. And now you're saying what I said was wrong, except it's the same thing.

    Let's try this. Can you, in your own words, describe what you think I said? Because it's painfully obvious it's not even remotely close to what I actually said, and I'd like to know what that difference is exactly so I can convey my point better.

    I'll do even better. I'll quote you quoting me

    @Gąska said in Visual Studio WTF:

    @boomzilla said in Visual Studio WTF: as well as talking about different situations and confusing yourself by the fact that they have different triggers and manifestations.



  • @boomzilla said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    . When I say UB didn't happen, it's because it didn't, literally didn't, and it has no effects whatsoever because it literally didn't happen.

    It's like you don't even read my posts. Fascinating.

    Welcome to TDWTF.

    Edit: I am making a joke about the tendency of people to talk past each other in many TDWTF arguments, not accusing any specific individual in this discussion of failing to read.


  • Banned

    @boomzilla said in Visual Studio WTF:

    Let's try this. Can you, in your own words, describe what you think I said? Because it's painfully obvious it's not even remotely close to what I actually said, and I'd like to know what that difference is exactly so I can convey my point better.

    I'll do even better. I'll quote you quoting me

    @Gąska said in Visual Studio WTF:

    @boomzilla said in Visual Studio WTF: as well as talking about different situations and confusing yourself by the fact that they have different triggers and manifestations.

    This is not better. This is much worse. It's actually not helpful at all. You say I'm talking about different situations - what situations are those, in particular? You say I'm confused by different triggers and manifestations - what triggers and manifestations are you referring to, in particular?

    I cannot tell you where the misunderstanding is as long as you refuse to say what you think I said. What situations, what triggers, what manifestations, and how am I confusing them?

    Just to reiterate - my point is that if there are two code paths, and one of them leads to UB but the other doesn't, but the first path is never taken at runtime, then the behavior of that program is well-defined.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    This is not better. This is much worse. It's actually not helpful at all. You say I'm talking about different situations - what situations are those, in particular? You say I'm confused by different triggers and manifestations - what triggers and manifestations are you referring to, in particular?

    FFS. Go back to your :wharrgarbl: with @Steve_The_Cynic. Run time vs compile time vs link time, for instance.

    I cannot tell you where the misunderstanding is as long as you refuse to say what you think I said.

    That's a personal problem.

    Just to reiterate - my point is that if there are two code paths, and one of them leads to UB but the other doesn't, but the first path is never taken at runtime, then the behavior of that program is well-defined.

    That's a fine statement, but you were generalizing before, which made your statement false.


  • Banned

    @boomzilla can you point out where did I generalize in a way that makes it false? Because I'm pretty sure I didn't.

    I refuse to reply to the first paragraph because it makes no sense.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    @boomzilla can you point out where did I generalize in a way that makes it false? Because I'm pretty sure I didn't.

    I refuse to reply to the first paragraph because it makes no sense.

    Once again, here you go:

    @Gąska said in Visual Studio WTF:

    Here's the funny thing about UB: it's not UB to have a possibility of doing the wrong thing; it's only UB to actually do the wrong thing at runtime.


  • Banned

    @boomzilla where's the generalization? Under what conditions is that statement false?


  • ♿ (Parody)

    @Gąska it's like you don't know the definition of "undefined" or something.



  • And the mindless back and forth continues...


  • Banned

    @boomzilla it's like you can't understand a simple sentence.

    Only PART of the program's behavior is undefined. The rest is well-defined. If the undefined part is only relevant to some code branches, and those code branches are never taken, then regardless of what possible effects the UB may have, none of them will happen. IF the UB is only in the code path that's never executed. And it must be this way because otherwise, downcasting wouldn't work.


  • Banned

    @Benjamin-Hall said in Visual Studio WTF:

    And the mindless back and forth continues...

    It could all be avoided if @boomzilla just learned to read.



  • @Gąska said in Visual Studio WTF:

    @Benjamin-Hall said in Visual Studio WTF:

    And the mindless back and forth continues...

    It could all be avoided if @boomzilla just learned to read.

    It takes two to argue mindlessly.


  • Banned

    @Benjamin-Hall you're right. I should've never taken @boomzilla seriously.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    @Benjamin-Hall said in Visual Studio WTF:

    And the mindless back and forth continues...

    It could all be avoided if @boomzilla just learned to read.

    :rolleyes: JFC you're retarded.

    @Gąska said in Visual Studio WTF:

    @boomzilla it's like you can't understand a simple sentence.

    Only PART of the program's behavior is undefined. The rest is well-defined. If the undefined part is only relevant to some code branches, and those code branches are never taken, then regardless of what possible effects the UB may have, none of them will happen. IF the UB is only in the code path that's never executed. And it must be this way because otherwise, downcasting wouldn't work.

    Yes, that's one kind of UB. I accept that you're unable to understand that there are others, so I'll stop trying to convince you. I should have known better watching you before with Steve_The_Cynic, but eh.


  • Considered Harmful

    @Benjamin-Hall said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @Benjamin-Hall said in Visual Studio WTF:

    And the mindless back and forth continues...

    It could all be avoided if @boomzilla just learned to read.

    It takes two to argue mindlessly.

    Takes at least four.


  • Considered Harmful

    @boomzilla said in Visual Studio WTF:

    I should have known better watching you before with , but eh.

    Don't be jelly.

    Be peanut butter. OH YEAH.


  • Banned

    @boomzilla oh, I get it now. Your inability to read made you think that when I say "it's only UB when...", I meant literally this is the only way to ever have any UB whatsoever. Now it all makes sense.

    Let's go back to that sentence.

    Here's the funny thing about UB: it's not UB to have a possibility of doing the wrong thing; it's only UB to actually do the wrong thing at runtime.

    Focus on this part:

    it's not UB to have a possibility of doing the wrong thing;

    This part of the sentence establishes the context. This is what we're talking about - UBs that result from programmer error ("doing the wrong thing") in code that's run conditionally ("have a possibility").

    It's immediately followed by this - after a semicolon, so still the same sentence:

    it's only UB to actually do the wrong thing at runtime.

    Because it's part of the same sentence, it only makes sense to consider this in context of the previous sentence. The context of conditionally-executed code that "does the wrong thing" resulting in UB. "It's only UB", as in, in this context. Sure, other UBs exist since they're not related to runtime behavior. But here specifically, we're only talking about UBs from runtime behavior, as that was established in the previous part. And the only UBs from runtime behavior are those where "the wrong thing" actually happens.

    Another clue that "only" might not mean literally the only kind of UB that could ever possibly happen in any case, is that it's obviously wrong. If something is obviously wrong, there's an extremely high chance this is not what the person meant and that you're misreading something. Also, in the posts immediately preceeding the one you complained about, I have shown that I'm aware that UBs that aren't a result of runtime behavior also exist.


    In short. I didn't generalize anything. You just can't read. You were so overtaken by the word "only" that your brain couldn't comprehend the rest of what I was saying. And if you said right at the start that the word "only" is what you have a problem with, this whole brainless argument could be avoided. But instead you started talking about smearing and definitions and effects and triggers, but you completely skipped over the core issue - that runtime UBs aren't the "only" kind of UBs.

    That's why I hate talking to you. It always turns out you misread something, and you always make it as hard as possible to figure out what you misread.

    And just to be clear - yes, I mean you specifically. All other forum members combined aren't even half as bad at reading comprehension as you.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    Your inability to read made you think that when I say "it's only UB when...", I meant literally this is the only way to ever have any UB whatsoever. Now it all makes sense.

    Indeed. It helps to read the words.


  • Banned

    @boomzilla it helps to understand them too. So learn to read.


  • ♿ (Parody)

    @Gąska said in Visual Studio WTF:

    @boomzilla it helps to understand them too. So learn to read.

    :rolleyes:



  • Without casting aspersions on any specific individuals in this thread, most discussions of this sort tend to have both pots and kettles.


  • Java Dev

    @Gąska said in Visual Studio WTF:

    @boomzilla it's like you can't understand a simple sentence.

    Only PART of the program's behavior is undefined. The rest is well-defined. If the undefined part is only relevant to some code branches, and those code branches are never taken, then regardless of what possible effects the UB may have, none of them will happen. IF the UB is only in the code path that's never executed. And it must be this way because otherwise, downcasting wouldn't work.

    That's dangerous thinking, because UB isn't contained in a code branch. It is not "Dereferencing a null pointer has undefined results", it is "Dereferencing a pointer implicitly indicates that pointer will never be null at that point in time". And it can work itself backward in time: If a pointer is dereferenced, then it is not a null pointer, and it can't have been a null pointer beforehand ever since the last time the value of that pointer was changed. Every null check on that pointer, not just after the dereference but even beforehand all the way back to when the pointer last changed can be optimised away based on the discovered knowledge the pointer is not NULL.



  • @HardwareGeek said in Visual Studio WTF:

    Without casting aspersions on any specific individuals in this thread, most discussions of this sort tend to have both pots and kettles.

    You rang?


  • Banned

    @PleegWat said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @boomzilla it's like you can't understand a simple sentence.

    Only PART of the program's behavior is undefined. The rest is well-defined. If the undefined part is only relevant to some code branches, and those code branches are never taken, then regardless of what possible effects the UB may have, none of them will happen. IF the UB is only in the code path that's never executed. And it must be this way because otherwise, downcasting wouldn't work.

    That's dangerous thinking

    Sure. Everything regarding UB is dangerous.

    because UB isn't contained in a code branch. It is not "Dereferencing a null pointer has undefined results", it is "Dereferencing a pointer implicitly indicates that pointer will never be null at that point in time". And it can work itself backward in time: If a pointer is dereferenced, then it is not a null pointer, and it can't have been a null pointer beforehand ever since the last time the value of that pointer was changed. Every null check on that pointer, not just after the dereference but even beforehand all the way back to when the pointer last changed can be optimised away based on the discovered knowledge the pointer is not NULL.

    It can go backward in time, but it cannot go to alternate realities. It's perfectly legal to have a null pointer, it's only UB to....

    Scratch that last sentence.

    It's not UB to have a null pointer. It is UB to dereference it. If no dereference happens in the current timeline, then the compiler cannot assume in the current time that it's not null (maybe the programmer knows that the pointer can only be null if that other, seemingly unrelated condition is true?) Because of the as-if rule, absent the UB, an optimizing compiler must generate code where all observable effects are identical to completely unoptimized code.

    That said, I wouldn't vouch for GCC adhering to as-if rule in -O3. UBs of any kind are still best avoided.


  • BINNED

    @Benjamin-Hall said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @Benjamin-Hall said in Visual Studio WTF:

    And the mindless back and forth continues...

    It could all be avoided if @boomzilla just learned to read.

    It takes two to argue mindlessly.

    It only takes one to refuse to be helpful.



  • @topspin I didn't ask for help! This isn't a help topic!! Don't help if I didn't ask for help!!! :wharrgarbl: :wharrgarbl: :wharrgarbl: :wharrgarbl:



  • @topspin said in Visual Studio WTF:

    @Benjamin-Hall said in Visual Studio WTF:

    @Gąska said in Visual Studio WTF:

    @Benjamin-Hall said in Visual Studio WTF:

    And the mindless back and forth continues...

    It could all be avoided if @boomzilla just learned to read.

    It takes two to argue mindlessly.

    It only takes one to refuse to be helpful.

    Yes, but if the other interlocutor ignores them, it dies there.

    Basically, don't feed the troll. Even if the troll is being so unintentionally.

    and yes, I know I'm being hypocritical here in the extreme. But I'm trying to do better.



  • @Benjamin-Hall said in Visual Studio WTF:

    and yes, I know I'm being hypocritical here in the extreme. But I'm trying to do better.

    It's easier when you're an observer. We've all been stuck in one of those retarded discussions, only that when you're in the middle of them, your POV doesn't seem that retarded.

    To be fair, the discussion by itself is a lot less retarded (if not very informative still) when you've read C++ committee people have similar discussions regarding UB and the topics surrounding it. You can always find people from the same groups: the purists, which operate on the premise that your compiler will screw you over given the slightest chance; the pragmatists, where if the program right now does what it's supposed to, everything is good, standard be damned. Truth is somewhere in-between, probably. Where exactly? 🤷

    Case in point: how illegal is it actually to dereference the nullptr? Well. Standard is pretty clear: no can do. Is address zero, which is the typical representation of a nullptr, special? Somewhat, but not too much. You can map memory at address zero "just fine":

    read.cpp:

    #include <cstdint>
    
    std::uint64_t read_( std::uint64_t* aAddr )
    {
    	return *aAddr;
    }
    

    test.cpp:

    #include <cstdint>
    
    std::uint64_t read_( std::uint64_t* );
    
    #include <cstdio>
    #include <cstring>
    
    #include <sys/mman.h>
    
    int main()
    {
    	std::uint64_t x = 123;
    	std::printf( "x = %lu\n", read_( &x ) );
    
    	auto ptr = ::mmap( 
    		0, 16*1024,
    		PROT_READ|PROT_WRITE, 
    		MAP_PRIVATE|MAP_ANONYMOUS|MAP_FIXED,
    		-1, 0
    	);
    
    	std::printf( "address = %p\n", ptr );
    
    	std::uint64_t y = 456;
    	std::memcpy( ptr, &y, sizeof(std::uint64_t) );
    
    	std::printf( "nullptr = %lu\n", read_( nullptr ) );
    
    	::munmap( ptr, 16*1024 );
    
    	return 0;
    }
    

    $ ./a.out
    x = 123
    address = 0xffffffffffffffff
    Segmentation fault
    $ sudo ./a.out
    Password: hunter2
    x = 123
    address = (nil)
    *nullptr = 456

     
    Program does what you expect it to do, which includes reading from nullptr.

    Edit - the following went missing: However, is the program fine? No, if you're not careful, the optimizer will screw you over. GCC in fact does this on purpose, if it ever realizes what you're up to. If you let GCC do that, it will insert a ud2 instruction = undefined instructions / invalid opcode, which will bring your program down by default.


Log in to reply