C++ Stockholm Syndrome


  • Garbage Person

    @ben_lubar Yeah, but defer only fires on return from the surrounding function, not the block. Makes them particularly annoying to use in loops.

    And (annoyingly) takes two lines, whereas std::lock_guard only takes one.

    But then Go is good at being annoying.



  • @greybeard said in C++ Stockholm Syndrome:

    @ben_lubar Yeah, but defer only fires on return from the surrounding function, not the block. Makes them particularly annoying to use in loops.

    Sounds like you need to refactor.

    @greybeard said in C++ Stockholm Syndrome:

    And (annoyingly) takes two lines, whereas std::lock_guard only takes one.

    Yes, but std::lock_guard doesn't work with all function calls, just ones that implement the functions it assumes a lock should have. It's a trade-off. It takes two lines to wrap some code in a mutex, but you don't need to make an entirely new class for every different function you want to call that way.


  • Discourse touched me in a no-no place

    @pie_flavor said in C++ Stockholm Syndrome:

    JVM bytecode is just enough to implement Java in.

    Not any more; not for quite a few years now. That's still the enormous majority of the JVM bytecode, but that's arguably because a lot of that's general processing operations which would be there for pretty much any sane language.


  • Discourse touched me in a no-no place

    @ben_lubar From this discussion, I've learned that you're not the best at C# programming but better at Go, and @masonwheeler is better at C# than Go. Which is kind of OK, but not very useful to the rest of us. ;)

    Deadlocks are a mark of the program being wrong. Not (usually) the library.



  • @dkf said in C++ Stockholm Syndrome:

    @ben_lubar From this discussion, I've learned that you're not the best at C# programming but better at Go, and @masonwheeler is better at C# than Go. Which is kind of OK, but not very useful to the rest of us. ;)

    Deadlocks are a mark of the program being wrong. Not (usually) the library.

    Generally, the deadlocks in C# I've run into have all followed the pattern of "multi-layered synchronous wait on task while there is an active HttpContext".


  • Discourse touched me in a no-no place

    @ben_lubar said in C++ Stockholm Syndrome:

    Generally, the deadlocks in C# I've run into have all followed the pattern of "multi-layered synchronous wait on task while there is an active HttpContext".

    IOW, bugs in the program. ;)

    Doing that sort of thing right is possible, but not exactly easy as the primitives aren't quite suitable in C# and add quite a lot of boilerplate.



  • @pie_flavor said in C++ Stockholm Syndrome:

    @pie_flavor said in C++ Stockholm Syndrome:

    @lb_ Well, Rust doesn't have anything like that, but new features are added every so often and I wouldn't be surprised to see it eventually. Right now conditional compilation is limited basically to crate features and a few global flags like operating system or pointer width.

    Actually, never mind, I think you can do exactly that.

    <snip />

    This looks a lot like a C++-style template specialization, i.e., what you needed to do before if constexpr existed.



  • @masonwheeler said in C++ Stockholm Syndrome:

    In Boo, a macro definition (or a meta method or an AST attribute--different metaprogramming constructs for different use cases) is ordinary Boo code that gets compiled and emitted in the output as ordinary classes and methods. Then when you use it, you reference the assembly containing the metaprogramming code, and the compiler loads it and executes it. Metaprogramming is just code that operates on the compiler's ASTs rather than on runtime data. (As a side benefit, this also means that you can attach the debugger to the compiler and set a breakpoint inside your metaprogramming code and see exactly what's going on, a thing that's simply not possible in C++.)

    👍 Sounds like the right way to go about that.

    I'm hoping that C++ will continue to move into that direction (the constexpr stuff that started in C++11 and has been built on ever since is a start). Some of the stuff in the meta-classes proposal also seems to move towards that, i.e., where you can write compile time code that simply interacts with the compiler via a pre-defined API. I hope that takes off, but it's going to be a while. :-/

    I'm less optimistic about being able to load and run code the way you describe, though. The module stuff might introduce a pre-compiled format for compile-time stuff, but I'm not familiar enough with that part.

    One thing I do like about the C++ templates, though, is the very terse syntax you get for doing simple meta-programming stuff. All other meta-programming environments that use the "normal" language have made that much more painful.


  • Considered Harmful

    @cvi 🤷 There was a use-case and I gave a solution. Compile-time resolution of an if-statement seems much less cromulent than simply making separate methods. I'd like to see a use-case where specialization doesn't cut the mustard.



  • @pie_flavor As said, specialization is what C++ had before, so it's the one I'm still more familiar with. There are few painful cases, though, for specialization (expanding on @LB_'s example):

    template<typename Iterator, typename Other>
    auto some_generic_func(Iterator begin, Iterator end, Other dest)
    {
        // pile of code
    
        if constexpr(std::is_same_v<typename std::iterator_traits<Iterator>::iterator_category, std::random_access_iterator_tag>)
        {
            //optimized for random-access iterators
        }
        else
        {
            //generic 
        }
    
        // pile of code here
    
        if constexpr(std::is_same_v<typename std::iterator_traits<Other>::iterator_category, std::bidirectional_iterator_tag>)
        {
            //optimized for bidirectional iterators
        }
        else
        {
            //..
        }
    
        // pile of code
    }
    

    Essentially, you'd need to factor out the if constexpr parts into separate functions, and specialize those and then call them in the main function. Or, you would have provide four specializations of the method (Iterator is/is not random access and Other is/is not bidirectional) that all repeat the various piles of code.

    Unless you factor out things into separate functions, the approach scales rather badly with the number of conditions. And factoring stuff out into separate functions isn't that convenient either and has the potential to create a lot of spaghetti.


  • Impossible Mission - B

    @cvi said in C++ Stockholm Syndrome:

    I'm less optimistic about being able to load and run code the way you describe, though. The module stuff might introduce a pre-compiled format for compile-time stuff, but I'm not familiar enough with that part.

    That would be tricky, as in order to do that, your code needs an externally visible definition of the compiler's internal AST format. Is any C++ compiler going to want to do that?

    One thing I do like about the C++ templates, though, is the very terse syntax you get for doing simple meta-programming stuff. All other meta-programming environments that use the "normal" language have made that much more painful.

    The classical example of C++ template metaprogramming is the generation of arbitrary Fibonacci numbers at compile time. The first Google result for "C++ template fibonacci number" gives this example, which is rather typical:

    // Calculate the value passed as T
    template <int T>
    struct Fibonacci
    {
        enum { value = (Fibonacci<T - 1>::value + Fibonacci<T - 2>::value) };
    };
    
    // In the template meta-programming, we do not have conditionals, so instead
    // we use struct-overloading like mechanism to set constraints, we do this for
    // numbers 0, 1 and 2, just like our algorithm in the function above.
    template <>
    struct Fibonacci<0>
    {
        enum { value = 1 };
    };
    
    template <>
    struct Fibonacci<1>
    {
        enum { value = 1 };
    };
    
    template <>
    struct Fibonacci<2>
    {
        enum { value = 1 };
    };
    

    Note the comment:

    In the template meta-programming, we do not have conditionals, so instead we use struct-overloading like mechanism to set constraints, we do this for numbers 0, 1 and 2, just like our algorithm in the function above.

    Not only does this make the result look really weird, it also makes the compilation take massively longer than it should, because using naive recursion runs in exponential time (ie O(2N), which is far worse than O(N2)).

    Here's an example of how to do the same thing with Boo metaprogramming, where the entire language is available to you and so you can do it with iterative generation, which runs in linear (O(N)) time:

    import System.Linq.Enumerable
    
    [Meta]
    def Fibonacci(value as IntegerLiteralExpression):
       var index = value.Value //unwrap the AST object and retrieve its integer value
       var result = FibonacciSequence().Skip(index).First()
       return IntegerLiteralExpression(value.LexicalInfo, result)
    
    def FibonacciSequence() as long*: //the * means "IEnumerable of" in Boo
       first as long = 0
       second as long = 1
       while true:
          yield first
          yield second
          first += second
          second += first
    

    Much faster at compile-time, and all in 14 lines of code. The C++ template version is 12 lines of code, not counting braces and comments, 20 if you do count the braces, which probably should be counted because they're a syntactical necessity. Either way, the Boo version is at a comparable level of terseness and does an objectively better job of achieving the correct result.


  • Considered Harmful

    @cvi It would be a pile of spaghetti, if Rust didn't have super-powerful macros. I'll bet you a macro if_types_equal! could be written that did the whole specialization thing on one line.
    The primary reason I don't like stuff like if constexpr is because C++ really blurs the line between compile-time code and runtime code. For example, its templates are just that - templates for compilation expansion. They may as well be a different language which compiles down to C++. Rust generics have the same benefits, but they actually get treated like language elements instead of compiler elements. Specialization is clearly part of the language - different types get treated differently. However, if constexpr is a language element that performs conditional compilation - more line-blurring. Rust aggressively optimizes in release mode, so it doesn't need an if constexpr construct at all - an if statement with a compile-time true condition gets removed, a function call returning a value gets collapsed, and a function call that returns another function call directly gets collapsed. All of this is the compiler's job. The language's job is to find a way to, in this instance, check generic types, which it has.



  • @pie_flavor said in C++ Stockholm Syndrome:

    C++ really blurs the line between compile-time code and runtime code.

    And what about the awesome-looking Boo that @masonwheeler posted? Is that line more or less blurry?

    In reality nobody cares, we learn what can be a constant expression and what can only be evaluated at runtime based on language syntax and keywords, and if we really need the former rather than the latter there are easy ways to enforce that or else get a compile error.

    @pie_flavor said in C++ Stockholm Syndrome:

    All of this is the compiler's job.

    The compiler's job is not to make my code more readable and more maintainable. That's the language's job. You're right though that optimization is the compiler's job, but why not make that job easier when the language makes it so convenient to do so?



  • @masonwheeler said in C++ Stockholm Syndrome:

    Exactly. When all you have is a hammerRAII, everything starts to look like a nailclass with its own constructor and destructor. Add that in to the example and it looks a lot less simple and clean.

    Let's have a comparison within the same hypothetial language. Which of these looks cleaner to you?

    {
        scoped_change frobbed {[&]{ MyWidget.Frob(); }, [&]{ MyWidget.UnFrob(); }};
        DoStuffWith(MyWidget);
    }
    
    {
        MyWidget.Frob();
        try
        {
            DoStuffWith(MyWidget);
        }
        finally
        {
            MyWidget.UnFrob();
        }
    }
    

    To me, the main difference is the amount of space between Frob and UnFrob and the potential for them to be misconstrued as unrelated, or to forget that they need to be paired. In the first example, it's pretty clear that they are related, and you only have to add or remove a single line of code to have the widget in the Frobbed state or not. In the second example, there could be a lot more code involved and a lot more states than just Frobbed, and it would be easy to forget an UnFrob here or a Frob there, or to mix up the order if the order matters. The syntax is less important, and while the syntax of the second example looks better, I'll take the improved maintainability of the first example instead.



  • @pie_flavor said in C++ Stockholm Syndrome:

    Rust aggressively optimizes in release mode, so it doesn't need an if constexpr construct at all - an if statement with a compile-time true condition gets removed [...]

    There's a important difference between if constexpr and if. The if constexpr version discards the branch that isn't taken earlier, because it may contain invalid code. Example:

    template< typename T > int f( T x )
    {
        if constexpr( std::is_pointer_v<T> ) { return *x; }
        else { return x; }
    }
    

    This will compile even when instantiated with T = int, despite *x not being defined for int types. Without the "constexpr" the code would be invalid.

    Edit:

    Rust generics have the same benefits, but they actually get treated like language elements instead of compiler elements.

    I don't get what you mean by that. Do you have an example/clarification?



  • @masonwheeler said in C++ Stockholm Syndrome:

    That would be tricky, as in order to do that, your code needs an externally visible definition of the compiler's internal AST format. Is any C++ compiler going to want to do that?

    As mentioned, I haven't been following the modules proposals too closely (I read one of the early versions, but got Fortran-Module PTSD).

    AFAIK the idea is somewhat related to precompiled headers but with more explicit interfaces (so that internal stuff can be hidden more, and I think the goal is also to prevent macros from leaking everywhere). From what I know you "compile" a header (set of headers?) into a module. I don't think that the format of the module is specified, rather that's left to the various implementers/compilers. The modules are for sure not required to be portable between compilers.

    Maybe somebody else here has more up-to-date and detailed information on the module stuff?

    The classical example of C++ template metaprogramming is the generation of arbitrary Fibonacci numbers at compile time. The first Google result for "C++ template fibonacci number" gives this example, which is rather typical:

    TBF ... if you just want a compile-time fibonacci number in C++14 and later:

    constexpr unsigned long fib( unsigned long x )
    {
    	if( 0 == x ) return 0;
    
    	auto prev = 0ul, cur = 1ul;
    	for( auto i = 1ul; i < x; ++i )
    	{
    		auto next = prev + cur;
    		prev = cur;
    		cur = next;
    	}
    	
    	return cur;
    }
    
    // Usable in compile-time only contexts:
    using Number = std::integral_constant< unsigned long, fib(27) >;
    
    // But also in run-time only ones
    auto f( unsigned long x ) { return fib(x); }
    

    Edit:

    Not only does this make the result look really weird, it also makes the compilation take massively longer than it should, because using naive recursion runs in exponential time (ie O(2^N), which is far worse than O(N^2)).

    Most compilers will use memoization, so each Fibonacci<N> is only evaluated once. That lowers the complexity quite significantly.

    Edit2:
    I'll have to read up on the IntegerLiteralExpression. If it that does what I think it does, it would actually be one aspect that I'm missing in C++ -- currently there is no way to distinguish between an expression that can be evaluated at compile time from one that is purely runtime. So, the expression 5 is just an int the same way integerVariableJustReadFromTheUser is.


  • Impossible Mission - B

    @cvi said in C++ Stockholm Syndrome:

    AFAIK the idea is somewhat related to precompiled headers but with more explicit interfaces (so that internal stuff can be hidden more, and I think the goal is also to prevent macros from leaking everywhere). From what I know you "compile" a header (set of headers?) into a module. I don't think that the format of the module is specified, rather that's left to the various implementers/compilers. The modules are for sure not required to be portable between compilers.

    Yeah, that's about how I figured it would have to work.

    TBF ... if you just want a compile-time fibonacci number in C++14 and later:

    constexpr unsigned long fib( unsigned long x )
    {
    	if( 0 == x ) return 0;
    
    	auto prev = 0ul, cur = 1ul;
    	for( auto i = 1ul; i < x; ++i )
    	{
    		auto next = prev + cur;
    		prev = cur;
    		cur = next;
    	}
    	
    	return cur;
    }
    
    // Usable in compile-time only contexts:
    using Number = std::integral_constant< unsigned long, fib(27) >;
    
    // But also in run-time only ones
    auto f( unsigned long x ) { return fib(x); }
    

    So the new C++ standard has a way to call C++ code from Template code under certain conditions? If so, that's a serious improvement.

    Most compilers will use memoization, so each Fibonacci<N> is only evaluated once. That lowers the complexity quite significantly.

    Huh. TIL.

    Edit2:
    I'll have to read up on the IntegerLiteralExpression. If it that does what I think it does, it would actually be one aspect that I'm missing in C++ -- currently there is no way to distinguish between an expression that can be evaluated at compile time from one that is purely runtime. So, the expression 5 is just an int the same way integerVariableJustReadFromTheUser is.

    IntegerLiteralExpression is the class for an AST node that holds an integer literal. It consists of a Value property of type long (aka int64) that holds the actual integer literal, an IsLong property of type bool (false for int, true for long,) and a bunch of plumbing related to the way the compiler interacts with ASTs. You can find the source code to it, and the rest of the AST node types, here. Like most node types, it's defined as a partial class, divided between the Ast folder (manually written information such as constructors) and the Ast/Impl folder (auto-generated code extrapolated from the AST model.)

    The entire AST definition is right there for anyone to see, which is basically the only way this style of metaprogramming is possible. About 90% of what you need to know is in the AST model file.



  • @masonwheeler said in C++ Stockholm Syndrome:

    So the new C++ standard has a way to call C++ code from Template code under certain conditions? If so, that's a serious improvement.

    Yeah ... essentially, you can call functions that are constexpr from template code (and other compile-time contexts). Originally, in C++11 constexpr was quite restrictive (single return statement only IIRC). C++14 relaxed that significantly -- you can pretty much make classes constexpr by declaring the relevant member functions constexpr.

    Example
    struct Int2
    {
    	int x, y;
    
    	constexpr Int2( int aX, int aY ) : x(aX), y(aY) {}
    	constexpr Int2 operator* (int f) {
    		return Int2{ x * f, y * f };
    	}
    };
    
    constexpr int dot( Int2 a, Int2 b ) {
    	return a.x*b.x + a.y*b.y;
    }
    
    #include <type_traits>
    using Number = std::integral_constant< int, dot( Int2{1,2}*5, Int2{3,4} ) >;
    

    There's another somewhat interesting feature in the new "relaxed" constexpr: you can have branches that lead to non-constexpr code in the constexpr function. It's callable in a constexpr context if you don't hit those branches; if you do, you'll get a compile time error. For example, throwing exceptions isn't allowed in constexpr code, but the following is valid:

    constexpr auto f( int x ) { if( x< 0 ) throw SomeError{}; return x*5; }
    
    using A = std::integral_constant< int, f(55) >; // OK
    using B = std::integral_constant< int, f(-3) >; // Compile time error
    

    In C++-time all of this is still a fairly new feature. So, a lot of standard functions that could be constexpr aren't (and some that should be, can't, because of backward compatiblity -- almost all of math.h for example). Also, at this point, you can make so much of the inlined code constexpr that it almost seems like constexpr should have been the default. :-/

    The entire AST definition is right there for anyone to see, which is basically the only way this style of metaprogramming is possible. About 90% of what you need to know is in the AST model file.

    Again, I wish that C++ will go in this direction eventually. The meta-classes proposal has some baby-steps w.r.t. that, which is one of the reasons why I'm excited by that (but it's still long ways off).



  • ...



  • @cvi said in C++ Stockholm Syndrome:

    Also, at this point, you can make so much of the inlined code constexpr that it almost seems like constexpr should have been the default.

    Aye - in new languages, the defaults should always be constant expression/evaluation, immutable, read only, etc. unless explicitly marked otherwise. The problem is that a lot of new languages either bake immutability into the type (I'm not a fan of Boo's immutable strings, for instance) or they otherwise don't have any form of const-correctness like C and C++ do.


  • Discourse touched me in a no-no place

    @lb_ said in C++ Stockholm Syndrome:

    immutable strings

    As long as they've also got language access to a fast buffer type for when building strings from many pieces, actual immutable strings aren't a great problem. (They've certainly got library access; C# has such a thing in its basic standard library and they're targeting the same VM model.) It'd be nice if compilers were better about detecting opportunities to keep things in builder buffers longer so that immutability only kicks in when you start looking for it, but I guess that's not a common optimisation. It'd still help a lot of not-very-well-written code, and that's a large fraction of what's out there.


  • Banned

    @masonwheeler said in C++ Stockholm Syndrome:

    @ben_lubar said in C++ Stockholm Syndrome:

    @masonwheeler said in C++ Stockholm Syndrome:

    @topspin said in C++ Stockholm Syndrome:

    Got a specific example where this is horrible so we can make sure to talk about the same thing?

    You run into it all the time in GUI programming. Stuff that looks like:

    MyWidget.Frob();
    try:
       DoStuffWith(MyWidget);
    finally:
       MyWidget.UnFrob();
    

    Where Frob represents temporarily placing it in a specific state that should be reversed once you're done with whatever you're doing.

    That has nothing to do with resource acquisition or destruction, but you still want it to be exception-safe so that if a recoverable exception gets thrown, it doesn't leave your UI in an inconsistent state.

    std::frob frobbed(MyWidget);
    DoStuffWith(MyWidget);
    // compiler calls frobbed.~frob() when scope ends.
    

    Exactly. When all you have is a hammerRAII, everything starts to look like a nailclass with its own constructor and destructor. Add that in to the example and it looks a lot less simple and clean.

    When all you have is this tool that can do everything you want and doesn't look fugly, I don't see how that's a problem.

    template<typename F>
    class finally
    {
    public:
        explicit finally(F f): f(f) {}
        ~finally() { f() }
    private:
        T f;
    }
    
    void exampleUsage()
    {
        myWidget.frob();
        auto f = finally{[]{ myWidget.unfrob(); };
    
        DoStuffWith(MyWidget);
    

    This emulates 100% functionality of finally and works in every case. Yes, it's kinda hackish, and you get finally before try - but it's totally viable solution, and no worse than every other piece of code written in C++.


  • Banned

    @lb_ said in C++ Stockholm Syndrome:

    @gąska Ah, that's good to hear. I'm especially happy Rust is getting coroutines. Are there any notable items on my wishlist Rust doesn't satisfy off the top of your head? I'm not concerned about the borrow checker, I can learn a new way to think and write code if it means being closer to my ideal language.

    Looking at your list, the only missing thing is exceptions (Rust uses error-returning via Result instead, and they made it very powerful; almost as easy and effortless as exceptions, but you cannot ignore them, and you cannot not catch them). Also, while it does have powerful generics and metaprogramming, there's no feature parity between it and C++ - there are some pieces still missing, there are some pieces that will never get through, but there are also some things you can do in Rust that you can't do in C++. Also, traits work for both static and dynamic polymorphism by default, without any additional code needed. Not commenting on "clean syntax" since it's very subjective (for example, I find Rust syntax much cleaner than Scala).

    @pie_flavor said in C++ Stockholm Syndrome:

    @pie_flavor said in C++ Stockholm Syndrome:

    @lb_ Well, Rust doesn't have anything like that, but new features are added every so often and I wouldn't be surprised to see it eventually. Right now conditional compilation is limited basically to crate features and a few global flags like operating system or pointer width.

    Actually, never mind, I think you can do exactly that.

    On nightly. That's kinda very important detail.


  • Banned

    Random thought, but on topic: @masonwheeler criticizes C++ templates for being Turing-complete, but praises the Turing-complete language Boo for metaprogramming model where compiler can call arbitrary Boo code to generate code.


  • Impossible Mission - B

    @gąska I criticize C++ templates for being Turing-complete by accident. It was never designed that way; template metaprogramming was a discovery of something that wasn't meant to exist. So is it any surprise that it's such a mess to work with?


  • Banned

    @masonwheeler I still don't see how being Turing complete, whether by accident or not, is a problem. They are a mess, but they would be a mess even without Turing completeness (though they would be a different kind of mess).

    Fun fact: CSS is Turing complete by accident, too.


  • BINNED

    @cvi said in C++ Stockholm Syndrome:

    could be constexpr aren't (and some that should be, can't, because of backward compatiblity -- almost all of math.h for example).

    I don't understand why that is. Adding a constexpr qualifier doesn't change the ABI or observable behavior, I think?

    Actually, just the other day I discovered that gcc and clang do constant folding of expressions involving sqrt and log and the like. There was this ten line function full of complicated constant expressions for what boiled down to a simple quadratic.
    topspin: "Why would you do that, that's got to be slow. The optimizer is pretty powerful but I doubt it can do... oh, wow, look at this".

    I read up on it and there were some comments about how this is non-conforming, but I'm not sure if that means the library can't guarantee it by adding a constexpr annotation or if it isn't allowed to do the constant folding at all. I think (if those comments were correct) it must be the former, since the latter should at least fall under the as if rule.


  • Considered Harmful

    @lb_ said in C++ Stockholm Syndrome:

    @pie_flavor said in C++ Stockholm Syndrome:

    C++ really blurs the line between compile-time code and runtime code.

    And what about the awesome-looking Boo that @masonwheeler posted? Is that line more or less blurry?

    In reality nobody cares, we learn what can be a constant expression and what can only be evaluated at runtime based on language syntax and keywords, and if we really need the former rather than the latter there are easy ways to enforce that or else get a compile error.

    Metaprogramming is different. You're still manipulating data; there's nothing that couldn't be written without metaprogramming, metaprogramming just makes it easier. But language constructs like if constexpr are not fancy macros. They are specific compiler instructions, with performance as their only reason for inclusion, which means that the job of optimizing the code is being placed in your hands.

    @pie_flavor said in C++ Stockholm Syndrome:

    All of this is the compiler's job.

    The compiler's job is not to make my code more readable and more maintainable. That's the language's job. You're right though that optimization is the compiler's job, but why not make that job easier when the language makes it so convenient to do so?

    if constexpr does not make your code more readable and more maintainable than if. You don't care about making the compiler's job 'easier', it's not a human. If a compile-time expression is in the if statement, it will be optimized. There is no reason whatsoever for there to need to be an additional programming construct to tell the compiler to optimize it.


  • Considered Harmful

    @cvi said in C++ Stockholm Syndrome:

    @pie_flavor said in C++ Stockholm Syndrome:

    Rust aggressively optimizes in release mode, so it doesn't need an if constexpr construct at all - an if statement with a compile-time true condition gets removed [...]

    There's a important difference between if constexpr and if. The if constexpr version discards the branch that isn't taken earlier, because it may contain invalid code. Example:

    template< typename T > int f( T x )
    {
        if constexpr( std::is_pointer_v<T> ) { return *x; }
        else { return x; }
    }
    

    This will compile even when instantiated with T = int, despite *x not being defined for int types. Without the "constexpr" the code would be invalid.

    That goes too far over the line of confusing a build script with a program for my tastes. I'd rather do specialization, no matter how clean the resulting code wouldn't be.

    Edit:

    Rust generics have the same benefits, but they actually get treated like language elements instead of compiler elements.

    I don't get what you mean by that. Do you have an example/clarification?

    When you have a C++ template, it is essentially part of a build script. It's something that isn't C++ but compiles to C++ while building. And as such you can perform all sorts of crazy shit inside a template that wouldn't compile if this weren't the pattern. For example, a template function that calls an arbitrary function on an object, and doesn't error because you only ever call the function with types that have that function. In Rust, you're still compiling a new copy of the function for each type it's used with, but that's an implementation detail. It's still treated as existing written code, not as code in a template that will get pasted in at a particular point. You can still do crazy stuff like that, but that requires macros.


  • Banned

    @topspin said in C++ Stockholm Syndrome:

    I read up on it and there were some comments about how this is non-conforming

    C++ doesn't require conforming implementation to implemet everything exactly as it says in the spec. It just requires that resulting program has behavior equivalent to what it would be by the spec. A subtle but important difference.



  • @topspin said in C++ Stockholm Syndrome:

    I don't understand why that is. Adding a constexpr qualifier doesn't change the ABI or observable behavior, I think?

    A lot of the functions from math.h (and consequently cmath) set errno (see here), preventing them from being constexpr.

    It's a rather unfortunate consequence of their original definitions (that admittedly pre-date the C++11 constexpr by quite some time, so it's hard to blame the original specification for this problem).

    It is a rather common question/complaint about the current state of affairs, and I'm pretty sure I've seen at least one call/proposal for a modern math library, and there are a few attempts at coding such a library as well. (Apparently there are some people who rely on checking errno in their code; personally, 99+% of the code that I've seen doesn't do this. A lot of people don't even know that this is a thing, and I'm counting quite a few experienced C++ numerics people into that group.)

    I think that there is some sort of consensus that something needs to be done. Less so what. Options include: returning NaN (my personal choice), or an optional<>, or an error_code out parameter, or throwing an exception (that will get you yelled at by the anti-exception people for sure)? All of them (that's going to be a fun API to maintain)?

    Not holding my breath for a resolution on that particular issue. :-/

    I read up on it and there were some comments about how this is non-conforming

    Eh. Yeah. Floating point stuff, conformance and optimization are a rather messy topic. First, there's the question of -ffast-math (or equivalents) or not. If yes, there's going to be a ton of non-conforming stuff going on, but on the other hand, it's mostly stuff that you do want to happen anyway. If no, there is probably still a bunch of non-conforming stuff going on, because apparently conforming to the various standards that control floating point math is a pain, and includes a bunch of stuff that nobody sane wants anyway, at least not on modern architectures.



  • @pie_flavor said in C++ Stockholm Syndrome:

    But language constructs like if constexpr are not fancy macros. They are specific compiler instructions, with performance as their only reason for inclusion, which means that the job of optimizing the code is being placed in your hands.

    No, if constexper is not for performance. It is for conditional compilation at language level instead of preprocessor level. What you use that for is not limited.

    @pie_flavor said in C++ Stockholm Syndrome:

    if constexpr does not make your code more readable and more maintainable than if. You don't care about making the compiler's job 'easier', it's not a human. If a compile-time expression is in the if statement, it will be optimized. There is no reason whatsoever for there to need to be an additional programming construct to tell the compiler to optimize it.

    You can have code that does not compile in an if constexpr if the condition is false. You cannot do that with an if no matter what optimizations your compiler employs. if constexpr is not about optimization.

    @pie_flavor said in C++ Stockholm Syndrome:

    When you have a C++ template, it is essentially part of a build script. It's something that isn't C++ but compiles to C++ while building.

    That's a really outdated way of thinking about templates. A template is a function that, instead of returning a value, returns another function or a type. Multiple named return items are supported. Basically, templates make functions and types almost first-class citizens, which is something normally only scripting languages can pull off.


  • BINNED

    @cvi Ah yes, errno. Also rounding modes.
    My first (wrong) thought was that's irrelevant since if I write std::sqrt(2) the compiler can prove it won't have to set errno and also do correct rounding to full precision (or would the different modes still produce different outcomes?). But of course that doesn't help in cases where you do get a domain error, even if that's 100% stupid to do in a constant.
    It is kind of similar to the exception throwing path not being allowed in a constexpr, just that you can't handle it in the same way.



  • @pie_flavor said in C++ Stockholm Syndrome:

    I'd rather do specialization, no matter how clean the resulting code wouldn't be.

    Fair enough, I suppose. 🤷♂

    Personally, I've written a lot of code where if constexpr would have made things a lot cleaner/shorter and easier to both write and maintain. (I've actually not gotten to use it that much; not all of my target compilers support it yet.)

    It's something that isn't C++ but compiles to C++ while building.

    That's a fair way of describing templates, I guess. Again, personally, I don't mind blurring the lines between templates and the rest of C++ (if one wants to make such a distinction); templates and the generic programming style that they enable are one of the key reasons why I use C++ in the first place.

    (It's also not quite true, template code is checked for a number of errors before being instantiated; things that don't depend on template parameters must generally be correct even if the function isn't ever instantiated. However, compilers, particularly MSVC IIRC, have struggled with this in the past.)



  • @topspin said in C++ Stockholm Syndrome:

    Also rounding modes.

    Yeah. Forgot about those. I think that's the other problem: there is some amount of global state that changes the behaviour of the methods methods (potentially) at runtime (rounding state being one of them, and error reporting another).

    That's another messy part of the standard.


  • Considered Harmful

    @lb_ said in C++ Stockholm Syndrome:

    @pie_flavor said in C++ Stockholm Syndrome:

    But language constructs like if constexpr are not fancy macros. They are specific compiler instructions, with performance as their only reason for inclusion, which means that the job of optimizing the code is being placed in your hands.

    No, if constexper is not for performance. It is for conditional compilation at language level instead of preprocessor level. What you use that for is not limited.

    @pie_flavor said in C++ Stockholm Syndrome:

    if constexpr does not make your code more readable and more maintainable than if. You don't care about making the compiler's job 'easier', it's not a human. If a compile-time expression is in the if statement, it will be optimized. There is no reason whatsoever for there to need to be an additional programming construct to tell the compiler to optimize it.

    You can have code that does not compile in an if constexpr if the condition is false. You cannot do that with an if no matter what optimizations your compiler employs. if constexpr is not about optimization.

    My point is that they make compilation an ever-present concept in code. It shouldn't be. Conditional compilation in Rust can be done, but it can only be done with #[cfg] and is only for things that are platform- or feature-specific. C++ has taken a concept that is useful when used sparingly, and made it a core component of the language.

    @pie_flavor said in C++ Stockholm Syndrome:

    When you have a C++ template, it is essentially part of a build script. It's something that isn't C++ but compiles to C++ while building.

    That's a really outdated way of thinking about templates. A template is a function that, instead of returning a value, returns another function or a type. Multiple named return items are supported. Basically, templates make functions and types almost first-class citizens, which is something normally only scripting languages can pull off.

    Templates are only functions if you use a very very abstract definition of 'function'. It doesn't make functions and types first-class citizens - this is all still resolved at compile time, and it's not like they can be stored in variables (on the other hand, in Rust functions actually are first-class citizens).



  • @masonwheeler said in C++ Stockholm Syndrome:

    @ben_lubar said in C++ Stockholm Syndrome:

    @masonwheeler said in C++ Stockholm Syndrome:

    @topspin said in C++ Stockholm Syndrome:

    Got a specific example where this is horrible so we can make sure to talk about the same thing?

    You run into it all the time in GUI programming. Stuff that looks like:

    MyWidget.Frob();
    try:
       DoStuffWith(MyWidget);
    finally:
       MyWidget.UnFrob();
    

    Where Frob represents temporarily placing it in a specific state that should be reversed once you're done with whatever you're doing.

    That has nothing to do with resource acquisition or destruction, but you still want it to be exception-safe so that if a recoverable exception gets thrown, it doesn't leave your UI in an inconsistent state.

    std::frob frobbed(MyWidget);
    DoStuffWith(MyWidget);
    // compiler calls frobbed.~frob() when scope ends.
    

    Exactly. When all you have is a hammerRAII, everything starts to look like a nailclass with its own constructor and destructor. Add that in to the example and it looks a lot less simple and clean.

    Actually the above looks much cleaner to me. I would probable tighten the scope:

      {
        std::frob frobbed(MyWidget);
        DoStuffWith(MyWidget);
     }
    

    This supports SRP. As soon as I saw this pattern twice [" You run into it all the time "], the DRY would kick in, and I would most likely introduce a template [which could also help is ISP, LSP, OCP and DIP]


  • Garbage Person

    @thecpuwizard You could even design it like:

    {
        auto frobbed = MyWidget.frob();
        DoStuffWith(MyWidget);
    }
    

    if frobbing is something that should always be undone.



  • @greybeard said in C++ Stockholm Syndrome:

    @thecpuwizard You could even design it like:

    {
        auto frobbed = MyWidget.frob();
        DoStuffWith(MyWidget);
    }
    

    if frobbing is something that should always be undone.

    99% certain that would break OCP [presuming that frob() and unfrob() had alrady been written], also it would likely impact SRP. Of course, one would need more details (of a real situation) to make an accurate analysis.



  • @pie_flavor said in C++ Stockholm Syndrome:

    C++ has taken a concept that is useful when used sparingly, and made it a core component of the language.

    I don´t think it's quite like that. This kind of thing pops up when talking about templates, but the point of templates is that they're a work saving measure. You generally only resort to them after you have found a certain pattern repeating regularly and figure that the only thing changing is the type involved. By that point you should know most of what you want the template to do, and these tools enable you to deal with minor differences. But the template is written once, and then can be used again and again without worrying about all this.

    C++ is kind of focused on creating abstractions and relationships that describe the problem you want to solve, and provides a great many tools to do that. That makes the language complex, and puts the burden on the person writing the abstraction, but using the abstraction is then easy. A lot of the complexity comes from making these things easy to use. So you can have one 'expert' supporting a bunch of less experienced programmers and helping them be more productive. In time the less experienced ones can learn and support others.

    In other languages I've seen, you're at the mercy of language writers wanting to support whatever it is you want to do, and if they don't you have fewer tools than they do to create libraries and the like. In C++, the standard library is itself written in standard's conforming C++, and you can write libraries that are just as versatile as what the implementation could offer (the oft cited example is boost, which tends to be a testing ground for new language features). If you're doing the same thing that 80% of companies are doing, that trade of may not be worth it, but if you're in the 20% working in niche markets, that's irreplaceable.


  • Discourse touched me in a no-no place

    @gąska said in C++ Stockholm Syndrome:

    I still don't see how being Turing complete, whether by accident or not, is a problem.

    It's a problem because C++ compilers aren't exactly the world's nicest or most efficient scripting engines. (This is not a comment on the code that they produce or the efficiency of other parts of the compilation pipeline.) Systems designed from the beginning for generating code using programmable transforms can be a lot more efficient.


  • Discourse touched me in a no-no place

    @cvi said in C++ Stockholm Syndrome:

    Options include: returning NaN (my personal choice), or an optional<>, or an error_code out parameter, or throwing an exception (that will get you yelled at by the anti-exception people for sure)?

    That's one of those nasty questions that causes great heart-ache over not very much. I'm of the opinion that the normal option should be either NaN (or Inf if that makes better mathematical sense in context) or throwing an exception. I'd like to be able to select which of those two I get, but personally would be in the NaN camp most of the time as it is a quite reasonable poison value (and much of my code targets environment which are strictly noexcept for other reasons).


  • Banned

    @dkf said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    I still don't see how being Turing complete, whether by accident or not, is a problem.

    It's a problem because C++ compilers aren't exactly the world's nicest or most efficient scripting engines. (This is not a comment on the code that they produce or the efficiency of other parts of the compilation pipeline.) Systems designed from the beginning for generating code using programmable transforms can be a lot more efficient.

    I still don't see how Turing completeness has anything to do with it.


  • Discourse touched me in a no-no place

    @gąska said in C++ Stockholm Syndrome:

    I still don't see how Turing completeness has anything to do with it.

    It means they're a full programming language. In a system not designed to be good at that sort of thing.


  • Java Dev

    @dkf said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    I still don't see how Turing completeness has anything to do with it.

    It means they're a full programming language. In a system not designed to be good at that sort of thing.

    A non-turing-complete template system would potentially be just as bad a headache in practice. Consider regex which is often hated and not turing complete.


  • Discourse touched me in a no-no place

    @pleegwat said in C++ Stockholm Syndrome:

    Consider regex which is often hated and not turing complete.

    Except for those crazy mofos who decided that adding features to regexps to make them turing complete was a good idea…


  • Banned

    @dkf my point is, awful languages are awful regardless of being Turing complete.



  • @pie_flavor said in C++ Stockholm Syndrome:

    My point is that they make compilation an ever-present concept in code.

    That's what scripting languages are though. You often never know if code even has valid syntax until you trigger the conditions for that particular code path to be invoked. Accessing non-existent members? Good luck debugging that. C++ makes it an error unless you explicitly say a condition must be met first.


  • Considered Harmful

    @lb_ I never brought up scripting languages, and you only did on a completely unrelated topic. C++ is not a scripting language, scripting languages suck, scripting languages often get compiled anyway when run so you'll hit a syntax error long before the code hits that point, and you're talking about the effects of the concept by comparing it to a completely different concept with the same effects, whereas I was talking about the concept and only the concept with the effects simply as an example of what I meant.



  • @pie_flavor My point is C++ takes something that is awful in other languages and makes it sane and actually great while still being just as powerful as those other languages (at compile time, at least). The whole point of if constexpr is to make it easier to both read and write generic code with special cases. You can do your specialization all day long but it will never be as readable and it will always involve more typing.


Log in to reply