Google C++ Style Guide



  • @dkf said:

    It would also be nice if I was paid a million bucks an hour in this job. Not happening though.

    Damn. Guess I can't retire tonight...



  • @caffiend said:

    If you're forced to have raw grads on your team, you DON'T want them working in C++, because they were taught from a 10 year old textbook, by someone who'd never written a line of commercially viable code in his life. It's equally un-viable if you need to have someone with 10 years commercial experience who isn't also a passionate hobbyist working on your team.

    Fresh grads are quite ok. They can still be taught new tricks if you keep an eye on them and explain the proper ways where needed. It is the old dogs that will keep producing obnoxious code.

    @caffiend said:

    I'm genuinely surprised to hear that you've got cross-platform code that can be practically deployed across all the platforms you mentioned.

    We managed to get hang of it eventually, but yes, it was quite an effort. It would be much easier now that Qt was ported to all of them though.

    Writing portable C++ is pain, but it again gives you enough rope to hang yourselftie all the loose ends together. Besides, conditional compilation may be a mess, but it lets us build all those customized versions. We've built over 40 and some of them are regularly getting new releases.

    Yes, many applications should be written in something higher level than C++. Even higher level than Java, which wastes too much memory (lack of user-defined value types is big part of the reason) and is a bit too dumb for my taste. But C++ is not as niche. For generic libraries and frameworks it is still best we have. And yes, I do hope Rust can provide some healthy competition.


  • Discourse touched me in a no-no place

    @Bulb said:

    For generic libraries and frameworks it is still best we have.

    There's a few problems left with implementations, especially when it comes to getting things like compiler and linker configurations right in some of the gnarlier use cases. What's worse, the quality of error messages still leaves a bit to be desired. I've had some very strange failures out of systems written in C++ when trying to get them to build, link and be dynamically loaded (into a C program, for raisins).

    I'm guessing that stopping changing the language for a while and instead focusing on stamping out these not-the-language-but-still-very-annoying issues would be the best approach for expanding adoption. I'm going to guess that that won't happen though; the committee is there and probably feels they need to change things anyway.



  • @caffiend said:

    because they were taught from a 10 year old textbook, by someone who'd never written a line of commercially viable code in his life.

    Sadly, this is the real problem with C++: it is usually not taught. Instead, C with Classes gets taught :(

    @dkf said:

    There's a few problems left with implementations, especially when it comes to getting things like compiler and linker configurations right in some of the gnarlier use cases. What's worse, the quality of error messages still leaves a bit to be desired. I've had some very strange failures out of systems written in C++ when trying to get them to build, link and be dynamically loaded (into a C program, for raisins).
    Yeah, linkage in C++ sucks and it's my number one complaint with the language. I hate getting linker errors about unresolved externals and having no idea which library I forgot to link or if one that I did link is linked incorrectly or in the wrong order.


  • Discourse touched me in a no-no place

    @LB_ said:

    I hate getting linker errors about unresolved externals and having no idea which library I forgot to link or if one that I did link is linked incorrectly or in the wrong order.

    It's when the language-aware linker decides to leave out the C++ standard library (seriously?!) or the compiler's own intrinsics library (:facepalm: :headdesk:) that you start to wonder if you're going insane. And it turns out that it leaves those out because it is assuming that it's already put them in, but actually hasn't because main() was a C program. AAAAAAAAAAAAuuuuuuugh! The errors just make no sense at all as they all talk about things that you make no mention of in your actual program code anywhere. You end up thinking that you're going mad and that the compiler is just taking the piss for shits and giggles.

    Though when the compiler picks one version of C++ and the linker picks a different (earlier) version, that's pretty special too. 😡

    Seriously, stop changing the language. Get shit bedded in right and give people a chance to switch, because that sure doesn't happen by magic. Yes, it means that the language designers will have to go off and get real jobs instead. That's no bad thing; maybe they can learn to paint fences or something else socially useful.



  • @dkf said:

    Seriously, stop changing the language.

    How can we fix this problem without changing the language? Modules are going to fix the linking nightmare:



  • @dkf said:

    It's when the language-aware linker decides to leave out the C++ standard library (seriously?!) or the compiler's own intrinsics library (:facepalm: :headdesk:) that you start to wonder if you're going insane. And it turns out that it leaves those out because it is assuming that it's already put them in, but actually hasn't because main() was a C program. AAAAAAAAAAAAuuuuuuugh! T

    When mixing C and C++ code and libraries, you should always use the C++ linker to link exes and dlls for the reasons outlined above.



  • @LB_ said:

    Modules are going to fix the linking nightmare:

    Which will we get first? Modules, or concepts?

    J/K, the answer is always "not concepts".


  • Discourse touched me in a no-no place

    @LB_ said:

    I hate getting linker errors about unresolved externals and having no idea which library I forgot to link

    You can use lib to figure that out, at least.



  • I did laugh, but in all honesty, I believe modules are so much more important than concepts. Concepts are a nice thing to have, whereas modules are a basic feature in nearly all other languages...


  • kills Dumbledore

    @dkf said:

    maybe they can learn to paint fences bikeshedsor something else socially useful

    FTFY


  • Discourse touched me in a no-no place

    @tar said:

    When mixing C and C++ code and libraries, you should always use the C++ linker to link exes and dlls for the reasons outlined above.

    Which to me reads like “you should never use C++ to create dynamic libraries” because there's no way to guarantee what the top-level linker used is.


  • Winner of the 2016 Presidential Election

    @dkf said:

    “you should never use C++ to create dynamic libraries”

    Actually, that's not bad advice, considering C++ doesn't even have a standardized, stable ABI.


  • Discourse touched me in a no-no place

    Which would be OK if the makers of software like LLVM knew the first thing about producing a complete C API to their code in the first place. Dingo fuckers.



  • @dkf said:

    “you should never use C++ to create dynamic libraries”

    Don't you mean static? Shared libraries and dynamic libraries are the output of the linker and already have everything they need linked in, you don't use them as linker input again. Or is this different in *nix?



  • @LB_ said:

    @dkf said:
    “you should never use C++ to create dynamic libraries”

    Don't you mean static? Shared libraries and dynamic libraries are the output of the linker and already have everything they need linked in, you don't use them as linker input again. Or is this different in *nix?

    No, @dkf meant dynamic libraries. If you use a C++ linker to link a dynamic library, you've just decided that all future software using your DL will need to use the C++ linker (and not any old C++ linker, but one that's ABI compatible with the linker used for your DL) to link with your DL.

    A workaround is to write a wrapper DL that wraps the API of your C++ DL and exposes it through extern "C" { ... } wrapped functions.



  • @OffByOne said:

    No, @dkf meant dynamic libraries. If you use a C++ linker to link a dynamic library, you've just decided that all future software using your DL will need to use the C++ linker (and not any old C++ linker, but one that's ABI compatible with the linker used for your DL) to link with your DL.

    A workaround is to write a wrapper DL that wraps the API of your C++ DL and exposes it through extern "C" { ... } wrapped functions.

    +1 to that.

    Especially on windows. Wrap it in a C API. Yes, it's hours of scutwork, looks ugly, and makes people cringe, but anything else is a recipe for disaster.

    To list a few issues I've had to deal with...
    Some linkers assume the size of entries in the VTable based on the export table or header file (for the pedants I don't know which it used, but it got it wrong). Had to be fixed with manual pointer arithmetic and a lot of really fragile manual calls to the underlying memory allocator.

    Some idiot overloaded the new operator and used a different memory allocator, full of "placement new" and manual calls to malloc(), so now delete is broken (usually caused by someone using copy-pasted boost source code from 2003).

    Some asshole putting an unclosed #pragma pack in their library's header file (MS have done this on more than one occasion).

    That whole virtual destructor thing i mentioned earlier becomes really important when all you get is base-class pointers out of a dynamic library.

    Someone used a compiler with an "almost" compatible "name mangling" (yes that's what it's called) convention, which differs only in case from the latest version of the same compiler.

    (windows only) That whole argument over how big char_t is, and library developers being selfish and redefining it in their library's header file to make sure "their one works".

    (windows only) the fact that the length of integer datatypes is ambiguous, and the very existance of the longlong datatype.

    This fu*king thing which "cleverly" defeats the compilers most valiant efforts to save idiots from themselves, that somehow gets copy-pasted enough that I've encountered it when debugging issues with the C++ APIs provided by at least 3 fortune 500 companies.

    </rant> Sorry, it's not often I get to vent like this.

    [Edit] The version of horrible_cast in that SO article is less destructive than those I've seen in the wild (since they're all values being copied around). The ones I'm bitching about use pointers / references to corrupt your beautiful stack frame, and waste lots of non-billable hours late into the night during the defects liability period, leading to failed relationships, alcoholism, feelings of hopelessness and suicidal thoughts.


  • Discourse touched me in a no-no place

    @OffByOne said:

    A workaround is to write a wrapper DL that wraps the API of your C++ DL and exposes it through extern "C" { ... } wrapped functions.

    I've done that. That part works. Except that on some platforms, the linker decides to leave out some of the libraries from the required library list (no idea why) leaving the result unable to dlopen()/LoadLibrary(). Diagnosing these things is also unreasonably difficult.



  • @OffByOne said:

    If you use a C++ linker to link a dynamic library, you've just decided that all future software using your DL will need to use the C++ linker (and not any old C++ linker, but one that's ABI compatible with the linker used for your DL) to link with your DL.

    That's not news to me, I've always been taught to use the exact same version of the exact same compiler.

    @caffiend said:

    That whole virtual destructor thing i mentioned earlier becomes really important when all you get is base-class pointers out of a dynamic library.
    I always found that a bit of a weird design pattern, I'm not sure what problem it's supposed to solve. It's like PImpl but with virtual function overhead instead of pointer overhead (which might be the same, I'm not sure).

    @caffiend said:

    (windows only) the fact that the length of integer datatypes is ambiguous, and the very existance of the longlong datatype.
    Not sure how this is Windows only - it's the way the C and C++ standards define the types. Are people just somewhat more consistent on *nix? I know the world of embedded programming doesn't afraid of 16bit int, or even 24bit int.

    @dkf said:

    Diagnosing these things is also unreasonably difficult.
    This.



  • @LB_ said:

    Not sure how this is Windows only - it's the way the C and C++ standards define the types. Are people just somewhat more consistent on *nix? I know the world of embedded programming doesn't afraid of 16bit int, or even 24bit int.

    Our embedded devs are a bit kinder (or crueler, whichever you think) they usually typedef their weird length integers with names like UINT7 (yes, it's a thing) or UINT24. The best one is VUINT16 (which we sometimes get in structs meant for on-wire serialisation). VUINT stands for Volatile Unsigned Integer. Which in the C world, means it needs the compiler treat is specially so that data changed under interrupts is available in some real-time sort of way. Why TF is that variable in a serialised data structure. Usually it means they made a mistake, but sometimes they say "it's not important which one you get, just put it in the database, go out and grab a chicken burger and be happy"

    Bloody hell hardware team, get a grip! You can buy a 1.2ghz SoC, with an MMU, running a decent linux distro for about the same price as the stone-age microconrollers you use, where the "operating system" doesn't consists of a while(1) loop filled with a 28000 line long switch statement and no functional decomposition.



  • @powerlord said:

    Wasn't that the point of the D programming language or am I misremembering?

    I expected d-lang to be my new favorite language, then I learned the "htod" utility that was necessary to make the use of C libraries practical was windows only.



  • @dkf said:

    Which to me reads like “you should never use C++ to create dynamic libraries” because there's no way to guarantee what the top-level linker used is.

    Is that not a true statement?



  • True enough for me, everytime I had to use a dll written in c++ I had to look for a binary compiled with the same compiler I was using.



  • @dkf said:

    Which to me reads like “you should never use C++ to create dynamic libraries” because there's no way to guarantee what the top-level linker used is.

    Well, I suppose my next rule is always compile native code from source so that you can guarantee that somebody (i.e. you) used the correct (i.e. C++) linker so that everything is linked correctly.



  • @asdf said:

    C++ doesn't even have a standardized, stable ABI.

    I thought that either 11 or 14 at least made some baby steps in this direction?



  • @LB_ said:

    That's not news to me, I've always been taught to use the exact same version of the exact same compiler.

    +800,000



  • @LB_ said:

    That's not news to me, I've always been taught to use the exact same version of the exact same compiler.

    With the exact same options. (I love VS's property pages for settings...)



  • @dkf said:

    Which to me reads like “you should never use C++ to create dynamic libraries” because there's no way to guarantee what the top-level linker used is.

    Actually, C++ linker is needed when linking static C++ libraries. Dynamic libraries know their dependencies, so a C linker will link against them fine.

    You will still have problems if the ABIs don't match, which together with the combinatorial explosion of ABI and runtime variants on Windows renders making dynamic libraries mostly pointless, but that is slightly different issue.



  • @dcon said:

    VS's property

    Do you mean “property sheets” (.vsprops files)? They removed that feature in VS2010 when they switched C++ to MSBuild like .NET. Theoretically you can do the same thing there, because MSBuild is generic format and supports includes, but Visual Studio, at least 2010, won't do anything to help you with them.

    Anyway, myself I switched to generating projects with CMake, because that way I can also generate XCode projects and makefiles/ninjas for Android and MinGW from one, consistent, source.


  • Discourse touched me in a no-no place

    @blakeyrat said:

    Is that not a true statement?

    Well, I'm talking about creating a DLL with an ABI (as used by others) that only exposes functions that are extern "C" but where it uses the C++ standard library internally, as that has some pretty nice stuff in it (it uses some other C++-only libraries too; if it didn't, I wouldn't use C++ despite the extra coding work this makes). The problem was that somehow the linker was deciding that the C++ standard library it needed to link against wasn't required, or maybe that a different (older) version was better. Fuck if I know what happened. It wasn't even on my development platform where I have good tools for diagnosing all this stuff.

    C++ needs time not changing so that all these sorts of toolchain issues can be nailed. There's a lot of code out there in the wild, and much of it is not yet ported to C++11 or 14.

    @tar said:

    always compile native code from source

    Yet another naĂŻve statement. Yay for you.

    @Bulb said:

    You will still have problems if the ABIs don't match

    That shouldn't happen if the dynamic libraries include the version info required, but to be fair it's also a problem when you're evolving dynamic libraries written in other languages. The right question isn't whether the problem can be prevented entirely, but rather what tools do you have to work with to mitigate the challenges.



  • @dkf said:

    That shouldn't happen if the dynamic libraries include the version info required

    Even if the dynamic libraries included the version info required (they don't), you'd still have problem if you have two libraries that were built with different compilers or different settings. The particular problems will be different for static and dynamic libraries and for DLL platforms (Windows) and ELF platforms (most of the rest), but there will be some in any combination.

    For static libraries, every symbol will be there just once, so if the libraries expect it to work differently, at least one of them will not work. For shared libraries on ELF system the dynamic linker merges symbols, so the symptoms will be similar. On the other hand on DLL system each DLL may have its own copy of the symbol, so each of the libraries in isolation will work, but passing non-trivial data to and from them won't.

    The problems are much more prevalent on Windows because:

    • Windows have static and dynamic version of runtime times debug and release version and the choice must be done at compile time, so you have 4 variants even with single compiler version and without further dependencies.
    • Windows don't have any DLL versioning with support for backward compatibility and each compiler version comes with new runtime, so combining libraries built with different version is not possible.

    On ELF platforms since the choice between using static or dynamic dependency is deferred to link time and since shared objects have versions, the explosion of variants does not occur and linking against library built with older version is possible to an extent, and usually fails early when it is not, because the linker notices request for multiple incompatible versions of the same shared object (where they appear as different libraries in the dll case). It does not mean the problem would not exist, but it is more tractable.


  • Discourse touched me in a no-no place

    @Bulb said:

    Windows have static and dynamic version of runtime times debug and release version and the choice must be done at compile time, so you have 4 variants even with single compiler version and without further dependencies.

    There used to also be threaded/non-threaded versions of things too. 8 versions, all at once. :facepalm:



  • Do you mean “property sheets” (.vsprops files)?

    Yes

    They removed that feature in VS2010 when they switched C++ to MSBuild like .NET.

    No - they changed them to .props files. I'm using them in my current vc10/12/14 projects. (In fact, I'm using the same .props files in all 3.)



  • @dkf said:

    Yet another naĂŻve statement. Yay for you.

    Why is it naive? 15 years of C++ experience, and I've never had any of these problems you seems to be running into.
    Or is somebody literally handing you a DLL and making it your job to deal with it? Because that'd suck. Then again every commercial C++ library I've ever used had terms for accessing the source, so, again, stop :doing_it_wrong: I guess...



  • @Bulb said:

    I am pretty sure raw pointers are available.

    Zero-overhead doesn't mean access to pointers per-se. Among other things, it means that you can determine the layout of structures in memory. So creating a container, for example, is as memory efficient as managing raw pointers to dynamically allocated memory, but safe and consistent.

    @Bulb said:

    It does give you enough rope to shoot yourself in the foot.
    Which is an impressive amount: It's not easy making a gun out of rope.

    @caffiend said:

    I'm genuinely surprised to hear that you've got cross-platform code that can be practically deployed across all the platforms you mentioned. Why you'd want to process non-trivial amounts of data using a phone is also beyond me, but how am i to know your domain.
    I've worked for game companies. If you want to share a single codebase for your 3d game across mobile devices, you're stuck with C++.

    @LB_ said:

    Yeah, linkage in C++ sucks and it's my number one complaint with the language.
    In fairness, that's an issue with the linker implementation. It has nothing to do with the language itself. If the linker writers were willing to follow a different approach, it wouldn't matter what order you put the libraries. It would also fix the issue where when you have two libraries that depend on each other you have to repeat one and interleave them.

    Not sure if this would have a significant impact on build speed.

    @dkf said:

    Yes, it means that the language designers will have to go off and get real jobs instead.
    C++ doesn't have "language designers". It has a committee that accepts proposals, evaluates them, and decides what goes in and what doesn't. So anyone willing to write a paper can be a C++ designer. And you can join the committee for $1200/year, I think.

    They usually have day-jobs at the companies most invested in the language, which pay for their own committee members. Which is why nothing is ever deprecated: the companies that would have to rewrite ten year old code if some broken thing was deprecated get to vote not to do that.



  • @Kian said:

    Zero-overhead doesn't mean access to pointers per-se. Among other things, it means that you can determine the layout of structures in memory. So creating a container, for example, is as memory efficient as managing raw pointers to dynamically allocated memory, but safe and consistent.

    Oh, true. I was a bit slow when I wrote that.

    Well, D does have value types (structs) and they have basically the same layout as in C/C++ and you can have tightly packed arrays of them. However the class types are forced to be handled by reference and allocated from garbage collector. Basically structs and classes work exactly as in C# except structs can have destructors and do proper RAII with them.

    Now I do not think that is a good approach, because in practice even classes that use inheritance and virtual dispatch can often be instantiated on stack or composed by value. Which is what Rust does.

    Generally, D has some things done better than C++ and some worse (often those seem inspired by C#), but in either case it is not sufficiently different to warrant the switch. IMO the first language that has sufficient new stuff that it might be worth switching to is Rust (and even though 1.0 has been released about half a year ago, some important features are still waiting to be polished, specified and added).


  • Discourse touched me in a no-no place

    @Kian said:

    C++ doesn't have "language designers". It has a committee that accepts proposals, evaluates them, and decides what goes in and what doesn't. So anyone willing to write a paper can be a C++ designer. And you can join the committee for $1200/year, I think.

    They usually have day-jobs at the companies most invested in the language, which pay for their own committee members. Which is why nothing is ever deprecated: the companies that would have to rewrite ten year old code if some broken thing was deprecated get to vote not to do that.

    FWIW, I know exactly how this sort of thing works: I've done standardisation work in the past. Not of programming languages, to be fair, but I don't think that it will be very different. The real costs are all the travel that you need to do; face-to-face meetings are a hell of a lot more productive than online ones, but they do take over your life a bit. (I'm glad I stopped before I started trying to make friends with the staff of the business-class lounge in Schiphol…)

    Deprecation and removal of things that you got wrong are very necessary, though they're also painful for people implementing stuff. Except when you remove something that nobody sane likes. (Have trigraphs been slaughtered yet? Nobody uses a system that needs them any more. Or if they do, they'll be happy to be forced to stop and join the ASCII revolution!)



  • @dkf said:

    Have trigraphs been slaughtered yet?
    They're still around, but compilers forbid them by default.



  • @Kian said:

    In fairness, that's an issue with the linker implementation. It has nothing to do with the language itself. If the linker writers were willing to follow a different approach, it wouldn't matter what order you put the libraries. It would also fix the issue where when you have two libraries that depend on each other you have to repeat one and interleave them.

    True, but if the language required sane handling of this in the first place, things would be much better.



  • @Kian said:

    Which is an impressive amount: It's not easy making a gun out of rope.



  • @Kian said:

    I've worked for game companies. If you want to share a single codebase for your 3d game across mobile devices, you're stuck with C++

    Oh so very much this.

    Although I suppose there's always C...



  • This post is deleted!


  • @Kian said:

    I've worked for game companies. If you want to share a single codebase for your 3d game across mobile devices, you're stuck with C++.

    Are there any gaming devices that don't support C#? Maybe the Wii-U? Maybe?


  • Java Dev

    He did say mobile, so I'd assume that's mostly android and ios?



  • @PleegWat said:

    He did say mobile, so I'd assume that's mostly android and ios?

    Unity does C#. Plus, Xamarin.



  • Both Unity -- what Rhywden said. And I know Unity supports whatever Nintendo calls their latest Gameboy model. And Sony Vita, if that matters.


Log in to reply