C++ Stockholm Syndrome


  • Banned

    @topspin said in C++ Stockholm Syndrome:

    @gąska Still, while that works in spirit, Ben's "like a billion RAM sticks" is literally correct as of now.

    So was the "640kB" quote back when it was made. If future proofing is easier than not future proofing, why not to? Any metadata you can store in your pointer, you can also store separately. Computers have enough memory for that, unlike in the 80s and 90s when these tricks were really necessary (and became huge PITA when available RAM grew.

    It wasn't that long ago when hard disks were measured in megabytes.


  • BINNED

    @gąska That wasn't my point, I was just :pendant:ing.



  • @lb_ said in C++ Stockholm Syndrome:

    I have most definitely investigated Rust and that's a whole mess that I'm letting simmer for now before making a final judgement, but the prospects are not good. So since I'm stuck with C++ maybe I just have stockholm syndrome, but here's some requirements for any replacement language:

    So how does Rust stand:

    • enforced static typing (if typoInVairableName = 1 is not an error, get lost)

    Check.

    • scope-based memory and resource management like RAII (gc is off by default)

    Check.

    • const-correctness and immutable by default (e.g. mutable keyword)

    Check.

    • exception handling (zero-cost, of course)

    Partially. There are panics, which are exceptions and are zero-cost, but are for reporting bugs only, while failures that are expected to be handled are returned as Result (variant), with tools to make it comfortable, but not completely zero-cost.

    The approach does have its advantages. While exceptions are very convenient for quick and dirty abort-if-anything-went-wrong
    error handling, they get very hard to hunt down when you need to make things stable later. The Results are always there and insist on being handled, so while they make it a bit harder to prototype, they improve the final quality.

    • ability to annotate strength of exception guarantees (nice to have at language-level, though I guess documentation is an okay fallback)

    No. But you don't have that in C++ either, do you?

    You don't really care

    • as much zero-cost abstraction as possible (if it can be done at compile time instead of at runtime, it should be)

    Check.

    • as much stuff checked by the compiler at compile time as possible (if it can be a compile error instead of a runtime error, it should be)

    Check.

    • sane and consistent compiling/linking regardless of platform (dear god none of this shared-vs-static nonsense)

    Check. The defaults work out of the box on all platforms and you don't need to care. Of course the platform warts won't just go away, so in special cases you may still need to deal with them, but those are special cases.

    • powerful generics and metaprogramming (I'm talking passing around and returning types as variables and first class citizens)

    Rust generics can do most of the things C++ (except value parameters, which are planned, but there are more important things), plus has powerful AST-based macros.

    • modern clean syntax (duh)

    Check.

    • good standard library (duh)

    Still too young. But the dependency manager and central repository are a huge improvement over C++.

    Um, and C++ does not exactly have one either.

    • UTF8 strings (duh)

    Check.

    So why are you saying the Rust prospect is no good?


  • Considered Harmful

    @gąska said in C++ Stockholm Syndrome:

    @topspin okay, 1995. Doesn't matter to me, both dates were before I learned what RAM is and that computers aren't all the same. The point is, any amount of memory will eventually get filled. It might take a while longert to fill 16 exabytes than it took 4 gigabytes, but it WILL happen, and our track record as an industry suggests a lot of programs running today will be still running then.

    There will come a day when Chrome uses a terabyte of RAM to run, and people will take it for granted.


  • Considered Harmful

    @bulb From what I can tell, it's because he's internalized all of C++'s quirks and ways of doing things, and focuses more on the fact that Rust can't do certain things in as few lines of code as C++ can than the fact that Rust's way is cleaner or that you generally go about problems entirely differently in Rust.



  • @gąska said in C++ Stockholm Syndrome:

    @topspin said in C++ Stockholm Syndrome:

    @gąska Still, while that works in spirit, Ben's "like a billion RAM sticks" is literally correct as of now.

    So was the "640kB" quote back when it was made. If future proofing is easier than not future proofing, why not to? Any metadata you can store in your pointer, you can also store separately. Computers have enough memory for that, unlike in the 80s and 90s when these tricks were really necessary (and became huge PITA when available RAM grew.

    It wasn't that long ago when hard disks were measured in megabytes.

    So was 64K when the 8-bit micro was developed, and earlier 4K...

    Oldest hard disk [rack mount, 14 inch removable platter] held 1.6MB - Drive and controller $7900 and $350 installation in 1974, 74 Dollars per month for maintenance, Additional drives $5100 and $260 installation, 64 Dollars per month for maintenance, Disk cartridge $99

    Biggest existent RAM I know of in a machine is 1.375 petabytes (but that information is 2 years old). However, more important is virtual address space which can far exceed physical - so the limit of 64 bit may be closer than many think...


  • Discourse touched me in a no-no place

    @gąska said in C++ Stockholm Syndrome:

    people said the same about 32 bits in 2000

    I had applications that ran out of addressable space on (very expensive) workstations with 4GB of memory before that date. By 2000, I was porting to 64-bit machines, which had been around long enough then to be mostly production ready.


  • Discourse touched me in a no-no place

    @bulb said in C++ Stockholm Syndrome:

    but not completely zero-cost

    Actually, you might find that they end up being zero cost once the compiler and optimiser have finished having their evil way with the code.


  • Discourse touched me in a no-no place

    @thecpuwizard said in C++ Stockholm Syndrome:

    Biggest existent RAM I know of in a machine is 1.375 petabytes (but that information is 2 years old).

    I wonder how they manage to access that much space in a sensible amount of time? 😈



  • @dkf said in C++ Stockholm Syndrome:

    @thecpuwizard said in C++ Stockholm Syndrome:

    Biggest existent RAM I know of in a machine is 1.375 petabytes (but that information is 2 years old).

    I wonder how they manage to access that much space in a sensible amount of time? 😈

    VERY Carefully :) :) :)



  • @pie_flavor said in C++ Stockholm Syndrome:

    @bulb From what I can tell, it's because he's internalized all of C++'s quirks and ways of doing things, and focuses more on the fact that Rust can't do certain things in as few lines of code as C++ can than the fact that Rust's way is cleaner or that you generally go about problems entirely differently in Rust.

    I am usually the resident C++ lawyer and I certainly can take great advantage of the C++ quirks. But I still see Rust is much better.


  • BINNED

    @pie_flavor said in C++ Stockholm Syndrome:

    There will come a day when Chrome uses a terabyte of RAM to run, and people will take it for granted.

    That day was in 2017.


  • Banned

    @topspin terabyte is 1000 gigabytes, not 100.


  • kills Dumbledore

    @gąska said in C++ Stockholm Syndrome:

    So was the "640kB" quote back when it was made

    No it wasn't because Bill Gates never said that



  • @jaloopa said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    So was the "640kB" quote back when it was made

    No it wasn't because Bill Gates never said that

    Interesting semantic argument, is the phrase "when the quote was made" actually dependent on whether the alleged quotation was actually spoken, or if it it was made out of thin air....


  • Banned

    @jaloopa said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    So was the "640kB" quote back when it was made

    No it wasn't because Bill Gates never said that

    But if he had said it, it would be true.


  • Banned

    @thecpuwizard said in C++ Stockholm Syndrome:

    @jaloopa said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    So was the "640kB" quote back when it was made

    No it wasn't because Bill Gates never said that

    Interesting semantic argument, is the phrase "when the quote was made" actually dependent on whether the alleged quotation was actually spoken, or if it it was made out of thin air....

    I meant the made up date it was allegedly said for the first time, not the date it was made up on.



  • @dkf said in C++ Stockholm Syndrome:

    I had applications that ran out of addressable space on (very expensive) workstations with 4GB of memory before that date.

    Only having 32-bit addresses to work with probably also influenced decisions when designing methods. For example, using memory mapping for large files isn't really that viable with only a 32-bit address space.

    By 2000, I was porting to 64-bit machines,

    From what I'm told, 64-bit workstations (SGI and similar) have been around for a long time. Apparently.


  • Discourse touched me in a no-no place

    @cvi said in C++ Stockholm Syndrome:

    From what I'm told, 64-bit workstations (SGI and similar) have been around for a long time. Apparently.

    Yeah, but actually kitting them out with more than 4GB of memory was incredibly expensive. Since my problem was that the application I was working was incredibly memory hungry even with horrible memory compression tricks, there wasn't a great deal I could do other than port to an actual supercomputer. The guys at the supercomputer centre were quite upset that we didn't want to use lots of CPU cores; it was their really huge address space (for the time) that was interesting.

    That code would probably still bring modern systems to their knees. It's one of the few things I know of that can really use terabytes of memory without difficulty (due to the underlying problem being actually somewhere about EXP2SPACE).



  • @cvi said in C++ Stockholm Syndrome:

    @dkf said in C++ Stockholm Syndrome:

    I had applications that ran out of addressable space on (very expensive) workstations with 4GB of memory before that date.

    Only having 32-bit addresses to work with probably also influenced decisions when designing methods. For example, using memory mapping for large files isn't really that viable with only a 32-bit address space.

    By 2000, I was porting to 64-bit machines,

    From what I'm told, 64-bit workstations (SGI and similar) have been around for a long time. Apparently.

    AMD Opteron came out in 2003, and it was certainty not the first 64 bit processor it was the CPU that largely brought x64 to practical use.

    One of the first 64 bit CPU's that ended up in workstations was the 21064 announced in February 1992


  • Discourse touched me in a no-no place

    @thecpuwizard said in C++ Stockholm Syndrome:

    One of the first 64 bit CPU's that ended up in workstations was the 21064 announced in February 1992

    I've used those. It had lots of memory and was slow as heck. :smiling_face_with_open_mouth_closed_eyes:



  • @bulb said in C++ Stockholm Syndrome:

    The Results are always there and insist on being handled, so while they make it a bit harder to prototype, they improve the final quality.

    I've always wondered how this works - do the error types have as much variety as C++ and Java? I'm mainly concerned about problems raised by this quote:

    Because we only have one bit of information (the operation failed or not), G cannot handle the error; typically it doesn’t know what is the error. So the only thing that it can do is return false. In general, this approach leads to a situation in which if something bad happens in the system, everybody returns false, and nobody knows what happen. This can have negative consequences on your software. If some small part doesn't work, even if the error can be ignored, it can make your whole software not work. You need to be very careful to avoid these situations.

    @bulb said in C++ Stockholm Syndrome:

    So why are you saying the Rust prospect is no good?

    https://what.thedailywtf.com/topic/20945/why-he-s-dropping-rust (this is the main one I remember scaring me off from Rust)
    https://what.thedailywtf.com/topic/22463/rust-s-fatal-flaw (though this seems to just be cosmetic, it surprises me that not much care was put into naming things)

    @pie_flavor said in C++ Stockholm Syndrome:

    From what I can tell, it's because he's internalized all of C++'s quirks and ways of doing things, and focuses more on the fact that Rust can't do certain things in as few lines of code as C++ can than the fact that Rust's way is cleaner or that you generally go about problems entirely differently in Rust.

    Nope, see the above linked topics.


    A Snake's Tale does give me some hope for Rust yet, at least.


  • BINNED

    @lb_ That part about "you can't easily port GUI libraries to Rust because they use class hierarchies and Rust doesn't support that" seems kind of scary.

    @Gąska Is that still true?



  • @dkf said in C++ Stockholm Syndrome:

    The guys at the supercomputer centre were quite upset that we didn't want to use lots of CPU cores; it was their really huge address space (for the time) that was interesting.

    Yeah. To some degree that's still true. Last time I applied for some time at one of the local centres, the question was how many core-hours I'd be using in a month. I wasn't sure, but my estimate was around maybe 200-300 (I mainly wanted to get at their GPUs with 16GB of memory for doing some scaling experiments). The amount was rounded up to the minimum of 5000 hours per month.


  • ♿ (Parody)

    @dkf said in C++ Stockholm Syndrome:

    That code would probably still bring modern systems to their knees. It's one of the few things I know of that can really use terabytes of memory without difficulty (due to the underlying problem being actually somewhere about EXP2SPACE).

    Soooo...what did it do? You've got me really curious now.


  • Banned

    @lb_ said in C++ Stockholm Syndrome:

    @bulb said in C++ Stockholm Syndrome:

    The Results are always there and insist on being handled, so while they make it a bit harder to prototype, they improve the final quality.

    I've always wondered how this works - do the error types have as much variety as C++ and Java? I'm mainly concerned about problems raised by this quote:

    Because we only have one bit of information (the operation failed or not), G cannot handle the error; typically it doesn’t know what is the error. So the only thing that it can do is return false. In general, this approach leads to a situation in which if something bad happens in the system, everybody returns false, and nobody knows what happen. This can have negative consequences on your software. If some small part doesn't work, even if the error can be ignored, it can make your whole software not work. You need to be very careful to avoid these situations.

    In Rust, an error can be arbitrary type, though usually it's an arbitrary type implementing Error trait. It can contain as much (or as little) info as you want, and as many auxiliary methods as you want. I haven't written any Rust in ages, but from what I've heard, there are some widely used crates that let you easily create custom error types with rich functionality like automatically chaining/wrapping errors of arbitrary types inside yours, which makes ? operator work nicely everywhere, letting you forget about errors in 90% of places but still having them all statically checked so there's no way it randomly blows up. All the benefits of checked exceptions, without the boilerplate of checked exceptions.

    https://what.thedailywtf.com/topic/20945/why-he-s-dropping-rust (this is the main one I remember scaring me off from Rust)

    That's from 2016. Rust is a very different language nowadays (though the code you've written in 2016 still compiles and works). I don't have time to go through that article right now, though, so can't say which parts exactly don't apply anymore.

    https://what.thedailywtf.com/topic/22463/rust-s-fatal-flaw (though this seems to just be cosmetic, it surprises me that not much care was put into naming things)

    You realize it's April Fool's joke?


  • Banned

    @topspin said in C++ Stockholm Syndrome:

    @lb_ That part about "you can't easily port GUI libraries to Rust because they use class hierarchies and Rust doesn't support that" seems kind of scary.

    @Gąska Is that still true?

    Yes, it's very much true. Until we figure out how to make widgets without inheritance, there will be no good GUI library for Rust. And no, IMGUI doesn't count.



  • @gąska said in C++ Stockholm Syndrome:

    @ben_lubar people said the same about 32 bits in 2000.

    I'm not saying we'll never need more than 64 bits for address space. I'm just saying that if you need that much address space with currently-existing hardware, your motherboard must have an insane number of RAM slots and your computer is probably non-trivially affected by the speed of light.



  • @dkf said in C++ Stockholm Syndrome:

    That code would probably still bring modern systems to their knees. It's one of the few things I know of that can really use terabytes of memory without difficulty (due to the underlying problem being actually somewhere about EXP2SPACE).

    I'm reminded of a quantum chemistry computation that in optimized form is something like O(N7). In unoptimized form it's O(N!) or worse, where N is the number of basis functions (usually ~3-10 per electron).

    And that's only the second-order term in an infinite series. We can go to a linearized version of the 3rd-order term, but it's...painful, even parallelized as much as possible.

    These will take 100s of cores on a cluster 6+ months of CPU time (per core, not aggregate numbers) to run to completion. For comparatively small molecules (N ~ 10000 or so).

    Edit: found a slide comparing the methods:

    0_1524060202371_Screen Shot 2018-04-18 at 10.02.47 AM.png


  • Banned

    @ben_lubar said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    @ben_lubar people said the same about 32 bits in 2000.

    I'm not saying we'll never need more than 64 bits for address space. I'm just saying that if you need that much address space with currently-existing hardware, your motherboard must have an insane number of RAM slots and your computer is probably non-trivially affected by the speed of light.

    Fun fact: signal propagation is already a factor in design of consumer grade GPUs.

    Also, I know it's not a problem now and won't be a problem for at least another decade, but eventually we might get there, and all in-place pointer metadata fuckery will bite us hard then, just like the last time it happened. And every bit of information drastically reduces time until it happens.


  • BINNED

    @gąska I understand that people use inheritance way too often when they shouldn't (the "composition over inheritance" argument), but why not allow it all? That seems like a distinctly missing feature.



  • @gąska said in C++ Stockholm Syndrome:

    @topspin said in C++ Stockholm Syndrome:

    @lb_ That part about "you can't easily port GUI libraries to Rust because they use class hierarchies and Rust doesn't support that" seems kind of scary.

    @Gąska Is that still true?

    Yes, it's very much true. Until we figure out how to make widgets without inheritance, there will be no good GUI library for Rust. And no, IMGUI doesn't count.

    What? You have interfaces, right? You can make widgets that way. Just make sure there aren't any methods that return this and you can do it with composition.


  • Discourse touched me in a no-no place

    @boomzilla said in C++ Stockholm Syndrome:

    Soooo...what did it do?

    Temporal logic model checker for simulated asynchronous hardware. It would attempt to take the (very small) simulation on all possible execution paths and see if interesting properties like “keep working” would hold; the vast amount of memory was used to work out whether the system state was in a previously seen state, which was important for comprehending the valuation of the formal logic. It worked… provided the base model was incredibly small. Scaling up was horrible, even with systems that were guaranteed finite (we had the hardware generation path worked out, which forced everything to be finite) so we started looking into using category theory to do major induced state space reductions, rather like what the very advanced compilers do now in a more limited way. This was also the same project where I wrote a graphical programming language and IDE for it.

    But we ran out of money and I transferred internally to a completely different area of computing. ;)


  • Banned

    @ben_lubar said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    @topspin said in C++ Stockholm Syndrome:

    @lb_ That part about "you can't easily port GUI libraries to Rust because they use class hierarchies and Rust doesn't support that" seems kind of scary.

    @Gąska Is that still true?

    Yes, it's very much true. Until we figure out how to make widgets without inheritance, there will be no good GUI library for Rust. And no, IMGUI doesn't count.

    What? You have interfaces, right? You can make widgets that way. Just make sure there aren't any methods that return this and you can do it with composition.

    But you can't make anything as robust as WPF or JavaFX or Qt. They are heavily based on implementation inheritance and selective overrides, and there's no easy way to achieve it without classes without sacrificing all ergonomy.


  • Discourse touched me in a no-no place

    @benjamin-hall said in C++ Stockholm Syndrome:

    In unoptimized form it's O(N!) or worse

    EXPSPACE is similar to O(N!); it's a complexity class where the space required for a problem is exponential in the size of the input and where the time required is therefore bounded on the low side by that and may be quite a bit higher.

    Double EXP SPACE has space requirements proportional to eeN and is just awful. Problems in this class are so thoroughly not-scalable that they'll never really be the domain of ordinary programmers; finding ways to reduce the problem size is the main way forward, and that's into the scary part of CS-meets-Math.



  • @gąska said in C++ Stockholm Syndrome:

    selective overrides

    Does Rust not have the ability to replace a method that would be added by an embedded object?

    Anyway, speaking of UI library nightmares:


  • Banned

    @ben_lubar said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    selective overrides

    Does Rust not have the ability to replace a method that would be added by an embedded object?

    Rust doesn't have ability to add methods via embedded objects.



  • @gąska said in C++ Stockholm Syndrome:

    You realize it's April Fool's joke?

    No, oops. Thanks, I really didn't realize that. I hardly pay attention to dates >_<


  • Banned

    @topspin said in C++ Stockholm Syndrome:

    @gąska I understand that people use inheritance way too often when they shouldn't (the "composition over inheritance" argument), but why not allow it all? That seems like a distinctly missing feature.

    1. It doesn't play nice with Hindley-Milner type inference.
    2. It duplicates the functionality of traits, and at the same time is completely incompatible with them - if Rust had both, it would cause a huge split in the ecosystem between class-based libraries and trait-based libraries. So it's best to have one or the other - and traits are superior in everything except GUI.

  • Impossible Mission - B

    @gąska said in C++ Stockholm Syndrome:

    Yes, it's very much true. Until we figure out how to make widgets withoutget proper support for inheritance, there will be no good GUI library for Rust. And no, IMGUI doesn't count.

    FTFY. Language designers need to get over this whole hipster "we don't really need OO" phase. It doesn't work in Go and it doesn't work in Rust. There's a reason OOP has spent the last couple decades taking over the world in ways no other paradigm has: it's what actually works.


  • Impossible Mission - B

    @gąska said in C++ Stockholm Syndrome:

    1. It doesn't play nice with Hindley-Milner type inference.

    That's news to me, and to everyone else who's worked on an OO compiler that uses type inference... 😕

    1. It duplicates the functionality of traits,

    Maybe

    and at the same time is completely incompatible with them

    This statement positively reeks of a lack of imagination. It shouldn't be all that difficult to get the two to play nicely together, unless I'm completely misunderstanding what traits are...

    and traits are superior in everything except GUI.

    ...which is why trait-oriented programming has spent the last couple decades taking over the world, amirite?



  • @gąska said in C++ Stockholm Syndrome:

    So it's best to have one or the other - and traits are superior in everything except GUI.

    Superior for everything except the one thing that matters.


  • Banned

    @masonwheeler said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    Yes, it's very much true. Until we figure out how to make widgets withoutget proper support for inheritance, there will be no good GUI library for Rust. And no, IMGUI doesn't count.

    FTFY. Language designers need to get over this whole hipster "we don't really need OO" phase.

    Rust is very OO. Just differently. Java programmers need to get over this whole "there is no other way to do things than what I learned when I was 19" phase.

    There's a reason OOP has spent the last couple decades taking over the world in ways no other paradigm has: it's what actually works.there were hundreds of billions of dollars pumped into languages that just happened to have class inheritance, similar to how JavaScript and PHP took over frontend and backend respectively despite being absolutely not suited for it, or for anything else to be frank.

    FTFY. You underestimate the effect of chaotic randomness on technology.


  • Banned

    @blakeyrat said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    So it's best to have one or the other - and traits are superior in everything except GUI.

    Superior for everything except the one thing that matters.

    We've been over this.


  • Banned

    @masonwheeler said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    1. It doesn't play nice with Hindley-Milner type inference.

    That's news to me, and to everyone else who's worked on an OO compiler that uses type inference... 😕

    Why do you think I said "Hindley-Milner type inference", and not just "type inference"?

    @masonwheeler said in C++ Stockholm Syndrome:

    and at the same time is completely incompatible with them

    This statement positively reeks of a lack of imagination. It shouldn't be all that difficult to get the two to play nicely together, unless I'm completely misunderstanding what traits are...

    Traits have some properties that classes cannot have, and vice versa - which makes it impossible to make any sort of wrapper for trait to behave like a class, and for class to behave like a trait. If you had both, every time you wanted to use polymorphism, you'd have to decide whether to go with classes or with traits - and stick to it for the next 20 years (or until the next rewrite from scratch, or until your company goes bankrupt).

    @masonwheeler said in C++ Stockholm Syndrome:

    ...which is why trait-oriented programming has spent the last couple decades taking over the world, amirite?

    No you're not.


  • Impossible Mission - B

    @gąska said in C++ Stockholm Syndrome:

    Why do you think I said "Hindley-Milner type inference", and not just "type inference"?

    And what about it? Looking over this series of articles on the H-M formalisms I don't see anything that contradicts standard OO principles or would be obviously incompatible with them.

    Traits have some properties that classes cannot have, and vice versa

    ...such as?

    • which makes it impossible to make any sort of wrapper for trait to behave like a class, and for class to behave like a trait.

    Not what I meant. Trying to make a class be a trait, yeah, obviously that's a dumb idea. Trying to make a class have a trait, on the other hand, what's wrong with that?



  • @dkf said in C++ Stockholm Syndrome:

    Double EXP SPACE has space requirements proportional to e^e^N and is just awful.

    Sounds fun, where do I sign up?

    And here people were poking fun at me for proposing a solution that just needs to reduce memory requirements by a factor 1000 to become viable for the problem sizes that I want to target.


  • BINNED

    @gąska said in C++ Stockholm Syndrome:

    1. It doesn't play nice with Hindley-Milner type inference.

    I know what type inference is, NFC what that specific type is. (yes, yes, google... :kneeling_warthog: )

    1. It duplicates the functionality of traits, and at the same time is completely incompatible with them - if Rust had both, it would cause a huge split in the ecosystem between class-based libraries and trait-based libraries. So it's best to have one or the other - and traits are superior in everything except GUI.

    I completely do not follow.
    If traits duplicate that functionality, why can't you use traits to achieve the same thing? And in what way is a GUI example of "Widget <- TextBox <- RichTextBox" in any way special that GUIs are the only thing that don't work. The same kind of subtyping is used in a lot of places.

    Can you explain this in a simpler (or more concrete) way?


  • Banned

    @masonwheeler said in C++ Stockholm Syndrome:

    @gąska said in C++ Stockholm Syndrome:

    Why do you think I said "Hindley-Milner type inference", and not just "type inference"?

    And what about it? Looking over this series of articles on the H-M formalisms I don't see anything that contradicts standard OO principles or would be obviously incompatible with them.

    There was some discussion in one Rust RFC or another where someone explained why exactly class inheritance makes it harder to infer types, but CBA to look it up right now. Feel free to do it yourself if you want to. I think it involved trait method resolution too?

    Traits have some properties that classes cannot have, and vice versa

    ...such as?

    Traits can be implemented outside type definition. Base classes can have fields in addition to methods. Just for starters.

    • which makes it impossible to make any sort of wrapper for trait to behave like a class, and for class to behave like a trait.

    Not what I meant. Trying to make a class be a trait, yeah, obviously that's a dumb idea. Trying to make a class have a trait, on the other hand, what's wrong with that?

    Okay, maybe not completely incompatible, but incompatible enough (especially in the other way) that it would result in split ecosystem.


  • Impossible Mission - B

    @gąska said in C++ Stockholm Syndrome:

    Traits can be implemented outside type definition. Base classes can have fields in addition to methods. Just for starters.

    This is true, but I don't see how it makes the two incompatible, just different.

    Okay, maybe not completely incompatible, but incompatible enough (especially in the other way) that it would result in split ecosystem.

    What other way?


Log in to reply