𝄯 Sharp Regrets: Top 10 Worst C# Features



  • @Bulb said:

    No, you broke it pretty badly. It takes a lot of SFINAE magic to do it correctly.

    I'm curious. Not saying there aren't probably some gotchas in there, but how exactly is it broken (except for missing parens when using the ! operator)? I can only imagine some confusion if you abuse implicit conversions, which might require in ambiguous cases to specify which types you want to compare two arguments as.



  • @mrguyorama said:

    I would have to disagree about for(;;) being less intuitive than foreach(). For someone just learning programming concepts, understanding that the alias you create in the foreach() declaration IS the object you want, instead of explicitly pulling it out using an index was certainly harder for me to comprehend.

    Unlearn indexing. Indexing is evil.

    For some reason I really hated indexing from the start. I always preferred iterating with the pointer. It is better and it is more flexible, because iterators work for all kinds of collections, but there are no indices for maps and sets and linked lists and other data structures that are not random access.

    @mrguyorama said:

    Also, does foreach() work on an object with no iterator?

    An object with indexing operator should also have iterator. But actually yes, it does. You can use iota (Range or whateveritiscalledinC#) to generate iterator over numbers 0..n-1 and index the object.

    On the other hand, for objects that have iterator and no index, and there are many sensible case of that, foreach is much easier.



  • My professor in one of our C classes once tried to help us better understand C type declarations by creating one with seventeen parts. I would hate to see that in actual production code. C can be utterly brutal sometimes.

    LINQ seems useful, if a bit hard for someone with a "a program is one single simple instruction at a time" mindset, but definitely superior to a loop.

    I believe java 8's new collection system might be similar?

    Mostly this topic has taught me that I need to expedite my learning of C#, very very advanced C#



  • @mrguyorama said:

    LINQ seems useful, if a bit hard for someone with a "a program is one single simple instruction at a time" mindset,

    That hasn't been a correct way to think about programming for many, many decades. Part of the design process of C++'s Standard Template Library was to remove the concept of indexes and replace it with iterators, and that was done, what, early-90s? Old enough that I learned it in college.

    Heck, SQL has never been part of that mindset, and it's older than most other languages that are still around. LINQ is mostly just the adoption of SQL ideas in a new environment.



  • @Kian said:

    Not saying there aren't probably some gotchas in there, but how exactly is it broken

    Unrestricted templates like this are likely to make many expressions ambiguous. Because the compiler only matches the signatures of the templates to decide what are viable overloads the unrestricted overloads will be viable for many types that either shouldn't be comparable at all or that have their own correct definitions using SFINAE-restricted templates where the precedence rules won't resolve the conflict.

    It is possible to do it, but you need to define a trait and SFINAE-restrict the templates to the types that opt-in by defining specialization of the trait.



  • @blakeyrat said:

    That hasn't been a correct way to think about programming for many, many decades.

    This is the one thing holding me back from being a better programmer. I have a hard time stepping out of the "how does this work at the instruction level" mindset to work with more powerful features. Although object oriented concepts I was able to pick up very quickly soo...

    When you teach the silly little kiddies in a COS 100 class to do basic "hello world" shit, you teach indexes, and worse, nobody ever taught us that indexes were bad, not even in 400 level classes.

    Anyone care to elaborate why they are so evil?



  • @blakeyrat said:

    LINQ is mostly just the adoption of SQL ideas in a new environment.

    I just wonder why LINQ uses completely different order than either SQL or the “list comprehensions” of Python or Haskell in it's SQL-like syntactic sugar. Made worse by the fact that it uses the same keywords as SQL, so one would expect it to be similar. And bam, it isn't.



  • @mrguyorama said:

    Anyone care to elaborate why they are so evil?

    Because you are more productive if you think in the high level concepts. The functional programming techniques are difficult to wrap your head around if you come with that mindset, but they get you to your destination much faster.



  • So not exactly that indexes themselves are bad, just that there often exists a better way to do it.



  • @Bulb said:

    But comparing for(;;) to while() is the only comparison that makes sense. Because for(;;) is an alternate syntax for while(). For many cases a superior one, if for no reason then because it makes it less likely you forget the step expression.

    This is all true, but as I pointed out earlier, a less complicated syntax is possible that would cover the overwhelming majority of cases where you are doing an indefinite loop for something other than an iterable collection - not that there would be many of those cases in a language like C#, but it comes up in C and C++ a lot, and C# followed Java's lead in blindly copying C's core syntax even when it wasn't really necessary to do so.

    The real question, I guess, is why did Brian Kernighan come up with that for() syntax in the first place? My understanding is that he was basing the C syntax on BCPL's, which in turn was based on Algol 60's (I know I skipped B, but that was more of an intermediate development stage of C than anything). Algol 60 had an overly elaborate FOR syntax (though nowhere near the wackiness in Algol 68), which BCPL mostly kept, and Kernighan actually pared it down a lot to get the C syntax. However, he still wanted the explicit initializer/test/iterator format so he could iterate through pointers directly rather than by an index.

    Pointer iteration isn't an issue in either Java or C#, but they kept that syntax anyway out inertia and familiarity. Didn't someone just say that "we've always done it that way" is a bad reason to do things? I may not agree with @blakeyrat on many things, but he's dead right about that.

    In any case, the issue is mostly moot, as the foreach syntax is usable for almost all cases today.



  • @mrguyorama said:

    So not exactly that indexes themselves are bad, just that there often exists a better way to do it.

    evil, n.:

    It means such and such is something you should avoid most of the time, but not something you should avoid all the time. For example, you will end up using these “evil” things whenever they are “the least evil of the evil alternatives.”

    (C++ FAQ)


  • ♿ (Parody)

    @Bulb said:

    Parsing is the process done by the compiler, not people.

    What do you suppose happens when you read code?



  • @mrguyorama said:

    When you teach the silly little kiddies in a COS 100 class to do basic "hello world" shit, you teach indexes, and worse, nobody ever taught us that indexes were bad, not even in 400 level classes.

    I learned absolutely nothing useful about computer programming in computer science courses at university.

    @mrguyorama said:

    Anyone care to elaborate why they are so evil?

    They're not evil, they're just indirection and utterly unnecessary indirection.

    With a for(;;) loop, you have this index i. Why is i there? Is that the object you're working on? No, you still need to pull the object you're working on out of the list, i just tells you which one to pull. Well, ok I guess. Then you have this range check. What's the lower bound of i? Well, 99.999% of the time it's zero. What's the condition? Well, 99.999% of the time, it's i should be less than the length of the list.

    Well that's irritating. Since we use for(;;) loops the same way 99.999% of the time, why don't we make a version where:

    1. You don't need to expressly say you want every index in the container
    2. You don't have to expressly look up the object you actually want to perform the operation on, it's just right there
    3. You can do all the "IEnumerable magic." (In C#, this means: your loop can make use of IEumerable providers that Yield, which potentially can greatly increase performance. There's a lot of other bits of "IEnumerable magic", but Yield is probably the bit most relevant to simple loops.

    Now that's not to say the for(;;) loop shouldn't exist. They do, and they're handy if you just want to quick and dirty look at every pixel in a Image, for example. But for normal business-type applications, they should be completely unnecessary.



  • @Bulb said:

    I just wonder why LINQ uses completely different order than either SQL or the “list comprehensions” of Python or Haskell in it's SQL-like syntactic sugar.

    Meh. LINQ is expressly a new syntax; they never promised otherwise. It's always been "SQL-like", not "SQL". Also, I doubt many C# developers know the conventions of Python or Haskell.

    I think it's safe to say that syntax is mostly unused nowadays, anyway.


  • Banned

    @xaade said:

    I don't

    But you just said... ah, forget it.

    @xaade said:

    ++A modifies its own argument.

    @xaade said:

    Or any other loop really, where you put the pointer at the first index and let post-increment iterate for you.

    How often do you use pointers in loops outside of C?

    @mrguyorama said:

    When you teach the silly little kiddies in a COS 100 class to do basic "hello world" shit, you teach indexes, and worse, nobody ever taught us that indexes were bad, not even in 400 level classes.

    It's because teachers are shitty programmers. If they weren't, they wouldn't be teachers.

    @mrguyorama said:

    Anyone care to elaborate why they are so evil?

    Slow when indices are checked for being out of range, unsafe when not.

    @mrguyorama said:

    So not exactly that indexes themselves are bad, just that there often exists a better way to do it.

    In programming, that's definition of bad.



  • @blakeyrat said:

    I learned absolutely nothing useful about computer programming in computer science courses at university.

    QFT. Though it may depend on the university. I have no trouble believing that some can do it right, especially after watching some of MIT's old lisp videos. Those are wonderful, because they teach from the right direction.



  • That is a great explanation. LINQ looks pretty useful.

    In regard to the COS courses at uni thing, I never understood why people say that. Could I have managed my current job without my schooling? Yeah, but I would have way less experience and would have taken way way longer to figure things out. Although this may have something to do with my COS program being very well done, with professors who know what they are doing.



  • @mrguyorama said:

    That is a great explanation. LINQ looks pretty useful.

    I should also add that if you adopt LINQ (and you should), you really won't ever use foreach() either. LINQ operations can replace that in 99.9% of cases. (Well, 100%, but there are some cases where the foreach() version would be significantly more readable.)



  • @ScholRLEA said:

    This is all true, but as I pointed out earlier, a less complicated syntax is possible that would cover the overwhelming majority of cases where you are doing an indefinite loop for something other than an iterable collection - not that there would be many of those cases in a language like C#, but it comes up in C and C++ a lot, and C# followed Java's lead in blindly copying C's core syntax even when it wasn't really necessary to do so.

    The basic/pascal-style for i := 1 to n is well covered with foreach:

    `foreach(var i : Range(1, n))`
    

    (just not sure where C# has Range, because I didn't use it for a while; I know where to find it in python)

    So the for(;;) is only for the cases where you need the full power of while, but can organize it in this slightly more structured form.

    @boomzilla said:

    What do you suppose happens when you read code?

    Some kind of comprehension that does not really follow the structure of the grammar.

    @blakeyrat said:

    Meh. LINQ is expressly a new syntax; they never promised otherwise.

    Yes. But a syntax that follows the pre-existing conventions is easier to adopt. But they chose not to.

    @blakeyrat said:

    I think it's safe to say that syntax is mostly unused nowadays, anyway.

    And I suspect it is not used because of its strange order of forms.

    I did read somewhere that the authors thought the SQL order is illogical and changed it. But then I checked and the mathematical set notation, list comprehensions and SQL all use the same order, so it is very well established:

    Math: { what: whatcollection, condition }
    Python: (what for what in collection if condition)
    SQL: select what from collection where conditions

    Only LINQ differs.



  • The fact is, the main value a degree gets you is that automated filters are less likely to ignore you. Mine at least also got me some exposure to different things, but apart from this one crazy German guy (happiest man in the universe. Not even kidding!), the lecturers were all horribly lazy and disorganized.

    I learned more trying to build a solar system than I did in class. I learned far more reading the C# language spec than my university had to teach. And this isn't terribly uncommon, from what I can tell. Universities are pretty much only capable of teaching old things, because that's what you can build a curriculum for. And unless they start you with CL, they'll teach you how to be bad at Java or C.



  • @Gaska said:

    It's because teachers are shitty programmers

    ehh, in this case, it's because the students all suck and take 16 hours to write "hello world" level programs in java. It gets better after the 100 level classes.

    @Gaska said:

    Slow when indices are checked for being out of range, unsafe when not.

    So you're saying that there is nothing going through a list of items on the processor? In the end of it, there is still a pointer somewhere getting iterated and checked for bounds, unless every collection is set up like a linked list

    @Gaska said:

    In programming, that's definition of bad

    In the context of business applications and professional systems, maybe not so much in personal projects

    @blakeyrat said:

    I should also add that if you adopt LINQ (and you should),

    I will adopt LINQ as soon as Microsoft adds it to Java /s (and I understand it well enough of course)

    Also of note is that no matter my personal preference, my company's code base uses foreach(), so not going to break the mold for stupid reasons



  • At my university, we had a class that every single COS major must take that basically sets up a situation where you build a massive code base and actually learn how to write code in a business environment. It was wonderful and helped me realize that I can't possibly understand every line of code in the (27K) code files our program has.

    Our class on programming language design had significant emphasis on "Here is the Java specification, don't just read it, understand it, or I fail you".

    And Algorithm Design was a great class to teach you to think about problems in an efficient way.

    About the only thing I didn't learn is Swing (and .net, but that's because that professor left) and I fixed that with a quick pet project.



  • Yeah, where I was, it seemed that none of the professors talked to eachother at all. We'd have a semester long class with only two projects, randomly in prolog. Some random Unix class, that had us do a bit of Ruby. We had a class where the lecturer tried to explain C# to us, badly, for half the semester. And projects were downright rare.

    Tip for anyone planning to attend the University of Auckland's CS courses: You'd better be willing to learn everything outside class. Otherwise all you get is the ability to not be filtered out by employers as quickly.



  • @Bulb said:

    It is post-increment that is redundant. It is the odd man out here.

    Do you know the meaning of the word ‘redundant’?



  • Linked for you. Despite their obvious age, these are really great, and can teach any programmer a lot - regardless of the languages they use - if they have the patience to let them. Yes, they are using Scheme as the language they are teaching in, and are not even using most of the language at that; yes, they really lay the Lisp evangelism on with a trowel; yes, they are thirty years old, and it shows in the technology they are using; yes, they go into detail on subjects that on the surface seem to be either esoteric or self-evident to the overwhelming majority of programmers.

    None of that matters.

    They needed to use a Lisp because it let's you poke into obscure corners and explore features that other languages just hand to you without ever questioning the fundamental assumptions beneath them. They chose Scheme specifically because it is so minimal that they can walk you through building the whole thing up from the bottom without the language getting in the way. They go poking around in those areas because understanding them well makes you a better programmer, and because it lets them explain a lot of things that people otherwise have a lot of trouble grasping.

    Also, the videos demonstrate that it is possible to teach this material in an engaging way. Courses based on Scheme tend to get a bad rap, partly because they don't seem to connect to modern programming in a useful way, and partly because they seem boring. The first misses the point (that understanding the principles first is a Good Thing), but second is mostly the fault of craptastic professors who shouldn't be teaching in the first place. The enthusiasm Abelson and Sussman have for their material in these videos is anything but boring.



  • Exactly. They go about things from an abstract level, not from a 'well first you need a number, but actually that number is some bits somewhere than you need to point to, so make sure you allocate enough memory and-' level. They teach you what you need to build a program.



  • Aww man, I LIKE the deep down nitty gritty level. Our COS major also requires a course on boolean/digital logic. Also an awesome course. It encouraged me to design my own processor.

    @ScholRLEA said:

    yes, they really lay the Lisp evangelism on with a trowel

    I actually have the book the course uses. The first ten pages are "This is why lisp is the best thing that ever happened to computers and you should feel ashamed for ever having used anything that isn't scheme" Both funny and kind of weird



  • @mrguyorama said:

    In regard to the COS courses at uni thing, I never understood why people say that. Could I have managed my current job without my schooling? Yeah, but I would have way less experience and would have taken way way longer to figure things out. Although this may have something to do with my COS program being very well done, with professors who know what they are doing.

    I had many courses that had their worth, but they were not the ones that taught the programming languages.

    I consider the “Introduction to Complexity and NP-Completeness” and the following “Computability” pretty important as I do the “Data Structures” and probably the course where we learned the basic of grammars name of which I don't remember. And from “Programming” courses the parts where we learned algorithms and the introduction to non-procedural programming (= functional and logical (=prolog)). And then of course the big team “Project” where we were just let loose to produce something reasonably large and complete.

    On the other hand the basic courses in Pascal (still back then) and C/C++ just taught poor obsolete practices and I already knew most of the things better anyway.

    @mrguyorama said:

    At my university, we had a class that every single COS major must take that basically sets up a situation where you build a massive code base and actually learn how to write code in a business environment. It was wonderful and helped me realize that I can't possibly understand every line of code in the (27K) code files our program has.

    That's great. That's what every university should include. Unfortunately, only few do (we had it in the “Project”, though it was kind of poorly supervised).



  • @Buddy said:

    Do you know the meaning of the word ‘redundant’?

    Yes, I do. Post-increment is redundant, because you can always simply move it to the next expression:

    whatever(i++)whatever(i); ++i

    and it will be more readable and not prone to sequencing-point issues and it does not have any big problems.



  • @Bulb said:

    but they were not the ones that taught the programming languages.

    A programming language is merely a tool. It has very little to do with programming. Programming languages should only be taught as part of COS out of necessity. We only teach Java (one class) and C (a combined learn c/unix class) and then the programming languages class teaches you to understand the concept of a programming language so you can pick it up in a weekend. Everything else is designed to be about COS, not learning programming. Needless to say, most of those who can't hack it quit by the third COS class and switch to Information Technology (or "COS for dummies" as my IT friend says)



  • Ours did the big filtering in the second semester. You had to take 105, which was about data structures. That halved the total population, and quartered the female population straight away. Then 205 or whatever the next year got rid of the stragglers, so only the truly focused/insane were left.



  • @mrguyorama said:

    the programming languages class teaches you to understand the concept of a programming language so you can pick it up in a weekend

    Then one of your professors wants to do everything in Standard ML or FRIL or something you have no hope of understanding right away...

    A language is more than just syntax, it's a way of representing ideas (semantics). Different languages support different ways of doing this and they take time to appreciate.

    @mrguyorama said:

    A programming language ... has very little to do with programming.

    Having to learn all the little details of a language may be thought of as a necessary evil, but to say that language has very little to do with communication... that's fucking incorrect.



  • @Yamikuronue said:

    I feel like all the valid criticisms would go away if you defined ++ to return void rather than return a value at all, in both prefix and postfix positions.

    Nemerle does this. (Although, I'll admit I'm not sure it even has a prefix ++operator, seeing as it'd be identical in behaviour to the postfix one...)

    Nemerle also uses value: type notation, which now I'm used to it, I think I actually prefer.

    Oh yeah, also in Nemerle, module is a keyword meaning static class. (If it looks like I'm trying to imply Nemerle is a better C# than C#, then, yeah...)

    Bonus side rant: back in C++/C# land, ++x increments x and then uses the value in an expression. x++ uses the value of x in the expression and then increments it. x++ use, then increment; ++x increment, then use. I think Lippert just learned the wrong mnemonics...



  • @dkf said:

    Some languages have a unit type, with exactly one value of that type (also called unit usually). Since there's only one value, it doesn't need to be communicated around as it cannot actually be carrying any information. It works like void but without the suck!

    In Nemerle, the singular value of the void type is spelled (); you can thank the ML family for that...



  • @blakeyrat said:

    that syntax is mostly unused nowadays

    Don't know about actual code in the wild (my company has never really embraced LINQ), but it seems all StackOverflow answers I hit use it. Which pisses me off, because I never really bothered to learn it.

    Extension methods just make so much more sense once you embrace the concept of lambdas being (ab)used as property selectors, which should take like 5 minutes. Then it's just chaining methods. Plus you can easily check what you get at each step in order.



  • @xaade said:

    I don't like empty statements, except their useful for goto, the most baneful thing in modern programming.

    Nemerle doesn't have goto, but if you invoke a block label like it was a function (inside that labelled block), it will exit the block. This sound nuts, but it's actually fairly easy to get used to.



  • @Bort said:

    Standard ML or FRIL

    We covered the core concepts of the different classes of programming languages, so we at least wouldn't be totally lost on our first encounter

    @Bort said:

    that's fucking incorrect

    Yes

    What I meant was that teaching programming languages should be a very very small part of any COS degree

    numStupidThingsIveSaid++;

    Error Uncaught Exception: NumberOverflowException



  • The Right way to do it is ++numStupidThingsIveSaid;



  • in my mind, it makes sense as "take this variable, and increment it" and it holds a special place in my mental model. I also think using it in any way other than a solitary statement completely on its own is bad practice. I like code to try to only do one thing at a time. If a smart compiler is going to reduce it however it wants anyway, why not write it in the most obvious way possible?



  • Actually add this to things in which C# is lacking - no nice way of doing the "x => x.SomeProperty" selectors. Most of the framework just does strings, and while expression trees are a step forward, you occasionally run into type issues (Expression<Func<SomeType, object>> does not generally give out a MemberExpression, you need generics), and there's nothing guaranteeing the expression is actually a property selector, and not some magic that happens to match the type.



  • @hungrier said:

    Reserved words? No, the C way would be to add yet another meaning to statican already used character.

    Filed under: C++C



  • @mrguyorama said:

    If a smart compiler is going to reduce it however it wants anyway, why not write it in the most obvious way possible?

    I've been writing a lot of C++ lately. Working with iterators kinda trains you to use ++i, so much so that seeing i++ in my old C# code looks wrong (it's not wrong, just looks weird).



  • @Maciejasjmj said:

    but it seems all StackOverflow answers I hit use it.

    I only see it on older questions that were answered back when it was most current. But hey, whatever, StackOverflow sucks anyway.



  • @Bulb said:

    I just wonder why LINQ uses completely different order than either SQL or the “list comprehensions” of Python or Haskell in it's SQL-like syntactic sugar.

    The order Microsoft uses makes IntelliSense work better.



  • @mrguyorama said:

    So you're saying that there is nothing going through a list of items on the processor? In the end of it, there is still a pointer somewhere getting iterated and checked for bounds, unless every collection is set up like a linked list

    Counter example:

    using System;
    using System.IO;
    
    class Program
    {
        static void Main()
        {
    	foreach (string line in File.ReadLines("c:\\file.txt"))
    	{
    	    Console.WriteLine("-- {0}", line);
    	}
        }
    }
    


  • @Bulb said:

    >whatever(i++) → whatever(i); ++i

    OK... this is the argument that officially kills my affection* for post-increment**.

    I'll even try to write my for(;;) loops as for(;;++i)

    The for loop with an i index lives on for all us math types who live off sequences and sums that are functions of i to begin with.

    *affection: n, that sense of familiarity from doing things the way we've always done them. See also "habit"

    **Aaah. The horror! I had managed to forget the swaths of our codebase where our Master Complicator™ uses whatever(i,j,k,m,n,p,q,r,s++) ...



  • @dkf said:

    However, if we stick to just Eric's top ten: array covariance is fundamentally broken. It's broken in Java too. Some design decisions are just plain wrong.

    There's a reason that the Java community will tell you to prefer lists to arrays. Hell, it's even the name of one of the items in Effective Java, Second Edition (Item 25 specifically).

    The example given there is contrived, but:

    // Fails at runtime!
    Object[] objectArray = new Long[1];
    objectArray[0] = "I don't fit in"; // Throws ArrayStoreException
    
    // Won't compile
    List<Object> ol = new ArrayList<Long>(); // Incompatible types
    ol.add("I don't fit in");
    

    ...because collections are invariant.



  • Ha. If you were REALLY learning how to write code in a business environment, they'd have a pre-existing codebase written using a version of the language 10 years behind current, which you aren't allowed to upgrade, and aren't allowed to bring any additional dependencies into.

    And are then told to add a feature to the codebase which can only be done if you start re-writing features of newer revisions of the language from scratch.



  • @rvleshrac said:

    Ha. If you were REALLY learning how to write code in a business environment, they'd have a pre-existing codebase written using a version of the language 10 years behind current, which you aren't allowed to upgrade, and aren't allowed to bring any additional dependencies into.

    And it would depend on some proprietary UI control set that is out of date and you don't have a license for so the app won't run. (Dealing with this right now).



  • @ScholRLEA said:

    it lets every Lisp coder come up with their own private dialect that no one else can read

    This is simultaneously Lisp's greatest strength and biggest weakness.


Log in to reply