C# - compiler WTF



  • WTF???!



  • It looks like enums in C# must derive from integral types to remain compatible with C++ and other legacy languages supported by the managed runtime.  What's the WTF here?





  • [quote user="fluffy777"]It looks like enums in C# must derive from integral types to remain compatible with C++ and other legacy languages supported by the managed runtime.  What's the WTF here?



    [/quote]

    OK, "byte" is just a language convenience.  It's mapped to "System.Byte" (see tooltip).  What's WTF is you are required to use the language specific form when creating an enum that inherits from any of the primitives.



  • Looks to me like it's being case-sensitive? foo2 : Byte compared to foo1 : byte. Which is no WTF. Am I missing something? I don't really know about C#.



  • Yes, it looks like a case sensitivity issue to me too, unless I'm missing something obvious. In which case, it's not a C# WTF, but a user WTF.
     



  • You can't extend enums, that will give you a compile-time error one way or the other. Enums are basically fancy wrappings around an integer type. Look up Value Types in MSDN.

    What you're looking for is

    public enum foo1
    {
      byte fooMember1;
    }



  •  

    Anyway, if it doesn't make sense it doesn't make sense.  I'm trying to use BCL types instead of language specific types in code and came across this one snag.  Not a big deal, I can certainly use the language specific way, just made me scratch my head and go "WTF?".  Hope someone enjoyed it :)



  • I know what you're saying, but I'm pretty sure it is correct.  I believe that 'byte', 'int', etc are LANGUAGE primitives (i.e. C# standard), whereas 'System.Byte', 'System.Int32', etc, are FRAMEWORK implementations (i.e. .NET Framework) of those primitives.  So it does make sense that a LANGUAGE enumeration could only be typed as an integral LANGUAGE primitive, even if the C# standard type (i.e. 'byte') is internally implemented as the .NET Framework type (i.e. 'System.Byte').

    Basically:

    'int', 'byte', etc are GUARANTEED to exist in ANY implementation of the C# standard

    'enum' can only be based on an integral C# primitive ('int', 'byte', etc)

    'System.Int32', 'System.Byte', etc, only exist in the .NET FRAMEWORK



  • I'm sure I posted here ... hmm. Anyway, this post by Luke says what I tried to say more clearly.



  • Luke, I was prepared to refute any explanation with "they're the same!", but, what you said does make sense.  It still makes me sad I have to use ushort instead of UInt16.



  • are there any performance differences between the language primitives and the framework implementations?



  • [quote user="Tann San"]are there any performance differences between the language primitives and the framework implementations?[/quote]

    No - it's a direct map.



  • [quote user="luke727"]

    I know what you're saying, but I'm pretty sure it is correct.  I believe that 'byte', 'int', etc are LANGUAGE primitives (i.e. C# standard), whereas 'System.Byte', 'System.Int32', etc, are FRAMEWORK implementations (i.e. .NET Framework) of those primitives.  So it does make sense that a LANGUAGE enumeration could only be typed as an integral LANGUAGE primitive, even if the C# standard type (i.e. 'byte') is internally implemented as the .NET Framework type (i.e. 'System.Byte').

    Basically:

    'int', 'byte', etc are GUARANTEED to exist in ANY implementation of the C# standard

    'enum' can only be based on an integral C# primitive ('int', 'byte', etc)

    'System.Int32', 'System.Byte', etc, only exist in the .NET FRAMEWORK

    [/quote]

    This is most certainly not true, the C# spec defines both system provided types (System.*) and their predefined shorthand versions, as well as explicitly defining what each shorthand aliases to. I mean, without System, you wouldn't have much of a language at all, would you? See the spec, sections 8.2.1 and 11.1.4: http://msdn2.microsoft.com/en-us/netframework/aa569283.aspx



  • The System.* structs and the keywords both get converted to their internal integral value (int8, int16, etc...) during compilation. byte (int8) is not ever viewed by the Framework as being System.Byte, but the System.Byte metadata is used during boxing operations.  

    Just a theory... but apparently the compiler does a different form of aliasing during enum definition, which first identifies System.Byte as its class metadata, not the alias.  Since the enum value expects to be stored as a native type, not a object, it dislikes the System struct.  Since there is only one conversion for the keyword (byte), there is no confusion, and it gets correctly identified as the native type.

     



  • just a quick check but Byte is defined as sealed which means you cannot use it as a base class.   I guess the question would be why is it sealed?  Especially when it would be nice to add missing methods (parse, try parse, etc).  The only explanation I could find is that sealed classes can get runtime performance enhancments and optimizations.   Here's a link describing what sealed means http://www.csharphelp.com/archives/archive158.html As for checking the class definition just use the old object browser.

     

     



  • [quote user="e.thermal"]

    just a quick check but Byte is defined as sealed which means you cannot use it as a base class.   I guess the question would be why is it sealed?  Especially when it would be nice to add missing methods (parse, try parse, etc).  The only explanation I could find is that sealed classes can get runtime performance enhancments and optimizations.   Here's a link describing what sealed means http://www.csharphelp.com/archives/archive158.html As for checking the class definition just use the old object browser.

     [/quote]

    I imagine that the reason they are sealed is the same reason that certain, similar Java classes (such as String) are final - to prohibit overriding that would allow you to break the class contract, such as breaking immutability (is that even a word?).

    Allowing classes like this to be sub-classed would open a myriad of security holes to get around the security model.



  • [quote user="e.thermal"]

    Especially when it would be nice to add missing methods (parse, try parse, etc).

    [/quote]

    In C# 3 you can extend a sealed class, but only with static methods. (I think there were more restriction - you'll have to look for article) It looks like a good solution - it's basically just a syntactic sugar for integrating String with "static class MyStringUtils", that everyone needs to write at one time, or another.



  • Ehh... whatever: http://blogs.msdn.com/ericgu/archive/2005/09/14/466510.aspx

    These are static functions in StringExtension class, taking only string parameter. In reality, you can call them with (string s).WhateverMyMethod()

    Looks nice :) 



  • c# is case sensitive. So the Byte will not work.

     

    also, each CLR compliant language must map it's data types to an underlying CLR type. In this case c#'s "byte" maps to the clr-compliant System.Byte. You can't use just "Byte" because of the naming collision. You can either use "byte" which is the c# name, or System.Byte, which would work in ANY CLR compliant language.

     

    No WTF here, move along.

     



  • [quote user="unklegwar"]c# is case sensitive. So the Byte will not work.

    also, each CLR compliant language must map it's data types to an underlying CLR type. In this case c#'s "byte" maps to the clr-compliant System.Byte. You can't use just "Byte" because of the naming collision. You can either use "byte" which is the c# name, or System.Byte, which would work in ANY CLR compliant language.

    No WTF here, move along.[/quote]

    Perhaps I am misunderstanding the intent or direction of your post, but if it is in regard to the original post, this is completely incorrect.  There is never, ever, going to be a naming collision between Byte and byte.  The compiler either recognizes it, or it doesn't.  Assuming you have imported System, the compiler will recognize Byte perfectly.  The OP's error is a compiler issue, not a CLR problem. The issue is that you can not use the structs to redefine an enum (regardless of if you spell out the full Type name or not), but have to specify the keyword instead.

    I don't think it's a WTF either, but for an entirely different reason.  As to why it behaves this way, I only have the theory I mentioned.

     

     


Log in to reply
 

Looks like your connection to What the Daily WTF? was lost, please wait while we try to reconnect.