Should I use int or Int32

前端 未结 30 2128
野性不改
野性不改 2020-11-22 11:52

In C#, int and Int32 are the same thing, but I\'ve read a number of times that int is preferred over Int32 with no reason

相关标签:
30条回答
  • 2020-11-22 12:32

    Though they are (mostly) identical (see below for the one [bug] difference), you definitely should care and you should use Int32.

    • The name for a 16-bit integer is Int16. For a 64 bit integer it's Int64, and for a 32-bit integer the intuitive choice is: int or Int32?

    • The question of the size of a variable of type Int16, Int32, or Int64 is self-referencing, but the question of the size of a variable of type int is a perfectly valid question and questions, no matter how trivial, are distracting, lead to confusion, waste time, hinder discussion, etc. (the fact this question exists proves the point).

    • Using Int32 promotes that the developer is conscious of their choice of type. How big is an int again? Oh yeah, 32. The likelihood that the size of the type will actually be considered is greater when the size is included in the name. Using Int32 also promotes knowledge of the other choices. When people aren't forced to at least recognize there are alternatives it become far too easy for int to become "THE integer type".

    • The class within the framework intended to interact with 32-bit integers is named Int32. Once again, which is: more intuitive, less confusing, lacks an (unnecessary) translation (not a translation in the system, but in the mind of the developer), etc. int lMax = Int32.MaxValue or Int32 lMax = Int32.MaxValue?

    • int isn't a keyword in all .NET languages.

    • Although there are arguments why it's not likely to ever change, int may not always be an Int32.

    The drawbacks are two extra characters to type and [bug].

    This won't compile

    public enum MyEnum : Int32
    {
        AEnum = 0
    }
    

    But this will:

    public enum MyEnum : int
    {
        AEnum = 0
    }
    
    0 讨论(0)
  • 2020-11-22 12:32

    A while back I was working on a project with Microsoft when we had a visit from someone on the Microsoft .NET CLR product team. This person coded examples and when he defined his variables he used “Int32” vs. “int” and “String” vs. “string”.

    I had remembered seeing this style in other example code from Microsoft. So, I did some research and found that everyone says that there is no difference between the “Int32” and “int” except for syntax coloring. In fact, I found a lot of material suggesting you use “Int32” to make your code more readable. So, I adopted the style.

    The other day I did find a difference! The compiler doesn’t allow you to type enum using the “Int32”, but it does when you use “int”. Don’t ask me why because I don’t know yet.

    Example:

    public  enum MyEnum : Int32
    {
        AEnum = 0
    }
    

    This works.

    public enum MyEnum : int
    {
        AEnum = 0
    }
    

    Taken from: Int32 notation vs. int

    0 讨论(0)
  • 2020-11-22 12:36

    The two are indeed synonymous; int will be a little more familiar looking, Int32 makes the 32-bitness more explicit to those reading your code. I would be inclined to use int where I just need 'an integer', Int32 where the size is important (cryptographic code, structures) so future maintainers will know it's safe to enlarge an int if appropriate, but should take care changing Int32s in the same way.

    The resulting code will be identical: the difference is purely one of readability or code appearance.

    0 讨论(0)
  • 2020-11-22 12:37

    I use int in the event that Microsoft changes the default implementation for an integer to some new fangled version (let's call it Int32b).

    Microsoft can then change the int alias to Int32b, and I don't have to change any of my code to take advantage of their new (and hopefully improved) integer implementation.

    The same goes for any of the type keywords.

    0 讨论(0)
  • 2020-11-22 12:37

    Some compilers have different sizes for int on different platforms (not C# specific)

    Some coding standards (MISRA C) requires that all types used are size specified (i.e. Int32 and not int).

    It is also good to specify prefixes for different type variables (e.g. b for 8 bit byte, w for 16 bit word, and l for 32 bit long word => Int32 lMyVariable)

    You should care because it makes your code more portable and more maintainable.

    Portable may not be applicable to C# if you are always going to use C# and the C# specification will never change in this regard.

    Maintainable ihmo will always be applicable, because the person maintaining your code may not be aware of this particular C# specification, and miss a bug were the int occasionaly becomes more than 2147483647.

    In a simple for-loop that counts for example the months of the year, you won't care, but when you use the variable in a context where it could possibly owerflow, you should care.

    You should also care if you are going to do bit-wise operations on it.

    0 讨论(0)
  • 2020-11-22 12:38

    int is the same as System.Int32 and when compiled it will turn into the same thing in CIL.

    We use int by convention in C# since C# wants to look like C and C++ (and Java) and that is what we use there...

    BTW, I do end up using System.Int32 when declaring imports of various Windows API functions. I am not sure if this is a defined convention or not, but it reminds me that I am going to an external DLL...

    0 讨论(0)
提交回复
热议问题