In my C# app, I would like to know whether it is really important to use short for smaller numbers, int for bigger etc. Does the memory consumption really matter?
This is entirely relative to the amount of memory you can afford to waste. If you aren't sure, it probably doesn't matter.
Int32
will be fine for almost anything. Exceptions include:
Int16
(aka: short
) would probably be better to represent some of the internalsint
cast and Visual Studio's first suggestion was to verify my value was less than infinity. I couldn't find a good type for that without using the pre-defined constants, so i used ulong
since that was the closest I could come in .NET 2.0 :)Only you can be the judge of whether the memory consumption really matters to you. In most situations it won't make any discernible difference.
In general, I would recommend using int
/Int32
where you can get away with it. If you really need to use short
, long
, byte
, uint
etc in a particular situation then do so.
There are a few cases where I really bother choosing.
Every other case is int all the way
Edit : About x86/x64
In x86 architecture, an int is 32 bits but in x64, an int is 64 bits
If you write "int" everywhere and move from one architecture to another, it might leads to problems. For example you have an 32 bits api that export a long. You cast it to an integer and everything is fine. But when you move to x64, the hell breaks loose.
The int is defined by your architecture so when you change architecture you need to be aware that it might lead to potential problems
Memory consumption based on the type of integers you are storing is probably not an issue in a desktop or web app. In a game or a mobile device app, it may be more of an issue.
However, the real reason to differentiate between the types is the kind of numbers you need to store. If you have really big numbers, or high precision, you may need to use long
to store it.