During a conversation on IRC, someone pointed out the following:
decimal.Parse(\"1.0000\").ToString() // 1.0000
decimal.Parse(\"1.00\").ToString() // 1.00
How is the number of significant figures decided during mathematical operations?
This is specified in the ECMA-334 C# 4 specification 11.1.7 p.112
A decimal is represented as an integer scaled by a power of ten. For decimals with an absolute value less than 1.0m, the value is exact to at least the 28th decimal place. For decimals with an absolute value greater than or equal to 1.0m, the value is exact to at least 28 digits.
Does the number of significant figures get retained during serialization?
Yes it does, with seriallization the value and its precision does not change
[Serializable]
public class Foo
{
public decimal Value;
}
class Program
{
static void Main(string[] args)
{
decimal d1 = decimal.Parse("1.0000");
decimal d2 = decimal.Parse("1.00");
Debug.Assert(d1 ==d2);
var foo1 = new Foo() {Value = d1};
var foo2 = new Foo() {Value = d2};
IFormatter formatter = new BinaryFormatter();
Stream stream = new FileStream("data.bin", FileMode.Create, FileAccess.Write, FileShare.None);
formatter.Serialize(stream, d1);
stream.Close();
formatter = new BinaryFormatter();
stream = new FileStream("data.bin", FileMode.Open, FileAccess.Read, FileShare.Read);
decimal deserializedD1 = (decimal)formatter.Deserialize(stream);
stream.Close();
Debug.Assert(d1 == deserializedD1);
Console.WriteLine(d1); //1.0000
Console.WriteLine(d2); //1.00
Console.WriteLine(deserializedD1); //1.0000
Console.Read();
}
}
Does the current culture affect the way this is handled?
The current culture affects only how a decimal can be parsed from a string, for example it can handle '.' or ',' as a culture-specific decimal point symbol or the currency symbol, should you provide it, e.g. "£123.4500". Culture does not change the way an object is stored internally and it does not affect its precision.
Internally, decimal has a mantissa, an exponent and a sign, so no space for anything else.