During a conversation on IRC, someone pointed out the following:
decimal.Parse(\"1.0000\").ToString() // 1.0000
decimal.Parse(\"1.00\").ToString() // 1.00
A decimal consists of a 96-bit integer and a scaling factor (number of digits after the decimal point), which ranges from 0 to 28. Thus:
In addition to post I see here, I would personally add a side note:
always during persistence manipulations with floating point/decimal/double numbers consider the culture
you're in, or you're going to save in. The code like here written is first, but definitive pass to complete mess and non culture independent architecture.
Use Decimal.Parse (String, IFormatProvider).
In my opinion, that methods (Parse From/To
) that lack of Culture
parameter have to be removed from the library to force a developer to think about that very important aspect.
How is the number of significant figures decided during mathematical operations?
This is specified in the ECMA-334 C# 4 specification 11.1.7 p.112
A decimal is represented as an integer scaled by a power of ten. For decimals with an absolute value less than 1.0m, the value is exact to at least the 28th decimal place. For decimals with an absolute value greater than or equal to 1.0m, the value is exact to at least 28 digits.
Does the number of significant figures get retained during serialization?
Yes it does, with seriallization the value and its precision does not change
[Serializable]
public class Foo
{
public decimal Value;
}
class Program
{
static void Main(string[] args)
{
decimal d1 = decimal.Parse("1.0000");
decimal d2 = decimal.Parse("1.00");
Debug.Assert(d1 ==d2);
var foo1 = new Foo() {Value = d1};
var foo2 = new Foo() {Value = d2};
IFormatter formatter = new BinaryFormatter();
Stream stream = new FileStream("data.bin", FileMode.Create, FileAccess.Write, FileShare.None);
formatter.Serialize(stream, d1);
stream.Close();
formatter = new BinaryFormatter();
stream = new FileStream("data.bin", FileMode.Open, FileAccess.Read, FileShare.Read);
decimal deserializedD1 = (decimal)formatter.Deserialize(stream);
stream.Close();
Debug.Assert(d1 == deserializedD1);
Console.WriteLine(d1); //1.0000
Console.WriteLine(d2); //1.00
Console.WriteLine(deserializedD1); //1.0000
Console.Read();
}
}
Does the current culture affect the way this is handled?
The current culture affects only how a decimal can be parsed from a string, for example it can handle '.' or ',' as a culture-specific decimal point symbol or the currency symbol, should you provide it, e.g. "£123.4500". Culture does not change the way an object is stored internally and it does not affect its precision.
Internally, decimal has a mantissa, an exponent and a sign, so no space for anything else.