Why can\'t you assign a number with a decimal point to the decimal type directly without using type suffix? isn\'t this kind of number considered a number of type decimal?
Your answer consists of two important points:
All numerical literals with a decimal point are inferred to be of type double
by the C# compiler, consequently, 3433.20
is a double
by default.
double
numbers do not implicitly convert to decimal
because although decimal
is more precise than double
it covers a shorter range so overflow is possible during a cast from double to decimal.
double
's range: ±(~10^−324 to 10^308)
with 15 or 16 significant figures.
decimal
's range: ±(~10^-28 to 10^28)
with 28 or 29 significant figures.