Does anyone know why integer division in C# returns an integer and not a float? What is the idea behind it? (Is it only a legacy of C/C++?)
In C#:
fl
As a little trick to know what you are obtaining you can use var, so the compiler will tell you the type to expect:
int a = 1; int b = 2; var result = a/b;
your compiler will tell you that result would be of type int here.