In C#, what\'s the best way to get the 1st digit in an int? The method I came up with is to turn the int into a string, find the 1st char of the string, then turn it back to
I just stumbled upon this old question and felt inclined to propose another suggestion since none of the other answers so far returns the correct result for all possible input values and it can still be made faster:
public static int GetFirstDigit( int i )
{
if( i < 0 && ( i = -i ) < 0 ) return 2;
return ( i < 100 ) ? ( i < 1 ) ? 0 : ( i < 10 )
? i : i / 10 : ( i < 1000000 ) ? ( i < 10000 )
? ( i < 1000 ) ? i / 100 : i / 1000 : ( i < 100000 )
? i / 10000 : i / 100000 : ( i < 100000000 )
? ( i < 10000000 ) ? i / 1000000 : i / 10000000
: ( i < 1000000000 ) ? i / 100000000 : i / 1000000000;
}
This works for all signed integer values inclusive -2147483648
which is the smallest signed integer and doesn't have a positive counterpart. Math.Abs( -2147483648 )
triggers a System.OverflowException
and - -2147483648
computes to -2147483648
.
The implementation can be seen as a combination of the advantages of the two fastest implementations so far. It uses a binary search and avoids superfluous divisions. A quick benchmark with the index of a loop with 100,000,000 iterations shows that it is twice as fast as the currently fastest implementation.
It finishes after 2,829,581 ticks.
For comparison I also measured a corrected variant of the currently fastest implementation which took 5,664,627 ticks.
public static int GetFirstDigitX( int i )
{
if( i < 0 && ( i = -i ) < 0 ) return 2;
if( i >= 100000000 ) i /= 100000000;
if( i >= 10000 ) i /= 10000;
if( i >= 100 ) i /= 100;
if( i >= 10 ) i /= 10;
return i;
}
The accepted answer with the same correction needed 16,561,929 ticks for this test on my computer.
public static int GetFirstDigitY( int i )
{
if( i < 0 && ( i = -i ) < 0 ) return 2;
while( i >= 10 )
i /= 10;
return i;
}
Simple functions like these can easily be proven for correctness since iterating all possible integer values takes not much more than a few seconds on current hardware. This means that it is less important to implement them in a exceptionally readable fashion as there simply won't ever be the need to fix a bug inside them later on.