What would be the complexity of converting a string to its equivalent number or vice versa? Does it change depending on programming language?
On the face of it, one ne
C# and C/C++ don't have any special information in the strings which represents the (possible) numerical value. Therefore they need to parse the string digit by digit when converting.
However, the number of digits is limited, so we've got just O(1): the conversion time is bounded (usually by the conversion of the biggest number). For a 32-bit int, the conversion has to consider maximum of 10 decimal digits (and possibly a sign).
Conversion from string is actually O(1) as well, because during parsing of it, it's enough to consider only a bounded number of characters (10+1 in case of 32-bit int).
Strictly speaking, we cannot use O
-notation for the case of int-to-string conversion, since the maximal value of int is bounded. Anyway, the time needed for conversion (in both directions) is limited by a constant.
As @Charles suggests, other languages (Python) actually can use arbitrary-precision numbers. For parsing such numbers, the time is O(number of digits)
, which O(string length)
and O(log(number))
for both conversions, respectively. With arbitrary-precision numbers, one cannot do it faster, since for both conversions every digit must be considered. For the conversions to/from a limited-precision numbers, the same O(1)
reasoning applies. However I didn't profile the parsing in Python myself, so maybe a less efficient algorithm is used there.
EDIT: following the @Steve suggestion, I checked that parsing in C/C++ and C# skips the initial whitespace, so the time for string->int conversion is actually O(input length)
. In case it's known that the string is trimmed, the conversion is again O(1)
.