I\'d like to know if it is an easy way of determining the maximum number of characters to print a decimal int
.
I know
contains
If you assume CHAR_BIT
is 8 (required on POSIX, so a safe assumption for any code targetting POSIX systems as well as any other mainstream system like Windows), a cheap safe formula is 3*sizeof(int)+2
. If not, you can make it 3*sizeof(int)*CHAR_BIT/8+2
, or there's a slightly simpler version.
In case you're interested in the reason this works, sizeof(int)
is essentially a logarithm of INT_MAX
(roughly log base 2^CHAR_BIT), and conversion between logarithms of different bases (e.g. to base 10) is just multiplication. In particular, 3 is an integer approximation/upper bound on log base 10 of 256.
The +2 is to account for a possible sign and null termination.
After accept answer (2+ yr)
The following fraction 10/33 exactly meets the needs for unpadded int8_t
, int16_t
, int32_t
and int128_t
. Only 1 char
extra for int64_t
. Exact or 1 over for all integer sizes up to int362_t
. Beyond that may be more that 1 over.
#include <limits.h>
#define MAX_CHAR_LEN_DECIMAL_INTEGER(type) (10*sizeof(type)*CHAR_BIT/33 + 2)
#define MAX_CHAR_SIZE_DECIMAL_INTEGER(type) (10*sizeof(type)*CHAR_BIT/33 + 3)
int get_int( void ) {
// + 1 for the \n of fgets()
char draft[MAX_CHAR_SIZE_DECIMAL_INTEGER(long) + 1]; //**
fgets(draft, sizeof draft, stdin);
return strtol(draft, NULL, 10);
}
** fgets()
typically works best with an additional char
for the terminating '\n'
.
Similar to @R.. but with a better fraction.
Recommend using generous, 2x, buffers when reading user input. Sometimes a user adds spaces, leading zeros, etc.
char draft[2*(MAX_CHAR_SIZE_DECIMAL_INTEGER(long) + 1)];
fgets(draft, sizeof draft, stdin);