I tried to build a function that calculates a binary number stored in a string into a decimal number stored in a long long
. I\'m thinking that my code should work b
'1' != 1
as mentioned in the comments by @churill. '1' == 49
. If you are on linux type man ascii
in terminal to get the ascii table.
Try this, it is the same code. I just used the stringNumber
directly instead of using const char*
to it. And I subtracted '0'
from the current index. '0' == 48
, so if you subtract it, you get the actual 1
or 0
integer value:
auto sz = stringNumber.size();
for(int i = 0; i < sz; i++) {
result += pow((stringNumber[sz - subtrahend] - '0') * 2, potency);
subtrahend++;
potency++;
std::cout << result << std::endl;
}
Moreover, use the methods provided by std::string
like .size()
instead of doing strlen()
on every iteration. Much faster.
In a production environment, I would highly recommend using std::bitset instead of rolling your own solution:
std::string stringNumber = "1111";
std::bitset<64> bits(stringNumber);
bits.to_ulong();