问题
public static int getIntegerFromBitArray(BitArray bitArray)
{
var result = new int[1];
bitArray.CopyTo(result, 0);
return result[0];
}
// Input A) 01110
// Output A) 14
// Input B) 0011
// Output B) 12 <=== ????? WHY!!! :)
Can some one please explain me why my second return value is 12 instead of 3?? Please ... Thank you.
回答1:
Basically it's considering the bits in the opposite order to the way you were expecting - you haven't shown how you're mapping your input binary to a BitArray
, but the result is treating it as 1100 rather than 0011.
The documentation isn't clear, admittedly, but it does work the way I'd expect it to: bitArray[0]
represents the least significant value, just as it usually is when discussing binary (so bit 0 is 0/1, bit 1 is 0/2, bit 2 is 0/4, bit 3 is 0/8 etc). For example:
using System;
using System.Collections;
class Program
{
static void Main(string[] args)
{
BitArray bits = new BitArray(8);
bits[0] = false;
bits[1] = true;
int[] array = new int[1];
bits.CopyTo(array, 0);
Console.WriteLine(array[0]); // Prints 2
}
}
回答2:
You need rotate bit's to right direction to get right results. 1100 is 12
来源:https://stackoverflow.com/questions/9697623/bitarray-to-integer-issue