I\'ve been studying C# and ran accross some familiar ground from my old work in C++. I never understood the reason for bitwise operators in a real application. I\'ve never u
Except for combining flags, bit logic isn't necessarily something you need in your UI code, but it is still tremendously important. For example, I maintain a binary serialization library, that needs to deal with all sorts of complex bit-packing strategies (variant length base-128 integer encoding, for example). This is one of the implementations (actually, this is a slower/safer version - there are other variants for dealing with buffered data, but they are harder to follow):
public static bool TryDecodeUInt32(Stream source, out uint value)
{
if (source == null) throw new ArgumentNullException("source");
int b = source.ReadByte();
if (b < 0)
{
value = 0;
return false;
}
if ((b & 0x80) == 0)
{
// single-byte
value = (uint) b;
return true;
}
int shift = 7;
value = (uint)(b & 0x7F);
bool keepGoing;
int i = 0;
do
{
b = source.ReadByte();
if (b < 0) throw new EndOfStreamException();
i++;
keepGoing = (b & 0x80) != 0;
value |= ((uint)(b & 0x7F)) << shift;
shift += 7;
} while (keepGoing && i < 4);
if (keepGoing && i == 4)
{
throw new OverflowException();
}
return true;
}
We have:
This is real code, used in a real (and much used) protocol. In general, bit operations are used a lot in any kind of encoding layer.
It is also hugely important in graphics programming, for example. And lots of others.
There are also some micro-optimisations (maths intensive work etc) that can be done with bit operations.