问题
If you have an enum
that is used for bit flags, i.e.,
[Flags]
internal enum _flagsEnum : byte
{
None = 0, //00000000
Option1 = 1, //00000001
Option2 = 1 << 1, //00000010
Option3 = 1 << 2, //00000100
Option4 = 1 << 3, //00001000
Option5 = 1 << 4, //00010000
Option6 = 1 << 5, //00100000
Option7 = 1 << 6, //01000000
Option8 = 1 << 7, //10000000
All = Byte.MaxValue,//11111111
}
_flagsEnum myFlagsEnum = _flagsEnum.None;
Is it faster to do..
bool hasFlag = myFlagsEnum.HasFlag(_flagsEnum.Option1);
or to do..
bool hasFlag = myFlagsEnum & _flagsEnum.Option1 != 0
If there's a performance difference between checking multiple flags, then take that into account as well.
Normally I'd check out the reference source, but in this case Enum.HasFlags just goes to an extern InternalHasFlags, so I have no idea what it's doing.
回答1:
There is a performance cost to using HasFlag
, because the implementation verifies that the enum
value that you pass is of the same type as the flag.
With this difference out of the way, the implementation is highly optimized to avoid promoting shorter types, such as byte
, to int
:
switch (pMTThis->GetNumInstanceFieldBytes()) {
case 1:
cmp = ((*(UINT8*)pThis & *(UINT8*)pFlags) == *(UINT8*)pFlags);
break;
case 2:
cmp = ((*(UINT16*)pThis & *(UINT16*)pFlags) == *(UINT16*)pFlags);
break;
case 4:
cmp = ((*(UINT32*)pThis & *(UINT32*)pFlags) == *(UINT32*)pFlags);
break;
case 8:
cmp = ((*(UINT64*)pThis & *(UINT64*)pFlags) == *(UINT64*)pFlags);
break;
default:
// should not reach here.
UNREACHABLE_MSG("Incorrect Enum Type size!");
break;
}
The source of ReflectionEnum::InternalHasFlag
can be found here.
Although the cost is relatively high, it is unlikely to matter, except for the most extreme situations. I would recommend keeping it, unless your profiler points to this call as a largest bottleneck in your program.
来源:https://stackoverflow.com/questions/39493944/c-sharp-enum-hasflag-vs-bitwise-and-operator-check