I am writing algorithms that work on series of numeric data, where sometimes, a value in the series needs to be null. However, because this application is performance critical,
Well, if you've ruled out Nullable
, you are left with domain values - i.e. a magic number that you treat as null. While this isn't ideal, it isn't uncommon either - for example, a lot of the main framework code treats DateTime.MinValue
the same as null. This at least moves the damage far away from common values...
edit to highlight only where no NaN
So where there is no NaN
, maybe use .MinValue
- but just remember what evils happen if you accidentally use that same value meaning the same number...
Obviously for unsigned data you'll need .MaxValue
(avoid zero!!!).
Personally, I'd try to use Nullable
as expressing my intent more safely... there may be ways to optimise your Nullable
code, perhaps. And also - by the time you've checked for the magic number in all the places you need to, perhaps it won't be much faster than Nullable
?