We noticed that lots of bugs in our software developed in C# (or Java) cause a NullReferenceException.
Is there a reason why \"null\" has even been included in the l
Null is an essential requirement of any OO language. Any object variable that hasn't been assigned an object reference has to be null.
Like many things in object-oriented programming, it all goes back to ALGOL. Tony Hoare just called it his "billion-dollar mistake." If anything, that's an understatement.
Here is a really interesting thesis on how to make nullability not the default in Java. The parallels to C# are obvious.
Anders Hejlsberg, "C# father", just spoke about that point in his Computerworld interview:
For example, in the type system we do not have separation between value and reference types and nullability of types. This may sound a little wonky or a little technical, but in C# reference types can be null, such as strings, but value types cannot be null. It sure would be nice to have had non-nullable reference types, so you could declare that ‘this string can never be null, and I want you compiler to check that I can never hit a null pointer here’.
50% of the bugs that people run into today, coding with C# in our platform, and the same is true of Java for that matter, are probably null reference exceptions. If we had had a stronger type system that would allow you to say that ‘this parameter may never be null, and you compiler please check that at every call, by doing static analysis of the code’. Then we could have stamped out classes of bugs.
Cyrus Najmabadi, a former software design engineer on the C# team (now working at Google) discuss on that subject on his blog: (1st, 2nd, 3rd, 4th). It seems that the biggest hindrance to the adoption of non-nullable types is that notation would disturb programmers’ habits and code base. Something like 70% of references of C# programs are likely to end-up as non-nullable ones.
If you really want to have non-nullable reference type in C# you should try to use Spec# which is a C# extension that allow the use of "!" as a non-nullable sign.
static string AcceptNotNullObject(object! s)
{
return s.ToString();
}
Null do not cause NullPointerExceptions...
Programers cause NullPointerExceptions.
Without nulls we are back to using an actual arbitrary value to determine that the return value of a function or method was invalid. You still have to check for the returned -1 (or whatever), removing nulls will not magically solve lazyness, but mearly obfuscate it.
There are situations in which null is a nice way to signify that a reference has not been initialized. This is important in some scenarios.
For instance:
MyResource resource;
try
{
resource = new MyResource();
//
// Do some work
//
}
finally
{
if (resource != null)
resource.Close();
}
This is in most cases accomplished by the use of a using statement. But the pattern is still widely used.
With regards to your NullReferenceException, the cause of such errors are often easy to reduce by implementing a coding standard where all parameters a checked for validity. Depending on the nature of the project I find that in most cases it's enough to check parameters on exposed members. If the parameters are not within the expected range an ArgumentException of some kind is thrown, or a error result is returned, depending on the error handling pattern in use.
The parameter checking does not in itself remove bugs, but any bugs that occur are easier to locate and correct during the testing phase.
As a note, Anders Hejlsberg has mentioned the lack of non-null enforcement as one of the biggest mistakes in the C# 1.0 specification and that including it now is "difficult".
If you still think that a statically enforced non-null reference value is of great importance you could check out the spec# language. It is an extension of C# where non-null references are part of the language. This ensures that a reference marked as non-null can never be assigned a null reference.
Nullity is a natural consequence of reference types. If you have a reference, it has to refer to some object - or be null. If you were to prohibit nullity, you would always have to make sure that every variable was initialized with some non-null expression - and even then you'd have issues if variables were read during the initialization phase.
How would you propose removing the concept of nullity?