Why does null exist in .NET?

前端 未结 11 1510
醉酒成梦
醉酒成梦 2020-12-24 06:04

Why can values be null in .NET? Is this superior to having a guarantee where everything would have a value and nothing call be null?

Anyone knows what each of these

相关标签:
11条回答
  • 2020-12-24 06:16

    This reminds me of an episode of James Burke's "Connections" series where monks were transcribing arabic to latin and first encountered a zero digit. Roman arithmetic did not have a representation for zero, but arabic/aramaic arithmetic did. "Why do we have to write a letter to indicate nothing?" argued the Catholic monks. "If it is nothing, we should write nothing!"

    Fortunately for modern society, they lost the argument and learned to write zero digits in their maths. ;>

    Null simply represents an absence of an object. There are programming languages which do not have "null" per se, but most of them do still have something to represent the absence of a legitimate object. If you throw away "null" and replace it with something called "EmptyObject" or "NullNode", it's still a null just with a different name.

    If you remove the ability for a programming language to represent a variable or field that does not reference a legitimate object, that is, you require that every variable and field always contain a true and valid object instance, then you make some very useful and efficient data structures awkward and inefficient, such as building a linked list. Instead of using a null to indicate the end of the linked list, the programmer is forced to invent "fake" object instances to serve as list terminals that do nothing but indicate "there's nothing here".

    Delving into existentialism here, but: If you can represent the presence of something, then isn't there a fundamental need to be able to represent the absence of it as well?

    0 讨论(0)
  • 2020-12-24 06:17

    Hysterical raisins.

    It's a hangover from C-level languages where you live on explicit pointer manipulation. Modern declarative languages (Mercury, Haskell, OCaml, etc.) get by quite happily without nulls. There, every value has to be explicitly constructed. The 'null' idea is handled through 'option' types, which have two values, 'no' (corresponding to null) and 'yes(x)' (corresponding to non-null with value x). You have to unpack each option value to decide what to do, hence: no null pointer reference errors.

    Not having nulls in a language saves you so much grief, it really is a shame the idea still persists in high-level languages.

    0 讨论(0)
  • 2020-12-24 06:18

    We've got Tony Hoare, an early pioneer that worked on Algol to thank for that. He rather regrets it:

    I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.

    A billion is a low-ball number, I think.


    UPDATE: C# version 8 and .NETCore have a decent solution for this problem, check out non-nullable reference types.

    0 讨论(0)
  • 2020-12-24 06:18

    I'm not familiar with alternatives either, but I don't see a difference between Object.Empty and null, other than null lets you know that something is wrong when your code tries to access the object, wheras Object.Empty allows processing to continue. Sometimes you want one behavior, and sometimes you want the other. Distinguishing null from Empty is a useful tool for that.

    0 讨论(0)
  • 2020-12-24 06:24

    As appealing as a world without null is, it does present a lot of difficulty for many existing patterns and constructs. For example consider the following constructs which would need major changes if null did not exist

    1. Creating an array of reference types ala: new object[42]. In the existing CLR world the arrays would be filled with null which is illegal. Array semantics would need to change quite a bit here
    2. It makes default(T) useful only when T is a value type. Using it on reference types or unconstrained generics wouldn't be allowed
    3. Fields in a struct which are a reference type need to be disallowed. A value type can be 0-initialized today in the CLR which conveniently fills fields of reference types with null. That wouldn't be possible in a non-null world hence fields whos type are reference types in struct's would need to be disallowed

    None of the above problems are unsolvable but they do result in changes that really challenge how developers tend to think about coding. Personally I wish C# and .Net was designed with the elimination of null but unfortunately it wasn't and I imagine problems like the above had a bit to do with it.

    0 讨论(0)
  • 2020-12-24 06:26

    I speculate that their exists null in .NET because it (C#) followed in the C++/Java foot-steps (and has only started to branch-out in more recent versions) and VB/J++ (which became VB.NET/J#) already had the notion of "nothing" values -- that is, .NET has null because of what was and not because of what it could have been.

    In some languages there is no notion of null -- null can be completely replaced with a type like Maybe -- there is Something (the object) or Nothing (but this is not null! There is no way to get the "Nothing" out of an Maybe!)

    In Scala with Option:

    val opt = Some("foo") // or perhaps, None
    opt match {
       case Some(x) => x.toString() // x not null here, but only by code-contract, e.g. Some(null) would allow it.
       case _ => "nothing :(" // opt contained "Nothing"
    }
    

    This is done by language design in Haskell (null not possible ... at all!) and by library support and careful usage such as in Scala, as shown above. (Scala supports null -- arguably for Java/C# interop -- but it is possible to write Scala code without using this fact unless null is allowed to "leak" about).

    Edit: See Scala: Option Pattern, Scala: Option Cheat Cheet and SO: Use Maybe Type in Haskell. Most of the time talking about Maybe in Haskell brings up the topic of Monads. I won't claim to understand them, but here is a link along with usage of Maybe.

    Happy coding.

    0 讨论(0)
提交回复
热议问题