Where did variable = null as “object destroying” come from?

左心房为你撑大大i 提交于 2019-12-05 09:28:16

问题


Working on a number of legacy systems written in various versions of .NET, across many different companies, I keep finding examples of the following pattern:

public void FooBar()
{
    object foo = null;
    object bar = null;

    try
    {
       foo = new object();
       bar = new object();

       // Code which throws exception.
    }
    finally
    {
       // Destroying objects
       foo = null;
       bar = null;
    }

}

To anybody that knows how memory management works in .NET, this kind of code is painfully unnecessary; the garbage collector does not need you to manually assign null to tell that the old object can be collected, nor does assigning null instructs the GC to immediately collected the object.

This pattern is just noise, making it harder to understand what the code is trying to achieve.

Why, then, do I keep finding this pattern? Is there a school that teaches this practice? Is there a language in which assigning null values to locally scoped variables is required to correctly manage memory? Is there some additional value in explicitly assigning null that I haven't percieved?


回答1:


It's FUDcargo cult programming (thanks to Daniel Earwicker) by developers who are used to "free" resources, bad GC implementations and bad API.

Some GCs didn't cope well with circular references. To get rid of them, you had to break the cycle "somewhere". Where? Well, if in doubt, then everywhere. Do that for a year and it's moved into your fingertips.

Also setting the field to null gives you the idea of "doing something" because as developers, we always fear "to forget something".

Lastly, we have APIs which must be closed explicitly because there is no real language support to say "close this when I'm done with it" and let the computer figure it out just like with GC. So you have an API where you have to call cleanup code and API where you don't. This sucks and encourages patterns like the above.




回答2:


It is possible that it came from VB which used a reference counting strategy for memory management and object lifetime. Setting a reference to Nothing (equivalent to null) would decrement the reference count. Once that count became zero then the object was destroyed synchronously. The count would be decremented automatically upon leaving the scope of a method so even in VB this technique was mostly useless, however there were special situations where you would want to greedily destroy an object as illustrated by the following code.

Public Sub Main()
  Dim big As Variant
  Set big = GetReallyBigObject()
  Call big.DoSomething
  Set big = Nothing
  Call TimeConsumingOperation
  Call ConsumeMoreMemory
End Sub

In the above code the object referenced by big would have lingered until the end without the call to Set big = Nothing. That may be undesirable if the other stuff in the method was a time consuming operation or generated more memory pressure.




回答3:


It comes from C/C++ where explicitly made setting your pointers to null was the norm (to eliminate dangling pointers)

After calling free():

#include <stdlib.h>
{
    char *dp = malloc ( A_CONST );

    // Now that we're freeing dp, it is a dangling pointer because it's pointing
    // to freed memory
    free ( dp );

    // Set dp to NULL so it is no longer dangling
    dp = NULL;
}

Classic VB developers also did the same thing when writing their COM components to prevent memory leaks.




回答4:


It is more common in languages with deterministic garbage collection and without RAII, such as the old Visual Basic, but even there it's unnecessary and there it was often necessary to break cyclic references. So possibly it really stems from bad C++ programmers who use dumb pointers all over the place. In C++, it makes sense to set dumb pointers to 0 after deleting them to prevent double deletion.




回答5:


I've seen this a lot in VBScript code (classic ASP) and I think it comes from there.




回答6:


I think it used to be a common misunderstanding among former C/C++ developers. They knew that the GC will free their memory, but they didn't really understand when and how. Just clean it and carry on :)




回答7:


I suspect that this pattern comes from translating C++ code to C# without pausing to understand the differences between C# finalization and C++ finalization. In C++ I often null things out in the destructor, either for debugging purposes (so that you can see in the debugger that the reference is no longer valid) or, rarely, because I want a smart object to be released. (If that's the meaning I'd rather call Release on it and make the meaning of the code crystal-clear to the maintainers.) As you note, this is pretty much senseless in C#.

You see this pattern in VB/VBScript all the time too, for different reasons. I mused a bit about what might cause that here:

http://blogs.msdn.com/b/ericlippert/archive/2004/04/28/122259.aspx




回答8:


May be the convention of assigning null originated from the fact that had foo been an instance variable instead of a local variable, you should remove the reference before GC can collect it. Someone slept during the first sentence and started nullifying all their variables; the crowd followed.




回答9:


It comes from C/C++ where doing a free()/delete on an already released pointer could result in a crash while releasing a NULL-pointer simply did nothing.

This means that this construct (C++) will cause problems

void foo()
{
  myclass *mc = new myclass(); // lets assume you really need new here
  if (foo == bar)
  {
    delete mc;
  }
  delete mc;
}

while this will work

void foo()
{
  myclass *mc = new myclass(); // lets assume you really need new here
  if (foo == bar)
  {
    delete mc;
    mc = NULL;
  }
  delete mc;
}

Conclusion: IT's totally unneccessary in C#, Java and just about any other garbage-collecting language.




回答10:


Consider a slight modification:

public void FooBar() 
{ 
    object foo = null; 
    object bar = null; 

    try 
    { 
       foo = new object(); 
       bar = new object(); 

       // Code which throws exception. 
    } 
    finally 
    { 
       // Destroying objects 
       foo = null; 
       bar = null; 
    } 
    vavoom(foo,bar);
} 

The author(s) may have wanted to ensure that the great Vavoom (*) did not get pointers to malformed objects if an exception was previously thrown and caught. Paranoia, resulting in defensive coding, is not necessarily a bad thing in this business.

(*) If you know who he is, you know.




回答11:


VB developers had to dispose all of their objects, to try and mitigate the chance of a Memory leak. I can imagine this is where it has come from as VB devs migrated over to .NEt / c#




回答12:


I can see it coming from either a misunderstanding of how the garbage collection works, or an effort to force the GC to kick in immediately - perhaps because the objects foo and bar are quite large.




回答13:


I've seen this in some Java code before. It was used on a static variable to signal that the object should be destroyed.

It probably didn't originate from Java though, as using it for anything other than a static variable would also not make sense in Java.




回答14:


It comes from C++ code especially smart pointers. In that case it's rougly equivalent to a .Dispose() in C#.

It's not a good practice, at most a developer's instinct. There is no real value by assigning null in C#, except may be helping the GC to break a circular reference.



来源:https://stackoverflow.com/questions/3132258/where-did-variable-null-as-object-destroying-come-from

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!