Coming from C/C++ a long time ago I still have a habit of ensuring that all resources are cleaned up correctly. I always ensure Dispose is called on IDisposable classes and impl
Some of these resources like handles are a limited resource for the entire system, so if your application doesn't release these other applications or even the OS may suffer. Have a look at Mark Russinovich's latest article in the pushing the limits of Windows series for examples.
Yes, I've maxed out the number of Oracle cursors when looping over a connection object, for example, because I forgot to close the command reader - that was only 100 loops on a single connection too, and I needed to support that with possibly hundreds of connections doing it at the same time.
Your fellow developers should be taught to use the using() { ... } syntax if they can't be bothered to close up any unmanaged resources themselves. It is good practice anyways and you should use it too, since you yourself might be forgetting to put your Dispose()
calls in a finally {}
clause so as to truly clean up in the event of an unhandled exception being thrown.
If you can't win their hearts - change their minds - create tests that break their code by maxing out the resources that they're not cleaning up - and then show that the "fix" is simple and easy, and enables their code to be far more scalable. Or just show it to your boss and tell them this will enable him/her to sell the product as a new version with more scalability built in :) Your fellow developers will be instructed to do this all the time in the future, hopefully, and you'll be held in higher regard too.
Not disposing database releated IDisposable objects is a reliable and efficient way to generate OutOfMemoryExceptions in environments.
DataSet implements IDisposable and I've read that it is not necessary to call Dispose because the objects that need to be disposed for a dataset are only created at Design time (by the visual studio designer). I've never seen OOM from un-Disposed datasets (just OOM from enormous DataSets)
Besides the obvious cases (already mentioned) of resources running out, another benefit of IDisposable is that, since it guarantees that Dispose() is called when a using
block exits, you can use it for all kinds of things, even things that aren't just "perform operation with OS resource".
In this way, it's like a poor-man's substitute for Ruby blocks, or for one small use case of Lisp macros.
Yes, I also ran into an issue with Connection objects to an Oracle database not getting disposed.
Mike Atlas's issue above is bad, but it was at least clear about what was going wrong. The issue we ran into was that from time to time under heavy load, the site would start throwing errors when we tried to open a connection, but by the time we looked at the system it had all cleared up (because the garabe collector had cleared the objects and freed up the connection pool). It was very difficult to reproduce, until I was looking through the code and noticed that a connection was not being closed in the event of an error, changing this to a using
statment fixed the whole issue.
The short answer is that if an object takes the effort to implement IDisposable, it's there for a reason, so ALWAYS dispose it when you are done, ideally with a using
statement. Don't get clever or tricky with disposing sometimes but not other times when you don't think you need to blah blah blah. Just do what works every time.
The shorter, and more satisfying answer, is that you are right, and your coworkers are morons who don't know what they are doing.
Yes it matters. When an object implements IDisposable it is explicitly stating that it is holding resources that need to be released when the object is no longer needed.
Most will still clean up their resources when the object is finalized but finalization is not deterministic and can't be relied on for resource management.
Simply wrapping the variable declarations in a using(...)
block makes it easy to dispose of properly.