I know it is a good habit of using close to close a file if not used any more in Python. I have tried to open a large number of open files, and not close them (in the same Pytho
It's a good idea to handle file closing. It's not the sort of thing that will give errors and exceptions: it will corrupt files, or not write what you tried to write, and so on.
The most common python interpreter, CPython, which you're probably using, does, however, try to handle file closing smartly, just in case you don't. If you open a file, and then it gets garbage collected, which generally happens when there are no longer any references to it, CPython will close the file.
So for example, if you have a function like
def write_something(fname):
f = open(fname, 'w')
f.write("Hi there!\n")
then Python will generally close the file at some point after the function returns.
That's not that bad for simple situations, but consider this:
def do_many_things(fname):
# Some stuff here
f = open(fname, 'w')
f.write("Hi there!\n")
# All sorts of other stuff here
# Calls to some other functions
# more stuff
return something
Now you've opened the file, but it could be a long time before it is closed. On some OSes, that might mean other processes won't be able to open it. If the other stuff has an error, your message might not actually get written to the file. If you're writing quite a bit of stuff, some of it might be written, and some other parts might not; if you're editing a file, you might cause all sorts of problems. And so on.
An interesting question to consider, however, is whether, in OSes where files can be open for reading by multiple processes, there's any significant risk to opening a file for reading and not closing it.