about close a file in Python

后端 未结 5 1804
小鲜肉
小鲜肉 2021-02-08 18:57

I know it is a good habit of using close to close a file if not used any more in Python. I have tried to open a large number of open files, and not close them (in the same Pytho

5条回答
  •  一向
    一向 (楼主)
    2021-02-08 19:09

    Python will, in general, garbage collect objects no longer in use and no longer being referenced. This means it's entirely possible that open file objects, that match the garbage collector's filters, will get cleaned up and probably closed. However; you should not rely on this, and instead use:

    with open(..):
    

    Example (Also best practice):

    with open("file.txt", "r") as f:
        # do something with f
    

    NB: If you don't close the file and leave "open file descriptors" around on your system, you will eventually start hitting resource limits on your system; specifically "ulimit". You will eventually start to see OS errors related to "too many open files". (Assuming Linux here, but other OS(es) will have similar behaviour).

    Important: It's also a good practice to close any open files you've written too, so that data you have written is properly flushed. This helps to ensure data integrity, and not have files unexpectedly contain corrupted data because of an application crash.

    It's worth noting that the above important note is the cause of many issues with where you write to a file; read it back; discover it's empty; but then close your python program; read it in a text editor and realize it's not empty.

    Demo: A good example of the kind of resource limits and errors you might hit if you don't ensure you close open file(s):

    $ python
    Python 2.7.6 (default, Mar 22 2014, 22:59:56) 
    [GCC 4.8.2] on linux2
    Type "help", "copyright", "credits" or "license" for more information.
    >>> xs = [open("/dev/null", "r") for _ in xrange(100000)]
    Traceback (most recent call last):
      File "", line 1, in 
    IOError: [Errno 24] Too many open files: '/dev/null'
    

提交回复
热议问题