python - ensure script is activated only once

隐身守侯 提交于 2019-12-06 01:39:35

问题


I'm writing a Python 2.7 script.
In summary, this script is being run every night on Linux and activates several processes.

I'd like to ensure this script is not run multiple times in parallel (basically trying to mimic Singleton pattern but on application level) .

Code Example

def main():
    # before doing anything, I'd like to know whether this
    # script was activated and alive. 
    # if so, error out

    # do something

if __name__ == "__main__":
    main()

Suggestion

The naive solution would be to create some kind of a lock file, that acts as a mutex.
The first thing we do is to check whether this file exists. if so, then other instance of the script already created it and we should error out. when the script is done, we remove this file.
I'm assuming this solution would work, as long as the operations on the file system are atomic.

Implementation

import os, sys

lock_file_path = ".lock_script"

def lock_mutex():
    if os.path.exists(lock_mutex_path):
        print "Error: script was already activated."
        sys.exit(-1)

    else:
        file = open(lock_mutex_path, 'w')

def unlock_mutex():
    assert( os.path.exists(lock_mutex_path))
    os.remove(lock_mutex_path)

def main():

    try:
        lock_mutex()

        # do something

        unlock_mutex()

    except:
        unlock_mutex()

if __name__ == "__main__":
    main()

Problem

How to ensure lock_mutex() and unlock_mutex() are atomic?


回答1:


I use supervisor (http://supervisord.org/) to run stuff under Linux. It runs Django, Celeryd and so on and ensures that they get restarted in case they finish unexpectedly.

But it's also possible to set the options so commands aren't started or restarted automatically when it finishes: autostart=false, autorestart=false, starseconds=0. I use that for these cron jobs.

In cron I put the command "supervisorctl start myscript", which does nothing if myscript is already running under supervisor, and otherwise starts it.

Works perfectly, regardless of the language that the script is written in.




回答2:


Since you're using linux, you can make use of flock:

import os
import fcntl
import time

def main():
  # acquire the prog lock
  if not prog_lock_acq('singleton.lock'):
    print("another instance is running")
    exit(1)

  print("program is running-press Ctrl+C to stop")
  while True:
    time.sleep(10)

def prog_lock_acq(lpath):
  fd = None
  try:
    fd = os.open(lpath, os.O_CREAT)
    fcntl.flock(fd, fcntl.LOCK_NB | fcntl.LOCK_EX)
    return True
  except (OSError, IOError):
    if fd: os.close(fd)
    return False

if __name__ == '__main__':
  main()

It doesn't matter that we left the file open after exiting the prog_lock_acq because when the process exits, it will be automatically closed by the OS. Also, if you leave out LOCK_NB option, the flock call will block until the current running process quits. Depending on your use case, that might be useful.

Note that we're not deleting the file on exit. It doesn't matter. Existence of the file doesn't indicate a live process—the lock does. So even if you kill your process with kill -9, the lock is still released.

There is however a caveat: if you unlink the lock file while the process is running, when the next instance of the process is run, it will create a new file which will have no lock on it and will run just fine which will violate our singleton design. You might be able to do something clever with a directory to prevent unlinking but I'm not sure how robust that would be.



来源:https://stackoverflow.com/questions/19752168/python-ensure-script-is-activated-only-once

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!