Sharing a lock between gunicorn workers

前端 未结 3 1705
南方客
南方客 2020-12-06 05:40

Is there a good way to share a multiprocessing Lock between gunicorn workers? I am trying to write a json API with Flask. Some of the API calls will interact a python class

相关标签:
3条回答
  • 2020-12-06 06:15

    Follow peterw's answer, the workers can share the lock resource.

    But, It is better to use try-finally block to ensure the lock will always be released.

    # dummy.py
    from multiprocessing import Lock
    import time
    
    lock = Lock()
    
    def start():
        lock.acquire()
    
        try:
            # TODO do work
            for i in range(0,10):
                print "did work %s" % i
                time.sleep(1)
        finally:
            lock.release()
    
    0 讨论(0)
  • 2020-12-06 06:19

    Late addition:
    If for some reason, using preload_app is not feasible, then you need to use a named lock. This ensures that all processes are using the same lock object. Using mp.Lock() will create a different object for each process, negating any value.

    I saw this package but did not use it yet. It supplies a named lock in the scope of one machine; that means that all processes within the same machine will use the same lock, but outside the boundaries of one machine this solution is not appropriate.

    0 讨论(0)
  • 2020-12-06 06:27

    I tried something, and it seems to work. I put preload_app = True in my gunicorn.conf and now the lock seems to be shared. I am still looking into exactly what's happening here but for now this is good enough, YMMV.

    0 讨论(0)
提交回复
热议问题