Python, WSGI, multiprocessing and shared data

后端 未结 3 518
粉色の甜心
粉色の甜心 2021-01-30 22:43

I am a bit confused about multiproessing feature of mod_wsgi and about a general design of WSGI applications that would be executed on WSGI servers with multiprocessing ability.

3条回答
  •  小鲜肉
    小鲜肉 (楼主)
    2021-01-30 23:33

    From the docs on processes and threading for wsgi:

    When Apache is run in a mode whereby there are multiple child processes, each child process will contain sub interpreters for each WSGI application.

    This means that in your configuration, 5 processes with 1 thread each, there will be 5 interpreters and no shared data. Your counter object will be unique to each interpreter. You would need to either build some custom solution to count sessions (one common process you can communicate with, some kind of persistence based solution, etc.) OR, and this is definitely my recommendation, use a prebuilt solution (Google Analytics and Chartbeat are fantastic options).

    I tend to think of using globals to share data as a big form of global abuse. It's a bug well and portability issue in most of the environments I've done parallel processing in. What if suddenly your application was to be run on multiple virtual machines? This would break your code no matter what the sharing model of threads and processes.

提交回复
热议问题