I have a class function (let\'s call it \"alpha.py\") that uses multiprocessing (processes=2) to fork a process and is part of a Python package that I wrote. In a separate Pytho
Actually, freeze_support()
is not needed here. You get a RuntimeError
because you create and start your new processes at the top level of your beta
module.
When a new process is created using multiprocessing
on Windows, a new Python interpreter will be started in this process and it will try to import the module with the target function that should be executed. This is your beta
module. Now, when you import it, all your top level statements should be executed which will cause a new process to be created and started again. And then, recursively, another process from that process, and so on and so forth.
This is most likely not what you want, thus new processes should be initialized and started only once, when you run beta.py directly with a subprocess
.
if __name__ == '__main__':
should be placed in beta.py, then move initialization and start code for your new processes in this section. After that, when beta.py will be imported and not run directly, no new process will be started and you will not see any side effects.