python -> multiprocessing module

前端 未结 2 1017
别跟我提以往
别跟我提以往 2021-02-04 18:07

Here\'s what I am trying to accomplish -

  1. I have about a million files which I need to parse & append the parsed content to a single file.
  2. Since a sin
2条回答
  •  孤独总比滥情好
    2021-02-04 18:57

    The docs for multiprocessing indicate several methods of sharing state between processes:

    http://docs.python.org/dev/library/multiprocessing.html#sharing-state-between-processes

    I'm sure each process gets a fresh interpreter and then the target (function) and args are loaded into it. In that case, the global namespace from your script would have been bound to your worker function, so the data_file would be there. However, I am not sure what happens to the file descriptor as it is copied across. Have you tried passing the file object as one of the args?

    An alternative is to pass another Queue that will hold the results from the workers. The workers put the results and the main code gets the results and writes it to the file.

提交回复
热议问题