How to parallelize this nested loop in Python that calls Abaqus

自作多情 提交于 2019-12-14 03:18:12

问题


I have the nested loops below. How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?

    for r in range(4):
        for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):

            # - Write Abaqus INP file - #
            writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])

            # - Delete LCK file to Enable Another Analysis - #
            delFile(aPath[k]+"/"+inpFiles[k]+".lck")

            # - Run Analysis - #
            runABQfile(inpFiles[k],aPath[k])

I tried using multiprocess.pool as but it never gets in:

            def parRunABQfiles(nA,nP,r,ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_):
            from os import path 
            from auxFunctions import writeABQfile, runABQfile 
            print("I am Here")
            for k in range( r*nA/nP, (r+1)*nA/nP ):
                # - Write Abaqus INP file - #
                writeABQfile(ppos,prop0,prop1,totalTime2Run_,recIntervals_,inpFiles_,i,lineNumbers_,aPath_)
                # - Delete LCK file to Enable Another Analysis - #
                delFile(aPath_+"/"+inpFiles[k]+".lck")
                # - Run Analysis - #
                runABQfile(inpFiles_,aPath_)
                # - Make Sure Analysis is not Bypassed - #
                while os.path.isfile(aPath_+"/"+inpFiles[k]+".lck") == True:
                      sleep(0.1)
            return k

        results = zip(*pool.map(parRunABQfiles, range(0, 4, 1)))

The runABQfile is just a subprocess.call to a sh script that runs abaqus

     def runABQfile(inpFile,path):    
         import subprocess
         import os

         prcStr1 = ('sbatch '+path+'/runJob.sh')

         process = subprocess.call(prcStr1, stdin=None, stdout=None, stderr=None, shell=True )

         return

I have no errors showing up so I am not sure why is not getting in there. I know because the writeABQfile does not write the input file. The question again is:

How can i parallelize the outside loop so i can distribute the outside loop into 4 simultaneous runs and wait for all 4 runs to complete before moving on with the rest of the script?


回答1:


Use concurrent.futures module if multiprocessing is what you want.

from concurrent.futures import ProcessPoolExecutor

def each(r):
    for k in range( r*nAnalysis/4, (r+1)*nAnalysis/4 ):
        writeABQfile(ppos,props,totalTime[k],recInt[k],inpFiles[k],i,lineNum[k],aPath[k])
        delFile(aPath[k]+"/"+inpFiles[k]+".lck")
        runABQfile(inpFiles[k],aPath[k])

with ProcessPoolExecutor(max_workers=4) as executor:
    output = executor.map(each, range(4)) # returns an iterable

If you just want to "do" stuff rather than "produce", check out as_completed function from the same module. There are direct examples in the doc.



来源:https://stackoverflow.com/questions/37169336/how-to-parallelize-this-nested-loop-in-python-that-calls-abaqus

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!