Python Write dynamically huge files avoiding 100% CPU Usage

十年热恋 提交于 2019-12-24 16:33:48

问题


I am parsing a huge CSV approx 2 GB files with the help of this great stuff. Now have to generate dynamic files for each column in a new file where column name as file name. So I written this code to write the dynamic files:

def write_CSV_dynamically(self, header, reader):
  """
  :header - CSVs first row in string format
  :reader - CSVs all other rows in list format  
  """

  try:
    headerlist =header.split(',') #-- string headers 
    zipof = lambda x, y: zip(x.split(','), y.split(','))
    filename = "{}.csv".format(self.dtstamp)
    filename = "{}_"+filename
    filesdct = {filename.format(k.strip()):open(filename.format(k.strip()), 'a')\
    for k in headerlist}
    for row in reader:
      for key, data in zipof(header, row):
        filesdct[filename.format(key.strip())].write( str(data) +"\n" )
    for _, v in filesdct.iteritems():
      v.close()
  except Exception, e:
    print e

Now its taking around 50 secs to write these huge files using 100% CPU.As there are other heavy things running on my server. I want to block my program to use only 10% to 20% of the CPU and write these files. No matter if it takes 10-15 mins. How can I optimize my code, so that it should limit 10-20% CPU usage.


回答1:


There is number of ways to achieve this:

  • Nice the process - plain and simple.

  • cpulimit - just pass your script and cpu usage as parameters:

    cpulimit -P /path/to/your/script -l 20

  • Python's resource package to set limits from the script. Bear in mind it works with absolute CPU time.



来源:https://stackoverflow.com/questions/35547734/python-write-dynamically-huge-files-avoiding-100-cpu-usage

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!