Workaround to R memory leak with XML package

前端 未结 3 974
灰色年华
灰色年华 2020-12-11 19:34

I am trying to run some simple program to extract tables from html code. However, there seems to be some memory issue with readHTMLTable in XML package. Is there any way I c

相关标签:
3条回答
  • 2020-12-11 19:45

    As of XML 3.98-1.4 and R 3.1 on Win7, this problem can be solved perfectly by using the function free(). But it does not work with readHTMLTable(). The following code works perfectly.

    library(XML)
    a = readLines("http://en.wikipedia.org/wiki/2014_FIFA_World_Cup")
    while(TRUE){
       b = xmlParse(paste(a, collapse = ""))
       #do something with b
       free(b)
    }
    

    The xml2 package has similar issues and the memory can be released by using the function remove_xml() followed by gc().

    0 讨论(0)
  • 2020-12-11 19:58

    Same problem here, even doing nothing more than reading in the document with doc <- xmlParse(...); root <- xmlRoot(doc), the memory allocated to doc is just never released to the O/S (as monitored in Windows' Task Manager).

    A crazy idea that we might try is to employ system("Rscript ...") to perform the XML parsing in a separate R session, saving the parsed R object to a file, which we then read in in the main R session. Hacky but it would at least ensure that whatever memory is gobbled up by the XML parsing, is released when the Rscript session terminates and doesn't affect the main process!

    0 讨论(0)
  • 2020-12-11 20:05

    I had a lot of problems with memory leaks in the XML pakackage too (under both windows and linux), but the way I solved it eventually was to remove the object at the end of each processing step, i.e. add a rm(b) and a gc() at the end of each iteration. Let me know if this works for you too.

    0 讨论(0)
提交回复
热议问题