How can I get R to use more CPU usage?

前端 未结 2 1557
攒了一身酷
攒了一身酷 2021-01-19 01:27

I noticed that R doesn\'t use all of my CPU, and I want to increase that tremendously (upwards to 100%). I don\'t want it to just parallelize a few functions; I want R to us

相关标签:
2条回答
  • 2021-01-19 01:57

    If you are trying to run 4 different LPs in parallel, here's how to do it in snowfall.

    sfInit(parallel=TRUE, cpus=4)
    sfSource(code.R) #if you have your function in a separate file
    sfExport(list=c("variable1","variable2",
                "functionname1")) #export your variables and function to cluster
    results<-sfClusterApplyLB(parameters, functionname) #this starts the function on the clusters
    

    E.g. The function in the sfClusterApply could contain your LP.

    Otherwise see comments in regard to your question

    0 讨论(0)
  • 2021-01-19 02:12

    Posting this as an answer because there's not enough space in a comment.
    This is not an answer directly towards your question but more to the performance.


    R uses slow statistical libraries by default which also can only use single core by default. Improved libraries are OPENBLAS/ATLAS. These however, can be a pain to install.
    Personally I eventually got it working using this guide.

    I ended up using Revolution R open(RRO) + MKL which has both improved BLAS libraries and multi-cpu support. It is an alternative R distribution which is supposed to have up to 20x the speed of regular R (I cannot confirm this, but it is alot faster).

    Furthermore, you could check the CRAN HPC packages to see if there is any improved packages which support the lp function.

    There is also packages to explore multi cpu usage.
    This answer by Gavin, as well as @user3293236's answer above show several possibilities for packages allowing multi CPU usage.

    0 讨论(0)
提交回复
热议问题