Jupyter Lab freezes the computer when out of RAM - how to prevent it?

后端 未结 7 1304
佛祖请我去吃肉
佛祖请我去吃肉 2021-02-04 08:11

I have recently started using Jupyter Lab and my problem is that I work with quite large datasets (usually the dataset itself is approx. 1/4 of my computer RAM). After few trans

相关标签:
7条回答
  • 2021-02-04 08:52

    Absolutely the most robust solution to this problem would be to use Docker containers. You can specify how much memory to allocate to Jupyter, and if the container runs out of memory it's simply not a big deal (just remember to save frequently, but that goes without saying).

    This blog will get you most of the way there. There are also some decent instructions setting up Jupyter Lab from one of the freely available, officially maintained, Jupyter images here:

    https://medium.com/fundbox-engineering/overview-d3759e83969c

    and then you can modify the docker run command as described in the tutorial as (e.g. for 3GB):

    docker run --memory 3g <other docker run args from tutorial here>
    

    For syntax on the docker memory options, see this question:

    What unit does the docker run "--memory" option expect?

    0 讨论(0)
提交回复
热议问题