Jupyter Lab freezes the computer when out of RAM - how to prevent it?

后端 未结 7 1305
佛祖请我去吃肉
佛祖请我去吃肉 2021-02-04 08:11

I have recently started using Jupyter Lab and my problem is that I work with quite large datasets (usually the dataset itself is approx. 1/4 of my computer RAM). After few trans

7条回答
  •  独厮守ぢ
    2021-02-04 08:48

    I am going to summarize the answers from the following question. You can limit the memory usage of your programm. In the following this will be the function ram_intense_foo(). Before calling that you need to call the function limit_memory(10)

    import resource
    import platform
    import sys
    import numpy as np 
    
    def memory_limit(percent_of_free):
        soft, hard = resource.getrlimit(resource.RLIMIT_AS)
        resource.setrlimit(resource.RLIMIT_AS, (get_memory() * 1024 * percent_of_free / 100, hard))
    
    def get_memory():
        with open('/proc/meminfo', 'r') as mem:
            free_memory = 0
            for i in mem:
                sline = i.split()
                if str(sline[0]) == 'MemAvailable:':
                    free_memory = int(sline[1])
                    break
        return free_memory
    
    def ram_intense_foo(a,b):
        A = np.random.rand(a,b)
        return A.T@A
    
    if __name__ == '__main__':
        memory_limit(95)
        try:
            temp = ram_intense_foo(4000,10000)
            print(temp.shape)
        except MemoryError:
            sys.stderr.write('\n\nERROR: Memory Exception\n')
            sys.exit(1)
    

提交回复
热议问题