I am using jupyter notebook with Python3 on windows 10. My computer has 8GB RAM and at least 4GB of my RAM is free.
But when I want to make a numpy ndArray with size
For Jupyter you need to consider 2 processes:
max_buffer_size is a Tornado Web Server setting, corresponds to the Maximum amount of incoming data to buffer and defaults to 100MB (104857600). (https://www.tornadoweb.org/en/stable/httpserver.html)
Based on this PR, this value seems to have been increased to 500 MB in Notebook.
Tornado HTTP server does not allow to my knowledge to define the max memory, it runs as a Python3 process.
For the kernel, you should look at the command defined kernel spec.
An option to try would be this one
Jupyter notebook has a default memory limit size. You can try to increase the memory limit by following the steps:
1) Generate Config file using command:
jupyter notebook --generate-config
2) Open jupyter_notebook_config.py file situated inside 'jupyter' folder and edit the following property:NotebookApp.max_buffer_size = your desired value
Remember to remove the '#' before the property value.Alternatively, you can simply run the Notebook using below command:
jupyter notebook --NotebookApp.max_buffer_size=your_value