Does Numpy automatically detect and use GPU?

前端 未结 4 1423
情话喂你
情话喂你 2021-02-05 03:27

I have a few basic questions about using Numpy with GPU (nvidia GTX 1080 Ti). I\'m new to GPU, and would like to make sure I\'m properly using the GPU to accelerate Numpy/Pytho

相关标签:
4条回答
  • 2021-02-05 03:48

    Does Numpy/Python automatically detect the presence of GPU and utilize it to speed up matrix computation (e.g. numpy.multiply, numpy.linalg.inv, ... etc)?

    No.

    Or do I have code in a specific way to exploit the GPU for fast computation?

    Yes. Search for Numba, CuPy, Theano, PyTorch or PyCUDA for different paradigms for accelerating Python with GPUs.

    0 讨论(0)
  • 2021-02-05 03:56

    No, you can also use CuPy which has a similar interface with numpy. https://cupy.chainer.org/

    0 讨论(0)
  • 2021-02-05 03:57

    JAX uses XLA to compile NumPy code to run on GPUs/ TPUs : https://github.com/google/jax

    0 讨论(0)
  • 2021-02-05 03:57

    No. Numpy does not use GPU. But you can use CuPy. The syntax of CuPy is quite compatible with NumPy. So, to use GPU, You just need to replace the following line of your code

     import numpy as np
    

    with

    import cupy as np
    

    That's all. Go ahead and run your code. One more thing that I think I should mention here is that to install CuPy you first need to install CUDA. Since the objective of your question is to make your computations faster by making use of GPU, I would also suggest you explore PyTorch. With PyTorch, you can do almost everything that you can do with NumPy and much more. The learning curve would also be quite smooth if you are already familiar with NumPy. You can find more details on replacing NumPy with PyTorch here: https://www.youtube.com/watch?v=p3iYN-2XL8w

    0 讨论(0)
提交回复
热议问题