GPU based algorithm on AWS Lambda

前端 未结 3 1042
小鲜肉
小鲜肉 2021-02-19 03:52

I have a function which perform some mathematical operations and need a 16gb GPU system, But this function will not be triggered always and rest of time my system will not be in

相关标签:
3条回答
  • 2021-02-19 03:55

    You can't specify the runtime environment for AWS Lambda functions, so no, you can't require the presence of a GPU (in fact the physical machines AWS chooses to put into its Lambda pool will almost certainly not have one).

    Your best bet would be to run the GPU-requiring function as a Batch job on a compute cluster configured to use p-type instances. The guide here might be helpful.

    0 讨论(0)
  • 2021-02-19 03:57

    Currently lambda doesn't have GPU.

    However, if you just need to do inference; the emulation via CPU works fine on AWS lambda; here is an article that goes into more details:

    https://aws.amazon.com/blogs/machine-learning/how-to-deploy-deep-learning-models-with-aws-lambda-and-tensorflow/

    0 讨论(0)
  • 2021-02-19 04:15

    Batch is a good solution for certain types of workload. Another option is GPUs on ECS, which could be used for running frequent tasks utilising GPU.

    0 讨论(0)
提交回复
热议问题