I need some advice.
I trained an image classifier using Tensorflow and wanted to deploy it to AWS Lambda using serverless. The directory includes the model, some py
The best way to do it would be to use the Serverless Framework as outlined in this article. It helps to zip them using a docker image which mimics Amazon's linux environment. Additionally, it automatically uses S3 as the code repository for your Lambda which increases the size limit. The article provided is an extremely helpful guide and is the same way that developers use tensorflow and other large libraries on AWS.
If you're still running into the 250MB size limit, you can try to follow this article which uses the same python-requirements-plugin
as the previous article, but with the option -slim: true
. This will help you to optimally compress your packages by removing unnecessary files from them, which allows you to decrease your package size before AND after unzipping.
I know I am answering it very late .. just putting it here for reference for other people.. I did the following things -
If this does not work then there are some additional things that can be done like removing pyc files etc as mentioned here
You can maybe use the ephemeral disk capacity, (/tmp) that have a limit of 512Mb, but in your case, memory will still be an issue.
The best choice can be to use an AWS batch, if serverless does not manage it, you can even keep a lambda to trigger your batch