AWS Lambda Deployment Xgboost package Python 3.6
We all know with AWS Lambda 50Mb limit on zip upload and ~262Mb limit from AWS s3 unzipped total size. This script allows you to make a package that will have xgboost library with all the dependencies and joblib.
- The final size is of the file in the zip is about 254M which leaves you about 8M for extra files and packages
- In most cases it's imposible to zip your models into the rest of 8Mb in the zip file (262 - 254) so here there a solotion with boto3 library how to load models from s3 buckets
- The model load call should be before lambda_function that's how it would be loaded once and not going to waste time in calls
- In the code it's suggested to load model into local AWS Lambda /tmp/ directory. That directory is limited to 500Mb
-
Clone this repo to your local folder
-
Run this command from the repo folder
docker run -v $(pwd):/outputs -it amazonlinux:2016.09 /bin/bash /outputs/build.sh
-
There will be generated folder lambda-package and the zip file lambda-package.zip
-
Edit file
lambda_function.py
to adopt it to your models -
Add it to lambda-package.zip file
-
Upload lambda-package.zip into AWS s3 bucket
-
Set up up PATH in the AWS Lambda, Python 3.6 as Runtime, Increase memory in base settings
-
Create Test
-
Save
-
Test
-
Repeat until it works :)
Ryan Brown for providing original packaging sklearn and numpy example Here it was used modified version to Python 3.6 from Mark Campanelli
Jing Xie and Ken Mcdonnell for helping with deploying models and debugging.