Serving and deploying Pytorch models for free with FastAPI and AWS Lambda — Part 1 [FAILED]

  • the model is built with Pytorch (machine translation)
  • the model is served through a REST API built with FastAPI
  • the application is then compressed and stored in an AWS S3 bucket
  • finally, the API is deployed as a lambda function on AWS Lambda

Setup Git and Github repository

  1. Create a new repository
  2. Add .gitignore (Python template)
  3. Add README.md
  4. Clone on your local machine
  5. Don’t forget to commit changes after every step! ⚠️

Setup virtual environment

pip install virtualenv
# virtualenv venv
# if you have the following error: "command not found", try this:
python -m virtualenv venv
source venv/bin/activate

Install the required dependencies:

# requirements.txt
atomicwrites==1.4.0
attrs==20.3.0
certifi==2020.12.5
chardet==3.0.4
click==7.1.2
colorama==0.4.4
fastapi==0.62.0
h11==0.11.0
idna==2.10
iniconfig==1.1.1
mangum==0.10.0
packaging==20.8
pluggy==0.13.1
py==1.10.0
pydantic==1.7.3
pyparsing==2.4.7
pytest==6.2.0
requests==2.25.0
starlette==0.13.6
toml==0.10.2
typing-extensions==3.7.4.3
urllib3==1.26.2
uvicorn==0.13.1
pip install -r requirements.txt

Build and test your REST API with FastAPI and Pytest

.
└── app
├── tests
| ├── __init__.py
| └── test_main.py
├── __init__.py
├── main.py
└── api
├── __init__.py
└── v1
├── __init__.py
└── api.py
python -m pytest

Create automation with Github Actions

mkdir .github/workflows
touch .github/workflows/main.yml
  • Continuous Integration (CI)
  • Continuous Deployment (CD)
name: CI/CD Pipelineon:
push:
branches: [ main ]
jobs:
continuous-integration:
....

continuous-deployment:
.....

Configure AWS Lambda and set up CI/CD

  1. Signup to AWS, create a Lambda function
  2. Select: author from scratch
  3. Select: runtime: python 3.7
  4. Select: choose or create an execution role
  5. Select: create function
  6. Create an S3 bucket in AWS console management. ⚠️ it should be in the same region as your Lambda function
  7. Change runtime settings (should be lambda_function.lambda_handler now), change it to main.handler
  8. Update your Github Actions config file (main.yml) to deploy your function to Lambda (don’t forget to change YOUR_LAMBDA_FUNCTION_NAME and YOUR_S3_BUCKET with your own names)
  1. AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY: go to AWS Management Console, on the top right click on your account and My Security Credentials, Access Keys, Create New Access Key
  2. AWS_DEFAULT_REGION is the AWS Region of your lambda function (and S3 bucket as they need to be the same). Just put the lower case part, e.g. eu-west-3 or us-west-2.

Create an API Gateway

  1. Go to API Gateway (in AWS Console)
  2. Click on Build on the card that says REST API but doesn’t have Private on it
  3. Click New API and enter a name for this API, then Create API
  4. Actions > Create Method > ANY > Use Lambda Proxy integration > Save
  5. Add a proxy: Actions > Create Resource > Configure as proxy > Create Resource
  6. Actions > Deploy API > [New Staging] > “dev” (or “staging” or whatever you want)

Pytorch Deep Learning Model

pip install torch torchtext spacy
pip freeze > requirements.txt

Build, train, and save the model

Serve the model

  1. create a new POST endpoint
  2. create utilities: process data (string to encoded tensor), load pre-trained model, feedforward, process output (encoded tensor to string)

To be continued…

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Achraf AIT SIDI HAMMOU

Achraf AIT SIDI HAMMOU

French ML undergrad — writing about GraphQL security @ Escape.tech — freelance developer