DEV Community

Cover image for Getting Started with Fast-Api 🏎️ and Docker🐳
Zoo Codes
Zoo Codes

Posted on • Updated on

Getting Started with Fast-Api 🏎️ and Docker🐳

New month, New blog post...

After the overwhelming response to my last post on Getting Started with Flask and Docker, I decided to write another one.

This time I'm going to show you how to get started with Fast-Api and Docker. Our demo project will be a simple API that allows the user to create and save notes to a Postgres database.

Fast-Api is a modern, fast (high-performance), web framework for building APIs with Python 3.7+ based on standard Python type hints.

It is a relatively new framework that is gaining popularity in the Python community(over 50k on GithHub). It is built on top of Starlette (a lightweight ASGI framework/toolkit, which is ideal for building high performance asyncio services) and Pydantic. It is fast, easy to use and easy to learn, making it a perfect choice for building APIs. It also has built-in support for OpenAPI and Swagger.

As the name implies, Fast-Api is fast. It is one of the fastest Python frameworks available. Using Uvicorn as the ASGI server, it is capable of handling over 10,000 requests per second.

Audience and Objectives πŸ—£οΈ

This tutorial is for anyone who wants to get started with Fast-Api and Docker. From beginners to intermediate developers.

By the end of this tutorial, you will be able to:

  • Create a Fast-Api project.
  • Run the project locally.
  • Connect to a Postgres database to perform CRUD operations.
  • Dockerize the project.
  • Commit the code and Push to GitHub.
  • Use GitHub Actions as our CI/CD pipeline to test and build Docker image and container.
  • Interact with Api via the browser or 3rd Party tools like Postman, Insomnia, etc.
  • Optionally create and connect a Vue frontend to the API.

PrerequisitesπŸ§‘β€πŸ’»

The main prerequisite for this tutorial is a basic understanding of Python and Docker.

Have the following installed on your machine:

  • Python 3.7+
  • Git installed on your system.
  • Docker running on your system.
  • PostgreSQL installed on your system. I recommend using Docker to run the database.
  • Terminal or Command Line Interface (CLI) of your choice.
  • Node.js and NPM installed on your system (if you want to create a Vue frontend). Nvm is recommended.

Project Structure πŸ“˜

The project structure will be as follows:



Fast-Api-example
β”œβ”€β”€ .github
β”‚   └── workflows
β”‚       └── docker-image.yml
|        └── python-app.yml
|
β”œβ”€β”€ src
β”‚   β”œβ”€β”€ app
β”‚   β”‚   β”œβ”€β”€ __init__.py
β”‚   β”‚   β”œβ”€β”€ api
β”‚   β”‚   β”‚   β”œβ”€β”€ __init__.py
β”‚   β”‚   β”‚   β”œβ”€β”€ crud.py
β”‚   β”‚   β”‚   β”œβ”€β”€ models.py
β”‚   β”‚   β”‚   β”œβ”€β”€ notes.py
β”‚   β”‚   β”‚   └── ping.py
        β”œβ”€β”€ models.py
β”‚       β”œβ”€β”€ db.py
β”‚       β”œβ”€β”€ main.py
    β”œβ”€β”€ Dockerfile
    β”œβ”€β”€ .env
β”œβ”€β”€ .gitignore

β”œβ”€β”€ README.md
β”œβ”€β”€ docker-compose.yml
β”œβ”€β”€ requirements.txt



Enter fullscreen mode Exit fullscreen mode

Simple architecture diagram:

Fast-Api-Architecture

Getting Started

These instructions will work on most unix/Linux systems. If you are using Windows, you can use WSL or Git Bash.

This tutorial will be broken down into the following sections:

Create a Fast-Api Project πŸ†•

Create a new folder and open it in your terminal.



mkdir Fast-Api-example
cd Fast-Api-example


Enter fullscreen mode Exit fullscreen mode

Create a virtual environment and activate it.



python3 -m venv venv
source venv/bin/activate


Enter fullscreen mode Exit fullscreen mode

Install the required dependencies. Save the following in a file called requirements.txt.



pip install fastapi uvicorn python-dotenv psycopg2-binary


Enter fullscreen mode Exit fullscreen mode


pip freeze > requirements.txt


Enter fullscreen mode Exit fullscreen mode

Create a new folder src inside it file called main.py and add the following code.



from fastapi import FastAPI

app = FastAPI()

@app.get("/")
def read_root():
 return {"Hello": "World"}


Enter fullscreen mode Exit fullscreen mode

The code is a simple hello world example. It imports the Fast-Api class and creates an instance of it. It then creates a route that returns a simple JSON object.

Run the project locally.



uvicorn src.main:app --reload --workers 1 --host 0.0.0.0 --port 8002



Enter fullscreen mode Exit fullscreen mode

We are using Uvicorn as the ASGI server. It is a lightning-fast ASGI server implementation, using uvloop and httptools.

We are also using the --reload flag to enable hot reloading. This means that whenever we make a change to our code, the server will automatically restart. The --workers 1 flag is used to specify the number of worker processes. The --host and --port flags are used to specify the host and port to run the server on.

Open your browser and navigate to http://localhost:8002/. You should see the following:

Fast-Api Hello World

Connect to a Postgres Database

Inside the src folder, create a new file called db.py and add the following code.



# src/db.py

import os

from sqlalchemy import (Column, Integer, String, Table, create_engine, MetaData)
from dotenv import load_dotenv
from databases import Database
from datetime import datetime as dt
from pytz import timezone as tz

load_dotenv()
# Database url if none is passed the default one is used
DATABASE_URL = os.getenv("DATABASE_URL", "postgresql://hello_fastapi:hello_fastapi@localhost/hello_fastapi_dev")

# SQLAlchemy
engine = create_engine(DATABASE_URL)
metadata = MetaData()
notes = Table(
    "notes",
    metadata,
    Column("id", Integer, primary_key=True),
    Column("title", String(50)),
    Column("description", String(50)),
    Column("completed",String(8), default="False"),
    Column("created_date", String(50), default=dt.now(tz("Africa/Nairobi")).strftime("%Y-%m-%d %H:%M"))
)
# Databases query builder

database = Database(DATABASE_URL)




Enter fullscreen mode Exit fullscreen mode

In the code ,we are using SQLAlchemy as our ORM(Object Relational Mapper) and Databases as our query builder.

Lets go through what the code does:

  • We are importing the required libraries.
  • from sqlalchemy we are importing the Column, Integer, String, Table, create_engine and MetaData classes. These are important classes that we will be using to create our database schema and perform CRUD operations.
  • We are also using the dotenv library to load environment variables from a .env file. This is good security practice escpecially in a production environment. When we run the project in a test env, we are providing a default database url. This is the url is important for testing also if the .env file is not found.
  • We are creating a table called notes with the following columns: id, title, description, completed and created_date. The id column is the primary key and the created_date column is set to the current date and time.
  • We are also creating a database instance using the Database class from the databases library. This instance will be used to perform CRUD operations.

Create a CRUD API

Api-viz

Inside the src folder, create a new folder called api and inside it create a new file called models.py and add the following code.



# src/api/models.py

from pydantic import BaseModel, Field, NonNegativeInt
from datetime import datetime as dt
from pytz import timezone as tz

class NoteSchema(BaseModel):
    title: str = Field(..., min_length=3, max_length=50) #additional validation for the inputs 
    description: str = Field(...,min_length=3, max_length=50)
    completed: str = "False"
    created_date: str = dt.now(tz("Africa/Nairobi")).strftime("%Y-%m-%d %H:%M")


class NoteDB(NoteSchema):
    id: int 



Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating two classes. The NoteSchema class is used to validate the data that is sent to the API. The NoteDB class is used to validate the data that is returned from the database. The id field is added to the NoteDB class because it is not sent to the API. It is generated by the database. The created_date field is also added to the NoteDB class because it is not sent to the API. It is generated by the database.

Here is a list of the fields that we are using:

  • title - The title of the note. It is a required field and it must be between 3 and 50 characters.
  • description - The description of the note. It is a required field and it must be between 3 and 50 characters.
  • completed - The status of the note. It is a required field and it must be either True or False.
  • created_date - The date and time when the note was created. It is a required field and it must be in the format YYYY-MM-DD HH:MM.

Using Pydantic, we can add additional validation to the fields. For example, we can add a regex to the title field to ensure that it only contains letters and numbers. We can also add a regex to the description field to ensure that it only contains letters and numbers. We can also add a regex to the completed field to ensure that it only contains letters and numbers. We can also add a regex to the created_date field to ensure that it only contains letters and numbers.

Inside the src/api folder, create a new file called crud.py and add the following code.




# src/api/crud.py

from app.api.models import NoteSchema
from app.db import notes, database
from datetime import datetime as dt


async def post(payload: NoteSchema):
    created_date = dt.now().strftime("%Y-%m-%d %H:%M")
    query = notes.insert().values(title=payload.title, 
    description=payload.description, completed=payload.completed, created_date=created_date)
    return await database.execute(query=query)

async def get(id: int):
    query = notes.select().where(id == notes.c.id)
    return await database.fetch_one(query=query)


async def get_all():
    query = notes.select()
    return await database.fetch_all(query=query)


async def put(id:int, payload=NoteSchema):
    created_date = dt.now().strftime("%Y-%m-%d %H:%M")
    query = (
        notes.update().where(id == notes.c.id).values(title=payload.title, 
        description=payload.description, completed=payload.completed, created_date=created_date)
        .returning(notes.c.id)
    )
    return await database.execute(query=query)

async def delete(id:int):
    query = notes.delete().where(id == notes.c.id)
    return await database.execute(query=query)



Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating five functions that will be used to perform CRUD operations. The post function is used to create a new note. The get function is used to get a note by its id. The get_all function is used to get all the notes. The put function is used to update a note. The delete function is used to delete a note.

Unlike the normal way of defining python functions using the def keyword, we are using the async keyword to define the functions. This is because we are using a core feature of FastApi which is that its asynchronous. This means that the functions will be executed asynchronously. This is good because it will allow the application to handle multiple requests at the same time.

Routing and API Endpoints

Inside the src/api folder, create a new file called notes.py and add the following code.



# src/api/notes.py

from app.api import crud
from app.api.models import NoteDB, NoteSchema
from fastapi import APIRouter, HTTPException, Path
from typing import List 
from datetime import datetime as dt
router = APIRouter()


@router.post("/", response_model=NoteDB, status_code=201)
async def create_note(payload: NoteSchema):
    note_id = await crud.post(payload)
    created_date = dt.now().strftime("%Y-%m-%d %H:%M")

    response_object = {
        "id": note_id,
        "title": payload.title,
        "description": payload.description,
        "completed": payload.completed,
        "created_date": created_date,
    }
    return response_object

@router.get("/{id}/", response_model=NoteDB)
async def read_note(id: int = Path(..., gt=0),):
    note = await crud.get(id)
    if not note:
        raise HTTPException(status_code=404, detail="Note not found")
    return note

@router.get("/", response_model=List[NoteDB])
async def read_all_notes():
    return await crud.get_all()

@router.put("/{id}/", response_model=NoteDB)
async def update_note(payload:NoteSchema,id:int=Path(...,gt=0)): #Ensures the input is greater than 0
    note = await crud.get(id)
    if not note:
        raise HTTPException(status_code=404, detail="Note not found")
    note_id = await crud.put(id, payload)
    response_object = {
        "id": note_id,
        "title": payload.title,
        "description": payload.description,
        "completed": payload.completed,
    }
    return response_object

#DELETE route
@router.delete("/{id}/", response_model=NoteDB)
async def delete_note(id:int = Path(...,gt=0)):
    note = await crud.get(id)
    if not note:
        raise HTTPException(status_code=404, detail="Note not found")
    await crud.delete(id)

    return note



Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating four functions that will be used to handle the requests. The create_note function is used to handle the POST request. The read_note function is used to handle the GET request. The read_all_notes function is used to handle the GET request. The update_note function is used to handle the PUT request. The delete_note function is used to handle the DELETE request.

The create_note function takes in a payload of type NoteSchema and returns a response of type NoteDB. The read_note function takes in an id of type int and returns a response of type NoteDB. The read_all_notes function returns a response of type List[NoteDB]. The update_note function takes in a payload of type NoteSchema and an id of type int and returns a response of type NoteDB. The delete_note function takes in an id of type int and returns a response of type NoteDB.

The @router.post decorator is used to define the route for the create_note function. The @router.get decorator is used to define the route for the read_note function. The @router.get decorator is used to define the route for the read_all_notes function. The @router.put decorator is used to define the route for the update_note function. The @router.delete decorator is used to define the route for the delete_note function.

Main File

Inside the src folder, create a new file called main.py and add the following code.



# src/main.py

from fastapi import FastAPI
from starlette.middleware.cors import CORSMiddleware

from app.api import notes, ping
from app.db import engine, metadata, database

metadata.create_all(engine)

app = FastAPI()

origins = [
    "http://localhost",
    "http://localhost:8080",
    "http://localhost:5173",
    "*"
]

app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["DELETE", "GET", "POST", "PUT"],
    allow_headers=["*"],
)

@app.on_event("startup")
async def startup():
    await database.connect()


@app.on_event("shutdown")
async def shutdown():
    await database.disconnect()

app.include_router(ping.router)
app.include_router(notes.router, prefix="/notes", tags=["notes"])




Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating a new FastApi application. We are also adding a middleware to allow cross-origin resource sharing. This is to allow the frontend to make requests to the backend. We are also adding the notes and ping routers to the application.

On app start up, we are connecting to the database. On app shutdown, we are disconnecting from the database. We are also adding the notes and ping endpoints to the application. Ths notes endpoint is added to the application with the prefix /notes. This means that all the routes in the notes endpoint will be prefixed with /notes. The ping endpoint is added to the application without a prefix. This means that all the routes in the ping endpoint will not be prefixed with anything.

One Database to Rule Them All πŸ‘‘

In this section, we will be creating a database to store the notes. We will be using PostgreSQL as our database. We will be using Docker to run the database. We will be using Docker Compose to run the database and the application.

Docker File

Inside the src folder, create a new file called Dockerfile and add the following code.



FROM python:3.9.1-alpine

WORKDIR /app

COPY requirements.txt .

RUN pip install -r requirements.txt

COPY . .

CMD ["uvicorn", "src.main:app", "--reload", "--workers", "1", "--host", "0.0.0.0", "--port", "8002"]



Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating a new Docker image. We are setting the working directory to /app. We are copying the requirements.txt file to the working directory. We are installing the dependencies in the requirements.txt file. We are copying all the files in the current directory to the working directory. We are setting the command to run when the container is started.

Feel free to change the port number in the command to any port number you want.

Docker Compose File πŸ“‚

Inside the root folder, create a new file called docker-compose.yml and add the following code.



version: "3.9"

services:
  db:
 image: postgres:13.2
 container_name: postgres
 restart: always
 environment:
  - POSTGRES_USER=hello_fastapi
    - POSTGRES_PASSWORD=hello_fastapi
    - POSTGRES_DB=hello_fastapi_dev
 ports:
   - "5432:5432"
 volumes:
   - ./data:/var/lib/postgresql/data

  app:
 build: .
 container_name: fastapi
 restart: always
 ports:
   - "8002:8002"
 depends_on:
   - db
 volumes:
   - .:/app
 command: uvicorn src.main:app --reload --workers 1 --host


networks:
  default:
      name: hello_fastapi



Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating two services. The db service is used to run the database. The app service is used to run the application. The db service is dependent on the app service. This means that the db service will not start until the app service is running.

Running the Application πŸƒ

programmer-viz

To run the application locally, run the following command.



uvicorn src.main:app --reload --workers 1 --host 0.0.0.0 --port 8002


Enter fullscreen mode Exit fullscreen mode

The application should be running on port 8002. You can test the application by making requests to the endpoints. You can use Postman/Insomnia to make requests to the endpoints. You can also use the frontend to make requests to the endpoints.

Screenshot of the application running locally.

Application Running Locally

Testing the Api with Thunder Client πŸ§ͺ

Personally am using Visual Studio Code as my editor. I have installed the Thunder Client extension. This allows me to make requests to the endpoints from within the editor. You can install the extension and make requests to the endpoints from within the editor. Examples as shown below.

Thunder-Client

Thunder-Client-get

Thunder-Client put

Thunder-Client post

Thunder-Client delete

CI/CD using GitHub Actions πŸ™

In this section, we will be setting up CI/CD using GitHub Actions. We will be using GitHub Actions to test the application. This is important as it alllows to be sure that any proposed changes dont break the api.

We will also implement a workflow to test and build the docker Images for the application and the database. This is important as it allows us to be sure that the application and the database are working as expected.

Test Workflow

Inside the .github/workflows folder, create a new file called pythonapp.yml and add the following code.



name: Python Application Test

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:

  build:

    runs-on: ubuntu-latest
    strategy:
      matrix:
        python-version: [3.6, 3.7, 3.8, 3.9]

    steps:
      - uses: actions/checkout@v2
      - name: Set up Python ${{ matrix.python-version }}
        uses: actions/setup-python@v2
        with:
          python-version: ${{ matrix.python-version }}
      - name: Install dependencies
        run: |
            python -m pip install --upgrade pip
            pip install -r requirements.txt
      - name: Lint with flake8
        run: |
            pip install flake8
            # stop the build if there are Python syntax errors or undefined names
            flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
            # exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
            flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
      - name: Setup PostgreSQL
        uses: Harmon758/postgresql-action@v1.0.0
        with:
          # Version of PostgreSQL to use
          postgresql version: 12.1-alpine
          # POSTGRES_DB - name for the default database that is created
          postgresql db: hello_fastapi_dev
          # POSTGRES_USER - create the specified user with superuser power
          postgresql user: hello_fastapi
          # POSTGRES_PASSWORD - superuser password
          postgresql password: hello_fastapi
      - name: Test with pytest
        run: |
          pip install pytest
          pytest .


Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating a new workflow. We are setting the name of the workflow to Python Application Test. We are setting the workflow to run when a push is made to the main branch or when a pull request is made to the main branch. We are creating a new job called build. We are setting the job to run on the latest version of Ubuntu. We are setting the job to run on multiple versions of Python. We are checking out the code. We are setting up Python. We are installing the dependencies. We are linting the code. We are setting up PostgreSQL. We are running the tests.

Build Workflow

Inside the .github/workflows folder, create a new file called docker-image.yml and add the following code.



# .github/workflows/docker-image.yml

# This is a basic workflow to help you get started with Actions
name: Docker Compose Actions Workflow

on:
  push:
    branches: [ "main" ]
  pull_request:
    branches: [ "main" ]

jobs:

  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v3
    - name: Build the Docker image
      run: docker-compose build --no-cache --force-rm 
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
      - name: Build the stack
        run: docker-compose up -d



Enter fullscreen mode Exit fullscreen mode

In the code above, we are creating a new workflow. We are setting the name of the workflow to Docker Compose Actions Workflow. We are setting the workflow to run when a push is made to the master branch or when a pull request is made to the master branch. We are creating a new job called build. We are setting the job to run on the latest version of Ubuntu. We are checking out the code. We are building the docker images for the application and the database.

Running the Workflows

To run the workflows, push the code to the main branch. The workflows will run automatically. You can check the status of the workflows by going to the Actions tab on GitHub.

GitHub Actions

Conclusion

In this article, we have built a simple CRUD API using FastAPI. We have also set up CI/CD using GitHub Actions. We have also set up a PostgreSQL database. We have also set up a docker-compose file to run the application locally.

Find the code for this article on

GitHub logo KenMwaura1 / Fast-Api-example

Simple asynchronous API implemented with Fast-Api framework utilizing Postgres as a Database and SqlAlchemy as ORM . GitHub Actions as CI/CD Pipeline

FastAPI Example App

fastapi-0.104.1-informational CodeQL Docker Compose Actions Workflow

"Buy Me A Coffee" Twitter

This repository contains code for asynchronous example api using the Fast Api framework ,Uvicorn server and Postgres Database to perform crud operations on notes.

Fast-api

Accompanying Article

Read the full tutorial here

Installation method 1 (Run application locally)

  1. Clone this Repo

    git clone (https://github.com/KenMwaura1/Fast-Api-example)

  2. Cd into the Fast-Api folder

    cd Fast-Api-example

  3. Create a virtual environment

    python3 -m venv venv

  4. Activate virtualenv

    source venv/bin/activate

    For zsh users

    source venv/bin/activate.zsh

    For bash users

    source venv/bin/activate.bash

    For fish users

    source venv/bin/activate.fish

  5. Cd into the src folder

    cd src

  6. Install the required packages

    python -m pip install -r requirements.txt

  7. Start the app

    python main.py
    Enter fullscreen mode Exit fullscreen mode

    7b. Start the app using Uvicorn

    uvicorn app.main:app --reload --workers 1 --host 0.0.0.0 --port 8002
    Enter fullscreen mode Exit fullscreen mode
  8. Ensure you have a Postgres Database running locally Additionally create a fast_api_dev database with user **fast_api** having required privileges OR Change the DATABASE_URL variable in the .env file inside then app folder to…

Till next time, happy coding.

Next time gif

References

Buy me a coffee

Top comments (2)

Collapse
 
nikiross profile image
NikiRoss

Theres are issues between the specified project structure in this post and the actual project structure in the final example on git. It caused me a number of issues when trying to run, as someone not very familiar with Python I was unable to debug without checking out the code repo. Just an FYI incase anyone else runs into the same issue but I still very much appreciate the tutorial :)

Collapse
 
ken_mwaura1 profile image
Zoo Codes

Thank you for your feedback. I had to make changes inorder to keep the article short.