API and PSQL integration example

In this section we will integrate the Python code with the PostgreSQL database. Each request and response will be stored in the database.

All the project is in the api-full-example folder:

├── api-full-example
│   ├── app.py
│   ├── db.py
│   ├── docker-compose.yml
│   └── __init__.py

The database will be created in the docker container exactly as in the previous section.

The connection and session objects will be created in the db.py file.

The endpoint logic and the application object will be created in the app.py file.

Launching the API

To launch the API, we will use the uvicorn command:

uvicorn api-full-example.app:app

The output in the terminal should look like:

...
...
...

INFO:     Started server process [27205]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)

Upon launching the API, the all the code will be run from the db.py script:

  • Ensuring that the database ROOT_DB exists

  • Ensuring that the tables requests and responses exist

  • Creating the tables if not

  • Creating the connection object

The docker-compose.yml file for database creation:

!cat api_full_example/docker-compose.yml
version: '3.1'

services:

  db:
    image: postgres:14.1
    restart: always
    environment:
      POSTGRES_PASSWORD: password
      POSTGRES_USER: user 
    ports:
      - "5431:5432"
    volumes:
      - ./data/db:/var/lib/postgresql/data

The full code of the db.py script is:

!cat api_full_example/db.py
# ORM functions for the database 
from sqlalchemy.orm import sessionmaker, declarative_base

# Creating the engine for the database
from sqlalchemy import create_engine

# Model creation 
from sqlalchemy import Column, Integer, Float, DateTime
from sqlalchemy.sql.schema import ForeignKey

# Data wrangling
import pandas as pd

# Dates and times
import datetime

# Connecting to the PSQL database
init_engine = create_engine('postgresql://user:password@localhost:5431/postgres')

# Ensuring that the database ROOT_DB is created in the PSQL server 
conn = init_engine.connect()

# When initiated, the connection object is in a state of an active transation. We need to commit it in order to make the changes permanent.
conn.execute("commit")

# Giving the database a name 
db_name = 'root_db'

# Getting all the database names 
databases = pd.read_sql("SELECT datname FROM pg_database;", init_engine)["datname"].values.tolist()

if db_name in databases:
    print(f'Database {db_name} already exists')
else:
    print('Database does not exist; Creating it')
    # Creating the database
    conn.execute(f"CREATE DATABASE {db_name}")
    conn.execute("commit")
conn.close()

# Creating the engine to the main database
engine = create_engine(f'postgresql://user:password@localhost:5431/{db_name}')

# Initiating the Base class
Base = declarative_base()

# Defining the models - Request and Response
class Request(Base):
    # Table name in database
    __tablename__ = 'requests'
    
    # If any changes are made to the columns, allow the database to know about it
    __table_args__ = {'extend_existing': True} 

    id = Column(Integer, primary_key=True)
    number = Column(Float)
    n = Column(Float)
    request_datetime = Column(DateTime)

    def __init__(self, number, n):
        self.number = number
        self.n = n
        self.request_datetime = datetime.datetime.now()

class Response(Base):
    # Table name in database
    __tablename__ = 'responses'
    
    # If any changes are made to the columns, allow the database to know about it
    __table_args__ = {'extend_existing': True} 

    id = Column(Integer, primary_key=True)
    # Each response has to have an associated request. This association is defined by the foreign key.
    # Hence, every response will have an associated request ID column value.
    request_id = Column(Integer, ForeignKey('requests.id'))
    root = Column(Float)
    response_code = Column(Integer)
    response_datetime = Column(DateTime)

    def __init__(self, request_id, root, response_code):
        self.request_id = request_id
        self.root = root
        self.response_code = response_code
        self.response_datetime = datetime.datetime.now()

# Creating the models in the database if they do not exist
tables = pd.read_sql(f"SELECT tablename FROM pg_catalog.pg_tables;", engine)["tablename"].values.tolist()
if 'requests' not in tables or 'responses' not in tables:
    print('Tables do not exist; Creating them')
    Base.metadata.create_all(engine)

# Creating the session
Session = sessionmaker(bind=engine)
session = Session()

The app.py script imports the requests and responses database models from the db.py file and creates the API application object.

Contents of the app.py file:

!cat api_full_example/app.py
# Importing the fastAPI library
from fastapi import FastAPI

# Importing the session, requests and responses models
from db import session, Request, Response

# Creating an instance of the FastAPI class
app = FastAPI()

# Creating an endpoint with the GET method
@app.get("/root")
def root_of_number(number: float, n: float):
    """
    The function returns the n-th root of the number.

    Parameters
    ----------
    number : float
        The number to find the n-th root of.
    n : float
        The n-th root to find.

    Returns
    -------
    float
        The n-th root of the number.
    """
    # Creating a new request object
    request = Request(number, n)

    # Adding the request to the session
    session.add(request)

    # Commiting to database
    session.commit()

    # Calculating the root
    try:
        root = number ** n

        # Creating the response object
        response = Response(request.id, root, 200)

        # Adding the response to the session
        session.add(response)

        # Commiting to database
        session.commit()

        # Returning the response to the user 
        return {"root": number ** n} 
    except Exception as e:
        # Creating the response object
        response = Response(request.id, None, 500)

        # Adding the response to the session
        session.add(response)

        # Commiting to database
        session.commit()

        # Returning the response to the user 
        return {"error": str(e)}

Querying and inspecting the flow of the API

Lets send a request to the API.

NOTE: the API and the database needs to be running on the machine where this notebook is beeing executed.

# Import the API querying lib
import requests

# Defining some numbers and roots 
numbers = [1, 2, 3, 4, 5]
roots = [0.5, 0.4, 0.3, 0.2, 0.1]

# Zipping for the loop
numbers_and_roots = zip(numbers, roots)

for number_and_root in numbers_and_roots:
    # Unpacking the numbers and roots
    number, root = number_and_root

    # Creating the request
    response = requests.get(f'http://localhost:7999/root?number={number}&n={root}')

    # Checking the response status code
    print(f"Root {root} of {number}: {response.json()['root']}")
Root 0.5 of 1: 1.0
Root 0.4 of 2: 1.3195079107728942
Root 0.3 of 3: 1.3903891703159093
Root 0.2 of 4: 1.3195079107728942
Root 0.1 of 5: 1.174618943088019

After the request is sent to the API server, the request is logged immediately in the database. Each request gets a unique ID in the database and when it was created.

Then the API server applies the code defined in the view root_of_number in app.py.

Either the request is processed successfully or not it is logged to the database.

The contents of the requests and responses tables:

import pandas as pd 
from api_full_example.db import engine 

# Listing last 5 requests
pd.read_sql("select * from requests", engine).tail(5)
Database root_db already exists
id number n request_datetime
15 16 1.0 0.5 2022-01-16 14:59:09.398754
16 17 2.0 0.4 2022-01-16 14:59:09.417805
17 18 3.0 0.3 2022-01-16 14:59:09.427194
18 19 4.0 0.2 2022-01-16 14:59:09.435537
19 20 5.0 0.1 2022-01-16 14:59:09.443541
# Listing last 5 responses
pd.read_sql("select * from responses", engine).tail(5)
id request_id root response_code response_datetime
15 16 16 1.000000 200 2022-01-16 14:59:09.410899
16 17 17 1.319508 200 2022-01-16 14:59:09.421567
17 18 18 1.390389 200 2022-01-16 14:59:09.430304
18 19 19 1.319508 200 2022-01-16 14:59:09.438796
19 20 20 1.174619 200 2022-01-16 14:59:09.446394

This is the the most basic necessity of when creating an API: you need to track the request and responses in the database. You can add more fields about the requests like user agent, IP address, country of origin, etc. As well as the response: the content type, content length, in depth error messages, etc.

Whats next?

We have created an API that receives a request, calculates something, sends the reponse back to the client and logs some information to the database.

In the next chapter of the book we will containerize the API and deploy it using Docker. Additionally, we will talk about technologies of Gunicorn and Nginx which will complete the full puzzle of deploying the API.

Contributions

If you enjoyed the book so far and feel like donating, feel free to do so. The link to do a one time donation is via Stripe.

Additionaly, if you want me to add another chapter or to expand an existing one, please create an issue on Github.