Back to Scaling Python Applications guides

FastAPI Docker Best Practices

Stanley Ulili
Updated on June 11, 2025

FastAPI has changed how developers build Python web APIs with its speed and simplicity, but getting your app ready for production takes more than just putting it in a container.

You need to consider performance, security, scalability, and maintaining a smooth-running app when real users start hitting it hard. A basic Docker setup won't cut it when your startup idea takes off, or your boss asks you to handle Black Friday traffic.

This guide will show you exactly how to deploy FastAPI with Docker the right way. You'll learn the practices that separate hobby projects from production systems that actually work.

1. Start with a solid Dockerfile foundation

Before you worry about fancy optimizations, ensure you get the basics right. A good Dockerfile is the foundation of everything else you'll build on top of it.

The FastAPI documentation shows you a simple approach that works great for most applications. Let's start there and understand why each line matters.

Here's the basic structure you should use:

 
FROM python:3.13-slim

WORKDIR /code

COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt

COPY ./app /code/app

CMD ["fastapi", "run", "app/main.py", "--port", "80"]

While this may look straightforward, there’s an important optimization happening here. Copying the requirements file and installing dependencies first allows Docker to cache that layer.

Docker builds your image in layers and caches each step. When you change your Python code, Docker doesn't need to reinstall all your dependencies - it just reuses the cached layer. This saves you tons of time during development.

Your project structure should look like this:

 
.
├── app/
│   ├── __init__.py
│   └── main.py
├── Dockerfile
└── requirements.txt

Always use the exec form for your CMD instruction. Write CMD ["fastapi", "run", "app/main.py", "--port", "80"] instead of CMD fastapi run app/main.py --port 80. The exec form lets FastAPI shut down properly when you stop the container.

If you're running behind a proxy like Nginx or Traefik, add the --proxy-headers flag:

 
CMD ["fastapi", "run", "app/main.py", "--proxy-headers", "--port", "80"]

This tells FastAPI to trust the headers from your proxy so it knows when requests are coming through HTTPS.

2. Handle environment configuration properly

Hardcoding settings in your FastAPI app is a recipe for disaster. You'll end up with different versions of your code for development and production, or worse, you'll accidentally commit API keys to Git.

Environment variables solve this problem. FastAPI works great with Pydantic settings that automatically load from environment variables.

Set up your configuration like this:

app/config.py
from pydantic_settings import BaseSettings

class Settings(BaseSettings):
    app_name: str = "My FastAPI App"
    debug: bool = False
    database_url: str
    secret_key: str

    class Config:
        env_file = ".env"

settings = Settings()

Create a .env file for your local development:

.env
DATABASE_URL=sqlite:///./test.db
SECRET_KEY=your-secret-key-here
DEBUG=True

Never commit your .env file to Git. Add it to your .gitignore:

.env
.env
.env.*

Instead, create a .env.example file that shows other developers what variables they need:

.env.example
DATABASE_URL=sqlite:///./test.db
SECRET_KEY=change-this-in-production
DEBUG=False

In your Docker Compose file, you can load environment variables easily:

 
services:
  api:
    build: .
    ports:
      - "8000:80"
    env_file:
      - .env
    environment:
      - DEBUG=False

This approach keeps your secrets safe and lets you use the same code everywhere. Change your environment variables, not your code, when you deploy to different environments.

3. Set up proper health checks

You need to know when your FastAPI app is healthy, especially when Docker or Kubernetes is managing your containers. A simple health check endpoint can save you hours of debugging.

Add this to your FastAPI app:

app/main.py
from fastapi import FastAPI, HTTPException
import sqlalchemy

app = FastAPI()

@app.get("/health")
async def health_check():
    return {"status": "healthy"}

@app.get("/health/db")
async def health_check_db():
    try:
        # Check if database is reachable
        # This is just an example - use your actual database connection
        result = await database.fetch_one("SELECT 1")
        return {"status": "healthy", "database": "connected"}
    except Exception as e:
        raise HTTPException(status_code=503, detail="Database connection failed")

Configure Docker to use your health check:

 
services:
  api:
    build: .
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:80/health"]
      interval: 30s
      timeout: 10s
      retries: 3
      start_period: 40s
    restart: unless-stopped

The health check runs every 30 seconds. If it fails 3 times in a row, Docker marks the container as unhealthy. You can configure your orchestrator to restart unhealthy containers automatically.

For more complex apps, check your database, Redis, or other critical services in your health check. But keep it simple - health checks should be fast and reliable.

4. Optimize for Docker layer caching

Docker’s layer caching can greatly speed up your builds, but it depends on how your Dockerfile is structured. Even small adjustments can be the difference between a quick 30-second build and waiting several minutes.

Here's the key principle: put things that change often at the bottom of your Dockerfile. Put things that rarely change at the top.

Your requirements.txt file doesn't change very often. Your Python code changes all the time. So install requirements first, then copy your code:

Dockerfile
FROM python:3.13-slim

WORKDIR /code

# Copy and install requirements first
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt

# Copy app code last (changes frequently)
COPY ./app /code/app

CMD ["fastapi", "run", "app/main.py", "--port", "80"]

When you change your Python code and rebuild, Docker reuses the layer with your installed packages. This saves huge amounts of time during development.

You can also use a .dockerignore file to avoid copying unnecessary files:

.dockerignore
# .dockerignore
.git
.pytest_cache
__pycache__
*.pyc
.env
README.md
.gitignore

This keeps your build context small and your builds fast.

For even better caching, you can separate your development and production dependencies:

Dockerfile
FROM python:3.13-slim

WORKDIR /code

# Install production dependencies
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt

# Install dev dependencies only in development
COPY ./requirements-dev.txt /code/requirements-dev.txt
RUN if [ "$ENVIRONMENT" = "development" ] ; then pip install -r /code/requirements-dev.txt ; fi

COPY ./app /code/app

CMD ["fastapi", "run", "app/main.py", "--port", "80"]

5. Handle database connections and migrations

Your FastAPI app probably needs a database. In containerized environments, you need to handle database connections carefully and make sure your migrations run at the right time.

First, make sure your database is ready before your app starts. Use depends_on with health checks in Docker Compose:

 
services:
  api:
    build: .
    depends_on:
      db:
        condition: service_healthy
    environment:
      - DATABASE_URL=postgresql://user:password@db/myapp

  db:
    image: postgres:15
    environment:
      POSTGRES_DB: myapp
      POSTGRES_USER: user
      POSTGRES_PASSWORD: password
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U user"]
      interval: 5s
      timeout: 5s
      retries: 5

For database migrations, you have two good options. You can run them in a separate container before your app starts:

 
services:
  migrate:
    build: .
    command: alembic upgrade head
    depends_on:
      db:
        condition: service_healthy
    environment:
      - DATABASE_URL=postgresql://user:password@db/myapp

  api:
    build: .
    depends_on:
      db:
        condition: service_healthy
      migrate:
        condition: service_completed_successfully

Or you can run them when your app starts:

 
# app/main.py
from fastapi import FastAPI
from alembic.config import Config
from alembic import command

app = FastAPI()

@app.on_event("startup")
async def startup_event():
    # Run migrations on startup
    alembic_cfg = Config("alembic.ini")
    command.upgrade(alembic_cfg, "head")

The separate container approach is safer for production because it prevents multiple app instances from running migrations at the same time.

6. Configure proper logging

Good logging is essential for production apps. You need to see what's happening, especially when things go wrong.

FastAPI uses Python's logging module. Configure it to output structured logs that are easy to parse:

app/main.py
import logging
import sys
from fastapi import FastAPI, Request
import time

# Configure logging
logging.basicConfig(
    level=logging.INFO,
    format='{"timestamp": "%(asctime)s", "level": "%(levelname)s", "message": "%(message)s", "module": "%(module)s"}',
    stream=sys.stdout
)

logger = logging.getLogger(__name__)

app = FastAPI()

@app.middleware("http")
async def log_requests(request: Request, call_next):
    start_time = time.time()

    response = await call_next(request)

    process_time = time.time() - start_time
    logger.info(f"Request completed", extra={
        "method": request.method,
        "url": str(request.url),
        "status_code": response.status_code,
        "process_time": round(process_time, 4)
    })

    return response

Configure Docker to handle log rotation so your logs don't fill up your disk:

 
services:
  api:
    build: .
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "3"

In production, consider sending your logs to a centralized logging service like CloudWatch, Datadog, or the ELK stack.

7. Prepare for production scaling

When your app gets popular, you'll need to handle more traffic. Plan for this from the beginning.

For simple deployments, you can run multiple worker processes in a single container:

 
CMD ["fastapi", "run", "app/main.py", "--port", "80", "--workers", "4"]

But for serious production deployments, you're better off running one process per container and scaling at the container level:

 
services:
  api:
    build: .
    deploy:
      replicas: 4
    depends_on:
      - db
      - redis

Use a reverse proxy like Nginx to handle static files and SSL termination:

 
services:
  nginx:
    image: nginx:alpine
    ports:
      - "80:80"
      - "443:443"
    volumes:
      - ./nginx.conf:/etc/nginx/nginx.conf
      - ./static:/usr/share/nginx/html/static
    depends_on:
      - api

  api:
    build: .
    expose:
      - "8000"

Configure your Nginx to proxy requests to your FastAPI containers:

 
upstream fastapi_backend {
    server api_1:8000;
    server api_2:8000;
    server api_3:8000;
    server api_4:8000;
}

server {
    listen 80;

    location /static/ {
        alias /usr/share/nginx/html/static/;
        expires 1y;
    }

    location / {
        proxy_pass http://fastapi_backend;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    }
}

This setup lets Nginx handle static files efficiently while your FastAPI containers focus on processing API requests.

Final thoughts

Building a production-ready FastAPI application with Docker isn't just about writing a Dockerfile. You need to consider caching, configuration, health checks, logging, and scaling from the outset.

Start with the basics - a solid Dockerfile, proper environment configuration, and basic health checks. Then add complexity as needed.

For deeper learning, check out the FastAPI Docker documentation for official guidance, Docker's best practices for containerization tips, and the Docker Compose documentation for orchestration details.

Got an article suggestion? Let us know
Next article
Get Started with Job Scheduling in Python
Learn how to create and monitor Python scheduled tasks in a production environment
Licensed under CC-BY-NC-SA

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Make your mark

Join the writer's program

Are you a developer and love writing and sharing your knowledge with the world? Join our guest writing program and get paid for writing amazing technical guides. We'll get them to the right readers that will appreciate them.

Write for us
Writer of the month
Marin Bezhanov
Marin is a software engineer and architect with a broad range of experience working...
Build on top of Better Stack

Write a script, app or project on top of Better Stack and share it with the world. Make a public repository and share it with us at our email.

community@betterstack.com

or submit a pull request and help us build better products for everyone.

See the full list of amazing projects on github