Back to Scaling Python Applications guides

How to Use Background Tasks in FastAPI

Stanley Ulili
Updated on June 18, 2025

FastAPI's background tasks enable you to run time-consuming operations without requiring users to wait for API responses. You can process files, send emails, generate reports, or handle database maintenance in the background, all while your API remains fast and responsive.

When you use traditional synchronous processing, users must wait for slow operations to complete before receiving a response. With FastAPI's background tasks, your application returns responses immediately while continuing to work behind the scenes. This approach works exceptionally well for web applications where users expect quick responses, even when complex operations are running.

This guide shows you how to implement background tasks in FastAPI. You'll learn basic setup, advanced patterns, monitoring strategies, and production deployment tips.

Let's dive in!

Prerequisites

Before you get started, make sure Python 3.8+ is installed on your system. If you haven’t installed FastAPI and Uvicorn yet, install them using pip.

Step 1 — Setting up a FastAPI project

Before we start using background tasks, let’s first set up a basic FastAPI project. In this step, you'll create a simple app to make sure everything is working correctly before we add background functionality.

Start by creating a new directory for your project:

 
mkdir fastapi-background-tasks && cd fastapi-background-tasks

Now, create a virtual environment and activate it:

 
python3 -m venv venv
 
source venv/bin/activate

Install the required dependencies:

 
pip install fastapi uvicorn python-multipart aiofiles

You'll also use Python's built-in asyncio and datetime modules for the examples. You don't need any additional libraries for the core functionality.

Create a main.py file with a basic FastAPI app:

main.py
from fastapi import FastAPI

app = FastAPI()

@app.get("/")
def read_root():
    return {"message": "FastAPI Background Tasks API", "status": "running"}

This creates a minimal FastAPI application with a single health check endpoint.

To verify your FastAPI installation works, run:

 
uvicorn main:app --reload

You should see output like this:

Output
INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Started reloader process [71293] using StatReload
INFO:     Started server process [71297]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     127.0.0.1:55271 - "GET / HTTP/1.1" 200 OK
INFO:     127.0.0.1:55271 - "GET /favicon.ico HTTP/1.1" 404 Not Found

If you see Application startup complete., you're ready to continue.

Now open your browser and go to http://127.0.0.1:8000. You should see a response like this:

FastAPI root endpoint response

With your FastAPI app up and running, you're ready to start adding background tasks in the next step.

Step 2 — Creating your first background task

FastAPI provides a BackgroundTasks class that runs functions after it returns responses to clients. You'll create a simple task tracking system for a project management application.

Update your main.py with this code:

main.py
from fastapi import FastAPI, BackgroundTasks
from datetime import datetime
from pydantic import BaseModel

app = FastAPI()


class TaskCreate(BaseModel):
    task_name: str
    assigned_to: str


def track_activity(project_id: int, activity: str, user: str):
    timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
    log_entry = f"[{timestamp}] Project {project_id}: {activity} by {user}\n"

    with open("activities.log", "a") as f:
        f.write(log_entry)
    print(f"Tracked: {activity}")


@app.post("/projects/{project_id}/tasks")
def create_task(
    project_id: int, task_data: TaskCreate, background_tasks: BackgroundTasks
):
    task_id = f"task_{project_id}_{len(task_data.task_name)}"

    background_tasks.add_task(
        track_activity,
        project_id,
        f"created '{task_data.task_name}'",
        task_data.assigned_to,
    )

    return {"task_id": task_id, "status": "created"}

This code adds the BackgroundTasks parameter, defines a logging function that writes to a file, and uses background_tasks.add_task() to schedule the log to run after the response is returned.

To test it, run the following command:

 
curl -X POST "http://127.0.0.1:8000/projects/101/tasks" \
     -H "Content-Type: application/json" \
     -d '{"task_name": "Setup database", "assigned_to": "Alice"}'

You’ll get a response like:

Output
{"task_id":"task_101_14","status":"created"}                      

Alternatively, you can test it in Postman:

Screenshot of the response in Postman

In the terminal where your FastAPI server is running, you’ll see:

Output
Tracked: created 'Setup database'

Now, check the contents of the log file:

 
cat activities.log
Output
[2025-06-18 12:39:32] Project 101: created 'Setup database' by Alice
[2025-06-18 12:40:30] Project 101: created 'Setup database' by Alice

This confirms that the API returns instantly while the logging runs in the background, making your application more responsive without losing important task details.

Step 3 — Handling complex data in background tasks

Now that you have a basic background task working, let's explore how to process more complex data structures and perform computational work. You'll build a customer feedback system that analyzes reviews for sentiment and saves detailed results.

Background tasks can handle multiple parameters and complex operations without affecting response times. This is perfect for tasks like data analysis, report generation, or any processing that involves calculations.

Add these functions to your main.py:

main.py
from fastapi import FastAPI, BackgroundTasks
from datetime import datetime
from pydantic import BaseModel
import json
app = FastAPI() class TaskCreate(BaseModel): task_name: str assigned_to: str
class ReviewSubmission(BaseModel):
customer_id: int
rating: int
review_text: str
def track_activity(project_id: int, activity: str, user: str): ...
def analyze_review(customer_id: int, product_id: int, rating: int, review_text: str):
import time
time.sleep(2) # Simulate processing time
# Perform sentiment analysis
positive_words = ["great", "excellent", "amazing", "love", "perfect"]
sentiment_score = sum(1 for word in positive_words if word in review_text.lower())
analysis_result = {
"customer_id": customer_id,
"product_id": product_id,
"rating": rating,
"sentiment_score": sentiment_score,
"processed_at": datetime.now().isoformat()
}
with open("review_analysis.json", "a") as f:
f.write(json.dumps(analysis_result) + "\n")
print(f"Analyzed review: {rating}/5 stars, sentiment: {sentiment_score}")
@app.post("/projects/{project_id}/tasks") def create_task( project_id: int, task_data: TaskCreate, background_tasks: BackgroundTasks ): ...
@app.post("/products/{product_id}/reviews")
def submit_review(
product_id: int,
review_data: ReviewSubmission,
background_tasks: BackgroundTasks
):
review_id = f"review_{product_id}_{review_data.customer_id}"
background_tasks.add_task(
analyze_review,
review_data.customer_id,
product_id,
review_data.rating,
review_data.review_text
)
return {"review_id": review_id, "status": "submitted"}

This enhanced version shows how to pass complex data structures to background tasks and perform detailed analysis. The review gets processed after the user receives confirmation, keeping the API responsive.

Test the new review system:

 
curl -X POST "http://127.0.0.1:8000/products/789/reviews" \
     -H "Content-Type: application/json" \
     -d '{
       "customer_id": 456,
       "rating": 5,
       "review_text": "This product is amazing and excellent quality!"
     }'

You'll get an immediate response:

Output
{"review_id":"review_789_456","status":"submitted"}                  

You can also test this in Postman by creating a POST request to http://127.0.0.1:8000/products/789/reviews with the JSON body:

Screenshot of the Postman UI

After about 2 seconds, you'll see this in your terminal:

Output
Analyzed review: 5/5 stars, sentiment: 2

Check the detailed analysis results:

 
cat review_analysis.json
Output
"customer_id": 456, "product_id": 789, "rating": 5, "sentiment_score": 2, "processed_at": "2025-06-18T13:03:21.928977"}

This example demonstrates how background tasks can handle complex data processing, including sentiment analysis and structured data storage, all while maintaining fast API response times.

Step 4 — Processing file uploads with background tasks

File processing is perfect for background tasks because file operations can take a long time and users shouldn't have to wait for uploads to be processed. You'll create a document processing system that handles file uploads and analyzes them without blocking the API response.

This step shows you how to accept file uploads, save them immediately, and then process them in the background while giving users instant confirmation.

Add these file processing functions to your main.py:

main.py
from fastapi import FastAPI, BackgroundTasks, UploadFile, File
from datetime import datetime from pydantic import BaseModel import json
from pathlib import Path
import os
app = FastAPI()
# Create upload directory
UPLOAD_DIR = Path("uploads")
UPLOAD_DIR.mkdir(exist_ok=True)
class TaskCreate(BaseModel): task_name: str assigned_to: str class ReviewSubmission(BaseModel): customer_id: int rating: int review_text: str def track_activity(project_id: int, activity: str, user: str): ... def analyze_review(customer_id: int, product_id: int, rating: int, review_text: str): ... print(f"Analyzed review: {rating}/5 stars, sentiment: {sentiment_score}")
def process_document(filename: str, file_size: int, uploaded_by: str):
import time
time.sleep(3) # Simulate document processing time
# Simulate document analysis
file_path = UPLOAD_DIR / filename
word_count = file_size // 5 # Rough estimate for demo
processing_result = {
"filename": filename,
"file_size": file_size,
"uploaded_by": uploaded_by,
"word_count": word_count,
"status": "processed",
"processed_at": datetime.now().isoformat()
}
with open("document_analysis.json", "a") as f:
f.write(json.dumps(processing_result) + "\n")
print(f"Processed document: {filename} ({file_size} bytes)")
@app.post("/projects/{project_id}/tasks") def create_task( project_id: int, task_data: TaskCreate, background_tasks: BackgroundTasks ): ... @app.post("/products/{product_id}/reviews") def submit_review( product_id: int, review_data: ReviewSubmission, background_tasks: BackgroundTasks ): ...
@app.post("/documents/upload")
async def upload_document(
file: UploadFile = File(...),
uploaded_by: str = "user",
background_tasks: BackgroundTasks = BackgroundTasks()
):
# Save file immediately
file_path = UPLOAD_DIR / file.filename
with open(file_path, "wb") as f:
content = await file.read()
f.write(content)
file_size = len(content)
# Process document in background
background_tasks.add_task(process_document, file.filename, file_size, uploaded_by)
return {
"filename": file.filename,
"file_size": file_size,
"status": "uploaded",
"message": "File uploaded successfully and is being processed"
}

This code shows how to handle file uploads where the file gets saved immediately, but the processing happens in the background. Users get instant confirmation without waiting for analysis to complete.

Create a test file and upload it:

 
echo "This is a sample business document with important information." > test_document.txt

Now test the file upload:

 
curl -X POST "http://127.0.0.1:8000/documents/upload" \
     -F "file=@test_document.txt" \
     -F "uploaded_by=john_doe"

You'll get an immediate response:

Output
{"filename":"test_document.txt","file_size":63,"status":"uploaded","message":"File uploaded successfully and is being processed"}

You can also test this in Postman by creating a POST request to http://127.0.0.1:8000/documents/upload, setting the body type to "form-data", and adding: - Key: file (select File type) - Value: choose your test file - Key: uploaded_by (Text type) - Value: john_doe

Screenshot of Postman file upload test

After about 3 seconds, you'll see this in your terminal:

Output
Processed document: test_document.txt (63 bytes

Check the processing results:

 
cat document_analysis.json
Output
{"filename": "test_document.txt", "file_size": 63, "uploaded_by": "user", "word_count": 12, "status": "processed", "processed_at": "2025-06-18T14:03:25.699705"}

Verify the file was saved:

 
ls uploads/
Output
test_document.txt

This example demonstrates how file uploads can be handled efficiently with background processing.

The file gets saved immediately, so users know their upload succeeded, while the time-consuming analysis happens in the background without making them wait.

Final thoughts

FastAPI’s BackgroundTasks help you run slow or resource-heavy operations without delaying the user’s response.

Tasks like logging, data analysis, and file processing can be handled in the background while the API continues to serve requests quickly. This improves performance, keeps the codebase clean, and ensures a smoother experience for users.

To learn more, visit the FastAPI Background Tasks documentation.

Got an article suggestion? Let us know
Next article
Get Started with Job Scheduling in Python
Learn how to create and monitor Python scheduled tasks in a production environment
Licensed under CC-BY-NC-SA

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Make your mark

Join the writer's program

Are you a developer and love writing and sharing your knowledge with the world? Join our guest writing program and get paid for writing amazing technical guides. We'll get them to the right readers that will appreciate them.

Write for us
Writer of the month
Marin Bezhanov
Marin is a software engineer and architect with a broad range of experience working...
Build on top of Better Stack

Write a script, app or project on top of Better Stack and share it with the world. Make a public repository and share it with us at our email.

community@betterstack.com

or submit a pull request and help us build better products for everyone.

See the full list of amazing projects on github