docker.recipes

Celery + Flower

intermediate

Distributed task queue with monitoring UI.

Overview

Celery is an open-source distributed task queue system for Python applications that enables asynchronous processing of time-consuming operations. Originally developed in 2009 by Ask Solem, Celery has become the de facto standard for handling background tasks in Python applications, from simple email sending to complex data processing pipelines. It uses a message broker like Redis to distribute tasks across multiple worker processes, making it essential for building scalable web applications that need to perform heavy computations without blocking user requests. This stack combines Celery with Redis as the message broker and result backend, plus Flower as the real-time monitoring interface. Redis serves dual purposes here: queuing pending tasks for workers to consume and storing task results for retrieval. Flower provides a web-based dashboard that visualizes task execution, worker status, queue lengths, and failure rates in real-time. This combination creates a complete asynchronous task processing system with full observability. Python developers building web applications with Django, Flask, or FastAPI will find this stack invaluable when they need to offload CPU-intensive operations, schedule periodic tasks, or handle workflows that involve multiple steps. Data engineering teams processing large datasets, e-commerce platforms handling order processing, and content management systems generating thumbnails or processing uploads all benefit from this distributed task architecture. The Flower monitoring interface makes this particularly suitable for production environments where task visibility and debugging capabilities are crucial.

Key Features

  • Redis-backed task queue with sub-millisecond message delivery and automatic task routing to available workers
  • Flower web UI providing real-time task monitoring, execution graphs, and worker performance metrics at port 5555
  • Celery worker auto-scaling support through Docker Compose scaling commands for handling variable workloads
  • Task result persistence in Redis with configurable TTL for retrieving completed task outcomes
  • Built-in task retry mechanisms with exponential backoff and custom retry strategies for handling transient failures
  • Celery beat integration support for periodic and scheduled task execution using crontab-like syntax
  • Worker prefork pool management allowing multiple concurrent tasks per worker instance
  • Task routing and priority queues enabling different processing lanes for urgent vs background tasks

Common Use Cases

  • 1Django/Flask web applications offloading image resizing, PDF generation, or email sending to background workers
  • 2E-commerce platforms processing order fulfillment, inventory updates, and payment processing asynchronously
  • 3Data analytics pipelines running ETL jobs, report generation, and machine learning model training tasks
  • 4Content management systems handling video transcoding, thumbnail generation, and bulk content operations
  • 5API services performing external service integrations, webhook processing, and third-party data synchronization
  • 6Financial applications executing batch payment processing, risk calculations, and compliance reporting
  • 7IoT data processing platforms handling sensor data aggregation, anomaly detection, and alert generation

Prerequisites

  • Python application with Celery tasks defined using @app.task decorator in a tasks.py file
  • Docker and Docker Compose installed with minimum 512MB RAM available for Redis operations
  • Basic understanding of Python async concepts and message queue architectures
  • Port 5555 available for Flower web interface access and monitoring
  • Dockerfile in project root that installs Celery and your application dependencies
  • Knowledge of Celery configuration options for broker URLs, task serialization, and result backends

For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms

docker-compose.yml

docker-compose.yml
1services:
2 redis:
3 image: redis:alpine
4 container_name: celery-redis
5 networks:
6 - celery
7
8 celery-worker:
9 build: .
10 command: celery -A tasks worker --loglevel=info
11 environment:
12 CELERY_BROKER_URL: redis://redis:6379/0
13 CELERY_RESULT_BACKEND: redis://redis:6379/0
14 depends_on:
15 - redis
16 networks:
17 - celery
18
19 flower:
20 image: mher/flower:latest
21 container_name: flower
22 environment:
23 CELERY_BROKER_URL: redis://redis:6379/0
24 ports:
25 - "5555:5555"
26 depends_on:
27 - redis
28 networks:
29 - celery
30
31networks:
32 celery:
33 driver: bridge

.env Template

.env
1# Requires your Python app with tasks.py

Usage Notes

  1. 1Docs: https://docs.celeryq.dev/
  2. 2Flower monitoring UI at http://localhost:5555
  3. 3Requires your Python app with Celery tasks defined
  4. 4Create tasks.py: @app.task def my_task(): ...
  5. 5Scale workers: docker compose up -d --scale celery-worker=3
  6. 6Monitor task status, retries, and worker health in Flower

Individual Services(3 services)

Copy individual services to mix and match with your existing compose files.

redis
redis:
  image: redis:alpine
  container_name: celery-redis
  networks:
    - celery
celery-worker
celery-worker:
  build: .
  command: celery -A tasks worker --loglevel=info
  environment:
    CELERY_BROKER_URL: redis://redis:6379/0
    CELERY_RESULT_BACKEND: redis://redis:6379/0
  depends_on:
    - redis
  networks:
    - celery
flower
flower:
  image: mher/flower:latest
  container_name: flower
  environment:
    CELERY_BROKER_URL: redis://redis:6379/0
  ports:
    - "5555:5555"
  depends_on:
    - redis
  networks:
    - celery

Quick Start

terminal
1# 1. Create the compose file
2cat > docker-compose.yml << 'EOF'
3services:
4 redis:
5 image: redis:alpine
6 container_name: celery-redis
7 networks:
8 - celery
9
10 celery-worker:
11 build: .
12 command: celery -A tasks worker --loglevel=info
13 environment:
14 CELERY_BROKER_URL: redis://redis:6379/0
15 CELERY_RESULT_BACKEND: redis://redis:6379/0
16 depends_on:
17 - redis
18 networks:
19 - celery
20
21 flower:
22 image: mher/flower:latest
23 container_name: flower
24 environment:
25 CELERY_BROKER_URL: redis://redis:6379/0
26 ports:
27 - "5555:5555"
28 depends_on:
29 - redis
30 networks:
31 - celery
32
33networks:
34 celery:
35 driver: bridge
36EOF
37
38# 2. Create the .env file
39cat > .env << 'EOF'
40# Requires your Python app with tasks.py
41EOF
42
43# 3. Start the services
44docker compose up -d
45
46# 4. View logs
47docker compose logs -f

One-Liner

Run this command to download and set up the recipe in one step:

terminal
1curl -fsSL https://docker.recipes/api/recipes/celery-flower/run | bash

Troubleshooting

  • Workers not connecting to Redis broker: Verify CELERY_BROKER_URL environment variable matches redis://redis:6379/0 format and Redis container is running
  • Tasks stuck in PENDING state: Check worker logs for import errors in tasks.py and ensure task decorators are properly applied
  • Flower showing no workers: Confirm celery-worker container started successfully and broker URL in Flower matches worker configuration
  • Memory errors in Redis: Increase Docker memory limits or implement Redis maxmemory policies with LRU eviction for result cleanup
  • Tasks timing out: Adjust Celery CELERYD_TASK_TIME_LIMIT and CELERYD_TASK_SOFT_TIME_LIMIT settings in worker environment
  • Flower UI not accessible: Check port 5555 mapping in docker-compose and verify no firewall blocking access to localhost:5555

Community Notes

Loading...
Loading notes...

Download Recipe Kit

Get all files in a ready-to-deploy package

Includes docker-compose.yml, .env template, README, and license

Ad Space