Celery + Flower
Distributed task queue with monitoring UI.
Overview
Celery is an open-source distributed task queue system for Python applications that enables asynchronous processing of time-consuming operations. Originally developed in 2009 by Ask Solem, Celery has become the de facto standard for handling background tasks in Python applications, from simple email sending to complex data processing pipelines. It uses a message broker like Redis to distribute tasks across multiple worker processes, making it essential for building scalable web applications that need to perform heavy computations without blocking user requests.
This stack combines Celery with Redis as the message broker and result backend, plus Flower as the real-time monitoring interface. Redis serves dual purposes here: queuing pending tasks for workers to consume and storing task results for retrieval. Flower provides a web-based dashboard that visualizes task execution, worker status, queue lengths, and failure rates in real-time. This combination creates a complete asynchronous task processing system with full observability.
Python developers building web applications with Django, Flask, or FastAPI will find this stack invaluable when they need to offload CPU-intensive operations, schedule periodic tasks, or handle workflows that involve multiple steps. Data engineering teams processing large datasets, e-commerce platforms handling order processing, and content management systems generating thumbnails or processing uploads all benefit from this distributed task architecture. The Flower monitoring interface makes this particularly suitable for production environments where task visibility and debugging capabilities are crucial.
Key Features
- Redis-backed task queue with sub-millisecond message delivery and automatic task routing to available workers
- Flower web UI providing real-time task monitoring, execution graphs, and worker performance metrics at port 5555
- Celery worker auto-scaling support through Docker Compose scaling commands for handling variable workloads
- Task result persistence in Redis with configurable TTL for retrieving completed task outcomes
- Built-in task retry mechanisms with exponential backoff and custom retry strategies for handling transient failures
- Celery beat integration support for periodic and scheduled task execution using crontab-like syntax
- Worker prefork pool management allowing multiple concurrent tasks per worker instance
- Task routing and priority queues enabling different processing lanes for urgent vs background tasks
Common Use Cases
- 1Django/Flask web applications offloading image resizing, PDF generation, or email sending to background workers
- 2E-commerce platforms processing order fulfillment, inventory updates, and payment processing asynchronously
- 3Data analytics pipelines running ETL jobs, report generation, and machine learning model training tasks
- 4Content management systems handling video transcoding, thumbnail generation, and bulk content operations
- 5API services performing external service integrations, webhook processing, and third-party data synchronization
- 6Financial applications executing batch payment processing, risk calculations, and compliance reporting
- 7IoT data processing platforms handling sensor data aggregation, anomaly detection, and alert generation
Prerequisites
- Python application with Celery tasks defined using @app.task decorator in a tasks.py file
- Docker and Docker Compose installed with minimum 512MB RAM available for Redis operations
- Basic understanding of Python async concepts and message queue architectures
- Port 5555 available for Flower web interface access and monitoring
- Dockerfile in project root that installs Celery and your application dependencies
- Knowledge of Celery configuration options for broker URLs, task serialization, and result backends
For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms
docker-compose.yml
docker-compose.yml
1services: 2 redis: 3 image: redis:alpine4 container_name: celery-redis5 networks: 6 - celery78 celery-worker: 9 build: .10 command: celery -A tasks worker --loglevel=info11 environment: 12 CELERY_BROKER_URL: redis://redis:6379/013 CELERY_RESULT_BACKEND: redis://redis:6379/014 depends_on: 15 - redis16 networks: 17 - celery1819 flower: 20 image: mher/flower:latest21 container_name: flower22 environment: 23 CELERY_BROKER_URL: redis://redis:6379/024 ports: 25 - "5555:5555"26 depends_on: 27 - redis28 networks: 29 - celery3031networks: 32 celery: 33 driver: bridge.env Template
.env
1# Requires your Python app with tasks.pyUsage Notes
- 1Docs: https://docs.celeryq.dev/
- 2Flower monitoring UI at http://localhost:5555
- 3Requires your Python app with Celery tasks defined
- 4Create tasks.py: @app.task def my_task(): ...
- 5Scale workers: docker compose up -d --scale celery-worker=3
- 6Monitor task status, retries, and worker health in Flower
Individual Services(3 services)
Copy individual services to mix and match with your existing compose files.
redis
redis:
image: redis:alpine
container_name: celery-redis
networks:
- celery
celery-worker
celery-worker:
build: .
command: celery -A tasks worker --loglevel=info
environment:
CELERY_BROKER_URL: redis://redis:6379/0
CELERY_RESULT_BACKEND: redis://redis:6379/0
depends_on:
- redis
networks:
- celery
flower
flower:
image: mher/flower:latest
container_name: flower
environment:
CELERY_BROKER_URL: redis://redis:6379/0
ports:
- "5555:5555"
depends_on:
- redis
networks:
- celery
Quick Start
terminal
1# 1. Create the compose file2cat > docker-compose.yml << 'EOF'3services:4 redis:5 image: redis:alpine6 container_name: celery-redis7 networks:8 - celery910 celery-worker:11 build: .12 command: celery -A tasks worker --loglevel=info13 environment:14 CELERY_BROKER_URL: redis://redis:6379/015 CELERY_RESULT_BACKEND: redis://redis:6379/016 depends_on:17 - redis18 networks:19 - celery2021 flower:22 image: mher/flower:latest23 container_name: flower24 environment:25 CELERY_BROKER_URL: redis://redis:6379/026 ports:27 - "5555:5555"28 depends_on:29 - redis30 networks:31 - celery3233networks:34 celery:35 driver: bridge36EOF3738# 2. Create the .env file39cat > .env << 'EOF'40# Requires your Python app with tasks.py41EOF4243# 3. Start the services44docker compose up -d4546# 4. View logs47docker compose logs -fOne-Liner
Run this command to download and set up the recipe in one step:
terminal
1curl -fsSL https://docker.recipes/api/recipes/celery-flower/run | bashTroubleshooting
- Workers not connecting to Redis broker: Verify CELERY_BROKER_URL environment variable matches redis://redis:6379/0 format and Redis container is running
- Tasks stuck in PENDING state: Check worker logs for import errors in tasks.py and ensure task decorators are properly applied
- Flower showing no workers: Confirm celery-worker container started successfully and broker URL in Flower matches worker configuration
- Memory errors in Redis: Increase Docker memory limits or implement Redis maxmemory policies with LRU eviction for result cleanup
- Tasks timing out: Adjust Celery CELERYD_TASK_TIME_LIMIT and CELERYD_TASK_SOFT_TIME_LIMIT settings in worker environment
- Flower UI not accessible: Check port 5555 mapping in docker-compose and verify no firewall blocking access to localhost:5555
Community Notes
Loading...
Loading notes...
Download Recipe Kit
Get all files in a ready-to-deploy package
Includes docker-compose.yml, .env template, README, and license
Ad Space
Shortcuts: C CopyF FavoriteD Download