Redash Query & Visualization
Connect to any data source, query, visualize, and share your data.
Overview
Redash is an open-source business intelligence and data visualization platform that enables organizations to connect to multiple data sources, write SQL queries, create visualizations, and build interactive dashboards. Originally developed by Arik Fraimovich and later acquired by Databricks, Redash democratizes data access by providing a web-based interface that allows both technical and non-technical users to explore data through SQL queries and share insights through compelling visualizations. Its strength lies in supporting dozens of data sources including PostgreSQL, MySQL, BigQuery, Redshift, Elasticsearch, and APIs, making it a versatile choice for organizations with diverse data infrastructure.
This Docker stack combines Redash with PostgreSQL as the metadata database, Redis for caching and job queue management, and a multi-service architecture that includes separate containers for the web application, scheduler, and worker processes. PostgreSQL stores Redash's internal data including user accounts, query definitions, dashboard configurations, and cached query results, while Redis handles the distributed task queue system that powers Redash's background query execution and scheduled refresh capabilities. The scheduler service manages automated query runs and dashboard refreshes, while worker containers execute queries against your connected data sources in parallel, preventing long-running queries from blocking the web interface.
This configuration is ideal for data teams, business analysts, and organizations seeking to establish a centralized analytics platform without the complexity and cost of enterprise BI solutions like Tableau or Looker. The stack provides production-grade capabilities including user management, query scheduling, alert systems, and API access, making it suitable for teams ranging from startups building their first data culture to enterprises consolidating multiple reporting tools into a unified platform.
Key Features
- Multi-data source connectivity supporting 40+ databases and APIs including PostgreSQL, MySQL, BigQuery, Redshift, Elasticsearch, and MongoDB
- Distributed query execution with Redis-backed job queues allowing parallel processing of multiple queries without blocking the web interface
- Automated query scheduling with configurable refresh intervals for keeping dashboards current with live data
- Interactive visualization library with chart types including line graphs, bar charts, pivot tables, cohort analysis, and geographic maps
- User permission system with organization-level access control, query sharing, and dashboard publication workflows
- Query result caching in PostgreSQL reducing load on source databases and improving dashboard loading times
- Alert system with threshold-based notifications via email, Slack, webhooks, and other integrations when query results meet specified conditions
- REST API for programmatic access to queries, dashboards, and data sources enabling custom integrations and automated workflows
Common Use Cases
- 1Startup analytics platform for tracking KPIs across multiple SaaS tools and databases without expensive BI software licensing
- 2E-commerce business intelligence combining sales data from PostgreSQL, marketing metrics from APIs, and inventory data from MySQL
- 3DevOps monitoring dashboards connecting to Elasticsearch logs, application databases, and infrastructure APIs for operational visibility
- 4Financial reporting system aggregating data from accounting software APIs, payment processors, and internal transaction databases
- 5Customer success analytics combining CRM data, product usage metrics, and support ticket information for churn prediction
- 6Marketing campaign analysis integrating Google Analytics, social media APIs, email marketing platforms, and conversion tracking databases
- 7Executive reporting platform providing scheduled dashboard delivery and automated alerts for key business metrics across departments
Prerequisites
- Minimum 2GB RAM for the complete stack (512MB for Redash services, 1GB for PostgreSQL, 256MB for Redis, 256MB for system overhead)
- Available ports 5000 for Redash web interface access from external clients
- Basic SQL knowledge for writing queries against your data sources, as Redash is query-centric rather than drag-and-drop
- Administrative access to target data sources including database credentials, API keys, and network connectivity for Redash to connect
- Understanding of environment variable configuration for database credentials, secret keys, and Redis connection strings
- Docker Compose 2.0+ with support for health check conditions and multi-service dependencies
For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms
docker-compose.yml
docker-compose.yml
1services: 2 redash: 3 image: redash/redash:latest4 ports: 5 - "5000:5000"6 environment: 7 PYTHONUNBUFFERED: 08 REDASH_LOG_LEVEL: INFO9 REDASH_REDIS_URL: redis://redis:6379/010 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}11 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}12 REDASH_SECRET_KEY: ${SECRET_KEY}13 depends_on: 14 postgres: 15 condition: service_healthy16 redis: 17 condition: service_started18 networks: 19 - redash-net20 restart: unless-stopped2122 redash-scheduler: 23 image: redash/redash:latest24 environment: 25 PYTHONUNBUFFERED: 026 REDASH_LOG_LEVEL: INFO27 REDASH_REDIS_URL: redis://redis:6379/028 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}29 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}30 REDASH_SECRET_KEY: ${SECRET_KEY}31 QUEUES: "celery"32 WORKERS_COUNT: 133 command: scheduler34 depends_on: 35 - redash36 networks: 37 - redash-net38 restart: unless-stopped3940 redash-worker: 41 image: redash/redash:latest42 environment: 43 PYTHONUNBUFFERED: 044 REDASH_LOG_LEVEL: INFO45 REDASH_REDIS_URL: redis://redis:6379/046 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}47 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}48 REDASH_SECRET_KEY: ${SECRET_KEY}49 QUEUES: "queries,scheduled_queries,celery"50 WORKERS_COUNT: 251 command: worker52 depends_on: 53 - redash54 networks: 55 - redash-net56 restart: unless-stopped5758 postgres: 59 image: postgres:16-alpine60 environment: 61 POSTGRES_USER: ${POSTGRES_USER}62 POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}63 POSTGRES_DB: ${POSTGRES_DB}64 volumes: 65 - postgres_data:/var/lib/postgresql/data66 healthcheck: 67 test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]68 interval: 10s69 timeout: 5s70 retries: 571 networks: 72 - redash-net73 restart: unless-stopped7475 redis: 76 image: redis:7-alpine77 volumes: 78 - redis_data:/data79 networks: 80 - redash-net81 restart: unless-stopped8283volumes: 84 postgres_data: 85 redis_data: 8687networks: 88 redash-net: 89 driver: bridge.env Template
.env
1# Redash Secrets2COOKIE_SECRET=$(openssl rand -hex 32)3SECRET_KEY=$(openssl rand -hex 32)45# PostgreSQL6POSTGRES_USER=redash7POSTGRES_PASSWORD=secure_postgres_password8POSTGRES_DB=redashUsage Notes
- 1Redash at http://localhost:5000
- 2Create database: docker compose run --rm redash create_db
- 3Connect to any SQL database or API
- 4Build dashboards from saved queries
Individual Services(5 services)
Copy individual services to mix and match with your existing compose files.
redash
redash:
image: redash/redash:latest
ports:
- "5000:5000"
environment:
PYTHONUNBUFFERED: 0
REDASH_LOG_LEVEL: INFO
REDASH_REDIS_URL: redis://redis:6379/0
REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
REDASH_SECRET_KEY: ${SECRET_KEY}
depends_on:
postgres:
condition: service_healthy
redis:
condition: service_started
networks:
- redash-net
restart: unless-stopped
redash-scheduler
redash-scheduler:
image: redash/redash:latest
environment:
PYTHONUNBUFFERED: 0
REDASH_LOG_LEVEL: INFO
REDASH_REDIS_URL: redis://redis:6379/0
REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
REDASH_SECRET_KEY: ${SECRET_KEY}
QUEUES: celery
WORKERS_COUNT: 1
command: scheduler
depends_on:
- redash
networks:
- redash-net
restart: unless-stopped
redash-worker
redash-worker:
image: redash/redash:latest
environment:
PYTHONUNBUFFERED: 0
REDASH_LOG_LEVEL: INFO
REDASH_REDIS_URL: redis://redis:6379/0
REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
REDASH_SECRET_KEY: ${SECRET_KEY}
QUEUES: queries,scheduled_queries,celery
WORKERS_COUNT: 2
command: worker
depends_on:
- redash
networks:
- redash-net
restart: unless-stopped
postgres
postgres:
image: postgres:16-alpine
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: ${POSTGRES_DB}
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test:
- CMD-SHELL
- pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}
interval: 10s
timeout: 5s
retries: 5
networks:
- redash-net
restart: unless-stopped
redis
redis:
image: redis:7-alpine
volumes:
- redis_data:/data
networks:
- redash-net
restart: unless-stopped
Quick Start
terminal
1# 1. Create the compose file2cat > docker-compose.yml << 'EOF'3services:4 redash:5 image: redash/redash:latest6 ports:7 - "5000:5000"8 environment:9 PYTHONUNBUFFERED: 010 REDASH_LOG_LEVEL: INFO11 REDASH_REDIS_URL: redis://redis:6379/012 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}13 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}14 REDASH_SECRET_KEY: ${SECRET_KEY}15 depends_on:16 postgres:17 condition: service_healthy18 redis:19 condition: service_started20 networks:21 - redash-net22 restart: unless-stopped2324 redash-scheduler:25 image: redash/redash:latest26 environment:27 PYTHONUNBUFFERED: 028 REDASH_LOG_LEVEL: INFO29 REDASH_REDIS_URL: redis://redis:6379/030 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}31 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}32 REDASH_SECRET_KEY: ${SECRET_KEY}33 QUEUES: "celery"34 WORKERS_COUNT: 135 command: scheduler36 depends_on:37 - redash38 networks:39 - redash-net40 restart: unless-stopped4142 redash-worker:43 image: redash/redash:latest44 environment:45 PYTHONUNBUFFERED: 046 REDASH_LOG_LEVEL: INFO47 REDASH_REDIS_URL: redis://redis:6379/048 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}49 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}50 REDASH_SECRET_KEY: ${SECRET_KEY}51 QUEUES: "queries,scheduled_queries,celery"52 WORKERS_COUNT: 253 command: worker54 depends_on:55 - redash56 networks:57 - redash-net58 restart: unless-stopped5960 postgres:61 image: postgres:16-alpine62 environment:63 POSTGRES_USER: ${POSTGRES_USER}64 POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}65 POSTGRES_DB: ${POSTGRES_DB}66 volumes:67 - postgres_data:/var/lib/postgresql/data68 healthcheck:69 test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]70 interval: 10s71 timeout: 5s72 retries: 573 networks:74 - redash-net75 restart: unless-stopped7677 redis:78 image: redis:7-alpine79 volumes:80 - redis_data:/data81 networks:82 - redash-net83 restart: unless-stopped8485volumes:86 postgres_data:87 redis_data:8889networks:90 redash-net:91 driver: bridge92EOF9394# 2. Create the .env file95cat > .env << 'EOF'96# Redash Secrets97COOKIE_SECRET=$(openssl rand -hex 32)98SECRET_KEY=$(openssl rand -hex 32)99100# PostgreSQL101POSTGRES_USER=redash102POSTGRES_PASSWORD=secure_postgres_password103POSTGRES_DB=redash104EOF105106# 3. Start the services107docker compose up -d108109# 4. View logs110docker compose logs -fOne-Liner
Run this command to download and set up the recipe in one step:
terminal
1curl -fsSL https://docker.recipes/api/recipes/redash-analytics/run | bashTroubleshooting
- Redash web interface shows 'Internal Server Error': Check PostgreSQL connection by verifying REDASH_DATABASE_URL environment variable and ensure postgres container is healthy with pg_isready
- Queries never complete or show perpetual 'Running' status: Restart redash-worker container and check Redis connectivity, as worker processes handle query execution through Redis job queues
- Dashboard refresh fails with 'Scheduled query failed': Check redash-scheduler container logs and verify data source connectivity from within the Docker network
- High memory usage in PostgreSQL container: Tune shared_buffers and work_mem settings, or increase query result cache TTL to reduce frequent large query re-execution
- Redis connection errors in Redash logs: Verify REDASH_REDIS_URL format matches 'redis://redis:6379/0' and ensure redis container is accessible on redash-net network
- Permission denied when connecting to external data sources: Ensure data source allows connections from Docker container IP ranges and verify firewall rules for outbound database connections
Community Notes
Loading...
Loading notes...
Download Recipe Kit
Get all files in a ready-to-deploy package
Includes docker-compose.yml, .env template, README, and license
Components
redashpostgresqlredisnginx
Tags
#redash#sql#visualization#dashboards#analytics
Category
Monitoring & ObservabilityAd Space
Shortcuts: C CopyF FavoriteD Download