docker.recipes

Redash Query & Visualization

intermediate

Connect to any data source, query, visualize, and share your data.

Overview

Redash is an open-source business intelligence and data visualization platform that enables organizations to connect to multiple data sources, write SQL queries, create visualizations, and build interactive dashboards. Originally developed by Arik Fraimovich and later acquired by Databricks, Redash democratizes data access by providing a web-based interface that allows both technical and non-technical users to explore data through SQL queries and share insights through compelling visualizations. Its strength lies in supporting dozens of data sources including PostgreSQL, MySQL, BigQuery, Redshift, Elasticsearch, and APIs, making it a versatile choice for organizations with diverse data infrastructure. This Docker stack combines Redash with PostgreSQL as the metadata database, Redis for caching and job queue management, and a multi-service architecture that includes separate containers for the web application, scheduler, and worker processes. PostgreSQL stores Redash's internal data including user accounts, query definitions, dashboard configurations, and cached query results, while Redis handles the distributed task queue system that powers Redash's background query execution and scheduled refresh capabilities. The scheduler service manages automated query runs and dashboard refreshes, while worker containers execute queries against your connected data sources in parallel, preventing long-running queries from blocking the web interface. This configuration is ideal for data teams, business analysts, and organizations seeking to establish a centralized analytics platform without the complexity and cost of enterprise BI solutions like Tableau or Looker. The stack provides production-grade capabilities including user management, query scheduling, alert systems, and API access, making it suitable for teams ranging from startups building their first data culture to enterprises consolidating multiple reporting tools into a unified platform.

Key Features

  • Multi-data source connectivity supporting 40+ databases and APIs including PostgreSQL, MySQL, BigQuery, Redshift, Elasticsearch, and MongoDB
  • Distributed query execution with Redis-backed job queues allowing parallel processing of multiple queries without blocking the web interface
  • Automated query scheduling with configurable refresh intervals for keeping dashboards current with live data
  • Interactive visualization library with chart types including line graphs, bar charts, pivot tables, cohort analysis, and geographic maps
  • User permission system with organization-level access control, query sharing, and dashboard publication workflows
  • Query result caching in PostgreSQL reducing load on source databases and improving dashboard loading times
  • Alert system with threshold-based notifications via email, Slack, webhooks, and other integrations when query results meet specified conditions
  • REST API for programmatic access to queries, dashboards, and data sources enabling custom integrations and automated workflows

Common Use Cases

  • 1Startup analytics platform for tracking KPIs across multiple SaaS tools and databases without expensive BI software licensing
  • 2E-commerce business intelligence combining sales data from PostgreSQL, marketing metrics from APIs, and inventory data from MySQL
  • 3DevOps monitoring dashboards connecting to Elasticsearch logs, application databases, and infrastructure APIs for operational visibility
  • 4Financial reporting system aggregating data from accounting software APIs, payment processors, and internal transaction databases
  • 5Customer success analytics combining CRM data, product usage metrics, and support ticket information for churn prediction
  • 6Marketing campaign analysis integrating Google Analytics, social media APIs, email marketing platforms, and conversion tracking databases
  • 7Executive reporting platform providing scheduled dashboard delivery and automated alerts for key business metrics across departments

Prerequisites

  • Minimum 2GB RAM for the complete stack (512MB for Redash services, 1GB for PostgreSQL, 256MB for Redis, 256MB for system overhead)
  • Available ports 5000 for Redash web interface access from external clients
  • Basic SQL knowledge for writing queries against your data sources, as Redash is query-centric rather than drag-and-drop
  • Administrative access to target data sources including database credentials, API keys, and network connectivity for Redash to connect
  • Understanding of environment variable configuration for database credentials, secret keys, and Redis connection strings
  • Docker Compose 2.0+ with support for health check conditions and multi-service dependencies

For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms

docker-compose.yml

docker-compose.yml
1services:
2 redash:
3 image: redash/redash:latest
4 ports:
5 - "5000:5000"
6 environment:
7 PYTHONUNBUFFERED: 0
8 REDASH_LOG_LEVEL: INFO
9 REDASH_REDIS_URL: redis://redis:6379/0
10 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
11 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
12 REDASH_SECRET_KEY: ${SECRET_KEY}
13 depends_on:
14 postgres:
15 condition: service_healthy
16 redis:
17 condition: service_started
18 networks:
19 - redash-net
20 restart: unless-stopped
21
22 redash-scheduler:
23 image: redash/redash:latest
24 environment:
25 PYTHONUNBUFFERED: 0
26 REDASH_LOG_LEVEL: INFO
27 REDASH_REDIS_URL: redis://redis:6379/0
28 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
29 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
30 REDASH_SECRET_KEY: ${SECRET_KEY}
31 QUEUES: "celery"
32 WORKERS_COUNT: 1
33 command: scheduler
34 depends_on:
35 - redash
36 networks:
37 - redash-net
38 restart: unless-stopped
39
40 redash-worker:
41 image: redash/redash:latest
42 environment:
43 PYTHONUNBUFFERED: 0
44 REDASH_LOG_LEVEL: INFO
45 REDASH_REDIS_URL: redis://redis:6379/0
46 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
47 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
48 REDASH_SECRET_KEY: ${SECRET_KEY}
49 QUEUES: "queries,scheduled_queries,celery"
50 WORKERS_COUNT: 2
51 command: worker
52 depends_on:
53 - redash
54 networks:
55 - redash-net
56 restart: unless-stopped
57
58 postgres:
59 image: postgres:16-alpine
60 environment:
61 POSTGRES_USER: ${POSTGRES_USER}
62 POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
63 POSTGRES_DB: ${POSTGRES_DB}
64 volumes:
65 - postgres_data:/var/lib/postgresql/data
66 healthcheck:
67 test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
68 interval: 10s
69 timeout: 5s
70 retries: 5
71 networks:
72 - redash-net
73 restart: unless-stopped
74
75 redis:
76 image: redis:7-alpine
77 volumes:
78 - redis_data:/data
79 networks:
80 - redash-net
81 restart: unless-stopped
82
83volumes:
84 postgres_data:
85 redis_data:
86
87networks:
88 redash-net:
89 driver: bridge

.env Template

.env
1# Redash Secrets
2COOKIE_SECRET=$(openssl rand -hex 32)
3SECRET_KEY=$(openssl rand -hex 32)
4
5# PostgreSQL
6POSTGRES_USER=redash
7POSTGRES_PASSWORD=secure_postgres_password
8POSTGRES_DB=redash

Usage Notes

  1. 1Redash at http://localhost:5000
  2. 2Create database: docker compose run --rm redash create_db
  3. 3Connect to any SQL database or API
  4. 4Build dashboards from saved queries

Individual Services(5 services)

Copy individual services to mix and match with your existing compose files.

redash
redash:
  image: redash/redash:latest
  ports:
    - "5000:5000"
  environment:
    PYTHONUNBUFFERED: 0
    REDASH_LOG_LEVEL: INFO
    REDASH_REDIS_URL: redis://redis:6379/0
    REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
    REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
    REDASH_SECRET_KEY: ${SECRET_KEY}
  depends_on:
    postgres:
      condition: service_healthy
    redis:
      condition: service_started
  networks:
    - redash-net
  restart: unless-stopped
redash-scheduler
redash-scheduler:
  image: redash/redash:latest
  environment:
    PYTHONUNBUFFERED: 0
    REDASH_LOG_LEVEL: INFO
    REDASH_REDIS_URL: redis://redis:6379/0
    REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
    REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
    REDASH_SECRET_KEY: ${SECRET_KEY}
    QUEUES: celery
    WORKERS_COUNT: 1
  command: scheduler
  depends_on:
    - redash
  networks:
    - redash-net
  restart: unless-stopped
redash-worker
redash-worker:
  image: redash/redash:latest
  environment:
    PYTHONUNBUFFERED: 0
    REDASH_LOG_LEVEL: INFO
    REDASH_REDIS_URL: redis://redis:6379/0
    REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
    REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
    REDASH_SECRET_KEY: ${SECRET_KEY}
    QUEUES: queries,scheduled_queries,celery
    WORKERS_COUNT: 2
  command: worker
  depends_on:
    - redash
  networks:
    - redash-net
  restart: unless-stopped
postgres
postgres:
  image: postgres:16-alpine
  environment:
    POSTGRES_USER: ${POSTGRES_USER}
    POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
    POSTGRES_DB: ${POSTGRES_DB}
  volumes:
    - postgres_data:/var/lib/postgresql/data
  healthcheck:
    test:
      - CMD-SHELL
      - pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}
    interval: 10s
    timeout: 5s
    retries: 5
  networks:
    - redash-net
  restart: unless-stopped
redis
redis:
  image: redis:7-alpine
  volumes:
    - redis_data:/data
  networks:
    - redash-net
  restart: unless-stopped

Quick Start

terminal
1# 1. Create the compose file
2cat > docker-compose.yml << 'EOF'
3services:
4 redash:
5 image: redash/redash:latest
6 ports:
7 - "5000:5000"
8 environment:
9 PYTHONUNBUFFERED: 0
10 REDASH_LOG_LEVEL: INFO
11 REDASH_REDIS_URL: redis://redis:6379/0
12 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
13 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
14 REDASH_SECRET_KEY: ${SECRET_KEY}
15 depends_on:
16 postgres:
17 condition: service_healthy
18 redis:
19 condition: service_started
20 networks:
21 - redash-net
22 restart: unless-stopped
23
24 redash-scheduler:
25 image: redash/redash:latest
26 environment:
27 PYTHONUNBUFFERED: 0
28 REDASH_LOG_LEVEL: INFO
29 REDASH_REDIS_URL: redis://redis:6379/0
30 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
31 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
32 REDASH_SECRET_KEY: ${SECRET_KEY}
33 QUEUES: "celery"
34 WORKERS_COUNT: 1
35 command: scheduler
36 depends_on:
37 - redash
38 networks:
39 - redash-net
40 restart: unless-stopped
41
42 redash-worker:
43 image: redash/redash:latest
44 environment:
45 PYTHONUNBUFFERED: 0
46 REDASH_LOG_LEVEL: INFO
47 REDASH_REDIS_URL: redis://redis:6379/0
48 REDASH_DATABASE_URL: postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@postgres:5432/${POSTGRES_DB}
49 REDASH_COOKIE_SECRET: ${COOKIE_SECRET}
50 REDASH_SECRET_KEY: ${SECRET_KEY}
51 QUEUES: "queries,scheduled_queries,celery"
52 WORKERS_COUNT: 2
53 command: worker
54 depends_on:
55 - redash
56 networks:
57 - redash-net
58 restart: unless-stopped
59
60 postgres:
61 image: postgres:16-alpine
62 environment:
63 POSTGRES_USER: ${POSTGRES_USER}
64 POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
65 POSTGRES_DB: ${POSTGRES_DB}
66 volumes:
67 - postgres_data:/var/lib/postgresql/data
68 healthcheck:
69 test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
70 interval: 10s
71 timeout: 5s
72 retries: 5
73 networks:
74 - redash-net
75 restart: unless-stopped
76
77 redis:
78 image: redis:7-alpine
79 volumes:
80 - redis_data:/data
81 networks:
82 - redash-net
83 restart: unless-stopped
84
85volumes:
86 postgres_data:
87 redis_data:
88
89networks:
90 redash-net:
91 driver: bridge
92EOF
93
94# 2. Create the .env file
95cat > .env << 'EOF'
96# Redash Secrets
97COOKIE_SECRET=$(openssl rand -hex 32)
98SECRET_KEY=$(openssl rand -hex 32)
99
100# PostgreSQL
101POSTGRES_USER=redash
102POSTGRES_PASSWORD=secure_postgres_password
103POSTGRES_DB=redash
104EOF
105
106# 3. Start the services
107docker compose up -d
108
109# 4. View logs
110docker compose logs -f

One-Liner

Run this command to download and set up the recipe in one step:

terminal
1curl -fsSL https://docker.recipes/api/recipes/redash-analytics/run | bash

Troubleshooting

  • Redash web interface shows 'Internal Server Error': Check PostgreSQL connection by verifying REDASH_DATABASE_URL environment variable and ensure postgres container is healthy with pg_isready
  • Queries never complete or show perpetual 'Running' status: Restart redash-worker container and check Redis connectivity, as worker processes handle query execution through Redis job queues
  • Dashboard refresh fails with 'Scheduled query failed': Check redash-scheduler container logs and verify data source connectivity from within the Docker network
  • High memory usage in PostgreSQL container: Tune shared_buffers and work_mem settings, or increase query result cache TTL to reduce frequent large query re-execution
  • Redis connection errors in Redash logs: Verify REDASH_REDIS_URL format matches 'redis://redis:6379/0' and ensure redis container is accessible on redash-net network
  • Permission denied when connecting to external data sources: Ensure data source allows connections from Docker container IP ranges and verify firewall rules for outbound database connections

Community Notes

Loading...
Loading notes...

Download Recipe Kit

Get all files in a ready-to-deploy package

Includes docker-compose.yml, .env template, README, and license

Ad Space