docker.recipes

Dify

intermediate

Open-source LLM app development platform for building AI-native applications

Overview

Dify is an open-source LLM app development platform that enables developers and non-technical users to build AI-native applications through visual workflow editors and no-code interfaces. Created by Langgenius, Dify abstracts the complexity of working with Large Language Models by providing pre-built components for common AI patterns like chatbots, content generation, and document analysis. The platform supports multiple LLM providers including OpenAI, Anthropic, and open-source models, making it a vendor-agnostic solution for AI application development. This Docker stack combines Dify's API backend and web frontend with a robust data infrastructure designed specifically for AI workloads. PostgreSQL serves as the primary database for application metadata, user management, and workflow configurations, while Redis handles session management, task queues, and real-time communication between the API and worker processes. Weaviate acts as the vector database, storing document embeddings and enabling semantic search capabilities for Retrieval-Augmented Generation (RAG) applications. The worker service processes background tasks like document indexing, model inference, and workflow execution. This configuration is ideal for AI product teams, startups building AI features, and enterprises wanting to democratize AI development across their organization. The visual workflow builder allows product managers and domain experts to create sophisticated AI applications without coding, while developers can extend functionality through APIs and custom integrations. The multi-service architecture ensures scalability from prototype to production, with each component handling specific aspects of the AI application lifecycle.

Key Features

  • Visual workflow editor with drag-and-drop nodes for building complex AI applications without coding
  • Multi-LLM provider support with unified APIs for OpenAI, Anthropic, Azure OpenAI, and open-source models
  • Built-in RAG pipeline with document upload, chunking, embedding generation, and semantic search via Weaviate
  • Conversation memory management using Redis for maintaining context across multi-turn interactions
  • Background worker processing for heavy AI tasks like document indexing and batch inference operations
  • PostgreSQL-backed application versioning with rollback capabilities for workflow iterations
  • Real-time streaming responses with WebSocket support for chat-based applications
  • Built-in prompt engineering tools with variable injection and template management

Common Use Cases

  • 1Customer support chatbots with company knowledge base integration and handoff to human agents
  • 2Content generation workflows for marketing teams creating blog posts, social media, and email campaigns
  • 3Document analysis and summarization systems for legal, financial, and research organizations
  • 4Internal knowledge management platforms with natural language querying of company documentation
  • 5E-commerce product recommendation engines combining user behavior with semantic product matching
  • 6Educational content creation tools for generating quizzes, explanations, and personalized learning paths
  • 7Code review and documentation assistants for development teams with repository-specific context

Prerequisites

  • Minimum 6GB RAM (2GB for Weaviate, 1GB for PostgreSQL, 1GB for Dify services, 2GB for system overhead)
  • Docker Engine 20.10+ and Docker Compose 2.0+ with support for depends_on conditions
  • At least one LLM provider API key (OpenAI, Anthropic, etc.) configured after deployment
  • Ports 3000 and 5001 available for web interface and API access respectively
  • Basic understanding of AI concepts like embeddings, vector search, and prompt engineering
  • 50GB+ disk space for document storage, vector indices, and database growth over time

For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms

docker-compose.yml

docker-compose.yml
1services:
2 api:
3 image: langgenius/dify-api:latest
4 container_name: dify-api
5 restart: unless-stopped
6 ports:
7 - "${API_PORT:-5001}:5001"
8 environment:
9 - MODE=api
10 - LOG_LEVEL=INFO
11 - SECRET_KEY=${SECRET_KEY}
12 - CONSOLE_WEB_URL=http://localhost:${WEB_PORT:-3000}
13 - INIT_PASSWORD=${INIT_PASSWORD:-password}
14 - DB_USERNAME=${DB_USERNAME:-postgres}
15 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}
16 - DB_HOST=db
17 - DB_PORT=5432
18 - DB_DATABASE=dify
19 - REDIS_HOST=redis
20 - REDIS_PORT=6379
21 - VECTOR_STORE=weaviate
22 - WEAVIATE_ENDPOINT=http://weaviate:8080
23 volumes:
24 - ./volumes/app/storage:/app/api/storage
25 depends_on:
26 - db
27 - redis
28 - weaviate
29
30 worker:
31 image: langgenius/dify-api:latest
32 container_name: dify-worker
33 restart: unless-stopped
34 environment:
35 - MODE=worker
36 - LOG_LEVEL=INFO
37 - SECRET_KEY=${SECRET_KEY}
38 - DB_USERNAME=${DB_USERNAME:-postgres}
39 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}
40 - DB_HOST=db
41 - DB_PORT=5432
42 - DB_DATABASE=dify
43 - REDIS_HOST=redis
44 - REDIS_PORT=6379
45 - VECTOR_STORE=weaviate
46 - WEAVIATE_ENDPOINT=http://weaviate:8080
47 volumes:
48 - ./volumes/app/storage:/app/api/storage
49 depends_on:
50 - db
51 - redis
52 - weaviate
53
54 web:
55 image: langgenius/dify-web:latest
56 container_name: dify-web
57 restart: unless-stopped
58 ports:
59 - "${WEB_PORT:-3000}:3000"
60 environment:
61 - CONSOLE_API_URL=http://api:5001
62 - APP_API_URL=http://api:5001
63
64 db:
65 image: postgres:15-alpine
66 container_name: dify-db
67 restart: unless-stopped
68 environment:
69 - POSTGRES_USER=${DB_USERNAME:-postgres}
70 - POSTGRES_PASSWORD=${DB_PASSWORD:-difyai123456}
71 - POSTGRES_DB=dify
72 volumes:
73 - ./volumes/db/data:/var/lib/postgresql/data
74
75 redis:
76 image: redis:7-alpine
77 container_name: dify-redis
78 restart: unless-stopped
79 volumes:
80 - ./volumes/redis/data:/data
81
82 weaviate:
83 image: semitechnologies/weaviate:latest
84 container_name: dify-weaviate
85 restart: unless-stopped
86 environment:
87 - QUERY_DEFAULTS_LIMIT=25
88 - AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true
89 - PERSISTENCE_DATA_PATH=/var/lib/weaviate
90 - DEFAULT_VECTORIZER_MODULE=none
91 volumes:
92 - ./volumes/weaviate:/var/lib/weaviate

.env Template

.env
1# Dify Configuration
2WEB_PORT=3000
3API_PORT=5001
4
5# Security (generate with: openssl rand -base64 42)
6SECRET_KEY=your-secret-key-change-this
7
8# Initial admin password
9INIT_PASSWORD=password
10
11# Database
12DB_USERNAME=postgres
13DB_PASSWORD=difyai123456
14
15# Optional: OpenAI API Key for built-in models
16# OPENAI_API_KEY=sk-your-key

Usage Notes

  1. 1Web console at http://localhost:3000
  2. 2API endpoint at http://localhost:5001
  3. 3First login: admin@example.com / password (from INIT_PASSWORD)
  4. 4Add LLM providers in Settings > Model Providers
  5. 5Build apps with visual workflow editor
  6. 6RAG knowledge base with document upload

Individual Services(6 services)

Copy individual services to mix and match with your existing compose files.

api
api:
  image: langgenius/dify-api:latest
  container_name: dify-api
  restart: unless-stopped
  ports:
    - ${API_PORT:-5001}:5001
  environment:
    - MODE=api
    - LOG_LEVEL=INFO
    - SECRET_KEY=${SECRET_KEY}
    - CONSOLE_WEB_URL=http://localhost:${WEB_PORT:-3000}
    - INIT_PASSWORD=${INIT_PASSWORD:-password}
    - DB_USERNAME=${DB_USERNAME:-postgres}
    - DB_PASSWORD=${DB_PASSWORD:-difyai123456}
    - DB_HOST=db
    - DB_PORT=5432
    - DB_DATABASE=dify
    - REDIS_HOST=redis
    - REDIS_PORT=6379
    - VECTOR_STORE=weaviate
    - WEAVIATE_ENDPOINT=http://weaviate:8080
  volumes:
    - ./volumes/app/storage:/app/api/storage
  depends_on:
    - db
    - redis
    - weaviate
worker
worker:
  image: langgenius/dify-api:latest
  container_name: dify-worker
  restart: unless-stopped
  environment:
    - MODE=worker
    - LOG_LEVEL=INFO
    - SECRET_KEY=${SECRET_KEY}
    - DB_USERNAME=${DB_USERNAME:-postgres}
    - DB_PASSWORD=${DB_PASSWORD:-difyai123456}
    - DB_HOST=db
    - DB_PORT=5432
    - DB_DATABASE=dify
    - REDIS_HOST=redis
    - REDIS_PORT=6379
    - VECTOR_STORE=weaviate
    - WEAVIATE_ENDPOINT=http://weaviate:8080
  volumes:
    - ./volumes/app/storage:/app/api/storage
  depends_on:
    - db
    - redis
    - weaviate
web
web:
  image: langgenius/dify-web:latest
  container_name: dify-web
  restart: unless-stopped
  ports:
    - ${WEB_PORT:-3000}:3000
  environment:
    - CONSOLE_API_URL=http://api:5001
    - APP_API_URL=http://api:5001
db
db:
  image: postgres:15-alpine
  container_name: dify-db
  restart: unless-stopped
  environment:
    - POSTGRES_USER=${DB_USERNAME:-postgres}
    - POSTGRES_PASSWORD=${DB_PASSWORD:-difyai123456}
    - POSTGRES_DB=dify
  volumes:
    - ./volumes/db/data:/var/lib/postgresql/data
redis
redis:
  image: redis:7-alpine
  container_name: dify-redis
  restart: unless-stopped
  volumes:
    - ./volumes/redis/data:/data
weaviate
weaviate:
  image: semitechnologies/weaviate:latest
  container_name: dify-weaviate
  restart: unless-stopped
  environment:
    - QUERY_DEFAULTS_LIMIT=25
    - AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true
    - PERSISTENCE_DATA_PATH=/var/lib/weaviate
    - DEFAULT_VECTORIZER_MODULE=none
  volumes:
    - ./volumes/weaviate:/var/lib/weaviate

Quick Start

terminal
1# 1. Create the compose file
2cat > docker-compose.yml << 'EOF'
3services:
4 api:
5 image: langgenius/dify-api:latest
6 container_name: dify-api
7 restart: unless-stopped
8 ports:
9 - "${API_PORT:-5001}:5001"
10 environment:
11 - MODE=api
12 - LOG_LEVEL=INFO
13 - SECRET_KEY=${SECRET_KEY}
14 - CONSOLE_WEB_URL=http://localhost:${WEB_PORT:-3000}
15 - INIT_PASSWORD=${INIT_PASSWORD:-password}
16 - DB_USERNAME=${DB_USERNAME:-postgres}
17 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}
18 - DB_HOST=db
19 - DB_PORT=5432
20 - DB_DATABASE=dify
21 - REDIS_HOST=redis
22 - REDIS_PORT=6379
23 - VECTOR_STORE=weaviate
24 - WEAVIATE_ENDPOINT=http://weaviate:8080
25 volumes:
26 - ./volumes/app/storage:/app/api/storage
27 depends_on:
28 - db
29 - redis
30 - weaviate
31
32 worker:
33 image: langgenius/dify-api:latest
34 container_name: dify-worker
35 restart: unless-stopped
36 environment:
37 - MODE=worker
38 - LOG_LEVEL=INFO
39 - SECRET_KEY=${SECRET_KEY}
40 - DB_USERNAME=${DB_USERNAME:-postgres}
41 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}
42 - DB_HOST=db
43 - DB_PORT=5432
44 - DB_DATABASE=dify
45 - REDIS_HOST=redis
46 - REDIS_PORT=6379
47 - VECTOR_STORE=weaviate
48 - WEAVIATE_ENDPOINT=http://weaviate:8080
49 volumes:
50 - ./volumes/app/storage:/app/api/storage
51 depends_on:
52 - db
53 - redis
54 - weaviate
55
56 web:
57 image: langgenius/dify-web:latest
58 container_name: dify-web
59 restart: unless-stopped
60 ports:
61 - "${WEB_PORT:-3000}:3000"
62 environment:
63 - CONSOLE_API_URL=http://api:5001
64 - APP_API_URL=http://api:5001
65
66 db:
67 image: postgres:15-alpine
68 container_name: dify-db
69 restart: unless-stopped
70 environment:
71 - POSTGRES_USER=${DB_USERNAME:-postgres}
72 - POSTGRES_PASSWORD=${DB_PASSWORD:-difyai123456}
73 - POSTGRES_DB=dify
74 volumes:
75 - ./volumes/db/data:/var/lib/postgresql/data
76
77 redis:
78 image: redis:7-alpine
79 container_name: dify-redis
80 restart: unless-stopped
81 volumes:
82 - ./volumes/redis/data:/data
83
84 weaviate:
85 image: semitechnologies/weaviate:latest
86 container_name: dify-weaviate
87 restart: unless-stopped
88 environment:
89 - QUERY_DEFAULTS_LIMIT=25
90 - AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true
91 - PERSISTENCE_DATA_PATH=/var/lib/weaviate
92 - DEFAULT_VECTORIZER_MODULE=none
93 volumes:
94 - ./volumes/weaviate:/var/lib/weaviate
95EOF
96
97# 2. Create the .env file
98cat > .env << 'EOF'
99# Dify Configuration
100WEB_PORT=3000
101API_PORT=5001
102
103# Security (generate with: openssl rand -base64 42)
104SECRET_KEY=your-secret-key-change-this
105
106# Initial admin password
107INIT_PASSWORD=password
108
109# Database
110DB_USERNAME=postgres
111DB_PASSWORD=difyai123456
112
113# Optional: OpenAI API Key for built-in models
114# OPENAI_API_KEY=sk-your-key
115EOF
116
117# 3. Start the services
118docker compose up -d
119
120# 4. View logs
121docker compose logs -f

One-Liner

Run this command to download and set up the recipe in one step:

terminal
1curl -fsSL https://docker.recipes/api/recipes/dify/run | bash

Troubleshooting

  • Weaviate startup fails with 'permission denied' error: Ensure ./volumes/weaviate directory has proper ownership with 'sudo chown -R 999:999 ./volumes/weaviate'
  • Dify API returns 500 errors on knowledge base operations: Check Weaviate connectivity and verify WEAVIATE_ENDPOINT environment variable points to http://weaviate:8080
  • Document upload fails with timeout errors: Increase worker memory allocation and check that both api and worker containers can access shared storage volume
  • PostgreSQL connection refused during startup: Verify DB_PASSWORD matches between api/worker services and postgres POSTGRES_PASSWORD, then restart dependent services
  • Redis memory usage grows continuously: Configure Redis maxmemory policy by adding 'command: redis-server --maxmemory 512mb --maxmemory-policy allkeys-lru' to redis service
  • Web interface shows 'API connection failed': Verify CONSOLE_API_URL in web service matches the internal API service name and port (http://api:5001)

Community Notes

Loading...
Loading notes...

Download Recipe Kit

Get all files in a ready-to-deploy package

Includes docker-compose.yml, .env template, README, and license

Components

dify-apidify-webpostgresredisweaviate

Tags

#ai#llm#rag#chatbot#workflow#no-code

Category

AI & Machine Learning
Ad Space