Dify
Open-source LLM app development platform for building AI-native applications
Overview
Dify is an open-source LLM app development platform that enables developers and non-technical users to build AI-native applications through visual workflow editors and no-code interfaces. Created by Langgenius, Dify abstracts the complexity of working with Large Language Models by providing pre-built components for common AI patterns like chatbots, content generation, and document analysis. The platform supports multiple LLM providers including OpenAI, Anthropic, and open-source models, making it a vendor-agnostic solution for AI application development.
This Docker stack combines Dify's API backend and web frontend with a robust data infrastructure designed specifically for AI workloads. PostgreSQL serves as the primary database for application metadata, user management, and workflow configurations, while Redis handles session management, task queues, and real-time communication between the API and worker processes. Weaviate acts as the vector database, storing document embeddings and enabling semantic search capabilities for Retrieval-Augmented Generation (RAG) applications. The worker service processes background tasks like document indexing, model inference, and workflow execution.
This configuration is ideal for AI product teams, startups building AI features, and enterprises wanting to democratize AI development across their organization. The visual workflow builder allows product managers and domain experts to create sophisticated AI applications without coding, while developers can extend functionality through APIs and custom integrations. The multi-service architecture ensures scalability from prototype to production, with each component handling specific aspects of the AI application lifecycle.
Key Features
- Visual workflow editor with drag-and-drop nodes for building complex AI applications without coding
- Multi-LLM provider support with unified APIs for OpenAI, Anthropic, Azure OpenAI, and open-source models
- Built-in RAG pipeline with document upload, chunking, embedding generation, and semantic search via Weaviate
- Conversation memory management using Redis for maintaining context across multi-turn interactions
- Background worker processing for heavy AI tasks like document indexing and batch inference operations
- PostgreSQL-backed application versioning with rollback capabilities for workflow iterations
- Real-time streaming responses with WebSocket support for chat-based applications
- Built-in prompt engineering tools with variable injection and template management
Common Use Cases
- 1Customer support chatbots with company knowledge base integration and handoff to human agents
- 2Content generation workflows for marketing teams creating blog posts, social media, and email campaigns
- 3Document analysis and summarization systems for legal, financial, and research organizations
- 4Internal knowledge management platforms with natural language querying of company documentation
- 5E-commerce product recommendation engines combining user behavior with semantic product matching
- 6Educational content creation tools for generating quizzes, explanations, and personalized learning paths
- 7Code review and documentation assistants for development teams with repository-specific context
Prerequisites
- Minimum 6GB RAM (2GB for Weaviate, 1GB for PostgreSQL, 1GB for Dify services, 2GB for system overhead)
- Docker Engine 20.10+ and Docker Compose 2.0+ with support for depends_on conditions
- At least one LLM provider API key (OpenAI, Anthropic, etc.) configured after deployment
- Ports 3000 and 5001 available for web interface and API access respectively
- Basic understanding of AI concepts like embeddings, vector search, and prompt engineering
- 50GB+ disk space for document storage, vector indices, and database growth over time
For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms
docker-compose.yml
docker-compose.yml
1services: 2 api: 3 image: langgenius/dify-api:latest4 container_name: dify-api5 restart: unless-stopped6 ports: 7 - "${API_PORT:-5001}:5001"8 environment: 9 - MODE=api10 - LOG_LEVEL=INFO11 - SECRET_KEY=${SECRET_KEY}12 - CONSOLE_WEB_URL=http://localhost:${WEB_PORT:-3000}13 - INIT_PASSWORD=${INIT_PASSWORD:-password}14 - DB_USERNAME=${DB_USERNAME:-postgres}15 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}16 - DB_HOST=db17 - DB_PORT=543218 - DB_DATABASE=dify19 - REDIS_HOST=redis20 - REDIS_PORT=637921 - VECTOR_STORE=weaviate22 - WEAVIATE_ENDPOINT=http://weaviate:808023 volumes: 24 - ./volumes/app/storage:/app/api/storage25 depends_on: 26 - db27 - redis28 - weaviate2930 worker: 31 image: langgenius/dify-api:latest32 container_name: dify-worker33 restart: unless-stopped34 environment: 35 - MODE=worker36 - LOG_LEVEL=INFO37 - SECRET_KEY=${SECRET_KEY}38 - DB_USERNAME=${DB_USERNAME:-postgres}39 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}40 - DB_HOST=db41 - DB_PORT=543242 - DB_DATABASE=dify43 - REDIS_HOST=redis44 - REDIS_PORT=637945 - VECTOR_STORE=weaviate46 - WEAVIATE_ENDPOINT=http://weaviate:808047 volumes: 48 - ./volumes/app/storage:/app/api/storage49 depends_on: 50 - db51 - redis52 - weaviate5354 web: 55 image: langgenius/dify-web:latest56 container_name: dify-web57 restart: unless-stopped58 ports: 59 - "${WEB_PORT:-3000}:3000"60 environment: 61 - CONSOLE_API_URL=http://api:500162 - APP_API_URL=http://api:50016364 db: 65 image: postgres:15-alpine66 container_name: dify-db67 restart: unless-stopped68 environment: 69 - POSTGRES_USER=${DB_USERNAME:-postgres}70 - POSTGRES_PASSWORD=${DB_PASSWORD:-difyai123456}71 - POSTGRES_DB=dify72 volumes: 73 - ./volumes/db/data:/var/lib/postgresql/data7475 redis: 76 image: redis:7-alpine77 container_name: dify-redis78 restart: unless-stopped79 volumes: 80 - ./volumes/redis/data:/data8182 weaviate: 83 image: semitechnologies/weaviate:latest84 container_name: dify-weaviate85 restart: unless-stopped86 environment: 87 - QUERY_DEFAULTS_LIMIT=2588 - AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true89 - PERSISTENCE_DATA_PATH=/var/lib/weaviate90 - DEFAULT_VECTORIZER_MODULE=none91 volumes: 92 - ./volumes/weaviate:/var/lib/weaviate.env Template
.env
1# Dify Configuration2WEB_PORT=30003API_PORT=500145# Security (generate with: openssl rand -base64 42)6SECRET_KEY=your-secret-key-change-this78# Initial admin password9INIT_PASSWORD=password1011# Database12DB_USERNAME=postgres13DB_PASSWORD=difyai1234561415# Optional: OpenAI API Key for built-in models16# OPENAI_API_KEY=sk-your-keyUsage Notes
- 1Web console at http://localhost:3000
- 2API endpoint at http://localhost:5001
- 3First login: admin@example.com / password (from INIT_PASSWORD)
- 4Add LLM providers in Settings > Model Providers
- 5Build apps with visual workflow editor
- 6RAG knowledge base with document upload
Individual Services(6 services)
Copy individual services to mix and match with your existing compose files.
api
api:
image: langgenius/dify-api:latest
container_name: dify-api
restart: unless-stopped
ports:
- ${API_PORT:-5001}:5001
environment:
- MODE=api
- LOG_LEVEL=INFO
- SECRET_KEY=${SECRET_KEY}
- CONSOLE_WEB_URL=http://localhost:${WEB_PORT:-3000}
- INIT_PASSWORD=${INIT_PASSWORD:-password}
- DB_USERNAME=${DB_USERNAME:-postgres}
- DB_PASSWORD=${DB_PASSWORD:-difyai123456}
- DB_HOST=db
- DB_PORT=5432
- DB_DATABASE=dify
- REDIS_HOST=redis
- REDIS_PORT=6379
- VECTOR_STORE=weaviate
- WEAVIATE_ENDPOINT=http://weaviate:8080
volumes:
- ./volumes/app/storage:/app/api/storage
depends_on:
- db
- redis
- weaviate
worker
worker:
image: langgenius/dify-api:latest
container_name: dify-worker
restart: unless-stopped
environment:
- MODE=worker
- LOG_LEVEL=INFO
- SECRET_KEY=${SECRET_KEY}
- DB_USERNAME=${DB_USERNAME:-postgres}
- DB_PASSWORD=${DB_PASSWORD:-difyai123456}
- DB_HOST=db
- DB_PORT=5432
- DB_DATABASE=dify
- REDIS_HOST=redis
- REDIS_PORT=6379
- VECTOR_STORE=weaviate
- WEAVIATE_ENDPOINT=http://weaviate:8080
volumes:
- ./volumes/app/storage:/app/api/storage
depends_on:
- db
- redis
- weaviate
web
web:
image: langgenius/dify-web:latest
container_name: dify-web
restart: unless-stopped
ports:
- ${WEB_PORT:-3000}:3000
environment:
- CONSOLE_API_URL=http://api:5001
- APP_API_URL=http://api:5001
db
db:
image: postgres:15-alpine
container_name: dify-db
restart: unless-stopped
environment:
- POSTGRES_USER=${DB_USERNAME:-postgres}
- POSTGRES_PASSWORD=${DB_PASSWORD:-difyai123456}
- POSTGRES_DB=dify
volumes:
- ./volumes/db/data:/var/lib/postgresql/data
redis
redis:
image: redis:7-alpine
container_name: dify-redis
restart: unless-stopped
volumes:
- ./volumes/redis/data:/data
weaviate
weaviate:
image: semitechnologies/weaviate:latest
container_name: dify-weaviate
restart: unless-stopped
environment:
- QUERY_DEFAULTS_LIMIT=25
- AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true
- PERSISTENCE_DATA_PATH=/var/lib/weaviate
- DEFAULT_VECTORIZER_MODULE=none
volumes:
- ./volumes/weaviate:/var/lib/weaviate
Quick Start
terminal
1# 1. Create the compose file2cat > docker-compose.yml << 'EOF'3services:4 api:5 image: langgenius/dify-api:latest6 container_name: dify-api7 restart: unless-stopped8 ports:9 - "${API_PORT:-5001}:5001"10 environment:11 - MODE=api12 - LOG_LEVEL=INFO13 - SECRET_KEY=${SECRET_KEY}14 - CONSOLE_WEB_URL=http://localhost:${WEB_PORT:-3000}15 - INIT_PASSWORD=${INIT_PASSWORD:-password}16 - DB_USERNAME=${DB_USERNAME:-postgres}17 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}18 - DB_HOST=db19 - DB_PORT=543220 - DB_DATABASE=dify21 - REDIS_HOST=redis22 - REDIS_PORT=637923 - VECTOR_STORE=weaviate24 - WEAVIATE_ENDPOINT=http://weaviate:808025 volumes:26 - ./volumes/app/storage:/app/api/storage27 depends_on:28 - db29 - redis30 - weaviate3132 worker:33 image: langgenius/dify-api:latest34 container_name: dify-worker35 restart: unless-stopped36 environment:37 - MODE=worker38 - LOG_LEVEL=INFO39 - SECRET_KEY=${SECRET_KEY}40 - DB_USERNAME=${DB_USERNAME:-postgres}41 - DB_PASSWORD=${DB_PASSWORD:-difyai123456}42 - DB_HOST=db43 - DB_PORT=543244 - DB_DATABASE=dify45 - REDIS_HOST=redis46 - REDIS_PORT=637947 - VECTOR_STORE=weaviate48 - WEAVIATE_ENDPOINT=http://weaviate:808049 volumes:50 - ./volumes/app/storage:/app/api/storage51 depends_on:52 - db53 - redis54 - weaviate5556 web:57 image: langgenius/dify-web:latest58 container_name: dify-web59 restart: unless-stopped60 ports:61 - "${WEB_PORT:-3000}:3000"62 environment:63 - CONSOLE_API_URL=http://api:500164 - APP_API_URL=http://api:50016566 db:67 image: postgres:15-alpine68 container_name: dify-db69 restart: unless-stopped70 environment:71 - POSTGRES_USER=${DB_USERNAME:-postgres}72 - POSTGRES_PASSWORD=${DB_PASSWORD:-difyai123456}73 - POSTGRES_DB=dify74 volumes:75 - ./volumes/db/data:/var/lib/postgresql/data7677 redis:78 image: redis:7-alpine79 container_name: dify-redis80 restart: unless-stopped81 volumes:82 - ./volumes/redis/data:/data8384 weaviate:85 image: semitechnologies/weaviate:latest86 container_name: dify-weaviate87 restart: unless-stopped88 environment:89 - QUERY_DEFAULTS_LIMIT=2590 - AUTHENTICATION_ANONYMOUS_ACCESS_ENABLED=true91 - PERSISTENCE_DATA_PATH=/var/lib/weaviate92 - DEFAULT_VECTORIZER_MODULE=none93 volumes:94 - ./volumes/weaviate:/var/lib/weaviate95EOF9697# 2. Create the .env file98cat > .env << 'EOF'99# Dify Configuration100WEB_PORT=3000101API_PORT=5001102103# Security (generate with: openssl rand -base64 42)104SECRET_KEY=your-secret-key-change-this105106# Initial admin password107INIT_PASSWORD=password108109# Database110DB_USERNAME=postgres111DB_PASSWORD=difyai123456112113# Optional: OpenAI API Key for built-in models114# OPENAI_API_KEY=sk-your-key115EOF116117# 3. Start the services118docker compose up -d119120# 4. View logs121docker compose logs -fOne-Liner
Run this command to download and set up the recipe in one step:
terminal
1curl -fsSL https://docker.recipes/api/recipes/dify/run | bashTroubleshooting
- Weaviate startup fails with 'permission denied' error: Ensure ./volumes/weaviate directory has proper ownership with 'sudo chown -R 999:999 ./volumes/weaviate'
- Dify API returns 500 errors on knowledge base operations: Check Weaviate connectivity and verify WEAVIATE_ENDPOINT environment variable points to http://weaviate:8080
- Document upload fails with timeout errors: Increase worker memory allocation and check that both api and worker containers can access shared storage volume
- PostgreSQL connection refused during startup: Verify DB_PASSWORD matches between api/worker services and postgres POSTGRES_PASSWORD, then restart dependent services
- Redis memory usage grows continuously: Configure Redis maxmemory policy by adding 'command: redis-server --maxmemory 512mb --maxmemory-policy allkeys-lru' to redis service
- Web interface shows 'API connection failed': Verify CONSOLE_API_URL in web service matches the internal API service name and port (http://api:5001)
Community Notes
Loading...
Loading notes...
Download Recipe Kit
Get all files in a ready-to-deploy package
Includes docker-compose.yml, .env template, README, and license
Components
dify-apidify-webpostgresredisweaviate
Tags
#ai#llm#rag#chatbot#workflow#no-code
Category
AI & Machine LearningAd Space
Shortcuts: C CopyF FavoriteD Download