ELK Stack
Elasticsearch, Logstash, and Kibana for log management and analysis.
Overview
Elasticsearch is a distributed, RESTful search and analytics engine built on Apache Lucene that excels at full-text search, log analytics, and real-time data analysis. Originally developed by Shay Banon in 2010, it has become the cornerstone of modern observability stacks, offering near real-time indexing and powerful aggregation capabilities that make it ideal for processing massive volumes of structured and unstructured data. The ELK Stack combines Elasticsearch's search engine with Logstash's data processing pipeline and Kibana's visualization frontend to create a comprehensive log management and analytics platform. Logstash acts as the data ingestion workhorse, parsing, transforming, and enriching log data from multiple sources before sending it to Elasticsearch for indexing. Kibana provides the visual layer, transforming raw search results into dashboards, charts, and alerts that make complex data insights accessible to both technical and business users. This three-component architecture is particularly valuable for organizations dealing with distributed systems, microservices, or any environment where centralized logging and real-time monitoring are critical. DevOps teams, security analysts, and data engineers rely on this stack to troubleshoot application issues, detect security threats, monitor system performance, and derive business intelligence from operational data.
Key Features
- Full-text search with relevance scoring and fuzzy matching across structured and unstructured log data
- Near real-time indexing and search capabilities with sub-second query response times
- Logstash pipeline processing with 200+ input, filter, and output plugins for data transformation
- Distributed architecture with automatic sharding and replication for horizontal scaling
- Kibana's interactive dashboards with drill-down capabilities and time-based data exploration
- Advanced aggregations for statistical analysis, histograms, and geospatial data processing
- RESTful API access to Elasticsearch for custom integrations and programmatic queries
- Built-in alerting and machine learning anomaly detection for proactive monitoring
Common Use Cases
- 1Centralized application logging for microservices architectures and distributed systems
- 2Security information and event management (SIEM) for threat detection and compliance
- 3Application performance monitoring (APM) with error tracking and performance metrics
- 4E-commerce search functionality with autocomplete, faceted search, and product recommendations
- 5Infrastructure monitoring and capacity planning using system and network logs
- 6Business intelligence dashboards combining operational data with business metrics
- 7DevOps troubleshooting and root cause analysis for production incidents
Prerequisites
- Minimum 4GB RAM available (2GB for Elasticsearch, 1GB each for Logstash and Kibana)
- Docker and Docker Compose installed with sufficient disk space for log data retention
- Available ports 9200 (Elasticsearch), 5601 (Kibana), 5044 and 9600 (Logstash)
- Understanding of JSON document structure and basic query concepts for Elasticsearch
- Logstash pipeline configuration knowledge for custom data parsing and transformation
- Basic familiarity with Lucene query syntax for advanced search operations
For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms
docker-compose.yml
docker-compose.yml
1services: 2 elasticsearch: 3 image: docker.elastic.co/elasticsearch/elasticsearch:8.11.04 container_name: elasticsearch5 environment: 6 - discovery.type=single-node7 - xpack.security.enabled=false8 - "ES_JAVA_OPTS=-Xms512m -Xmx512m"9 volumes: 10 - es_data:/usr/share/elasticsearch/data11 ports: 12 - "9200:9200"13 networks: 14 - elk1516 logstash: 17 image: docker.elastic.co/logstash/logstash:8.11.018 container_name: logstash19 volumes: 20 - ./logstash/pipeline:/usr/share/logstash/pipeline:ro21 ports: 22 - "5044:5044"23 - "9600:9600"24 depends_on: 25 - elasticsearch26 networks: 27 - elk2829 kibana: 30 image: docker.elastic.co/kibana/kibana:8.11.031 container_name: kibana32 environment: 33 ELASTICSEARCH_HOSTS: http://elasticsearch:920034 ports: 35 - "5601:5601"36 depends_on: 37 - elasticsearch38 networks: 39 - elk4041volumes: 42 es_data: 4344networks: 45 elk: 46 driver: bridge.env Template
.env
1ES_JAVA_OPTS=-Xms512m -Xmx512mUsage Notes
- 1Docs: https://www.elastic.co/guide/index.html
- 2Kibana at http://localhost:5601 - create index patterns under Stack Management
- 3Elasticsearch API at http://localhost:9200 - check health with /_cluster/health
- 4Logstash beats input on 5044 - configure Filebeat/Metricbeat to send here
- 5Increase ES_JAVA_OPTS for larger deployments (min 4GB recommended)
- 6Security disabled by default - enable xpack.security for production
Individual Services(3 services)
Copy individual services to mix and match with your existing compose files.
elasticsearch
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:8.11.0
container_name: elasticsearch
environment:
- discovery.type=single-node
- xpack.security.enabled=false
- ES_JAVA_OPTS=-Xms512m -Xmx512m
volumes:
- es_data:/usr/share/elasticsearch/data
ports:
- "9200:9200"
networks:
- elk
logstash
logstash:
image: docker.elastic.co/logstash/logstash:8.11.0
container_name: logstash
volumes:
- ./logstash/pipeline:/usr/share/logstash/pipeline:ro
ports:
- "5044:5044"
- "9600:9600"
depends_on:
- elasticsearch
networks:
- elk
kibana
kibana:
image: docker.elastic.co/kibana/kibana:8.11.0
container_name: kibana
environment:
ELASTICSEARCH_HOSTS: http://elasticsearch:9200
ports:
- "5601:5601"
depends_on:
- elasticsearch
networks:
- elk
Quick Start
terminal
1# 1. Create the compose file2cat > docker-compose.yml << 'EOF'3services:4 elasticsearch:5 image: docker.elastic.co/elasticsearch/elasticsearch:8.11.06 container_name: elasticsearch7 environment:8 - discovery.type=single-node9 - xpack.security.enabled=false10 - "ES_JAVA_OPTS=-Xms512m -Xmx512m"11 volumes:12 - es_data:/usr/share/elasticsearch/data13 ports:14 - "9200:9200"15 networks:16 - elk1718 logstash:19 image: docker.elastic.co/logstash/logstash:8.11.020 container_name: logstash21 volumes:22 - ./logstash/pipeline:/usr/share/logstash/pipeline:ro23 ports:24 - "5044:5044"25 - "9600:9600"26 depends_on:27 - elasticsearch28 networks:29 - elk3031 kibana:32 image: docker.elastic.co/kibana/kibana:8.11.033 container_name: kibana34 environment:35 ELASTICSEARCH_HOSTS: http://elasticsearch:920036 ports:37 - "5601:5601"38 depends_on:39 - elasticsearch40 networks:41 - elk4243volumes:44 es_data:4546networks:47 elk:48 driver: bridge49EOF5051# 2. Create the .env file52cat > .env << 'EOF'53ES_JAVA_OPTS=-Xms512m -Xmx512m54EOF5556# 3. Start the services57docker compose up -d5859# 4. View logs60docker compose logs -fOne-Liner
Run this command to download and set up the recipe in one step:
terminal
1curl -fsSL https://docker.recipes/api/recipes/elk-stack/run | bashTroubleshooting
- Elasticsearch cluster health red/yellow: Check disk space and increase heap size in ES_JAVA_OPTS
- Kibana shows 'Elasticsearch cluster did not respond': Verify Elasticsearch is running and accessible on port 9200
- Logstash pipeline not processing data: Check pipeline configuration syntax and Elasticsearch connectivity
- Out of memory errors: Increase Docker container memory limits and JVM heap sizes for all components
- Slow query performance: Add proper field mappings and consider index optimization strategies
- Data not appearing in Kibana: Verify index patterns match your data indices and check time field configuration
Community Notes
Loading...
Loading notes...
Download Recipe Kit
Get all files in a ready-to-deploy package
Includes docker-compose.yml, .env template, README, and license
Components
elasticsearchlogstashkibana
Tags
#elasticsearch#logstash#kibana#logging
Category
Monitoring & ObservabilityAd Space
Shortcuts: C CopyF FavoriteD Download