Lago
Open-source metering and usage-based billing.
Overview
Lago is an open-source metering and usage-based billing platform designed to help businesses implement sophisticated pricing models beyond traditional subscription fees. Developed to address the complexity of modern SaaS billing requirements, Lago enables companies to track customer usage across multiple dimensions and automatically generate accurate invoices based on consumption patterns, subscription tiers, or hybrid models combining both approaches.
This Docker configuration combines Lago's API and frontend components with PostgreSQL for reliable billing data storage and Redis for high-performance session management and background job processing. PostgreSQL's ACID compliance ensures billing accuracy and data integrity - critical requirements when handling financial transactions and customer invoicing. Redis accelerates the platform by caching frequently accessed pricing rules, managing user sessions, and queuing billing calculations that need to process large volumes of usage events.
This stack targets SaaS companies, API providers, and usage-based service businesses that need to move beyond simple monthly subscriptions. Development teams building platforms with pay-per-use models, tiered pricing, or complex billing rules will find Lago particularly valuable for replacing expensive third-party billing solutions with a self-hosted alternative that offers complete control over pricing logic and customer data.
Key Features
- Real-time usage metering with billable metrics tracking API calls, storage consumption, and custom events
- Advanced pricing models including graduated tiers, package deals, percentage-based fees, and volume discounts
- PostgreSQL-backed invoice generation with automated tax calculations and multi-currency support
- Redis-powered background processing for high-volume usage aggregation and billing calculations
- Webhook system for real-time notifications on invoice creation, payment events, and subscription changes
- Customer portal integration allowing end-users to view usage analytics and billing history
- Subscription lifecycle management with proration, upgrades, downgrades, and custom billing cycles
- Multi-tenant architecture supporting usage isolation and billing segmentation across customer organizations
Common Use Cases
- 1API-first companies billing customers based on request volume, bandwidth usage, or compute resources consumed
- 2Cloud storage providers implementing tiered pricing with different rates for storage, bandwidth, and API operations
- 3SaaS platforms combining monthly subscriptions with overage charges for premium features or usage limits
- 4Development teams building internal billing systems for multi-product companies with complex pricing structures
- 5Telecommunications and IoT service providers tracking device usage, data consumption, and service utilization
- 6Marketplace platforms charging transaction fees, listing fees, and usage-based commission structures
- 7Enterprise software vendors implementing seat-based licensing with additional usage-based feature charges
Prerequisites
- Minimum 2GB RAM recommended for PostgreSQL billing data processing and Redis caching operations
- Available ports 3000 (Lago API), 8080 (web interface), 5432 (PostgreSQL), and 6379 (Redis)
- Understanding of billing concepts like metering, pricing plans, and invoice generation workflows
- Environment variables configured for database credentials and Lago's secret key authentication
- Knowledge of webhook integrations for connecting billing events to payment processors or accounting systems
- Familiarity with usage-based billing models and how to define billable metrics for your specific use case
For development & testing. Review security settings, change default credentials, and test thoroughly before production use. See Terms
docker-compose.yml
docker-compose.yml
1services: 2 lago-api: 3 image: getlago/api:latest4 container_name: lago-api5 restart: unless-stopped6 environment: 7 DATABASE_URL: postgres://${DB_USER}:${DB_PASSWORD}@postgres:5432/${DB_NAME}8 REDIS_URL: redis://redis:63799 SECRET_KEY_BASE: ${SECRET_KEY}10 ports: 11 - "3000:3000"12 depends_on: 13 - postgres14 - redis15 networks: 16 - lago1718 lago-front: 19 image: getlago/front:latest20 container_name: lago-front21 environment: 22 API_URL: http://lago-api:300023 ports: 24 - "8080:80"25 depends_on: 26 - lago-api27 networks: 28 - lago2930 postgres: 31 image: postgres:16-alpine32 container_name: lago-postgres33 environment: 34 POSTGRES_DB: ${DB_NAME}35 POSTGRES_USER: ${DB_USER}36 POSTGRES_PASSWORD: ${DB_PASSWORD}37 volumes: 38 - postgres_data:/var/lib/postgresql/data39 networks: 40 - lago4142 redis: 43 image: redis:alpine44 container_name: lago-redis45 networks: 46 - lago4748volumes: 49 postgres_data: 5051networks: 52 lago: 53 driver: bridge.env Template
.env
1DB_NAME=lago2DB_USER=lago3DB_PASSWORD=changeme4SECRET_KEY=your-secret-keyUsage Notes
- 1Docs: https://docs.getlago.com/
- 2UI at http://localhost:8080, API at http://localhost:3000
- 3Create billable metrics for metering (API calls, storage, etc)
- 4Define pricing plans: flat fees, graduated, package, percentage
- 5Supports usage-based, subscription, and hybrid billing models
- 6Webhook events for invoice, payment, subscription lifecycle
Individual Services(4 services)
Copy individual services to mix and match with your existing compose files.
lago-api
lago-api:
image: getlago/api:latest
container_name: lago-api
restart: unless-stopped
environment:
DATABASE_URL: postgres://${DB_USER}:${DB_PASSWORD}@postgres:5432/${DB_NAME}
REDIS_URL: redis://redis:6379
SECRET_KEY_BASE: ${SECRET_KEY}
ports:
- "3000:3000"
depends_on:
- postgres
- redis
networks:
- lago
lago-front
lago-front:
image: getlago/front:latest
container_name: lago-front
environment:
API_URL: http://lago-api:3000
ports:
- "8080:80"
depends_on:
- lago-api
networks:
- lago
postgres
postgres:
image: postgres:16-alpine
container_name: lago-postgres
environment:
POSTGRES_DB: ${DB_NAME}
POSTGRES_USER: ${DB_USER}
POSTGRES_PASSWORD: ${DB_PASSWORD}
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- lago
redis
redis:
image: redis:alpine
container_name: lago-redis
networks:
- lago
Quick Start
terminal
1# 1. Create the compose file2cat > docker-compose.yml << 'EOF'3services:4 lago-api:5 image: getlago/api:latest6 container_name: lago-api7 restart: unless-stopped8 environment:9 DATABASE_URL: postgres://${DB_USER}:${DB_PASSWORD}@postgres:5432/${DB_NAME}10 REDIS_URL: redis://redis:637911 SECRET_KEY_BASE: ${SECRET_KEY}12 ports:13 - "3000:3000"14 depends_on:15 - postgres16 - redis17 networks:18 - lago1920 lago-front:21 image: getlago/front:latest22 container_name: lago-front23 environment:24 API_URL: http://lago-api:300025 ports:26 - "8080:80"27 depends_on:28 - lago-api29 networks:30 - lago3132 postgres:33 image: postgres:16-alpine34 container_name: lago-postgres35 environment:36 POSTGRES_DB: ${DB_NAME}37 POSTGRES_USER: ${DB_USER}38 POSTGRES_PASSWORD: ${DB_PASSWORD}39 volumes:40 - postgres_data:/var/lib/postgresql/data41 networks:42 - lago4344 redis:45 image: redis:alpine46 container_name: lago-redis47 networks:48 - lago4950volumes:51 postgres_data:5253networks:54 lago:55 driver: bridge56EOF5758# 2. Create the .env file59cat > .env << 'EOF'60DB_NAME=lago61DB_USER=lago62DB_PASSWORD=changeme63SECRET_KEY=your-secret-key64EOF6566# 3. Start the services67docker compose up -d6869# 4. View logs70docker compose logs -fOne-Liner
Run this command to download and set up the recipe in one step:
terminal
1curl -fsSL https://docker.recipes/api/recipes/lago/run | bashTroubleshooting
- Lago API failing to connect to PostgreSQL: Verify DATABASE_URL format and ensure PostgreSQL container is fully started before lago-api
- Usage events not processing correctly: Check Redis connectivity and ensure REDIS_URL matches the redis service network configuration
- Frontend showing 'API connection failed': Confirm lago-front can reach lago-api container and API_URL environment variable is correctly set
- Database migration errors on startup: Ensure PostgreSQL has sufficient disk space and proper permissions for the postgres_data volume
- Webhook delivery failures: Verify your external webhook endpoints are accessible from the Docker network and return proper HTTP status codes
- High memory usage during billing calculations: Monitor Redis memory consumption and consider increasing container limits during peak billing cycles
Community Notes
Loading...
Loading notes...
Download Recipe Kit
Get all files in a ready-to-deploy package
Includes docker-compose.yml, .env template, README, and license
Ad Space
Shortcuts: C CopyF FavoriteD Download