A microservices-based expense management application designed as a comprehensive DevOps training lab. This solution demonstrates modern containerization, orchestration, and CI/CD practices using different technology stacks.
This lab simulates a real-world microservices architecture with multiple technology stacks:
- Frontend (React) - B2B web application with authentication and expense management
- Backend (Node.js/Express) - REST API service with JWT authentication and message publishing
- Processor (Python) - Integration service simulating third-party data processing
- Lake Publisher (C#) - Data lake integration service for analytics workflows
- Database (MongoDB) - Document-based data persistence
- Message Queue (RabbitMQ) - Asynchronous communication between services
Students will learn to:
- Containerization: Create Dockerfiles and .dockerignore files for each service
- Local Orchestration: Build docker-compose configuration for local development
- Kubernetes Deployment: Design Helm charts for multi-environment deployments
- CI/CD Pipeline: Implement GitHub Actions for automated testing, building, and deployment
- Microservices Patterns: Understand service communication, data consistency, and integration patterns
Install the following on your system:
- Node.js (v16+) and npm
- Python (v3.8+) and pip
- .NET 9.0 SDK or later
- MongoDB (running on default port 27017)
- RabbitMQ (running on default port 5672)
cd packages/backend
npm install
npm startRuns on: http://localhost:3000
cd packages/frontend
npm install
npm startRuns on: http://localhost:3030
cd packages/processor
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
python processor.pycd packages/lakepublisher
dotnet restore
dotnet run- User authentication with username/password login
- Create, view, edit, delete expenses with exported status
- File attachment upload/download
- Modal-based expense details with inline editing
- Export status tracking and display
- Responsive design
- JWT-based authentication with configurable users
- RESTful API with full CRUD operations
- File upload and secure download endpoints
- RabbitMQ message publishing for all operations
- MongoDB integration with Mongoose ODM
- Exported field management for data lake integration
- Consumes RabbitMQ messages from backend
- Saves all expense events to JSON files
- Configurable output directory
- Graceful error handling
- Token-based API authentication for secure access
- Exports approved expenses to Apache Parquet format
- Hierarchical date-based folder structure (yyyy/mm/dd)
- Updates exported status to prevent duplicates
- Configurable filtering and output paths
Each component uses environment variables:
- Backend:
packages/backend/.env - Processor:
packages/processor/.env - Lake Publisher:
packages/lakepublisher/.env - Frontend: Runtime configuration via
public/config.js
- Create Dockerfiles for each service (frontend, backend, processor, lakepublisher)
- Configure .dockerignore files to optimize build contexts
- Build and test individual container images
- Design docker-compose.yml with all services, MongoDB, and RabbitMQ
- Configure service networking and environment variables
- Implement health checks and dependency management
- Create Helm charts for multi-environment deployment
- Configure ConfigMaps, Secrets, and persistent volumes
- Implement service discovery and load balancing
- Set up ingress controllers and SSL termination
- Build GitHub Actions workflows for automated testing
- Implement multi-stage builds and security scanning
- Configure automated deployment to staging and production
- Set up monitoring and alerting
- Start MongoDB and RabbitMQ services
- Start backend API server
- Start frontend development server
- Start message processor (simulates third-party integration)
- Run lake publisher (simulates data lake integration)
- Access application at http://localhost:3030
Frontend → Backend API → MongoDB
↓
RabbitMQ → Processor → JSON Files
Backend API → Lake Publisher → Parquet Files
packages/
├── backend/ # Node.js API server
├── frontend/ # React application
├── processor/ # Python message consumer
└── lakepublisher/ # C# data lake publisher
- Frontend + Backend: Simulates a B2B expense management platform
- Python Processor: Simulates integration with external audit systems (saves to filesystem but could send to third-party APIs)
- C# Lake Publisher: Simulates data lake integration for analytics and reporting workflows
- Dockerfiles for each service with optimized layers
- .dockerignore files to minimize build contexts
- docker-compose.yml for local development environment
- Helm charts for Kubernetes deployment across environments
- GitHub Actions workflows for CI/CD automation
- Documentation explaining architectural decisions and deployment strategies
For detailed setup and configuration of individual services, see README files in each package directory.
This repository is structured as a progressive learning path through different branches, each building upon the previous to demonstrate complete DevOps transformation:
main- Starting point with pure application code (what developers deliver to DevOps teams)phase1/containers- Basic containerization with Dockerfiles, docker-compose for local development, and GitHub Actions workflow for automated container builds and registry publishingphase2/kubernetes- Kubernetes deployment manifests and basic orchestration setupphase3/helm- Helm charts for templated Kubernetes deployments with environment managementphase4/devsecops- Security integration with SAST, container scanning, and infrastructure security validationphase5/gitops- Complete GitOps implementation with Terraform infrastructure as code and automated deploymentsphase6/secrets-management- SOPS encryption with AWS KMS for secure secrets management without long-lived credentialsphase7/versioning- Semantic versioning with GitVersion for automated release management, container tagging, and dependency management with Renovate - complete production-ready solution
Recommended Learning Sequence: Start with main (raw application code) and progress through each phase to understand how DevOps practices transform a basic application into a production-ready system, culminating in the complete enterprise solution on phase7/versioning branch.
Disclaimer: This project is designed for educational purposes to demonstrate DevOps automation workflows and best practices. While we strive for accuracy, we do not guarantee that every component will function flawlessly, as bugs may be present in the code examples.