A Golang-based SSE (Server-Sent Events) virtualization layer that maintains persistent connections while only invoking serverless functions on-demand. This architecture optimizes costs and improves user experience by separating connection management from function execution.
- Persistent SSE Connections: Maintains long-lived connections with automatic heartbeat
- On-Demand Function Invocation: Functions only run when needed, optimizing costs
- Redis State Management: Reliable state persistence and connection tracking
- Real-time Monitoring: Built-in health checks and metrics
- Function Registry: Dynamic function registration and health monitoring
- High Performance: Built with Go for optimal performance and concurrency
# Start the entire stack
docker-compose up -d
# View logs
docker-compose logs -f
# Stop the stack
docker-compose down- Clone and setup:
git clone https://github.com/rosaboyle/sse-virtualization-layer.git
cd sse-virtualization-layer
cp .env.example .env- Start Redis:
docker run -d --name redis -p 6379:6379 redis:7-alpine- Install dependencies:
go mod tidy- Run the application:
go run main.goGET /sse/{clientId}
Establishes persistent SSE connection for a client.
Parameters:
clientId(path): Unique identifier for the clientX-User-ID(header, optional): User identifier- Query parameters become connection metadata
Example:
curl -N -H "Accept: text/event-stream" \
"http://localhost:8080/sse/client123?app=myapp&version=1.0"POST /admin/functions
Register a new serverless function.
Request Body:
{
"name": "my-function",
"endpoint": "https://my-serverless-function.vercel.app/api/process",
"method": "POST",
"timeout": "30s",
"description": "My awesome serverless function",
"headers": {
"Authorization": "Bearer token"
}
}POST /invoke/{functionName}
Invoke a registered function. Results can be sent via SSE or HTTP response.
Request Body:
{
"payload": {
"message": "Hello, World!",
"data": [1, 2, 3]
},
"client_id": "client123",
"async": true,
"timeout": 60
}Get All Connections:
GET /admin/connections
Health Check:
GET /admin/health
Get All Functions:
GET /admin/functions
Set environment variables:
export PORT=8080
export REDIS_ADDR=localhost:6379
export REDIS_PASSWORD=""┌─────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Clients │◄──►│ SSE Gateway │◄──►│ Connection Mgr │
│ │ │ │ │ │
│ - Browser │ │ - HTTP Handlers │ │ - State Mgmt │
│ - Mobile │ │ - SSE Streaming │ │ - Heartbeat │
│ - IoT │ │ - CORS Support │ │ - Cleanup │
└─────────────┘ └──────────────────┘ └─────────────────┘
│ │
▼ ▼
┌──────────────────┐ ┌─────────────────┐
│ Function Router │◄──►│ Redis │
│ │ │ │
│ - Load Balancing │ │ - Connections │
│ - Health Checks │ │ - Functions │
│ - Retry Logic │ │ - Metrics │
└──────────────────┘ └─────────────────┘
│
▼
┌──────────────────┐
│ Serverless Funcs │
│ │
│ - AWS Lambda │
│ - Vercel │
│ - Netlify │
│ - Custom APIs │
└──────────────────┘
const eventSource = new EventSource('http://localhost:8080/sse/client123');
eventSource.onopen = function(event) {
console.log('Connected to SSE stream');
};
eventSource.addEventListener('connected', function(event) {
const data = JSON.parse(event.data);
console.log('Connection established:', data);
});
eventSource.addEventListener('function_response', function(event) {
const response = JSON.parse(event.data);
console.log('Function result:', response);
});
eventSource.addEventListener('heartbeat', function(event) {
console.log('Heartbeat received');
});
eventSource.onerror = function(event) {
console.error('SSE error:', event);
};curl -X POST http://localhost:8080/admin/functions \
-H "Content-Type: application/json" \
-d '{
"name": "text-processor",
"endpoint": "https://my-api.vercel.app/api/process",
"method": "POST",
"timeout": "30s",
"description": "Processes text input"
}'curl -X POST http://localhost:8080/invoke/text-processor \
-H "Content-Type: application/json" \
-d '{
"payload": {
"text": "Hello, World!",
"operation": "uppercase"
},
"client_id": "client123",
"async": true
}'- Cost Optimization: Serverless functions only run when needed
- Better UX: No connection drops or reconnection delays
- Scalability: Independent scaling of connections vs compute
- Reliability: Connection resilience separate from function health
- Real-time: Instant delivery of function results via SSE
- Flexibility: Support for any HTTP-based serverless function
The system provides built-in monitoring and metrics:
- Connection count and client breakdown
- Function health and availability
- Request/response metrics
- Redis connectivity status
- System uptime and performance
Access monitoring data via:
curl http://localhost:8080/admin/health
curl http://localhost:8080/admin/connectionsBuild:
go build -o virtualization-manager main.goRun tests:
go test ./...Build Docker image:
docker build -t sse-virtualization-layer .This is an open-source project. Contributions are welcome!
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Submit a pull request
MIT License - feel free to use this in your projects!