An intelligent Slack bot for Azure Red Hat OpenShift (ARO) HCP developers that leverages LangGraph and OpenAI to automate infrastructure tasks and provide DevOps assistance through natural language interactions.
┌─────────────┐ ┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ │ │ │ │ │ │ │
│ Slack │───▶│ Flask App │───▶│ LangGraph │───▶│ LLM Tools │
│ │ │ (Webhook) │ │ Workflow │ │ (OpenAI GPT) │
│ @mention │ │ │ │ │ │ │
└─────────────┘ └─────────────────┘ └─────────────────┘ └─────────────────┘
▲ │ │
│ ▼ ▼
│ ┌─────────────────┐ ┌─────────────────┐
│ │ │ │ │
└──────────────────────────────────│ Enhanced │◀───│ Tool Calls │
│ Response │ │ (Jenkins, │
│ Formatter │ │ Infrastructure│
│ │ │ etc.) │
└─────────────────┘ └─────────────────┘
Flow Description:
- Slack Input: User mentions the bot with a query in Slack
- Webhook Processing: Flask app receives the Slack event via webhook
- LangGraph Orchestration: Query is processed through a LangGraph workflow
- LLM Analysis: OpenAI GPT-4o-mini analyzes the request and determines required actions
- Tool Execution: Relevant tools are called (e.g., Jenkins jobs for infrastructure)
- Response Formatting: Results are formatted using LLM for better readability
- Slack Response: Pretty-formatted response is returned to the user in Slack thread
- 🔧 Infrastructure Automation: Trigger Jenkins jobs for ARO infrastructure provisioning
- 💬 Natural Language Interface: Interact using plain English through Slack
- 🧠 Intelligent Processing: Powered by OpenAI GPT-4o-mini for understanding context
- 🔄 Workflow Orchestration: Uses LangGraph for complex multi-step operations
- 📝 Smart Formatting: Automatically formats responses with links, status icons, and structured data
- 🔒 Event Deduplication: Prevents duplicate processing of Slack events
- 🧵 Thread Support: Maintains conversation context in Slack threads
make_infra: Creates personal dev infrastructure by triggering Jenkins jobs
- Python 3.12+
- Slack App with appropriate permissions
- OpenAI API access
- Jenkins server with API access
- UV package manager (recommended)
- Clone the repository:
git clone <repository-url>
cd aro-agent- Install dependencies using UV:
uv sync- Set up environment variables:
Create a
.envfile in the project root:
# Slack Configuration
SLACK_BOT_TOKEN=xoxb-your-bot-token
SLACK_SIGNING_SECRET=your-signing-secret
# OpenAI Configuration
OPENAI_API_KEY=your-openai-api-key
# Jenkins Configuration
JENKINS_URL=https://your-jenkins-server.com
JENKINS_USER_ID=your-jenkins-username
JENKINS_TOKEN=your-jenkins-token
JENKINS_JOB_NAME=make-infra-job- Create a new Slack app at api.slack.com
- Enable Event Subscriptions and set Request URL to:
https://your-domain.com/slack/events - Subscribe to
app_mentionevents - Install the app to your workspace
- Add the bot to channels where you want to use it
app_mentions:read- Listen for mentionschannels:history- Read channel messageschat:write- Send messagesusers:read- Get user information
- Start the agent:
uv run python agent.py- Mention the bot in Slack:
@aro-agent Create the dev infra for me
- Get formatted responses: The bot will process your request, execute necessary tools, and provide a nicely formatted response with relevant links and status information.
Infrastructure Creation:
User: @aro-agent I need to set up dev infrastructure for user 'supatil'
Bot: ✅ Infrastructure Build Triggered Successfully
Job ID: `uuid-here`
User: supatil
Job Name: make-infra-job
🔗 Build Link: View Build #123
Infrastructure build triggered successfully for user: supa
aro-agent/
├── agent.py # Main Flask application and Slack event handler
├── utils.py # Utility functions and response formatting
├── config/
│ ├── graph.py # LangGraph workflow configuration
│ └── llm.py # OpenAI LLM setup and tool binding
├── tools/
│ └── make_infra.py # Infrastructure automation tools
├── pyproject.toml # Project dependencies and metadata
├── .env # Environment variables (create this)
└── README.md # This file
agent.py: Core Flask app that handles Slack webhooks and orchestrates responsesconfig/graph.py: Defines the LangGraph workflow with chatbot → tools → formatter flowconfig/llm.py: Initializes OpenAI LLM with tool bindingstools/make_infra.py: Jenkins integration for infrastructure provisioningutils.py: Response formatting and state management
- Create a new tool in the
tools/directory:
from langchain.tools import StructuredTool
from pydantic import BaseModel
def your_new_tool(param: str) -> str:
"""Your tool description"""
# Tool implementation
return "result"
your_tool = StructuredTool.from_function(
func=your_new_tool,
name="your_tool_name"
)- Register the tool in
config/llm.py:
from tools.your_tool import your_tool
llm_with_tools = llm.bind_tools([make_infra_tool, your_tool])- Add tool node in
config/graph.py:
tool_node = ToolNode(tools=[make_infra_tool, your_tool])# Activate virtual environment
uv shell
# Run with debug mode
python agent.py