Skip to content

aro-hcp-devtools/aro-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ARO Agent 🤖

An intelligent Slack bot for Azure Red Hat OpenShift (ARO) HCP developers that leverages LangGraph and OpenAI to automate infrastructure tasks and provide DevOps assistance through natural language interactions.

🏗️ Architecture

┌─────────────┐    ┌─────────────────┐    ┌─────────────────┐    ┌─────────────────┐
│             │    │                 │    │                 │    │                 │
│    Slack    │───▶│   Flask App     │───▶│   LangGraph     │───▶│    LLM Tools    │
│             │    │  (Webhook)      │    │   Workflow      │    │  (OpenAI GPT)   │
│   @mention  │    │                 │    │                 │    │                 │
└─────────────┘    └─────────────────┘    └─────────────────┘    └─────────────────┘
       ▲                                           │                       │
       │                                           ▼                       ▼
       │                                  ┌─────────────────┐    ┌─────────────────┐
       │                                  │                 │    │                 │
       └──────────────────────────────────│  Enhanced       │◀───│   Tool Calls    │
                                          │  Response       │    │   (Jenkins,     │
                                          │  Formatter      │    │   Infrastructure│
                                          │                 │    │   etc.)         │
                                          └─────────────────┘    └─────────────────┘

Flow Description:

  1. Slack Input: User mentions the bot with a query in Slack
  2. Webhook Processing: Flask app receives the Slack event via webhook
  3. LangGraph Orchestration: Query is processed through a LangGraph workflow
  4. LLM Analysis: OpenAI GPT-4o-mini analyzes the request and determines required actions
  5. Tool Execution: Relevant tools are called (e.g., Jenkins jobs for infrastructure)
  6. Response Formatting: Results are formatted using LLM for better readability
  7. Slack Response: Pretty-formatted response is returned to the user in Slack thread

✨ Features

  • 🔧 Infrastructure Automation: Trigger Jenkins jobs for ARO infrastructure provisioning
  • 💬 Natural Language Interface: Interact using plain English through Slack
  • 🧠 Intelligent Processing: Powered by OpenAI GPT-4o-mini for understanding context
  • 🔄 Workflow Orchestration: Uses LangGraph for complex multi-step operations
  • 📝 Smart Formatting: Automatically formats responses with links, status icons, and structured data
  • 🔒 Event Deduplication: Prevents duplicate processing of Slack events
  • 🧵 Thread Support: Maintains conversation context in Slack threads

🛠️ Tools Available

Infrastructure Tools

  • make_infra: Creates personal dev infrastructure by triggering Jenkins jobs

📋 Prerequisites

  • Python 3.12+
  • Slack App with appropriate permissions
  • OpenAI API access
  • Jenkins server with API access
  • UV package manager (recommended)

🚀 Installation

  1. Clone the repository:
git clone <repository-url>
cd aro-agent
  1. Install dependencies using UV:
uv sync
  1. Set up environment variables: Create a .env file in the project root:
# Slack Configuration
SLACK_BOT_TOKEN=xoxb-your-bot-token
SLACK_SIGNING_SECRET=your-signing-secret

# OpenAI Configuration  
OPENAI_API_KEY=your-openai-api-key

# Jenkins Configuration
JENKINS_URL=https://your-jenkins-server.com
JENKINS_USER_ID=your-jenkins-username
JENKINS_TOKEN=your-jenkins-token
JENKINS_JOB_NAME=make-infra-job

⚙️ Configuration

Slack App Setup

  1. Create a new Slack app at api.slack.com
  2. Enable Event Subscriptions and set Request URL to: https://your-domain.com/slack/events
  3. Subscribe to app_mention events
  4. Install the app to your workspace
  5. Add the bot to channels where you want to use it

Required Slack Permissions

  • app_mentions:read - Listen for mentions
  • channels:history - Read channel messages
  • chat:write - Send messages
  • users:read - Get user information

🎯 Usage

Basic Usage

  1. Start the agent:
uv run python agent.py
  1. Mention the bot in Slack:
@aro-agent Create the dev infra for me
  1. Get formatted responses: The bot will process your request, execute necessary tools, and provide a nicely formatted response with relevant links and status information.

Example Interactions

Infrastructure Creation:

User: @aro-agent I need to set up dev infrastructure for user 'supatil'

Bot: ✅ Infrastructure Build Triggered Successfully

Job ID: `uuid-here`
User: supatil
Job Name: make-infra-job
🔗 Build Link: View Build #123
Infrastructure build triggered successfully for user: supa

📁 Project Structure

aro-agent/
├── agent.py              # Main Flask application and Slack event handler
├── utils.py              # Utility functions and response formatting
├── config/
│   ├── graph.py          # LangGraph workflow configuration
│   └── llm.py           # OpenAI LLM setup and tool binding
├── tools/
│   └── make_infra.py    # Infrastructure automation tools
├── pyproject.toml       # Project dependencies and metadata
├── .env                 # Environment variables (create this)
└── README.md           # This file

Key Components

  • agent.py: Core Flask app that handles Slack webhooks and orchestrates responses
  • config/graph.py: Defines the LangGraph workflow with chatbot → tools → formatter flow
  • config/llm.py: Initializes OpenAI LLM with tool bindings
  • tools/make_infra.py: Jenkins integration for infrastructure provisioning
  • utils.py: Response formatting and state management

🔧 Development

Adding New Tools

  1. Create a new tool in the tools/ directory:
from langchain.tools import StructuredTool
from pydantic import BaseModel

def your_new_tool(param: str) -> str:
    """Your tool description"""
    # Tool implementation
    return "result"

your_tool = StructuredTool.from_function(
    func=your_new_tool,
    name="your_tool_name"
)
  1. Register the tool in config/llm.py:
from tools.your_tool import your_tool
llm_with_tools = llm.bind_tools([make_infra_tool, your_tool])
  1. Add tool node in config/graph.py:
tool_node = ToolNode(tools=[make_infra_tool, your_tool])

Running in Development

# Activate virtual environment
uv shell

# Run with debug mode
python agent.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages