Skip to content

Latest commit

 

History

History
430 lines (339 loc) · 19.5 KB

File metadata and controls

430 lines (339 loc) · 19.5 KB

traceAI Logo

traceAI

OpenTelemetry-native instrumentation for AI applications
Standardized observability across LLMs, agents, and frameworks

License Python TypeScript OpenTelemetry

DocumentationExamplesSlackPyPI Packagesnpm Packages


🚀 What is traceAI?

traceAI provides drop-in OpenTelemetry instrumentation for popular AI frameworks and LLM providers. It automatically captures traces, spans, and attributes from your AI workflows—whether you're using OpenAI, Anthropic, LangChain, LlamaIndex, or 20+ other frameworks.

  • Zero-config tracing for OpenAI, Anthropic, LangChain, LlamaIndex, and more
  • OpenTelemetry-native — works with any OTel-compatible backend (Jaeger, Datadog, Future AGI, etc.)
  • Semantic conventions for LLM calls, agents, tools, and retrieval
  • Python + TypeScript support with consistent APIs

Table of Contents

✨ Key Features

Feature Description
🎯 Standardized Tracing Maps AI workflows to consistent OpenTelemetry spans & attributes
🔌 Zero-Config Setup Drop-in instrumentation with minimal code changes
🌐 Multi-Framework 20+ integrations across Python & TypeScript
📊 Vendor Agnostic Works with any OpenTelemetry-compatible backend
🔍 Rich Context Captures prompts, completions, tokens, model params, tool calls, and more
Production Ready Async support, streaming, error handling, and performance optimized

🎯 Quickstart

Python Quickstart

1. Install

pip install traceai-openai

2. Instrument your application

import os
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
import openai

# Set up environment variables
os.environ["FI_API_KEY"] = "<your-api-key>"
os.environ["FI_SECRET_KEY"] = "<your-secret-key>"
os.environ["OPENAI_API_KEY"] = "<your-openai-key>"

# Register tracer provider
trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="my_ai_app"
)

# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)

# Use OpenAI as normal - tracing happens automatically!
response = openai.chat.completions.create(
    model="gpt-4.1",
    messages=[{"role": "user", "content": "Hello!"}]
)

💡 Tip: Swap traceai-openai for any supported framework (e.g., traceai-langchain, traceai-anthropic)


TypeScript Quickstart

1. Install

npm install @traceai/openai @traceai/fi-core

2. Instrument your application

import { register, ProjectType } from "@traceai/fi-core";
import { OpenAIInstrumentation } from "@traceai/openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import OpenAI from "openai";

// Register tracer provider
const tracerProvider = register({
  projectName: "my_ai_app",
  projectType: ProjectType.OBSERVE,
});

// Register OpenAI instrumentation (before creating client!)
registerInstrumentations({
  tracerProvider,
  instrumentations: [new OpenAIInstrumentation()],
});

// Use OpenAI as normal - tracing happens automatically!
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

const response = await openai.chat.completions.create({
  model: "gpt-4.1",
  messages: [{ role: "user", content: "Hello!" }],
});

💡 Tip: Works with Anthropic, LangChain, Vercel AI SDK, and more TypeScript frameworks


📦 Supported Frameworks

Python

Package Description Version
traceAI-openai traceAI Instrumentation for OpenAI. PyPI
traceAI-anthropic traceAI Instrumentation for Anthropic. PyPI
traceAI-llamaindex traceAI Instrumentation for LlamaIndex. PyPI
traceAI-langchain traceAI Instrumentation for LangChain. PyPI
traceAI-mistralai traceAI Instrumentation for MistralAI. PyPI
traceAI-vertexai traceAI Instrumentation for VertexAI. PyPI
traceAI-crewai traceAI Instrumentation for CrewAI. PyPI
traceAI-haystack traceAI Instrumentation for Haystack. PyPI
traceAI-litellm traceAI Instrumentation for liteLLM. PyPI
traceAI-groq traceAI Instrumentation for Groq. PyPI
traceAI-autogen traceAI Instrumentation for Autogen. PyPI
traceAI-guardrails traceAI Instrumentation for Guardrails. PyPI
traceAI-openai-agents traceAI Instrumentation for OpenAI Agents. PyPI
traceAI-smolagents traceAI Instrumentation for SmolAgents. PyPI
traceAI-dspy traceAI Instrumentation for DSPy. PyPI
traceAI-bedrock traceAI Instrumentation for AWS Bedrock. PyPI
traceAI-instructor traceAI Instrumentation for Instructor. PyPI
traceAI-google-genai traceAI Instrumentation for Google Generative AI. PyPI
traceAI-google-adk traceAI Instrumentation for Google ADK. PyPI
traceAI-pipecat traceAI Instrumentation for Pipecat. PyPI
traceAI-portkey traceAI Instrumentation for Portkey. PyPI
traceAI-mcp traceAI Instrumentation for Model Context Protocol. PyPI
traceAI-livekit traceAI Instrumentation for LiveKit (Real-time). PyPI
traceAI-pinecone traceAI Instrumentation for Pinecone vector database. PyPI
traceAI-chromadb traceAI Instrumentation for ChromaDB vector database. PyPI
traceAI-qdrant traceAI Instrumentation for Qdrant vector database. PyPI
traceAI-weaviate traceAI Instrumentation for Weaviate vector database. PyPI
traceAI-milvus traceAI Instrumentation for Milvus vector database. PyPI
traceAI-lancedb traceAI Instrumentation for LanceDB vector database. PyPI
traceAI-mongodb traceAI Instrumentation for MongoDB Atlas Vector Search. PyPI
traceAI-pgvector traceAI Instrumentation for pgvector PostgreSQL extension. PyPI
traceAI-redis traceAI Instrumentation for Redis Vector Search. PyPI

TypeScript

Package Description Version
@traceai/openai traceAI Instrumentation for OpenAI. npm
@traceai/anthropic traceAI Instrumentation for Anthropic. npm
@traceai/langchain traceAI Instrumentation for LangChain. npm
@traceai/llamaindex traceAI Instrumentation for LlamaIndex. npm
@traceai/bedrock traceAI Instrumentation for AWS Bedrock. npm
@traceai/vercel traceAI Instrumentation for Vercel AI SDK. npm
@traceai/mastra traceAI Instrumentation for Mastra. npm
@traceai/mcp traceAI Instrumentation for Model Context Protocol. npm

🔧 Compatibility Matrix

Category Supported Frameworks Python TypeScript
LLM Providers OpenAI
Anthropic
AWS Bedrock
Google Vertex AI -
Google Generative AI -
Mistral AI -
Groq -
LiteLLM -
Agent Frameworks LangChain
LlamaIndex
CrewAI -
AutoGen -
OpenAI Agents -
Smol Agents -
Mastra -
Tools & Libraries Haystack -
DSPy -
Guardrails AI -
Instructor -
Portkey -
Pipecat -
LiveKit -
Vercel AI SDK -
Standards Model Context Protocol (MCP)
Vector Databases Pinecone -
ChromaDB -
Qdrant -
Weaviate -
Milvus -
LanceDB -
MongoDB Atlas Vector -
pgvector -
Redis Vector -

Legend: ✅ Supported | - Not yet available


🏗️ Architecture

traceAI is built on top of OpenTelemetry and follows standard OTel instrumentation patterns. This means you get:

🔌 Full OpenTelemetry Compatibility

  • Works with any OTel-compatible backend
  • Standard OTLP exporters (HTTP/gRPC)
  • Compatible with existing OTel setups

⚙️ Bring Your Own Configuration

You can use traceAI with your own OpenTelemetry setup:

Python: Custom TracerProvider & Exporters
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from traceai_openai import OpenAIInstrumentor

# Set up your own tracer provider
tracer_provider = TracerProvider()
trace.set_tracer_provider(tracer_provider)

# Add custom exporters (example with Future AGI)
# HTTP endpoint
otlp_exporter = OTLPSpanExporter(
    endpoint="https://api.futureagi.com/tracer/v1/traces",
    headers={
        "X-API-KEY": "your-api-key",
        "X-SECRET-KEY": "your-secret-key"
    }
)
# Or use gRPC: OTLPSpanExporter(endpoint="grpc://grpc.futureagi.com:443", ...)
tracer_provider.add_span_processor(BatchSpanProcessor(otlp_exporter))

# Instrument with traceAI
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
TypeScript: Custom TracerProvider, Span Processors & Headers
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { Resource } from "@opentelemetry/resources";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OpenAIInstrumentation } from "@traceai/openai";

// Create custom tracer provider
const provider = new NodeTracerProvider({
  resource: new Resource({
    "service.name": "my-ai-service",
  }),
});

// Add custom OTLP exporter with headers (example with Future AGI)
// HTTP endpoint
const exporter = new OTLPTraceExporter({
  url: "https://api.futureagi.com/tracer/v1/traces",
  headers: {
    "X-API-KEY": process.env.FI_API_KEY!,
    "X-SECRET-KEY": process.env.FI_SECRET_KEY!,
  },
});
// Or use gRPC: new OTLPTraceExporter({ url: "grpc://grpc.futureagi.com:443", ... })

// Add span processor
provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();

// Register traceAI instrumentation
registerInstrumentations({
  tracerProvider: provider,
  instrumentations: [new OpenAIInstrumentation()],
});

📊 What Gets Captured

traceAI automatically captures rich telemetry data:

  • Prompts & Completions: Full request/response content
  • Token Usage: Input, output, and total tokens
  • Model Parameters: Temperature, top_p, max_tokens, etc.
  • Tool Calls: Function/tool names, arguments, and results
  • Streaming: Individual chunks with delta tracking
  • Errors: Detailed error context and stack traces
  • Timing: Latency at each step of the AI workflow

All data follows OpenTelemetry Semantic Conventions for GenAI.


🗺️ Roadmap

See our ROADMAP.md for planned features and integrations:

Coming Soon

Category Integrations
LLM Providers Ollama, Cohere, Together AI, HuggingFace
Agent Frameworks LangGraph, Claude Agent SDK, Pydantic AI
Enterprise SDKs Java (LangChain4j, Spring AI), Go
Platform Features LLM Playground, Datasets & Experiments

Recently Added

Category Integrations
Vector DBs Pinecone, ChromaDB, Qdrant, Weaviate, Milvus, LanceDB, MongoDB Atlas, pgvector, Redis

Technical Documentation

For implementation details and architecture:


🤝 Contributing

We welcome contributions from the community!

📖 Read our Contributing Guide for detailed instructions on:

  • Setting up your development environment (Python & TypeScript)
  • Running tests and code quality checks
  • Submitting pull requests
  • Adding new framework integrations

Quick Start:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Found a bug? Please open an issue with:

  • Framework version
  • traceAI version
  • Minimal reproduction code
  • Expected vs actual behavior

Request a feature? Please open an issue with:

  • Use case or problem you're trying to solve
  • Proposed solution or feature description
  • Any relevant examples or mockups
  • Priority level (nice-to-have vs critical)

📚 Resources

Resource Description
🌐 Website Learn more about Future AGI
📖 Documentation Complete guides and API reference
👨‍🍳 Cookbooks Step-by-step implementation examples
🗺️ Roadmap Planned features and integrations
📝 Changelog All release notes and updates
📚 Knowledge Base Technical documentation and architecture
🤝 Contributing Guide How to contribute to traceAI
💬 Slack Join our community
🐛 Issues Report bugs or request features

🌍 Connect With Us

Website LinkedIn Twitter Reddit Substack


Built with ❤️ by the Future AGI team

⭐ Star us on GitHub | 🐛 Report Bug | 💡 Request Feature