Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions sdk/guides/observability.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: Observability & Tracing
description: Enable OpenTelemetry tracing to monitor and debug your agent's execution with tools like Laminar, MLflow, Honeycomb, or any OTLP-compatible backend.

Check warning on line 3 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L3

Did you really mean 'MLflow'?
---

> A full setup example is available [here](#example:-full-setup)!
Expand All @@ -12,7 +12,7 @@
- **[Laminar](https://laminar.sh/)** - AI-focused observability with browser session replay support
- **[MLflow](https://mlflow.org/)** - Open-source AI platform with tracing, evaluation, and LLM governance
- **[Honeycomb](https://www.honeycomb.io/)** - High-performance distributed tracing
- **Any OTLP-compatible backend** - Including Jaeger, Datadog, New Relic, and more

Check warning on line 15 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L15

Did you really mean 'Datadog'?

The SDK automatically traces:
- Agent execution steps
Expand All @@ -36,6 +36,14 @@

That's it! Run your agent code normally and traces will be sent to Laminar automatically.

For **self-hosted Laminar** deployments, you can also configure custom ports:

```bash icon="terminal" wrap
export LMNR_PROJECT_API_KEY="your-laminar-api-key"
export LMNR_HTTP_PORT=8000
export LMNR_GRPC_PORT=8001
```

### Using OpenTelemetry (OTLP) Backends

For OpenTelemetry (OTLP) compatible backends, set the following environment variables:
Expand Down Expand Up @@ -125,6 +133,8 @@
| Variable | Description | Example |
|----------|-------------|---------|
| `LMNR_PROJECT_API_KEY` | Laminar project API key | `your-laminar-api-key` |
| `LMNR_HTTP_PORT` | HTTP port for self-hosted Laminar | `8000` |
| `LMNR_GRPC_PORT` | gRPC port for self-hosted Laminar | `8001` |
| `OTEL_EXPORTER_OTLP_TRACES_ENDPOINT` | Full OTLP traces endpoint URL | `https://api.honeycomb.io:443/v1/traces` |
| `OTEL_EXPORTER_OTLP_ENDPOINT` | Base OTLP endpoint (traces path appended) | `http://localhost:4317` |
| `OTEL_ENDPOINT` | Short form endpoint | `http://localhost:4317` |
Expand All @@ -149,8 +159,8 @@

The SDK supports both HTTP and gRPC protocols:

- **`http/protobuf`** or **`otlp_http`** - HTTP with protobuf encoding (recommended for most backends)

Check warning on line 162 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L162

Did you really mean 'protobuf'?
- **`grpc`** or **`otlp_grpc`** - gRPC with protobuf encoding (use only if your backend supports gRPC)

Check warning on line 163 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L163

Did you really mean 'protobuf'?

## Platform-Specific Configuration

Expand All @@ -164,20 +174,28 @@
export LMNR_PROJECT_API_KEY="your-laminar-api-key"
```

**Self-Hosted Laminar**: If you are running a self-hosted Laminar instance, you can configure the HTTP and gRPC ports via environment variables:

```bash icon="terminal" wrap
export LMNR_PROJECT_API_KEY="your-laminar-api-key"
export LMNR_HTTP_PORT=8000
export LMNR_GRPC_PORT=8001
```

**Browser Session Replay**: When using Laminar with browser-use tools, session replays are automatically captured, allowing you to see exactly what the browser automation did.

### MLflow Setup

Check warning on line 187 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L187

Did you really mean 'MLflow'?

[MLflow](https://mlflow.org/) is an open-source AI platform that accepts OpenTelemetry traces out of the box, alongside evaluation and LLM governance capabilities.

1. Start your MLflow tracking server:

Check warning on line 191 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L191

Did you really mean 'MLflow'?

```bash icon="terminal" wrap
uvx mlflow server
```

<Note>
For other deployment options (pip, Docker Compose, etc.), see [Set Up MLflow Server](https://mlflow.org/docs/latest/genai/getting-started/connect-environment/).

Check warning on line 198 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L198

Did you really mean 'MLflow'?
</Note>

2. Configure the environment variables:
Expand All @@ -188,7 +206,7 @@
export OTEL_EXPORTER_OTLP_TRACES_PROTOCOL="http/protobuf"
```

Navigate to the MLflow UI (e.g., `http://localhost:5000`), select the experiment, and open the **Traces** tab to view the recorded traces.

Check warning on line 209 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L209

Did you really mean 'MLflow'?

### Honeycomb Setup

Expand Down Expand Up @@ -310,7 +328,7 @@

**Solutions**:
- Tracing has minimal overhead when properly configured
- Disable tracing in development by unsetting environment variables

Check warning on line 331 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L331

Did you really mean 'unsetting'?
- Use asynchronous exporters (default in most OTLP configurations)

## Example: Full Setup
Expand Down Expand Up @@ -374,5 +392,5 @@
## Next Steps

- **[Metrics Tracking](/sdk/guides/metrics)** - Monitor token usage and costs alongside traces
- **[LLM Registry](/sdk/guides/llm-registry)** - Track multiple LLMs used in your application

Check warning on line 395 in sdk/guides/observability.mdx

View check run for this annotation

Mintlify / Mintlify Validation (allhandsai) - vale-spellcheck

sdk/guides/observability.mdx#L395

Did you really mean 'LLMs'?
- **[Security](/sdk/guides/security)** - Add security validation to your traced agent executions
Loading