The AI Gateway provides comprehensive application monitoring through OpenTelemetry (OTel), enabling deep visibility into how the gateway processes requests, makes routing decisions, and performs under load. With built-in trace propagation and multiple exporter options, you can seamlessly integrate with your existing monitoring stack.

Getting Started

Why Monitor the AI Gateway?

AI Gateway application monitoring provides:

  • Routing decision visibility - See which providers were selected and why
  • Gateway performance tracking - Monitor request processing latency and throughput
  • System health insights - Track application health, errors, and resource usage

Built-in Instrumentation - The AI Gateway automatically instruments its own application performance with distributed tracing, metrics collection, and structured logging using OpenTelemetry.

For complete configuration options, see the Configuration Reference.

Telemetry Levels

The recommended telemetry level is info,ai_gateway=debug for development and info for production.

Configuration Examples

Use case: Local development with console output for immediate debugging.

telemetry:
  level: "info,ai_gateway=debug" # Recommended for development
  otlp-endpoint: "http://localhost:4317"

Result: Pretty-printed logs to console with full trace information and no external dependencies.

Reference

Grafana Stack (Included)

Use the included Docker Compose setup for complete local observability:

# Start the full observability stack
cd infrastructure
docker-compose up -d

# Access services
open http://localhost:3010  # Grafana dashboard
open http://localhost:9090  # Prometheus metrics

Included services:

  • Grafana (port 3010) - Dashboards and visualization
  • Prometheus (port 9090) - Metrics storage and querying
  • Loki (port 3100) - Log aggregation and search
  • OpenTelemetry Collector (port 4317) - Telemetry ingestion

Pre-built Dashboard: The setup includes a comprehensive Grafana dashboard for AI Gateway monitoring that you can import into your own Grafana instance.

Custom Monitoring Stack

The AI Gateway can integrate with any OpenTelemetry-compatible monitoring solution. Simply configure the telemetry endpoint in your configuration file to point to your existing monitoring infrastructure.