Documentation

Everything you need to instrument, monitor, and debug your AI agents

Getting Started

1. Install the SDK

Choose your preferred language and install the SDK:

Python

pip install agenttrace-sdk

TypeScript/JavaScript

# Not yet published - install from source npm install github:agenttrace/agenttrace#packages/sdk-typescript

2. Set Your API Key

Get your API key from the dashboard and set it as an environment variable:

export AGENTTRACE_API_KEY="your-api-key-here"

3. Instrument Your Agent

Add tracing to your agent with a simple decorator:

from agenttrace import trace_agent

@trace_agent(project="my-agent")
async def my_agent(query: str):
    # Your agent logic here
    response = await llm.generate(query)
    return response

Python SDK

Basic Usage

from agenttrace import AgentTrace

# Initialize the client
client = AgentTrace(
    api_key="your-api-key",
    project="my-agent"
)

# Create a trace
with client.trace("user-query") as trace:
    # Add steps
    trace.step("thinking", metadata={"input": "Hello"})

    # Add LLM calls
    trace.llm_call(
        model="gpt-4",
        prompt="Hello",
        response="Hi there!",
        tokens=15,
        cost=0.002
    )

    trace.step("responding", metadata={"output": "Hi there!"})

Advanced Features

  • Automatic token counting
  • Cost calculation for major LLM providers
  • Async support
  • Error tracking and stack traces
  • Custom metadata and tags

Configuration

client = AgentTrace(
    api_key="your-api-key",
    project="my-agent",
    api_url="https://api.agenttrace.io",  # Optional: custom API URL
    enable_local_logging=True,             # Optional: log locally
    batch_size=10,                         # Optional: batch size
    flush_interval=5                       # Optional: flush interval (seconds)
)

TypeScript SDK

Basic Usage

import { AgentTrace } from '@agenttrace/sdk';

// Initialize the client
const client = new AgentTrace({
  apiKey: 'your-api-key',
  project: 'my-agent'
});

// Create a trace
const trace = await client.startTrace('user-query');

// Add steps
await trace.addStep('thinking', { input: 'Hello' });

// Add LLM calls
await trace.addLLMCall({
  model: 'gpt-4',
  prompt: 'Hello',
  response: 'Hi there!',
  tokens: 15,
  cost: 0.002
});

// End the trace
await trace.end();

Using Decorators

import { traceAgent } from '@agenttrace/sdk';

class MyAgent {
  @traceAgent({ project: 'my-agent' })
  async processQuery(query: string) {
    // Your agent logic here
    return response;
  }
}

API Reference

AgentTrace provides a RESTful API for all operations. Base URL: https://api.agenttrace.io

Authentication

All API requests require an API key in the Authorization header:

curl -H "Authorization: Bearer YOUR_API_KEY" \
  https://api.agenttrace.io/v1/traces

Endpoints

POST /v1/traces

Create a new trace

GET /v1/traces

List all traces for a project

GET /v1/traces/:id

Get a specific trace by ID

POST /v1/traces/:id/steps

Add a step to an existing trace

Integrations

AgentTrace integrates seamlessly with popular AI frameworks

🦜LangChain
Planned
🚢CrewAI
Planned
🤖AutoGPT
Planned
🦙LlamaIndex
Planned

Need Help?

Join our community or reach out to our support team