Skip to main content
Quickly start capturing telemetry data from your generative AI apps. After installation and configuration, follow the Axiom AI engineering workflow to create, measure, observe, and iterate on your capabilities.
This page explains how to set up instrumentation with Axiom AI SDK. Expand the section below to chooose the right instrumentation approach for your needs.
Axiom offers the following approaches to capture generative AI telemetry:
Instrumentation approachLanguage supportCharacteristics
Axiom AI SDKTypeScriptQuick setup.
Minimal code changes.
ManualAnyMore involved setup.
Full control over instrumentation.
Instrumentation with Axiom AI SDK is the right choice for you if you have a TypeScript app and you want the SDK to capture and send traces with the correct semantic conventions.Manual instrumentation is the right choice for you if you want to use your own tooling or if you use a language other than TypeScript. You need to instrument your app manually to emit traces compatible with Axiom’s AI engineering features.Both approaches emit identical attributes. This means that all the telemetry analysis features work the same way.

Prerequisites

Install

Install Axiom AI SDK into your TypeScript project:
pnpm i axiom
The axiom package includes the axiom command-line interface (CLI) for managing your AI assets, which will be used in later stages of the Axiom AI engineering workflow.

Configure tracer

To send data to Axiom, configure a tracer. For example, use a dedicated instrumentation file and load it before the rest of your app. An example configuration for a Node.js environment:
  1. Install dependencies:
    pnpm i \
      dotenv \
      @opentelemetry/exporter-trace-otlp-http \
      @opentelemetry/resources \
      @opentelemetry/sdk-trace-node \
      @opentelemetry/semantic-conventions \
      @opentelemetry/api
    
  2. Create instrumentation file:
    /src/instrumentation.ts
    
    import 'dotenv/config'; // Make sure to load environment variables
    import { OTLPTraceExporter } from '@opentelemetry/exporter-trace-otlp-http';
    import { resourceFromAttributes } from '@opentelemetry/resources';
    import { NodeTracerProvider } from '@opentelemetry/sdk-trace-node';
    import { SimpleSpanProcessor } from '@opentelemetry/sdk-trace-node';
    import { ATTR_SERVICE_NAME } from '@opentelemetry/semantic-conventions';
    import { trace } from "@opentelemetry/api";
    import { initAxiomAI, RedactionPolicy } from 'axiom/ai';
    
    const tracer = trace.getTracer("my-tracer");
    
    // Configure the provider to export traces to your Axiom dataset
    const provider = new NodeTracerProvider({
      resource: resourceFromAttributes({
        [ATTR_SERVICE_NAME]: 'my-ai-app', // Replace with your service name
      },
      {
        // Use the latest schema version
        // Info: https://opentelemetry.io/docs/specs/semconv/
        schemaUrl: 'https://opentelemetry.io/schemas/1.37.0',
      }),
      spanProcessor: new SimpleSpanProcessor(
        new OTLPTraceExporter({
          url: `https://api.axiom.co/v1/traces`,
          headers: {
            Authorization: `Bearer ${process.env.AXIOM_TOKEN}`,
            'X-Axiom-Dataset': process.env.AXIOM_DATASET!,
          },
        })
      ),
    });
    
    // Register the provider
    provider.register();
    
    // Initialize Axiom AI SDK with the configured tracer
    initAxiomAI({ tracer, redactionPolicy: RedactionPolicy.AxiomDefault });
    
For more information on specifying redaction policies, see Redaction policies.

Store environment variables

Store environment variables in an .env file in the root of your project:
.env
AXIOM_TOKEN="API_TOKEN"
AXIOM_DATASET="DATASET_NAME"
OPENAI_API_KEY=""
GEMINI_API_KEY=""
XAI_API_KEY=""
ANTHROPIC_API_KEY=""
Replace API_TOKEN with the Axiom API token you have generated. For added security, store the API token in an environment variable.Replace DATASET_NAME with the name of the Axiom dataset where you send your data.Enter the API keys for the LLMs you want to work with.

What’s next?

  • Explore the AI engineering workflow: Start building systematic AI capabilities beginning with Create.
  • Continue with Axiom AI SDK: Learn about instrumenting your AI model and tool calls in Observe.
I