Integrations
OpenAI

Setup

Step 1: Installing Packages

pip install opentelemetry-instrumentation-openai

Step 2: Telemetry Configuration

Configure your application to collect and send telemetry data. Add the following lines to your python file:

import os
 
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import (
    BatchSpanProcessor
)
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
    OTLPSpanExporter
)
 
OTEL_EXPORTER_OTLP_ENDPOINT = "collector.infrastack.ai"
OTEL_EXPORTER_OTLP_HEADERS = "infrastack-api-key=<API-KEY>"
 
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = OTEL_EXPORTER_OTLP_ENDPOINT
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = OTEL_EXPORTER_OTLP_HEADERS
 
otlp_exporter = OTLPSpanExporter(
    endpoint = OTEL_EXPORTER_OTLP_ENDPOINT,
    headers=(tuple(OTEL_EXPORTER_OTLP_HEADERS.split("=")),)
)
 
processor = BatchSpanProcessor(otlp_exporter)
provider = TracerProvider(resource=Resource.create({"service.name": "openai.traces"}))
provider.add_span_processor(processor)

Now, add the intrumentation to the OpenAI API,

import openai
 
from opentelemetry.instrumentation.openai import OpenAIInstrumentor
 
OpenAIInstrumentor().instrument(tracer_provider = provider)

Step 3: Using the OpenAI API

All the OpenAI APIs can be used without modification and intrumentation provides immediate traces for this framework, i.e.

import os
 
from openai import OpenAI
 
 
OPENAI_API_KEY="<OPENAI-API-KEY>"
client = OpenAI(
  api_key=os.environ.get('OPENAI_API_KEY') or OPENAI_API_KEY
)
 
client.chat.completions.create(
    model='gpt-3.5-turbo',
    messages=[
        {'role': 'system', 'content': 'Answer in friendly and personal manner.'},
        {'role': 'user', 'content': 'How are you today'},
    ]
)

Content Privacy

By default, this instrumentation logs prompts, completions, and embeddings to span attributes. This gives you a clear visibility into how your LLM application is working, and can make it easy to debug and evaluate the quality of the outputs.

However, you may want to disable this logging for privacy reasons, as they may contain highly sensitive data from your users. You may also simply want to reduce the size of your traces.

To disable logging, set the TRACELOOP_TRACE_CONTENT environment variable to false.

TRACELOOP_TRACE_CONTENT=false