Integrations
LlamaIndex

Setup

Step 1: Installing Packages

pip install opentelemetry-instrumentation-llamaindex

Step 2: Telemetry Configuration

Configure your application to collect and send telemetry data. Add the following lines to your python file:

import os
 
from opentelemetry import trace
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import (
    BatchSpanProcessor
)
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import (
    OTLPSpanExporter
)
 
OTEL_EXPORTER_OTLP_ENDPOINT = "collector.infrastack.ai"
OTEL_EXPORTER_OTLP_HEADERS = "infrastack-api-key=<API-KEY>"
 
os.environ["OTEL_EXPORTER_OTLP_ENDPOINT"] = OTEL_EXPORTER_OTLP_ENDPOINT
os.environ["OTEL_EXPORTER_OTLP_HEADERS"] = OTEL_EXPORTER_OTLP_HEADERS
 
otlp_exporter = OTLPSpanExporter(
    endpoint = OTEL_EXPORTER_OTLP_ENDPOINT,
    headers=(tuple(OTEL_EXPORTER_OTLP_HEADERS.split("=")),)
)
 
processor = BatchSpanProcessor(otlp_exporter)
provider = TracerProvider(resource=Resource.create({"service.name": "llama_index.traces"}))
provider.add_span_processor(processor)

Now, add the intrumentation to the LlamaIndex API and possibly for OpenAI

from opentelemetry.instrumentation.llamaindex import LlamaIndexInstrumentor
LlamaIndexInstrumentor().instrument(tracer_provider = provider)

Content Privacy

By default, this instrumentation logs prompts, completions, and embeddings to span attributes. This gives you a clear visibility into how your LLM application is working, and can make it easy to debug and evaluate the quality of the outputs.

However, you may want to disable this logging for privacy reasons, as they may contain highly sensitive data from your users. You may also simply want to reduce the size of your traces.

To disable logging, set the TRACELOOP_TRACE_CONTENT environment variable to false.

TRACELOOP_TRACE_CONTENT=false