LangChain Integration
LangChain is a framework for developing applications powered by language models. LangGuard integrates with LangChain via OpenTelemetry to capture traces from your LangChain applications.
Overview
The LangChain integration enables LangGuard to:
- Capture chain execution traces from LangChain applications
- Monitor LLM calls — Model usage, token counts, and latency
- Track tool and retriever usage within chains and agents
- Apply governance policies to LangChain operations
Prerequisites
- A LangChain application (Python or JavaScript/TypeScript)
- OpenTelemetry SDK installed
- LangGuard API key (from Settings > API Keys)
- LangGuard admin role
Setup
Step 1: Install OpenTelemetry Dependencies
Python:
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp
JavaScript/TypeScript:
npm install @opentelemetry/api @opentelemetry/sdk-node @opentelemetry/exporter-trace-otlp-http
Step 2: Create an Integration in LangGuard
- Navigate to Integrations in the sidebar
- Click Add Integration
- Select AI Frameworks > LangChain
- Enter a Name (e.g., "Production LangChain App")
- LangGuard provides your OTLP endpoint and authentication details
- Click Save
Step 3: Configure Your Application
Set environment variables to send traces to LangGuard:
export OTEL_EXPORTER_OTLP_ENDPOINT="https://app.langguard.ai"
export OTEL_EXPORTER_OTLP_PROTOCOL="http/json"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_API_KEY"
export OTEL_SERVICE_NAME="my-langchain-app"
Step 4: Enable Tracing
Python with LangChain:
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
# Set up OpenTelemetry
provider = TracerProvider()
processor = BatchSpanProcessor(OTLPSpanExporter())
provider.add_span_processor(processor)
trace.set_tracer_provider(provider)
# Your LangChain code runs as normal — traces are exported automatically
Step 5: Restart and Verify
- Restart your application
- Generate some LangChain activity
- Check the Trace Explorer in LangGuard for incoming traces
What Gets Captured
Chain Execution
| Field | Description |
|---|---|
| Chain Name | The chain or agent being executed |
| Input | Input to the chain |
| Output | Chain output |
| Duration | Total execution time |
| Status | Success or error |
LLM Calls
- Model name and provider
- Input and output tokens
- Response latency
- Prompt and completion content
Tools and Retrievers
- Tool invocations and their results
- Retriever queries and retrieved documents
- Execution time per tool
Troubleshooting
No Traces Appearing
- Verify environment variables are set correctly
- Restart your application after setting variables
- Check network connectivity to the LangGuard endpoint
- Verify your API key is valid
Missing Spans
- Ensure OpenTelemetry auto-instrumentation is properly configured
- Check that all LangChain components are traced
- Verify the
BatchSpanProcessoris flushing correctly
Next Steps
- Integrations Overview — See all available integrations
- Trace Explorer — Analyze captured traces
- Policies — Apply governance rules to LangChain operations