Connecting Integrations
LangGuard connects to your AI platforms, frameworks, coding agents, and identity providers to aggregate and analyze your data. This guide covers how to connect supported platforms.
Overview
To connect an integration:
- Navigate to Integrations in the sidebar
- Click Add Integration
- Browse the categories to find your platform
- Click the platform card
- Enter your credentials
- Test the connection
- Save and start syncing
Supported Platforms
AI Gateways
| Platform | What You'll Need |
|---|---|
| OpenRouter | API Key |
| LiteLLM | Proxy URL, API Key |
| Cloudflare | Account ID, API Token |
AI Platforms
| Platform | What You'll Need |
|---|---|
| Databricks | Workspace URL, Access Token |
| Azure AI Foundry | Subscription ID, Client credentials |
| AWS Bedrock | AWS Access Key, Secret Key, Region |
AI Frameworks
| Platform | What You'll Need |
|---|---|
| MLflow | Tracking URI, API Token |
| LangChain | OpenTelemetry environment variables |
| CrewAI | OpenTelemetry environment variables |
| AWS AgentCore | AWS credentials, Region |
Coding Agents
| Platform | What You'll Need |
|---|---|
| Claude Code | OpenTelemetry environment variables |
| Cursor | GitHub plugin |
| OpenCode | OpenTelemetry environment variables |
Identity Platforms
| Platform | What You'll Need |
|---|---|
| Microsoft Entra ID | Tenant ID, Client ID, Client Secret |
| Google Workspace | Service account credentials |
See the Integrations Overview for the full list of supported and upcoming platforms.
Connecting AI Gateways
AI gateways proxy LLM traffic through a central point, making them a convenient integration option since all model calls are captured automatically.
OpenRouter
- Click Add Integration > AI Gateways > OpenRouter
- Enter your API Key (from your OpenRouter dashboard)
- Click Test Connection, then Save
LiteLLM
- Click Add Integration > AI Gateways > LiteLLM
- Enter:
- Proxy URL: Your LiteLLM proxy endpoint
- API Key: Your LiteLLM master key
- Click Test Connection, then Save
Cloudflare AI Gateway
- Click Add Integration > AI Gateways > Cloudflare
- Enter:
- Account ID: Your Cloudflare account ID
- API Token: A Cloudflare API token with AI Gateway read permissions
- Click Test Connection, then Save
Connecting AI Platforms
Databricks
- Click Add Integration > AI Platforms > Databricks
- Enter:
- Name: A friendly name (e.g., "Production Databricks")
- Host URL: Your workspace URL (e.g.,
https://dbc-xxx.cloud.databricks.com) - Access Token: Your personal access token (starts with
dapi)
- Click Test Connection, then Save
- Log in to your Databricks workspace
- Click your profile > User Settings
- Go to Access Tokens > Generate New Token
For detailed configuration including Unity Catalog sync and Python bridge setup, see the Databricks Integration Guide.
Azure AI Foundry
- Click Add Integration > AI Platforms > Azure AI Foundry
- Enter:
- Subscription ID: Your Azure subscription ID
- Client ID: Azure AD application client ID
- Client Secret: Azure AD application client secret
- Tenant ID: Your Azure AD tenant ID
- Click Test Connection, then Save
AWS Bedrock
- Click Add Integration > AI Platforms > AWS Bedrock
- Enter:
- Access Key ID: Your AWS access key
- Secret Access Key: Your AWS secret key
- Region: The AWS region where Bedrock is enabled (e.g.,
us-east-1)
- Click Test Connection, then Save
Connecting AI Frameworks
AI framework integrations use OpenTelemetry to send traces directly to LangGuard.
MLflow
- Click Add Integration > AI Frameworks > MLflow
- Enter:
- Tracking URI: Your MLflow tracking server URL
- API Token: Authentication token (if required)
- Click Test Connection, then Save
LangChain / CrewAI / AWS AgentCore
These frameworks integrate via OpenTelemetry environment variables:
- Click Add Integration and select your framework
- LangGuard generates the OTLP endpoint and API key for your integration
- Set the environment variables in your application:
export OTEL_EXPORTER_OTLP_ENDPOINT="https://app.langguard.ai"
export OTEL_EXPORTER_OTLP_PROTOCOL="http/json"
export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_API_KEY"
- Restart your application to begin sending traces
Connecting Coding Agents
Coding agent integrations send traces directly to LangGuard via OpenTelemetry, enabling real-time monitoring of agent activity, tool usage, and performance metrics.
- Claude Code — Set environment variables to send OTLP traces
- Cursor — Install the GitHub plugin for hook-based tracing
- OpenCode — Set environment variables to send OTLP traces
See the individual integration guides for step-by-step setup.
Connecting Identity Platforms
Identity platform integrations enrich LangGuard data with user and group information for governance and access tracking.
Microsoft Entra ID
- Click Add Integration > Identity Platforms > Microsoft Entra ID
- Enter:
- Tenant ID: Your Azure AD tenant ID
- Client ID: Application (client) ID
- Client Secret: Application client secret
- Click Test Connection, then Save
Google Workspace
- Click Add Integration > Identity Platforms > Google Workspace
- Upload your service account credentials JSON file
- Enter the admin email for domain-wide delegation
- Click Test Connection, then Save
After Connecting
Sync Settings
After adding an integration, you can configure sync behavior:
- Auto Sync: Enable/disable automatic synchronization
- Sync Interval: How often to sync (1 minute to 1 hour)
- Lookback Period: How far back to fetch historical data
Viewing Sync Status
Each integration shows:
- Status: Connected, Syncing, Error
- Last Sync: When data was last fetched
- Traces Synced: Total count of imported traces
Managing Integrations
- Edit: Update credentials or settings
- Disable: Pause syncing without deleting
- Delete: Remove integration entirely
Troubleshooting
Connection Test Failed
- Double-check your credentials
- Verify you have the correct permissions
- Check if the service is accessible
No Data Appearing
- Ensure your source platform has traces
- Check the time range in LangGuard
- Try triggering a manual sync
Rate Limit Errors
- Increase the sync interval (e.g., 30 minutes instead of 15)
- Some platforms have API rate limits
See Integration Issues for more help.
Next Steps
- Explore Trace Explorer - Analyze your synced traces
- Set Up Policies - Enable governance rules