AWS Bedrock Integration
AWS Bedrock is a fully managed service for building generative AI applications with foundation models. LangGuard integrates with AWS Bedrock to discover models, monitor invocations, and track provisioned throughput.
Overview
The AWS Bedrock integration enables LangGuard to:
- Discover Bedrock models — Foundation models and custom models in your account
- Monitor model invocations — Track usage, latency, and costs
- Track provisioned throughput — Monitor reserved capacity
- Apply governance policies to Bedrock interactions
Prerequisites
- An AWS account with Bedrock enabled
- IAM user or role with Bedrock read permissions
- AWS Access Key ID and Secret Access Key
- LangGuard admin role
Setup
Step 1: Create an IAM User
- Navigate to the AWS IAM Console
- Click Users > Create user
- Name it "langguard-integration"
- Attach the following policy (or create a custom one):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:ListFoundationModels",
"bedrock:ListCustomModels",
"bedrock:ListProvisionedModelThroughputs",
"bedrock:GetFoundationModel",
"bedrock:ListModelInvocationLoggingConfigurations",
"logs:GetLogEvents",
"logs:FilterLogEvents"
],
"Resource": "*"
}
]
}
- Create an Access Key and copy the Access Key ID and Secret Access Key
Step 2: Add Integration in LangGuard
- Navigate to Integrations in the sidebar
- Click Add Integration
- Select AI Platforms > AWS Bedrock
- Enter:
- Name: A friendly name (e.g., "Production Bedrock US-East-1")
- Access Key ID: Your AWS access key
- Secret Access Key: Your AWS secret key
- Region: The AWS region where Bedrock is enabled (e.g.,
us-east-1)
- Click Test Connection
- Click Save
What Gets Captured
Models
LangGuard discovers all available Bedrock models:
| Field | Description |
|---|---|
| Model ID | The Bedrock model identifier |
| Provider | Model provider (Anthropic, Meta, Amazon, etc.) |
| Model Name | Human-readable model name |
| Status | Available, deprecated, etc. |
| Capabilities | Text generation, embedding, image, etc. |
Invocations
When model invocation logging is enabled in AWS:
| Field | Description |
|---|---|
| Model | The model invoked |
| Input/Output Tokens | Token counts |
| Latency | Response time |
| Status | Success or error |
| Region | AWS region of the invocation |
Provisioned Throughput
- Reserved capacity allocations
- Utilization metrics
- Cost tracking for provisioned models
Multiple Regions
To monitor Bedrock across multiple AWS regions, create a separate integration for each region. Use descriptive names to distinguish them (e.g., "Bedrock US-East-1", "Bedrock EU-West-1").
Troubleshooting
Authentication Failed
- Verify the Access Key ID and Secret Access Key are correct
- Ensure the IAM user hasn't been deactivated
- Check that the access key hasn't been rotated or deleted
No Models Discovered
- Confirm Bedrock is enabled in the specified region
- Verify the IAM policy includes
bedrock:ListFoundationModels - Check the region is correct
No Invocation Data
- Enable model invocation logging in the AWS Bedrock console
- Verify CloudWatch Logs permissions are granted to the IAM user
- Check that applications are actively calling Bedrock models
Next Steps
- Integrations Overview — See all available integrations
- Discovery — View discovered models and resources
- Cost Estimates — Configure cost tracking for Bedrock models