AWS Integration
Integrate BroxiAI with Amazon Web Services (AWS) ecosystem
Learn how to integrate BroxiAI workflows with AWS services for enhanced functionality, scalability, and data processing capabilities.
AWS Integration Architecture

Overview
BroxiAI integrates seamlessly with AWS services to provide:
Scalable cloud infrastructure
Advanced data processing capabilities
Enterprise-grade security and compliance
Cost-effective resource management
Global availability and performance
Core AWS Services Integration
Amazon Bedrock
Connect BroxiAI to AWS's fully managed foundation model service.
Configuration
{
"provider": "aws_bedrock",
"region": "us-east-1",
"access_key_id": "${AWS_ACCESS_KEY_ID}",
"secret_access_key": "${AWS_SECRET_ACCESS_KEY}",
"models": {
"text": "anthropic.claude-3-sonnet-20240229-v1:0",
"embeddings": "amazon.titan-embed-text-v1"
}
}
Supported Models
Anthropic Claude 3 (Haiku, Sonnet, Opus)
Amazon Titan (Text, Embeddings)
Meta Llama 2
Cohere Command
AI21 Labs Jurassic
Amazon S3
Store and process documents, images, and other files.
Use Cases
Document storage for RAG applications
File upload/download workflows
Batch processing of documents
Data archival and backup
Integration Example
# BroxiAI workflow component
{
"component": "S3FileLoader",
"config": {
"bucket_name": "my-documents-bucket",
"object_key": "documents/manual.pdf",
"aws_access_key_id": "${AWS_ACCESS_KEY_ID}",
"aws_secret_access_key": "${AWS_SECRET_ACCESS_KEY}",
"region": "us-west-2"
}
}
Amazon RDS/Aurora
Connect to managed relational databases for data retrieval and storage.
Configuration
{
"database_type": "postgresql",
"host": "mydb.cluster-xyz.us-east-1.rds.amazonaws.com",
"port": 5432,
"database": "production",
"username": "${DB_USERNAME}",
"password": "${DB_PASSWORD}",
"ssl_mode": "require"
}
SQL Agent Integration
Natural language to SQL conversion
Database schema analysis
Automated query generation
Results formatting and analysis
Amazon DynamoDB
Integrate with NoSQL database for session management and metadata storage.
Use Cases
User session storage
Conversation history
Metadata caching
Real-time analytics
Configuration
{
"table_name": "broxi_sessions",
"region": "us-east-1",
"access_key_id": "${AWS_ACCESS_KEY_ID}",
"secret_access_key": "${AWS_SECRET_ACCESS_KEY}",
"partition_key": "session_id",
"sort_key": "timestamp"
}
Amazon SES
Send automated emails from BroxiAI workflows.
Email Component Setup
{
"component": "SESEmailSender",
"config": {
"region": "us-east-1",
"from_email": "noreply@yourcompany.com",
"access_key_id": "${AWS_ACCESS_KEY_ID}",
"secret_access_key": "${AWS_SECRET_ACCESS_KEY}"
}
}
Use Cases
Automated notifications
Report generation and delivery
Customer communication
Alert systems
Advanced AWS Integrations
Amazon Kinesis
Process streaming data in real-time workflows.
Data Streams
{
"stream_name": "broxi-events",
"region": "us-east-1",
"shard_count": 2,
"retention_period": 24
}
Integration Benefits
Real-time data processing
Event-driven workflows
Scalable data ingestion
Analytics and monitoring
AWS Lambda
Trigger BroxiAI workflows from serverless functions.
Lambda Function Example
import json
import requests
def lambda_handler(event, context):
# Trigger BroxiAI workflow
response = requests.post(
"https://api.broxi.ai/v1/flows/your-flow-id/run",
headers={
"Authorization": f"Bearer {os.environ['BROXI_API_TOKEN']}",
"Content-Type": "application/json"
},
json={
"input": event["input_message"],
"variables": event.get("variables", {})
}
)
return {
'statusCode': 200,
'body': json.dumps(response.json())
}
Amazon SNS
Send notifications and alerts from workflows.
Topic Configuration
{
"topic_arn": "arn:aws:sns:us-east-1:123456789012:broxi-alerts",
"region": "us-east-1",
"access_key_id": "${AWS_ACCESS_KEY_ID}",
"secret_access_key": "${AWS_SECRET_ACCESS_KEY}"
}
Notification Types
SMS messages
Email notifications
Mobile push notifications
HTTP/HTTPS endpoints
Amazon CloudWatch
Monitor and log BroxiAI workflow performance.
Metrics Configuration
{
"namespace": "BroxiAI/Workflows",
"metrics": [
"ExecutionTime",
"SuccessRate",
"ErrorRate",
"TokenUsage"
],
"log_group": "/aws/broxi/workflows"
}
Vector Database Integration
Amazon OpenSearch Service
Use OpenSearch as a vector database for RAG applications.
Configuration
{
"endpoint": "https://search-mydomain.us-east-1.es.amazonaws.com",
"region": "us-east-1",
"index_name": "documents",
"dimension": 1536,
"similarity_metric": "cosine",
"authentication": {
"type": "aws_iam",
"access_key_id": "${AWS_ACCESS_KEY_ID}",
"secret_access_key": "${AWS_SECRET_ACCESS_KEY}"
}
}
Features
Vector similarity search
Hybrid search (text + vector)
Real-time indexing
Scalable performance
Amazon MemoryDB for Redis
Fast vector search with in-memory performance.
Use Cases
Real-time recommendations
Session-based search
Caching layer for embeddings
Low-latency applications
Security and Compliance
IAM Roles and Policies
Configure secure access to AWS resources.
Minimal IAM Policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"s3:GetObject",
"s3:PutObject",
"ses:SendEmail"
],
"Resource": [
"arn:aws:bedrock:*:*:model/*",
"arn:aws:s3:::my-bucket/*",
"arn:aws:ses:*:*:identity/*"
]
}
]
}
VPC Integration
Deploy BroxiAI integrations within your VPC for enhanced security.
Network Configuration
Private subnets for database access
NAT gateways for internet access
Security groups for access control
VPC endpoints for AWS services
Encryption
Data Protection
S3 bucket encryption (AES-256 or KMS)
RDS encryption at rest
SES encryption in transit
Parameter Store for secrets
Cost Optimization
Resource Management
Best Practices
Use appropriate instance sizes
Implement auto-scaling policies
Monitor and optimize usage
Leverage spot instances where applicable
Cost Monitoring
Set up billing alerts
Use AWS Cost Explorer
Tag resources for cost allocation
Regular cost optimization reviews
AWS Free Tier
Take advantage of AWS Free Tier services:
Amazon Bedrock: Limited free usage
Amazon S3: 5GB of storage
Amazon DynamoDB: 25GB of storage
AWS Lambda: 1M free requests per month
Deployment Patterns
Multi-Region Deployment
Deploy across multiple AWS regions for:
High availability
Disaster recovery
Global performance
Compliance requirements
Architecture Example
Primary Region (us-east-1):
- Production workloads
- Primary databases
- Main S3 buckets
Secondary Region (us-west-2):
- Disaster recovery
- Read replicas
- Backup storage
Auto Scaling
Configure auto-scaling for variable workloads:
Application Load Balancer
Auto Scaling Groups
CloudWatch metrics
Scaling policies
Monitoring and Logging
CloudWatch Integration
Key Metrics
API response times
Error rates
Resource utilization
Cost metrics
Log Aggregation
Centralized logging
Log retention policies
Search and analysis
Alerting rules
AWS X-Ray
Distributed tracing for complex workflows:
Request tracing
Performance analysis
Bottleneck identification
Service map visualization
Getting Started
Prerequisites
AWS account with appropriate permissions
BroxiAI account and API access
AWS CLI configured
Basic understanding of AWS services
Quick Start Guide
Set up AWS credentials in BroxiAI global variables
Create S3 bucket for document storage
Configure Bedrock access for AI models
Build your first workflow using AWS components
Test and deploy to production
Example Workflow
Here's a complete workflow that processes documents from S3:

Support Resources
Next Steps
Last updated