🔀 Bedrock Proxy Endpoint
OpenAI API compatible server for AWS Bedrock LLMs
Bedrock Proxy Endpoint allows you to spin up your own custom OpenAI API server endpoint for AWS Bedrock LLM text inference. It makes it easy to use AWS Bedrock with existing OpenAI API compatible applications by providing a compatible proxy that handles all the format conversions.
Key Features
- OpenAI API compatibility layer for AWS Bedrock
- Support for multiple Bedrock models
- Format conversion between OpenAI and Bedrock
- Easy integration with existing applications
- Customizable endpoint configuration
- Detailed error handling and logging
Installation
- Clone the repository:
git clone https://github.com/jparkerweb/bedrock-proxy-endpoint.git cd bedrock-proxy-endpoint
- Install dependencies:
npm ci
- Configure AWS credentials:
aws configure
Usage
Start the server:
npm start
Make requests to your endpoint:
curl http://localhost:3000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "anthropic.claude-v2",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Configuration
Configure the endpoint by setting environment variables:
PORT=3000
AWS_REGION=us-east-1
DEFAULT_MODEL=anthropic.claude-v2