
Open Source AI Gateway
An open-source AI gateway that streamlines management of multiple large language model (LLM) providers with integrated features for efficiency and control.
About Open Source AI Gateway
This open-source AI gateway enables seamless management of multiple LLM providers such as OpenAI, Anthropic, Gemini, Ollama, Mistral, and Cohere. It features comprehensive analytics, safety guardrails, rate limiting, caching, and administrative controls, supporting both HTTP and gRPC protocols for versatile deployment.
How to Use
Configure the Config.toml file with your API credentials and model preferences. Launch the Docker container with the configuration mounted. Send API requests via curl or your preferred client, specifying the desired LLM provider.
Features
Use Cases
Best For
Pros
Cons
Frequently Asked Questions
Find answers to common questions about Open Source AI Gateway
