Introduction
AI Proxy API Documentation
Welcome to the AI Proxy API documentation. This guide provides comprehensive information to help you integrate and utilize our APIs for easily proxying your LLM calls via Enkrypt. These are compatible with OpenAI SDKs and adhere to the OpenAI API specifications.
Purpose
The AI Proxy API is designed to help you proxy LLM endpoints via Enkrypt enabling you to configure input and ouput guardrails. By leveraging our proxy API, you can protect your language models from adversarial attacks and ensure that your AI deployments are secure.
Offered APIs
Our API suite includes the following endpoints:
-
chat-completions: This endpoint allows you to generate chat completions.
-
completions: This endpoint enables you to generate completions.
Obtaining an API Key
To get started with the AI Proxy API, you need to obtain an API key. Follow these steps:
- Login: Access your account at app.enkryptai.com.
- Get API Key: Navigate to the API section to get your unique API key.
- Authentication: Use this API key in the authorization headers of your leaderboard API calls.
By following these steps, you can seamlessly integrate the AI Proxy API into your enterprise applications.
We are committed to helping you maintain the highest standards of safety in your language deployments. Should you have any questions or require further assistance, our support team is readily available to assist you.
Let’s ensure safer AI together!