AI Proxy Quickstart

We can use OpenAI SDK to use EnkryptAI deployment we created. It takes care of proxying correctly to the model saved in the deployment and applying the input, output guardrails set in the deployment.

Install required libraries

Shell
pip install requests python-dotenv tabulate pandas enkryptai-sdk openai

Setup your deployment

Please create a sample policy, sample model and then a sample deployment before proceeding with the below steps.

See AI Proxy Quickstart

Example Usage

# python3 -m pytest -s test_openai.py

import os
import pytest
from openai import OpenAI
from dotenv import load_dotenv

load_dotenv()

ENKRYPT_API_KEY = os.getenv("ENKRYPTAI_API_KEY")
ENKRYPT_BASE_URL = "https://api.enkryptai.com"

client = OpenAI(
    base_url=f"{ENKRYPT_BASE_URL}/ai-proxy"
)

test_deployment_name = "test-deployment"

# Custom headers
custom_headers = {
    'apikey': ENKRYPT_API_KEY,
    'X-Enkrypt-Deployment': test_deployment_name
}

# Example of making a request with custom headers
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello!'}],
    extra_headers=custom_headers
)

print("\n\nResponse from OpenAI API with custom headers: ", response)
print("\nResponse data type: ", type(response))

def test_openai_response():
    assert response is not None
    assert hasattr(response, "choices")
    assert len(response.choices) > 0
    print("\n\nOpenAI API response is: ", response.choices[0].message.content)
    assert hasattr(response, "enkrypt_policy_detections")