AI Proxy Quickstart

We can use OpenAI SDK to use an EnkryptAI deployment. It takes care of proxying correctly to the model saved in the deployment and applying the input, output guardrails set in the deployment.

We need to create a sample policy, sample model and then a sample deployment to use with the OpenAI SDK.

Setup a sample policy

export ENKRYPTAI_API_KEY="YOUR_ENKRYPTAI_API_KEY"

curl --request POST \
  --url https://api.enkryptai.com/guardrails/add-policy \
  --header 'Content-Type: application/json' \
  --header "apikey: $ENKRYPTAI_API_KEY" \
  --data '{
  "name": "sample-policy",
  "description": "Sample policy for testing",
  "detectors": {
    "topic_detector": {
      "enabled": false,
      "topic": []
    },
    "nsfw": {
      "enabled": true
    },
    "toxicity": {
      "enabled": false
    },
    "pii": {
      "enabled": false,
      "entities": [
        "pii"
      ]
    },
    "injection_attack": {
      "enabled": true
    },
    "keyword_detector": {
      "enabled": false,
      "banned_keywords": []
    },
    "system_prompt": {
      "enabled": false
    },
    "copyright_ip": {
      "enabled": false
    },
    "policy_violation": {
      "enabled": true,
      "policy_text": "Do not allow any illegal or immoral activities.",
      "need_explanation": true
    },
    "bias": {
      "enabled": false
    }
  }
}'

Response:

JSON
{
  "message": "Policy details added successfully",
  "data": {...}
}

Setup a sample model

export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
export ENKRYPTAI_API_KEY="YOUR_ENKRYPTAI_API_KEY"

curl --request POST \
  --url https://api.enkryptai.com/models/add-model \
  --header 'Content-Type: application/json' \
  --header "apikey: $ENKRYPTAI_API_KEY" \
  --data '{
    "model_saved_name": "sample-model",
    "testing_for": "LLM",
    "model_name": "gpt-4o",
    "model_type": "text_2_text",
    "certifications": [
      "GDPR",
      "CCPA",
      "SOC 2 Type 2",
      "SOC 3",
      "CSA STAR Level 1"
    ],
    "model_config": {
      "model_provider": "openai",
      "model_version": "1",
      "hosting_type": "External",
      "model_source": "https://openai.com",
      "endpoint": {
        "scheme": "https",
        "host": "api.openai.com",
        "port": 443,
        "base_path": "/v1"
      },
      "paths": {
        "completions": "/completions",
        "chat": "/chat/completions"
      },
      "auth_data": {
        "header_name": "Authorization",
        "header_prefix": "Bearer",
        "space_after_prefix": true,
        "api_key": "'$OPENAI_API_KEY'"
      },
      "metadata": {
        "max_tokens": 500,
        "input_cost_1M_tokens": 2.5,
        "output_cost_1M_tokens": 10
      },
      "default_request_options": {
        "temperature": 1,
        "top_p": 1,
        "top_k": null
      }
    }
  }'

Response:

JSON
{
  "message": "Model details added successfully",
  "data": {...}
}

Setup a sample deployment

export ENKRYPTAI_API_KEY="YOUR_ENKRYPTAI_API_KEY"

curl --request POST \
  --url https://api.enkryptai.com/deployments/add-deployment \
  --header 'Content-Type: application/json' \
  --header "apikey: $ENKRYPTAI_API_KEY" \
  --data '{
  "name": "sample-deployment",
  "model_saved_name": "sample-model",
  "input_guardrails_policy": {
    "policy_name": "sample-policy",
    "enabled": true,
    "additional_config": {
      "pii_redaction": false
    },
    "block": [
      "nsfw",
      "injection_attack",
      "policy_violation"
    ]
  },
  "output_guardrails_policy": {
    "policy_name": "sample-policy",
    "enabled": true,
    "additional_config": {
      "hallucination": false,
      "adherence": false,
      "relevancy": false
    },
    "block": [
      "nsfw",
      "injection_attack",
      "policy_violation"
    ]
  }
}'

Response:

JSON
{
  "message": "Deployment details added successfully",
  "data": {...}
}

AI Proxy Example Usage with OpenAI SDK

Python SDK
# python3 -m pytest -s test_openai.py

import os
import pytest
from openai import OpenAI
from dotenv import load_dotenv

load_dotenv()

ENKRYPT_API_KEY = os.getenv("ENKRYPTAI_API_KEY")
ENKRYPT_BASE_URL = "https://api.enkryptai.com"

client = OpenAI(
    base_url=f"{ENKRYPT_BASE_URL}/ai-proxy"
)

test_deployment_name = "test-deployment"

# Custom headers
custom_headers = {
    'apikey': ENKRYPT_API_KEY,
    'X-Enkrypt-Deployment': test_deployment_name
}

# Example of making a request with custom headers
response = client.chat.completions.create(
    model='gpt-4o',
    messages=[{'role': 'user', 'content': 'Hello!'}],
    extra_headers=custom_headers
)

print("\n\nResponse from OpenAI API with custom headers: ", response)
print("\nResponse data type: ", type(response))

def test_openai_response():
    assert response is not None
    assert hasattr(response, "choices")
    assert len(response.choices) > 0
    print("\n\nOpenAI API response is: ", response.choices[0].message.content)
    assert hasattr(response, "enkrypt_policy_detections")