curl --request POST \
--url https://api.enkryptai.com/redteam/v3/model-health \
--header 'Content-Type: application/json' \
--header 'apikey: <api-key>' \
--data '
{
"endpoint_configuration": {
"testing_for": "foundationModels",
"model_name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"certifications": [],
"model_config": {
"model_provider": "together",
"hosting_type": "External",
"input_modalities": [
"text"
],
"output_modalities": [
"text"
],
"model_source": "https://together.ai",
"rate_per_min": 20,
"system_prompt": "",
"endpoint": {
"scheme": "https",
"host": "api.together.xyz",
"port": 443,
"base_path": "/v1"
},
"paths": {
"completions": "/completions",
"chat": "/chat/completions"
},
"auth_data": {
"header_name": "Authorization",
"header_prefix": "Bearer",
"space_after_prefix": true
},
"apikeys": [
"xxxxx"
],
"metadata": {},
"default_request_options": {}
}
}
}
'{
"status": "healthy",
"message": "Model health check completed successfully",
"data": {
"query": "What is the capital of India?",
"response": "The capital of India is New Delhi."
},
"error": "Error while checking model health"
}curl --request POST \
--url https://api.enkryptai.com/redteam/v3/model-health \
--header 'Content-Type: application/json' \
--header 'apikey: <api-key>' \
--data '
{
"endpoint_configuration": {
"testing_for": "foundationModels",
"model_name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"certifications": [],
"model_config": {
"model_provider": "together",
"hosting_type": "External",
"input_modalities": [
"text"
],
"output_modalities": [
"text"
],
"model_source": "https://together.ai",
"rate_per_min": 20,
"system_prompt": "",
"endpoint": {
"scheme": "https",
"host": "api.together.xyz",
"port": 443,
"base_path": "/v1"
},
"paths": {
"completions": "/completions",
"chat": "/chat/completions"
},
"auth_data": {
"header_name": "Authorization",
"header_prefix": "Bearer",
"space_after_prefix": true
},
"apikeys": [
"xxxxx"
],
"metadata": {},
"default_request_options": {}
}
}
}
'{
"status": "healthy",
"message": "Model health check completed successfully",
"data": {
"query": "What is the capital of India?",
"response": "The capital of India is New Delhi."
},
"error": "Error while checking model health"
}Show child attributes
Show child attributes
Provider of the model which determines the request response format
openai, together, huggingface, groq, azure_openai, anthropic, cohere, bedrock, gemini, ai21, fireworks, alibaba, portkey, deepseek, mistral, llama, openai_compatible, cohere_compatible, anthropic_compatible, custom, url, enkryptai, sierra, boxai, nutanix, xactly "together"
Hosting type of the model
External, Internal "External"
Source of the model
"https://together.ai"
Array of tools available to the model
Show child attributes
[
{
"name": "web_search",
"description": "The tool web search is used to search the web for information related to finance."
}
]Types of input that the model can process
text, image, audio ["text"]Types of output that the model can generate
text ["text"]< 100 won't enable async. > 100 will enable async mode. > 200 we can run boosted async (all tests in parallel). Default 20.
20
System prompt
""
Show child attributes
Scheme of the endpoint
"https"
Host of the endpoint
"api.together.xyz"
Port of the endpoint
443
Base path of the endpoint
"/v1"
Show child attributes
"Authorization"
"Bearer"
true
"token"
"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
"xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx"
"{\"type\":\"service_account\",\"project_id\":\"xxxxxxxx\",\"private_key_id\":\"xxxxxxxx\",\"private_key\":\"xxxxxxxx\",\"client_email\":\"xxxxxxxx\",\"client_id\":\"xxxxxxxx\",\"auth_uri\":\"https://accounts.google.com/o/oauth2/auth\",\"token_uri\":\"https://oauth2.googleapis.com/token\",\"auth_provider_x509_cert_url\":\"https://www.googleapis.com/oauth2/v1/certs\",\"client_x509_cert_url\":\"https://www.googleapis.com/robot/v1/metadata/x509/[email protected]\"}"
"xxxxxxxx"
"xxxxxxxx"
If enabled, the authorization header or parameter can be overridden in the request by the value configured here
false
BoxAI client ID for authentication
"BOX_CLIENT_ID"
BoxAI client secret for authentication
"BOX_CLIENT_SECRET"
BoxAI user ID for authentication
"BOX_USER_ID"
BoxAI default file ID for operations
"BOX_FILE_ID"
Nutanix user session ID for authentication
"NUTANIX_USER_SESSION_ID"
Nutanix email ID for authentication
"NUTANIX_EMAIL_ID"
Xactly username for authentication
"XACTLY_USERNAME"
Xactly password for authentication
"XACTLY_PASSWORD"
["TOGETHER_AI_API_KEY"]apikey, jwt "apikey"
Show child attributes
POST, GET "POST"
"https://example.com/api/auth"
Array of custom headers to be sent with token requests
Show child attributes
[
{
"key": "content-type",
"value": "application/x-www-form-urlencoded"
},
{
"key": "x-custom-header",
"value": "custom-value"
}
]The body of the token request
"grant_type=client_credentials&client_id=xxxxxxxxxx&client_secret=xxxxxxxxxx"
The key of the token in the response in jq format
".access_token"
"curl --location 'https://example.com/api/chat' \\\n--header 'Accept: application/json, text/plain, */*' \\\n--header 'Accept-Language: en-US,en;q=0.9' \\\n--header 'Authorization: {token}' \\\n--header 'Connection: keep-alive' \\\n--header 'Content-Type: application/json' \\\n--data '{\"prompt\":\"{prompt}\"}'"
Array of custom headers to be sent with requests
[
{
"key": "X-Custom-Header",
"value": "custom-value"
}
]A flexible object that can contain any custom key-value pairs for the request payload. Only condition is to include {prompt} in the payload.
{
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant."
},
{ "role": "user", "content": "{prompt}" }
]
}Content type of the custom response. Currently only supports JSON.
json "json"
Optional response format path. Jq format for json type (e.g. '.choices[0].message.content')
".choices[0].message.content"
Show child attributes
2048
2.5
10
If Azure, it's instance type
"enkrypt2024"
If Azure, it's API version
"2024-02-01"
If Azure, it's deployment ID
"gpt3"
If Anthropic, it's version
""
If Llama2, it's format
openai "openai"
If Mistral, it's format
openai, ollama "openai"
Show child attributes
If running Gemini on Vertex, specify the regional API endpoint (hostname only)
""
If running Gemini on Vertex, specify the project ID
""
If running Gemini on Vertex, specify the location ID
""
Purpose of testing
foundationModels, chatbotsAndCopilots, agents, URL "foundationModels"
Name of the model. Required for foundationModels
"mistralai/Mixtral-8x7B-Instruct-v0.1"
List of certifications
[
"GDPR",
"CCPA",
"HIPAA",
"SOC 2 Type II",
"SOC 3"
]{
"testing_for": "foundationModels",
"model_name": "mistralai/Mixtral-8x7B-Instruct-v0.1",
"certifications": [],
"model_config": {
"model_provider": "together",
"hosting_type": "External",
"input_modalities": ["text"],
"output_modalities": ["text"],
"model_source": "https://together.ai",
"rate_per_min": 20,
"system_prompt": "",
"endpoint": {
"scheme": "https",
"host": "api.together.xyz",
"port": 443,
"base_path": "/v1"
},
"paths": {
"completions": "/completions",
"chat": "/chat/completions"
},
"auth_data": {
"header_name": "Authorization",
"header_prefix": "Bearer",
"space_after_prefix": true
},
"apikeys": ["xxxxx"],
"metadata": {},
"default_request_options": {}
}
}Successful Response
healthy, unhealthy "Model health check completed successfully"
"Error while checking model health"