All Detectors Explained
Toxicity Detector
The detector provides a detailed analysis of the text’s toxicity, indicating the presence and levels of general toxicity, obscene language, insults, threats, and identity hate. The detector returns a list of dictionaries, each containing the following information:
- score: The score of the text on a scale of 0 to 100, with higher scores indicating a higher level of toxicity.
- type: The type of toxicity detected, such as “obscene”, “insult”, “threat”, or “identity_hate”.