Skip to content

Latest commit

 

History

History
25 lines (21 loc) · 698 Bytes

README.md

File metadata and controls

25 lines (21 loc) · 698 Bytes

ToxicityClassificator

Module for predicting toxicity messages in Russian and English

Usage example

from toxicityclassifier import *

classifier = ToxicityClassificatorV1()

print(classifier.predict(text))          # (0 or 1, probability)
print(classifier.get_probability(text))  # probability
print(classifier.classify(text))         # 0 or 1

Weights

Weight for classification (if probability >= weight => 1 else 0)

classifier.weight = 0.5


Weight for language detection (English or Russian)

if the percentage of the Russian language >= language_weight, then the Russian model is used, otherwise the English one

classifier.language_weight = 0.5