thrad-distilbert-conversation-classifier
DistilBERT model for conversation classification with hard labels. For additional information about the data preprocessing, training, or evaluation, refer to the public repository for the project [here].
Model Details
- Base Architecture: DistilBERT
- Task: Multi-class conversation intent classification
Usage
from transformers import AutoTokenizer, AutoModelForSequenceClassification
model = AutoModelForSequenceClassification.from_pretrained("Thrad/thrad-distilbert-conversation-classifier")
tokenizer = AutoTokenizer.from_pretrained("Thrad/thrad-distilbert-conversation-classifier")
# Example inference
text = "Your conversation text here"
inputs = tokenizer(text, return_tensors="pt", truncation=True, max_length=512)
outputs = model(**inputs)
predictions = outputs.logits.softmax(dim=-1)
- Downloads last month
- 68