Use Bert-Wiki model with Transformer's pipeline for zero-shot text classification
pazcuturi opened this issue · 1 comments
pazcuturi commented
Hi,
This work is amazing, I've been playing around with the demo as well and thank you for making all of this publicly available.
I was wondering if there's a way to integrate the Bert-Wiki model directly with Transformer's pipeline for zero-shot text classification (as shown here). I'm particularly interested in providing the class descriptions (along with the names) and using multi_label classification.
Thanks!
pazcuturi commented
Never mind, just figured it out. Leaving this here for others:
model = AutoModelForSequenceClassification.from_pretrained("CogComp/ZeroShotWiki", label2id={"entailment": 0, "neutral": 1, "contradiction": 2})
classifier = pipeline("zero-shot-classification", model=model, tokenizer=tokenizer)
results = classifier(sequence, candidate_topics, multi_label=True)