Use different tokenizer for claims to wikipedia articles
Closed this issue · 0 comments
j6mes commented
Wikipedia articles are pre-tokenized and just need splitting by space. Claims need to be tokenized properly
Closed this issue · 0 comments
Wikipedia articles are pre-tokenized and just need splitting by space. Claims need to be tokenized properly