sheffieldnlp/naacl2018-fever

Use different tokenizer for claims to wikipedia articles

Closed this issue · 0 comments

j6mes commented

Wikipedia articles are pre-tokenized and just need splitting by space. Claims need to be tokenized properly