Tokenizer cant be hashed when using datastes.map function
dumpmemory opened this issue · 2 comments
dumpmemory commented
the tokenizer cant be hashed when using datasets.map function with num_proc >1 .
danyang-rainbow commented
same problem, can anyone help to solve this?
Sleepychord commented
https://stackoverflow.com/questions/55344376/how-to-import-protobuf-module
Seems like protobuf is not picklable. I will look into it in next few days.