getalp/Flaubert

Import Error

rcontesti opened this issue · 5 comments

Hi I have tried to download Flaubert as described:

import torch
from transformers import FlaubertModel, FlaubertTokenizer

Unfortunately it returns an ImportError:

  from transformers import FlaubertModel, FlaubertTokenizer
ImportError: cannot import name 'FlaubertModel'

Hi @rcontesti,

Please update transformers with the below command.

pip install https://github.com/huggingface/transformers.git --upgrade

Yes, tried it but I'm getting:

ERROR: Cannot unpack file C:\Users\RUBENC~1\AppData\Local\Temp\pip-unpack-d48ipzuq\transformers.git (downloaded from C:\Users\RUBENC~1\AppData\Local\Temp\pip-req-build-vbov4_l8, content-type: text/html; charset=utf-8); cannot detect archive format ERROR: Cannot determine archive format of C:\Users\RUBENC~1\AppData\Local\Temp\pip-req-build-vbov4_l8

I also upgraded with conda. But still not working. It seems as you imply that many models are missing still:

File "C:\Users\Ruben Contesti\AppData\Local\Continuum\Anaconda3\envs...\lib\site-packages\transformers\configuration_utils.py", line 145, in from_pretrained
raise EnvironmentError(msg)
OSError: Model name 'flaubert-base-uncased-squad' was not found in model name list (bert-base-uncased, bert-large-uncased, bert-base-cased, bert-large-cased, bert-base-multilingual-uncased, bert-base-multilingual-cased, bert-base-chinese, bert-base-german-cased, bert-large-uncased-whole-word-masking, bert-large-cased-whole-word-masking, bert-large-uncased-whole-word-masking-finetuned-squad, bert-large-cased-whole-word-masking-finetuned-squad, bert-base-cased-finetuned-mrpc, bert-base-german-dbmdz-cased, bert-base-german-dbmdz-uncased). We assumed 'flaubert-base-uncased-squad' was a path or url to a configuration file named config.json or a directory containing such a file but couldn't find any such file at this path or url.

If you obtained errors when upgrading then I think you should ask on the official transformers repo. Once you have successfully upgraded, Flaubert should work. There are 4 configurations supported by FlaubertModel and FlaubertTokenizer in transformers.

I also upgraded with conda. But still not working. It seems as you imply that many models are missing still:

We haven't trained Flaubert for a question answering dataset yet. Maybe you want to use this model? If so, then you should follow the instructions provided in this page.

@rcontesti I'm going to close this issue now. Please feel free to re-open if needed.