ImportError: cannot import name 'glue_criterion_metrics' from 'transformers'
SenYan1999 opened this issue · 6 comments
I install transformers from pip and the version is 2.8.0 and it reported such a error:
ImportError: cannot import name 'glue_criterion_metrics' from 'transformers'
Now I have solved by defining such a function manually in code. But why such an error comes out? Thanks!!!
The version in this repo is 2.3. Please install the transformers library in this repo. You can do it by executing pip install --editable .
under FreeLB/huggingface-transformers.
The version in this repo is 2.3. Please install the transformers library in this repo. You can do it by executing
pip install --editable .
under FreeLB/huggingface-transformers.
Thank you very much. If I want to implement a bert version of FreeLB, while your transformers only implement albert version, should i just modify the source code of Huggingface Bert by adding dp_masks into it? Do I need to modify the training process?
If you use run_glue_freelb.py, then I think you only have to modify the BERT models.
If you use run_glue_freelb.py, then I think you only have to modify the BERT models.
Thank you very much!
I have do what you do in Albert, while the results on CoLA is always MCC 0. And I found that the logits' 0 position is always negative, while 1position is always positive.
Such a weird problem... But I change some codes in run_glue_freelb.py and remove dp_masks, and use normal Bert models, it works well.
Maybe there are some thing wrong with me... I just apply dp_mask at self-attention and the pooled_output...
Can you get reasonable results without actually using the dropout masks?
Can you get reasonable results without actually using the dropout masks?
Yes, i did it. And it works well!