BioBert for Pytorch
dpappas opened this issue · 3 comments
Hello Everyone
I tried to convert the keras model to Pytorch just like explained in this link:
https://github.com/huggingface/pytorch-pretrained-BERT#command-line-interface
I am using the following code:
`
import os
from pytorch_pretrained_bert.convert_tf_checkpoint_to_pytorch import convert_tf_checkpoint_to_pytorch
if (not os.path.exists('/my_path/pytorch_model.bin')):
convert_tf_checkpoint_to_pytorch(
'/my_path/biobert_model.ckpt',
'/my_path/bert_config.json',
'/my_path/pytorch_model.bin'
)
`
I got the following error:
Traceback (most recent call last): File "/.../pytorch_pacrr_and_posit_drmm/convert_bert_model_to_pytorch.py", line 9, in <module> '/home/dpappas/Downloads/F_BERT/Biobert/pubmed_pmc_470k/pytorch_model.bin' File "/usr/local/lib/python3.6/site-packages/pytorch_pretrained_bert/convert_tf_checkpoint_to_pytorch.py", line 69, in convert_tf_checkpoint_to_pytorch pointer = getattr(pointer, l[0]) AttributeError: 'Parameter' object has no attribute 'BERTAdam'
Could anyone help me?
Thank you in advance.
Hi, dpappas
Our checkpoint includes optimizer's parameters as BERT uses customized Adam optimizer.
So, you can just exclude those parameters when you use our checkpoint for pytorch.
Here is my solution.
excluded = ['BERTAdam','_power','global_step']
init_vars = list(filter(lambda x:all([True if e not in x[0] else False for e in excluded]),init_vars))
Paste this code below init_vars = tf.train.list_variables(tf_path)
line.
Thank you!
In Pytorch for NER tasks.