tunib-ai/parallelformers

How can I parallelize the MegatronBertModel?

kajyuuen opened this issue · 1 comments

Thanks for sharing the great code.

Let me ask you one question.
In the list of Fully Supported Models, it says Megatron BERT, but the following code does not work.

from transformers import MegatronBertModel
from parallelformers import parallelize

model = MegatronBertModel.from_pretrained('nvidia/megatron-bert-cased-345m')
parallelize(model, num_gpus=2, verbose='detail')

Also, I could not find MegatronBertModel in the policies.
How can I parallelize the MegatronBertModel?

We added MegatronBertModel