Pretrained text encoder
ethancohen123 opened this issue · 2 comments
ethancohen123 commented
Is it possible to use and train dalle with an external ( frozen) text encoder ( as those available in hugging face) ?
ethancohen123 commented
Anyone has an idea about this ? @lucidrains
kingnobro commented
Hi. If you want to use pretrained language model, you are actually using the text embedding of that model.
- At first, you can load and save the text embedding layer weight of pretrained models like CLIP and BERT.
- Then, you need to replace the
text_emb
inDALLE
__init__
function. Now, instead of usingnn.Embedding
to create new text embedding, you can usetorch.load
to load pretrained weight saved in step 1.
Example: link