Freeze Transformer Weight
ivokun opened this issue · 1 comments
ivokun commented
Can we finetune the model with Transformer layers' weight freeze (only train the embeddings and softmax)?
StellaAthena commented
That is not currently supported by this codebase but is easy enough to do using either the HuggingFace library or by writing a small wrapper for this codebase.