zhongyy/Face-Transformer

how to solve 'size mismatch for patch_to_embedding.weight: copying a param with shape torch.Size([512, 432]) from checkpoint, the shape in current model is torch.Size([512, 192]).'?

Closed this issue · 0 comments

how to solve 'size mismatch for patch_to_embedding.weight: copying a param with shape torch.Size([512, 432]) from checkpoint, the shape in current model is torch.Size([512, 192]).'?