DepthAnything/Depth-Anything-V2

Something wrong with state_dict

Opened this issue · 0 comments

8FDC commented

depth_anything.load_state_dict(torch.load(f'checkpoints/depth_anything_v2_{args.encoder}.pth', map_location='cpu'))
File "C:\Users\admin\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\torch\nn\modules\module.py", line 2189, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for DepthAnythingV2:
Missing key(s) in state_dict: "pretrained.blocks.12.norm1.weight", "pretrained.blocks.12.norm1.bias", "pretrained.blocks.12.attn.qkv.weight", "pretrained.blocks.12.attn.qkv.bias", "pretrained.blocks.12.attn.proj.weight", "pretrained.blocks.12.attn.proj.bias", "pretrained.blocks.12.ls1.gamma", "pretrained.blocks.12.norm2.weight",
"pretrained.blocks.12.norm2.bias", "pretrained.blocks.12.mlp.fc1.weight", "pretrained.blocks.12.mlp.fc1.bias", "pretrained.blocks.12.mlp.fc2.weight", "pretrained.blocks.12.mlp.fc2.bias", "pretrained.blocks.12.ls2.gamma", "pretrained.blocks.13.norm1.weight", "pretrained.blocks.13.norm1.bias", "pretrained.blocks.13.attn.qkv.weight", "pretrained.blocks.13.attn.qkv.bias", "pretrained.blocks.13.attn.proj.weight", "pretrained.blocks.13.attn.proj.bias", "pretrained.blocks.13.ls1.gamma", "pretrained.blocks.13.norm2.weight", "pretrained.blocks.13.norm2.bias", "pretrained.blocks.13.mlp.fc1.weight", "pretrained.blocks.13.mlp.fc1.bias", ``