X-PLUG/mPLUG-Owl

从modelscope下载的权重,用给的示例代码做Inference,有的权重好像没有加载上,这个有影响么

LianghuiGuo opened this issue · 2 comments

Some weights of MPLUGOwl2LlamaForCausalLM were not initialized from the model checkpoint at /data/oss_bucket_0/mplug_owl2 and are newly initialized: ['model.visual_abstractor.encoder.layers.1.crossattention.attention.k_pos_embed', 'model.visual_abstractor.encoder.layers.4.crossattention.attention.q_pos_embed', 'model.visual_abstractor.encoder.layers.0.crossattention.attention.k_pos_embed', 'model.visual_abstractor.encoder.layers.4.crossattention.attention.k_pos_embed', 'model.visual_abstractor.encoder.layers.1.crossattention.attention.q_pos_embed', 'model.visual_abstractor.encoder.layers.3.crossattention.attention.q_pos_embed', 'model.visual_abstractor.encoder.layers.3.crossattention.attention.k_pos_embed', 'model.visual_abstractor.encoder.layers.0.crossattention.attention.q_pos_embed', 'model.visual_abstractor.encoder.layers.2.crossattention.attention.q_pos_embed', 'model.visual_abstractor.encoder.layers.5.crossattention.attention.q_pos_embed', 'model.visual_abstractor.encoder.layers.5.crossattention.attention.k_pos_embed', 'model.visual_abstractor.encoder.layers.2.crossattention.attention.k_pos_embed']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.

These parameters are not randomly initialized and not learned during training. So not affect the model's performance.

好的 感谢