MTLab/onnx2caffe

Load caffe model failed.

Closed this issue · 7 comments

Hi, My test code is as follows:
const String model_desc = "resnet18-pytorch2caffe.prototxt";
const String model_binary = "resnet18-pytorch2caffe.caffemodel";

int test_caffe()
{
// init model
Net net = readNetFromCaffe(model_desc, model_binary);
return 0;
}

The error message from is as follows:

OpenCV Error: Assertion failed (pbBlob.raw_data_type() == caffe::FLOAT16) in cv::dnn::experimental_dnn_v3::`anonymous-namespace'::CaffeImporter::blobFromProto, file C:\build\master_winpack-build-win64-vc15\opencv\modules\dnn\src\caffe\caffe_importer.cpp, line 251

part of converting log is as follows:

I1112 09:26:10.485890 38095 net.cpp:337] 113 does not need backward computation.
I1112 09:26:10.485898 38095 net.cpp:337] 112 does not need backward computation.
I1112 09:26:10.485908 38095 net.cpp:337] 111 does not need backward computation.
I1112 09:26:10.485914 38095 net.cpp:337] 111_bn does not need backward computation.
I1112 09:26:10.485924 38095 net.cpp:337] 110 does not need backward computation.
I1112 09:26:10.485929 38095 net.cpp:337] 109 does not need backward computation.
I1112 09:26:10.485936 38095 net.cpp:337] 108 does not need backward computation.
I1112 09:26:10.485944 38095 net.cpp:337] 108_bn does not need backward computation.
I1112 09:26:10.485950 38095 net.cpp:337] 107 does not need backward computation.
I1112 09:26:10.485955 38095 net.cpp:337] 106_106_0_split does not need backward computation.
I1112 09:26:10.485962 38095 net.cpp:337] 106 does not need backward computation.
I1112 09:26:10.485970 38095 net.cpp:337] 105 does not need backward computation.
I1112 09:26:10.485976 38095 net.cpp:337] 104 does not need backward computation.
I1112 09:26:10.485982 38095 net.cpp:337] 104_bn does not need backward computation.
I1112 09:26:10.485991 38095 net.cpp:337] 103 does not need backward computation.
I1112 09:26:10.485998 38095 net.cpp:337] 0 does not need backward computation.
I1112 09:26:10.486003 38095 net.cpp:379] This network produces output 171
I1112 09:26:10.486146 38095 net.cpp:402] Top memory (TEST) required for data: 50884608 diff: 50884608
I1112 09:26:10.486155 38095 net.cpp:405] Bottom memory (TEST) required for data: 50882560 diff: 50882560
I1112 09:26:10.486160 38095 net.cpp:408] Shared (in-place) memory (TEST) by data: 9934848 diff: 9934848
I1112 09:26:10.486166 38095 net.cpp:411] Parameters memory (TEST) required for data: 45795232 diff: 45795232
I1112 09:26:10.486171 38095 net.cpp:414] Parameters shared memory (TEST) by data: 0 diff: 0
I1112 09:26:10.486176 38095 net.cpp:420] Network initialization done.

I try your code on Mac with opencv 3.4.3 and it works well. What's your compiling environment?

Windows 64 + VS2017 + opencv3.4
Can you test on my converted caffe model, here is the link:
https://pan.baidu.com/s/16NWlmNvkOzf2JM_KFCwi2w

I'm sorry that i can not find a windows environment to test your code and model. When i tried to load your model on Mac, it printed such error info:
F1114 15:25:39.219410 2443531136 blob.cpp:496] Check failed: count_ == proto.data_size() (9408 vs. 0) *** Check failure stack trace: ***
this kind of error seems usually happen when try to load old caffemodel using newer version of caffe. What's the version of caffe you are using?

Thanks a lot for your efforts The caffe model was converted by your project, maybe it's an old version of caffe while opencv3.4 adapts to new version of caffe, so i' ll do more test on caffe latter.

The generated caffemodel will run for one time in pycaffe when converted by this project to check the output difference between pytorch and caffe. So if it is converted successfully, i believe the model is runnable. I suggest you convert the model in a unix/linux environment using the newest caffe. You can only build the cpu version of caffe as the converting does not need gpu, and it can be easily done in a virtual machine as your operation system is Windows. I have tested converting your model on mac and ubuntu, and it worked well.

@YukiNagato thank you for your constructive suggestion!