Inference Error
BillieJL opened this issue · 3 comments
I tried to run on video with this command
python inference.py --model_path checkpoints/ --camera True
but it give me the following error
W tensorflow/stream_executor/gpu/redzone_allocator.cc:314] Internal: Invoking GPU asm compilation is supported on Cuda non-Windows platforms only
Relying on driver to perform ptx compilation.
Modify $PATH to customize ptxas location.
This message will be only logged once.
any solution?
This warning generated by linux and windows systems, no effect on the results
well, i tried to run
python inference.py --model_path checkpoints/ --camera True
the camera was successfully turned on then this message appear
Incompatible shapes: [13300,2] vs. [4420,2] [Op:Mul] name: mul/
[ WARN:1] global C:\projects\opencv-python\opencv\modules\videoio\src\cap_msmf.cpp (674) SourceReaderCB::~SourceReaderCB terminating async callback
and then the camera turned off again.
any solution?
capture resolutions should match the input image size in priors_box