VisionLabs/torch-opencv

detectMultiScale3 call from lua (cannot convert 'nil' to 'bool')

Closed this issue · 4 comments

face_rects, meta = face_cascade:detectMultiScale3{image=frame, scaleFactor=1.2, minNeighbors=5, flags=0, minSize={0,0}, maxSize={0,0}, outputRejectLevels=true}

.../jatin/torch/install/share/lua/5.1/cv/objdetect/init.lua:204: bad argument #8 to 'CascadeClassifier_detectMultiScale3' (cannot convert 'nil' to 'bool')
stack traceback:
[C]: in function 'CascadeClassifier_detectMultiScale3'
...atin17/torch/install/share/lua/5.1/cv/objdetect/init.lua:204: in function 'detectMultiScale3'
[string "face_rects, meta = face_cascade:detectMultiSc..."]:1: in main chunk
[C]: in function 'xpcall'
/home/jatin17/torch/install/share/lua/5.1/trepl/init.lua:679: in function 'repl'
...in17/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:204: in main chunk
[C]: at 0x00406670

I am trying to get confidence scores for detected faces through detectMultiScale3 API but it is throwing (cannot convert 'nil' to 'bool') error even when I don't pass outputRejectLevels parameter it throws the same error I see the implementation at https://github.com/VisionLabs/torch-opencv/blob/master/cv/objdetect/init.lua where it defaults to false.

On a closer look I found the bug. We never parsed the argument outputRejectLevels and that's why it is always nil.

local image, scaleFactor, minNeighbors, flags, minSize, maxSize = cv.argcheck(t, argRules)

    local result = C.CascadeClassifier_detectMultiScale3(self.ptr, cv.wrap_tensor(image),
        scaleFactor, minNeighbors, flags, minSize, maxSize, outputRejectLevels)

Fixed, thank you very much!
It's been a long time since I decided to shift to

local result = C.CascadeClassifier_detectMultiScale3(self.ptr, cv.argcheck(t, argRules))

to avoid such bugs, but no time to implement it :/

Awesome. Thanks.