runFullImpl: failed to generate timestamp token
mike22437120 opened this issue · 1 comments
mike22437120 commented
sakura6264 commented
Sorry, but I can't reproduce this error.
Sometime when the model can't recognize anything, it will skip 1s of the media and log "failed to generate timestamp token", and usually running the inference again may make it work again.
So maybe there is something wrong with your input.
Or you could try the official one, if it doesn't work either, you could create an issue there.
( I don't have a GPU which is good enough to use the large model, so I could only try to use the medium one for testing.)