intel/libyami

Decoder 'drain' functionality

Closed this issue · 18 comments

I'm trying to update from an older libyami and I used the drain parameter of decodeGetOutput. Setting it to true would get me a frame out for every frame in. Since that is removed, how do you get similar functionality?

Hi @jsorg71 ,
you may need call decode with 0 size input to drain the decode.
sample code are here
https://github.com/intel/libyami-utils/blob/master/tests/vppinputdecode.cpp#L86

Hi @xhaihao , I think I tried that, I'll try again tomorrow to be sure. It looks like that is used for EOF event where you are about to delete or restart the decoder.

@jsorg71 , if you want one frame in, one frame out. You may need set enableLowLatency. But it has limitation. It only works for h264 and IPPPP pattern. It's a hack mode, we do not strictly follow the spec in this mode.
If possible, I suggest you change your encoder to zero latency mode. It maybe better.

@xuguangxin thanks, I'll look into that. I only use I and P frames.
I don't have that option in libyami 1.2.
When I build 1.3 I get

In file included from vaapicontext.cpp:27:0:
../vaapi/vaapistreamable.h: In function ‘std::ostream& operator<<(std::ostream&, const VAEntrypoint&)’:
../vaapi/vaapistreamable.h:108:10: error: ‘VAEntrypointFEI’ was not declared in this scope
     case VAEntrypointFEI:
          ^~~~~~~~~~~~~~~
Makefile:603: recipe for target 'libyami_vaapi_la-vaapicontext.lo' failed

this added by intel/libva@397a4a2, you can comment out the case or you can upgrade libva to a newer version.
It's hard to add #ifdef for every libva version, it will make the code unreadable.

@xuguangxin I got 1.3.0 building, thanks. I set 'enableLowLatency' before calling start, but I still don't see any difference.

BTW, it looks like the VAEntrypointFEI issue is fixed in HEAD of apache.

@jsorg71 , thanks for information. It may related to bitstream, could you send your bitstream to me.
thanks

@xuguangxin Sure, I'll capture and provide I and P frames that work perfectly with libyami 0.3.1 but not with any version after.

I put the files in http://server1.xrdp.org/a8/libyami-issue-844/

surface_command_beef_0.bin
h264 frames with a 16 byte header between each frame.
It contains
8 frames at 1920x1080 I P P P ...
42 frames at 424x368 I P P P ...
7 frames at 1032x776 I P P P ...

surface_command_beef_1.bin
h264 frames with a 16 byte header between each frame.
It contains
2 frames at 1920x1080 I P

h264_beef_stepper_031.tar.gz
example program for libyami v0.3.1 that works as expected

h264_beef_stepper_apache.tar.gz
example program for current apache head that I can't seem to get working.

of course, you will have to edit the Makefile to get building for your system but they are really simple.

thanks jsorg71, we will try it.

Hi @jsorg71
Although we need internal test and unit test, could you help try
https://github.com/xuguangxin/libyami-utils/tree/fix_low_latency
https://github.com/xuguangxin/libyami/tree/fix_low_latency
see if it meet your requirement?
You need change two things in your application

  1. add low latency flag as you did.
  2. add VIDEO_DECODE_BUFFER_FLAG_FRAME_END when you know it's a completed frame. this will tell decoder stop search frame, decode current frame ASAP

Another thing is interlace stream, do you have interlace requirement for low latency?

thanks

@xuguangxin Thanks, that works!

glad to hear this.

What do you think about making a 1.3.1 release? I like this feature but don't like grabbing a develop branch. It looks like it been about a year.

thanks guys

no problem, it's our job

release here #864