Picking a new MaxNALUSize
database64128 opened this issue · 4 comments
After more usage, it turns out that the newly raised (#30) MaxNALUSize is still not enough for 2160p60 streaming:
unable to decode AVCC: NALU size (4298040) is too big, maximum is 4194304
Before opening another pull request to raise the limit again, I would like to understand more about the effects of this value on memory usage. Would it be ok to just double it?
Hello, that variable doesn't influence memory usage, it's a limit on the maximum available memory in order to prevent people from creating specially-crafted packets that can crash software that depends on this library.
In case of RTSP / WebRTC / SRT / UDP, only one NALU at a time is routed from publisher to readers, therefore the maximum RAM occupied by a stream is MaxNALUSize * (maximum write queue size).
In case of HLS, a sequence of NALUs is kept into memory because HLS works by serving segments of NALUs, therefore the maximum RAM occupied by a stream is MaxNALUSize * segment duration * number of segments.
BTW i've renamed the variable into MaxAccessUnitSize in order to apply it to the entire access unit. An access unit is a sequence of NALUs referred to the same time instant.
Thanks for the detailed explanation. Do you plan to update the limit yourself? I'd be happy to test it afterwards.
This issue is being locked automatically because it has been closed for more than 6 months.
Please open a new issue in case you encounter a similar problem.