Weird gray periods on Ubuntu 20.04
patriciogonzalezvivo opened this issue · 3 comments
linux.yaml
width: 1920
height: 1080
framerate: 30
bitrate: 5000000
packet_size: 8192
transmitter:
source:
- display
sink:
- udp
device: :1
address: 127.0.0.1
port: 8080
codec: hevc_nvenc
receiver:
source:
- udp
sink:
- display
address: 127.0.0.1
port: 8080
codec: hevc
./tunnel tx ../../../examples/linux.yml
. CURRENT STATE
├── Dimensions: 1920x1080
├── Framerate: 30 FPS
├── Bitrate: 5000000 bps
└── Packet Len: 8192 Bytes
. TRANSMITTER
├── Source: DISPLAY
├── Sink: UDP
├── Device: :1
├── Address: 127.0.0.1
├── Port: 8080
└── Codec: hevc_nvenc
. [Render Meta]
├── Window Mode: HEADLESS
├── Frame Size: 1920x1080
├── Device Size: 960x540
├── Texture Count: 2
└──. [EGL Meta]
│ ├── APIs: OpenGL_ES OpenGL
│ ├── Version: 1.5
│ └── Vendor: NVIDIA
└──. [GL Meta]
├── Renderer: GeForce GTX 1650 with Max-Q Design/PCIe/SSE2
├── Version: OpenGL ES 3.2 NVIDIA 440.64
├── Vendor: NVIDIA Corporation
└── GLSL Ver.: OpenGL ES GLSL ES 3.20
[RESAMPLER] Performance Degradation:
Output pixel format is different than the input.
- Input: bgra -> Output: yuv420p
./tunnel rx ../../../examples/linux.yml
. CURRENT STATE
├── Dimensions: 1920x1080
├── Framerate: 30 FPS
├── Bitrate: 5000000 bps
└── Packet Len: 8192 Bytes
. RECEIVER
├── Source: UDP
├── Sink: DISPLAY
├── Device: /dev/video0
├── Address: 127.0.0.1
├── Port: 8080
└── Codec: hevc
. [Render Meta]
├── Window Mode: WINDOWED
├── Frame Size: 1920x1080
├── Device Size: 960x540
├── Texture Count: 2
└──. [EGL Meta]
│ ├── APIs: OpenGL_ES OpenGL
│ ├── Version: 1.5
│ └── Vendor: NVIDIA
└──. [GL Meta]
├── Renderer: GeForce GTX 1650 with Max-Q Design/PCIe/SSE2
├── Version: OpenGL ES 3.2 NVIDIA 440.64
├── Vendor: NVIDIA Corporation
└── GLSL Ver.: OpenGL ES GLSL ES 3.20
This behaviour appears to be happening only with the UDP transport. I still not sure what is causing it. Investigating...
After some kernel digging I think I found the problem. In nominal operation, the input frame is encoded by the codec and transformed into an AVPacket. These packets are broken down into smaller chunks to fit inside the UDP packets. When a keyframe is generated (that is much larger than intermediate frames) the packetizer creates and sends a burst of multiple packets to the client. The problem was the receiver's input buffer being too small to handle this burst of data resulting in overflow.
Closing. Please, reopen if the reported behavior persists.