Decode latency spikes randomly.
MarcoBarroca opened this issue · 14 comments
Unsure why this happens. I'm using OVR Space Calibrator in continuous calibration normally and suddenly everything stutters and decode latency spikes. I thought it could be my network but from the statistics view in the server, network latency and encoding are normal.
I'll add that I'm using HEVC.
I'll attach some pictures of the statistics screen taken from inside VR. The last picture is of normal conditions.
I also have a video of it happening, I'll try cutting the relevant parts and uploading it later if it'll help identify the issue.
Forgot to mention I increased the requested bitrate to 250mbps. Maybe that's the issue? Will do some tests with the default value later and report back.
I've noticed a bug recently with the streamer where a lot of games with mirror windows, for whatever reason, hang the client until the windows are minimized. Does it go away if you minimize the app? Or is it only with continuous calib?
It only stops if I wait it out or reset the connection by closing and re-opening the client. Continuous calib on or off doesn't matter, it will still spike and stay like that for a seemingly random amount of time.
Hmmm, what's your PC GPU? fwiw I've been pushing 350Mbps HEVC out of my 3060 with no issues on decode time, but the increasing slope makes me think it's something weirder
I’m using a 4090. I’m on older drivers, I could try updating.
I was assuming it was something on the Vision Pro as this seems like an issue with decode.
Wait, what is it doing when the decode ramps up? like is it still drawing fine, is it headlocked (frame stuck to head), or is it timewarping (quad stuck in space, but you can walk around it)? Is there a lot of latency?
Edit: Actually I guess you said it stutters so it'd be latency + timewarped, trying to think of why else it'd happen. If you can find a reliable way to reproduce it let me know
Timewarping mostly. Video lags and tries to catch up. Screens take a while to follow the head. Some artifacts on the video as well.
I have a 4090 w/ up-to-date drivers and OS running Windows 11. Getting over 200mbps starts to get really laggy (video has to catch up), but I also have Resolution set to Extreme (width: 6080). If I set it to Medium I can push it a little higher but definitely not 350. This varies greatly by game too. Above ^ is all for Alyx. If I play Legendary Tales I set it down to ~100 otherwise it's laggy.
I'm using OVR but not continuous calibration. Also for Bitrate, I'm on Adaptive Mode with Minimum set to 20 and Maximum network latency set to 8ms.
So I guess first thing: Extreme is mostly pointless, the Vision Pro VR mode is limited to 1920*2, 3840. I've got my fingers crossed that Apple will enable foveated rendering and higher res so we can get crisper images, but yeah. You'll also want to ensure fixed foveation is on.
But I've also been working on disgnosing a strange quirk I've noticed where if I have my Mbps above ~150 on Medium with HEVC, it will give me network spikes only when I duck down and the view changes significantly. I have yet to see the decoder go above 5ms but it's possible you accidentally found extreme conditions which trigger some kind of Apple bug instead of odd networking behavior.
Ok actually I tested Extreme on my 3060 with HEVC and it actually works, maybe it's only H264 which fails. But also, the other thinfs I found improve smoothness are
- Disable decoding latency limiter
- Disable optimize game render latency
HEVC seems strange bc I can never get it to actually boost the encode Mbps, not sure if it actually can go higher than around 200
I think I’ve managed to replicate this consistently. Seems like it’s tied to bitrate as I suspected.
Here’s a video of me recording from the Vision Pro as it’s happening.
In the video you can see that decode latency spikes happen when bitrate also spikes. It’s also completely unusable above 300mbps and seems stable around 100mbps. Further in the video I show how it looks inside Steam VR when latency is super high.
I wouldn’t think it’s got to do with my network setup. I’m using WiFi 6 and sitting right next to my router. I don’t have issues when streaming a 4k@120Hz feed with Sunshine+Moonlight even when gaming for instance.
When using adaptive bitrate the experience is similar. If the bitrate spikes so does decode latency. 300mbps seems like a hard limit for some reason. Specifically it seems like it’s when it spikes above “encoder latency”.
I figured I can also force network latency. If I limit bitrate to very low values with extreme resolution I can get network latency when moving the head too fast. Network latency spikes like this are different because they happen in short bursts and stop when the head is stopped.
Makes sense there would be a sweet spot on the decoding I guess, we don't do any kind of monitoring on the decoder so if it falls behind it'll cascade like that. Apple's decoder was already kinda weird for not being high latency at our medium resolution in the first place--On my Quest Pro, decode would get to 100ms at Medium/4288.
I have recently updated to the latest nightly release of the server and used the test-flight version.
The problem is mostly solved. If I set the bitrate manually I still get lag spikes but they are no longer related to Decode according to the statistics tab. By setting the bitrate to Adaptive the lag spikes are almost completely gone.
One thing I realized is that bitrate plummets if I move the head too fast. Is this expected behavior?
Considering this question is unrelated to the original issue I'll close this one.