Transcoded tracks cutting off early in Firefox
Opened this issue · 0 comments
Some transcoded tracks are consistently cutting off in Firefox around 1370 KB in (the exact amount varies, but seemingly always between 1300 and 1400... for reference, out of a 5460 KB transcoded ogg). That's from the Firefox Network inspector tab. When playback gets to the cutoff point, Pots just jumps to the next track as if the track had ended normally.
Printing debugging info server side shows the whole file is getting transcoded properly. Issuing the same avconv command and redirecting output to an ogg file produces perfectly correct output. Curling the URL (curl -o tmp.ogg http://HOSTNAME/song/TRACK_ID/m4a,ogg,mp3
) produces correct output. And the track downloads and plays fine in Chrome.
So, all signs point to some weirdness in Firefox's downloading/buffering of files sourced in an <audio />
tag. Possibly a recent development; seeing it in Firefox 36.0 on Linux. Since the transcoded track doesn't have a Content-Length header, that may be a factor. Have not seen a repro with any non-transcoded track.
Questions to answer:
- Does it happen on all transcoded tracks, or is there something special about this track/album?
- If we examine the HTML audio element, are there any error events being fired or other peculiarities? What load state does the element report after the download prematurely stops (can be observed by the file size of approx 1370 KB becoming visible in Network tab)?
I bet it's the gunicorn worker timeout. I worked around that by setting it to 100 seconds, which was enough, because Firefox would eagerly download the whole audio track regardless of length and 100 in practice was enough for transcoding to finish (the server's CPU being the limiting factor). Firefox's default download strategy has probably since changed to conserve bandwidth by only downloading a limited amount ahead of the current playback position. So now if the track is longer than 100 seconds + that fixed amount of time, it will cut off around there.
Maybe it's possible to influence the downloading strategy through HTML attributes or DOM APIs. Perhaps (though not for sure) adding preload="auto"
will be enough to do it, per this doc (which also warns that the browser can totally ignore that attribute if it feels like it).
Worst case, rewrite the backend in Rust using green threads with no timeout :)
Ultimate solution (maybe, maybe this is overengineering?): use some sort of message queue to tell a pool of backend transcoding tasks, "transcode this track to Ogg, and put it in a cache directory here". For subsequent requests, read from the cache directory if the file is already there, even if partial. Send the Accept-Ranges: bytes
header, which (hopefully) makes browsers make multiple requests with different ranges, which we fulfill from the cache directory, and no single request ever has to live for anywhere near 100 seconds to begin with. Some sort of cache should be in place anyway to prevent needless work.
preload="auto"
did nothing.