Byron/google-apis-rs

Google Drive upload of larger files always fails with 503

aeheathc opened this issue · 1 comments

google-drive3 = "5.0.2"

I'm using upload_resumable with a delegate that retries with backoff on all observed recoverable issues. However, this doesn't seem like something that can be solved by retrying.

Response { status: 503, version: HTTP/2.0, headers: {"content-type": "text/plain; charset=utf-8", "x-guploader-uploadid": "redacted", "content-length": "152", "date": "Tue, 08 Aug 2023 16:29:22 GMT", "server": "UploadServer", "alt-svc": "h3=\":443\"; ma=2592000,h3-29=\":443\"; ma=2592000"}, body: Body(Full(b"Invalid request. According to the Content-Range header, the upload offset is 41943040 byte(s), which exceeds already uploaded size of 35389440 byte(s).")) }

For each of these failed uploads, the elapsed time, uploaded bytes, and difference between uploaded&offset are seemingly random. The time can be anywhere from 5 minutes to an hour, the upload progress from 1 chunk worth up to 2.6GB, and the difference doesn't even seem to be a multiple of the chunk size.
I don't know why the library is sending an offset that the server considers invalid and looking at the crate docs I haven't found anything I can use to influence it.

Byron commented

I recommend to copy the crate files locally and modify the method in question until it works as expected. If that outcome/patch is posted here, one could integrate it into the generator to fix it for all.