Pantry integration does not seem to work
Closed this issue · 10 comments
Hi There,
I recently started using this and want to run it on multiple laptops. Ideally with sync between them so my data and statistics don't get lost. I figured that was what pantry was for but it is not working for me.
The first issue is that the get to pantry gets rejected (403) because of the name of the basket it is trying to access. Apparently xxx-en-mainSync
is not allowed. If I try xxx-en-mainsync
it is. No clue why. I made a fork and changed the basket to lower case and at least it seemed to go a bit further.
Then I ran into issues with the rate limiting. The first few calls would succeed and then it would overload the pantry which would respond with 423 responses. I did see something about rate limiting in the code but not sure what should be happening there.
Anyway for me it is not working. I am using Firefox, both on windows and linux with the same result. Haven't tried other browsers yet to be honest.
Thanks!
Yes... there are some problems with Pantry that made it completely broken since several months.
Some of the problems are these rate limiting you say. In the code there are some workarounds to fix these rate limit, but later it seems that pantry has set even tighter restrictions on access. There seems to be nothing official communicated, but the server completely closes when we start to send/request "a lot" of data.
We have planned some way of possibly workaround this rate limit another time, but we can tell you when this can get implemented. For the moment, please, disable Pantry until this gets fixed.
Too many things to fix/improve, so little time... ;)
Thanks for the quick reply, that is appreciated at least. I fully understand this being a hobby project and not having time to fix stuff ;)
Can you elaborate a little on what kind of ideas you had? My first thought was to use less baskets, maybe even put everything in a single basket. The only issue there is that I don't know how large the dataset can grow on a large account since I only have a fairly beginner account.
I'm asking because maybe I can try to hack some stuff together and open a PR for that.
For the data size, some can hit the 5Mb limit...
Thanks for the quick reply, that is appreciated at least. I fully understand this being a hobby project and not having time to fix stuff ;)
Can you elaborate a little on what kind of ideas you had? My first thought was to use less baskets, maybe even put everything in a single basket. The only issue there is that I don't know how large the dataset can grow on a large account since I only have a fairly beginner account.
I'm asking because maybe I can try to hack some stuff together and open a PR for that.
Sure ;)
The main idea is forget all that code that split the data in multiple baskets to avoid some of the rate limiting having one single file. But then, yes, you hit the data size limitation.
The answer for our problems is here:
https://pieroxy.net/blog/pages/lz-string/guide.html
nice and sort: compress our json data in to a lovely compressed json string and that is what we send to Pantry servers.
I have made some preliminary tests time ago and we can get huge compression gain with this lib. Even we can think in the future use it to avoid the 5MB limit of the json local data, not only the one we send to Pantry.
Every help is welcomed, if you want to join us as ogi dev, you only have to ask for ;)
https://pieroxy.net/blog/pages/lz-string/index.html
Here you can get main features of the lib and technical data.
Very, very promissing... :)
Ah yeah I will take a loot at it. Should be doable with the compression indeed. Since most of the JSON data is going to be similar compression can do a lot :D
I don't mind lending a hand but I don't think I will be all to active to be honest.
The decision is yours. You can simply make external contributions if you prefer, we appreciate any form of help. ;)
Yes, with the size reduction that we achieve with compression we can fit the maximum 5MB that our data can currently grow into the size limitation per basket has Pantry, which now i honestly do not remember how much it is, 1MB or so, maybe?
We should have control over this and if the resulting size to be sent is greater than this limit, abort the pantry sending and warn the user so that they can delete something before try again.
Ah I just read your comment about a warning. I will see if I can add something to the toast if we expect it to fail on size.
Edit: Nevermind, there is already a message if the data gets to large in the toast based on the response from pantry so I guess no work needed there.
Fixed with #238