Gemini 1.5 Pro Error: "Something went wrong while trying to deserialize a response from the server."
Closed this issue · 15 comments
Description of the bug:
I get this error very, very often. I wait about 15 seconds and repeat the prompt until I get a normal response. The prompt is not in English. This wastes a lot of time and I don't know how to fix it. Is there a way to handle errors? The model returns them to normal text that I have to parse. So far I have received over x3 types of different errors from Gemini 1.5 Pro.
Actual vs expected behavior:
No response
Any other information you'd like to share?
No response
BILLING from google.ai.generativelanguage.v1beta.GenerativeService.GenerateContent
Gemini requests are currently being billed to me. The problem is that I get an error from the library and no result.
Please tell me what to do? This library has a serialization error. I repeatedly repeat the request until I get an answer that does not have an error and in this way insanely large amounts are charged. Is there a way to not charge these errors? If not, in this case this library is not appropriate to use because it generates billing that is without any content.
For example, here is the following error:
Hey @karloti could you share the prompt you are using that's triggering the deserialization issue for debugging?
I think the issue may be because you are subclassing FunctionDeclaration. This is not the intended use. Does this error also occur if you use the defineFunction methods?
Sounds like a possibility, since the trace points to that line
at com.google.ai.client.generativeai.common.shared.FunctionCall$$serializer.deserialize(Types.kt:62)
that being said, it should absolutely not fail deserializing. The model is not returning a value that we were expecting.
relatedly, if you happen to be able to pull the text of one of these responses that fails to parse that would help debugging immensely. If you're in android studio you can use the network inspector in App Inspection to accomplish this.
relatedly, if you happen to be able to pull the text of one of these responses that fails to parse that would help debugging immensely. If you're in android studio you can use the network inspector in App Inspection to accomplish this.
I can give you the prompt, but I don't want it here publicly. Maybe an email. Note that the prompt is in Bulgarian and this may be a problem. However, the fact is that the same prompt has no problem in AI Studio
Actually, the response is more important, Feel free to remove any strings you like from it, the structure is whats important.
If you'd prefer to email it to me, you can email it to davidmotson@google.com
Alternatively, you could record request/responses send to the server using https://developer.android.com/studio/debug/network-profiler
That would allow us to see what gets send and what you get back. As @davidmotson said, structure matters rather than content, so you can edit out the values.
Alternatively, you could record request/responses send to the server using https://developer.android.com/studio/debug/network-profiler
That would allow us to see what gets send and what you get back. As @davidmotson said, structure matters rather than content, so you can edit out the values.
Temporarily stopped the Gemini API
I've also temporarily stopped the Gemini API for obvious reasons and can't give you a result from Network Inspector, but I've emailed everything I need and described the problem pretty well.
I hope you fix the error as soon as possible!
I have used type safe text for TextPart(val text: String)
I have no null strings passed
I also emailed the prompt to @davidmotson which shows that everything is normal.
here is the code snippet which appears to be null safe (prompt):
This bug will be fixed in v0.8.0
I have already tested the latest update and the bug is caught.
Thanks to David for fixing this.
@davidmotson https://drive.google.com/drive/folders/10S8eA3nXlkxUjqlK9EuIptzDMF7WQJf1?usp=sharing
Seems like this is not the same issue as the original one. Please create a new issue to track it separately. Thanks!
New version 0.8.0 has been published, including the fix to the original issue.