vertex-ai-gemini bad configuration error message
Closed this issue ยท 8 comments
Hi, I was eagerly awaiting the release of quarkus-langchain4j-vertex-ai-gemini
, so props for the release ๐ .
Unfortunately I get a configuration error. I added a log excerpt below. The key "quarkus.langchain4j.vertexai.devopi.location" is logged as missing during startup and later also mentioned in the error message as required. I later realized that the config key had to be quarkus.langchain4j.vertexai.gemini.devopi.location
but the error message did not help with that.
2024-05-09 10:21:59,013 WARN [io.qua.config] (main) Unrecognized configuration key "quarkus.langchain4j.vertexai.devopi.location" was provided; it will be ignored; verify that the dependency extension for this configuration is set or that you did not make a typo
2024-05-09 10:21:59,016 WARN [io.qua.config] (main) Unrecognized configuration key "quarkus.langchain4j.vertexai.devopi.projectId" was provided; it will be ignored; verify that the dependency extension for this configuration is set or that
...
Caused by: io.smallrye.config.ConfigValidationException: Configuration validation failed:
SRCFG00014: The config property quarkus.langchain4j.vertexai.devopi.location is required but it could not be found in any config source
at io.quarkiverse.langchain4j.vertexai.runtime.gemini.VertexAiGeminiRecorder.chatModel(VertexAiGeminiRecorder.java:26)
at io.quarkus.deployment.steps.VertexAiGeminiProcessor$generateBeans1607297205.deploy_0(Unknown Source)
at io.quarkus.deployment.steps.VertexAiGeminiProcessor$generateBeans1607297205.deploy(Unknown Source)
... 8 more
my inital config that lead to the error:
langchain4j:
devopi:
chat-model:
provider: vertexai-gemini
vertexai:
devopi:
projectId: "<my-project>"
location: "<region>"
gemini:
chat-model:
model-id: "gemini-1.0-pro"
config that worked after I later realized that the config path had to be quarkus.langchain4j.vertexai.gemini.devopi.location
:
langchain4j:
devopi:
chat-model:
provider: vertexai-gemini
vertexai:
gemini:
devopi:
projectId: "<my-project>"
location: "<region>"
chat-model:
chat-model: # why is chat-model repeated?
model-id: "gemini-1.0-pro"
Hi,
Actually there are two different extensions, quarkus-langchain4j-vertex-ai-gemini
which is meant to be used with Google's latest Gemini models and quarkus-langchain4j-vertex-ai
which is meant to be used with the older Palm models.
The configuration for the former is under quarkus.langchain4j.vertexai.gemini
while for the latter it is quarkus.langchain4j.vertexai
.
Are you trying to use both of them at the same time?
Hi
I'm trying to use only gemini and I only have a dependency to quarkus-langchain4j-vertex-ai-gemini
in my pom. I also have a dependency on langchain4j-vertex-ai-gemini
(without quarkus) because that is was I was using before, but I don't think that is causing this issue.
also have a dependency on langchain4j-vertex-ai-gemini (without quarkus) because that is was I was using before
You can safely remove this.
Also, if you only have this provider, then you don't need to set quarkus.langchain4j.chat-model.provider
or quarkus.langchain4j.whatever.chat-model.provider
.
Question, in your configuration, why are you using devopi
?
Quarkus is complaining about missing mymodel
configuration because that is the named model you seem to be using in your code.
I replaced devopi
with mymodel
when I created the issue, but apparently I didn't do a good job there, sorry about the confusion. I edited my first message and it should now say "devopi" everywhere.
I also use quarkus-langchain4j-openai
that's why I configured a provider.
Any chance you can attach a sample project so I can have a look?
I believe the error is here
https://github.com/quarkiverse/quarkus-langchain4j/blob/main/vertex-ai-gemini/runtime/src/main/java/io/quarkiverse/langchain4j/vertexai/runtime/gemini/VertexAiGeminiRecorder.java#L83
The config you are injecting has a different prefix from the error message
-
message:
Problem( "SRCFG00014: The config property quarkus.langchain4j.vertexai%s%s
https://github.com/quarkiverse/quarkus-langchain4j/blob/main/vertex-ai-gemini/runtime/src/main/java/io/quarkiverse/langchain4j/vertexai/runtime/gemini/VertexAiGeminiRecorder.java#L83 -
config:
@ConfigMapping(prefix = "quarkus.langchain4j.vertexai.gemini") public interface LangChain4jVertexAiGeminiConfig
https://github.com/quarkiverse/quarkus-langchain4j/blob/main/vertex-ai-gemini/runtime/src/main/java/io/quarkiverse/langchain4j/vertexai/runtime/gemini/config/LangChain4jVertexAiGeminiConfig.java#L19
Hope that helps
That indeed looks wrong!
Would you like to contribute a fix as you have already pinpointed the problem?