Any guide how to use tools/gemma/run_gemma_xla.py?
Closed this issue · 4 comments
I found run_gemma_xla.py file, but it seems gemma
package is missing?
and what is hex-llm as stated in the comments?
I think it is important to understand this utility codes for better Gemma serving scenarios.
@nkovela1 should know more! gemma
is the standalone pytorch version here https://github.com/google/gemma_pytorch/. Which should be clonable, then pip installable, though it is not on PyPI. And you are right! We should document this better.
Neel added some documentation to our scripts. Here's some links to Vertex AI notebooks:
https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/community/model_garden/model_garden_gemma_finetuning_on_vertex.ipynb
https://github.com/GoogleCloudPlatform/vertex-ai-samples/blob/main/notebooks/community/model_garden/model_garden_gemma_deployment_on_vertex.ipynb
Thanks! it looks much better.
Just wanted to demystify what hex-llm is and the underlying technology of it :)
I think more docs there coming soon probably! Besides the two I linked above, though I'm not directly involved so I don't know the precise timeline.