LLamaIndex's custom LLM, a Proxy-enabled version of Gemini!
pip install git+https://github.com/HawkClaws/proxy_gemini.git
from proxy_gemini import ProxyGemini
llm = ProxyGemini(
api_key="your api_key",
proxy_url="http://hogehoge",
)
res = llm.complete("HOGE HUGA world")
print(res.text)