miurla/morphic

Groq Llama models

ZainGithub12 opened this issue · 1 comments

@miurla, Whenever there are new Groq llama models available for morphic when stable, let us free users use it just like we can use got-4o mini. Because it's cost efficient, right?

The current version of Morphic requires a model with tool-use capabilities, which is not available in Llama 3.2.

https://console.groq.com/docs/models