AlbertRapp/tidychatmodels

[Feature Request] Add support for Databricks Foundation Serving Model Endpoints

Opened this issue · 0 comments

Hi Albert, awesome work with this package and love the tidymodels flavor! Support for Databricks Foundation Models would be great. This would include the following serving endpoints:

  • DBRX @ databricks-dbrx-instruct - This is Mosaic AI's chat LLM rebranded post-Databricks acquisition
  • Mixtral 8x7B @ databricks-mixtral-8x7b-instruct
  • Meta Lllama 3 @ databricks-meta-llama-3-70b-instruct

Here are some additional resources that might help you:

  • chattr repo: I requested a similar chattr feature in #97 and their team updated / merged into main from #99
  • Databricks docs: Foundation Model APIs
  • Databricks docs: Query a Serving Endpoint
    • Using curl I was successful hitting /serving-endpoints/{name}/invocations
  • Databricks docs: While I think the focus of tidychatmodels is specifically chat LLMs, there are a few embedding model endpoints on Databricks and here's more info on those models for reference: Supported Models for Pay-per-Token