Pinned Repositories
client-js
JS Client library for Mistral AI platform
client-python
Python client library for Mistral AI platform
cookbook
FastChat-release
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
megablocks-public
mistral-common
mistral-evals
mistral-finetune
mistral-inference
Official inference library for Mistral models
vllm-release
A high-throughput and memory-efficient inference and serving engine for LLMs
Mistral AI's Repositories
mistralai/mistral-inference
Official inference library for Mistral models
mistralai/mistral-finetune
mistralai/cookbook
mistralai/megablocks-public
mistralai/mistral-common
mistralai/client-python
Python client library for Mistral AI platform
mistralai/client-js
JS Client library for Mistral AI platform
mistralai/vllm-release
A high-throughput and memory-efficient inference and serving engine for LLMs
mistralai/mistral-evals
mistralai/FastChat-release
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
mistralai/client-ts
TS Client library for Mistral AI platform
mistralai/platform-docs-public
mistralai/transformers-release
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
mistralai/TensorRT-LLM
TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and build TensorRT engines that contain state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorRT-LLM also contains components to create Python and C++ runtimes that execute those TensorRT engines.
mistralai/sagemaker-docs
Mistral AI documentation for SageMaker