Pinned Repositories
aij-comp
binary_code_through_sound
Выпускной проект
EasyLM
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
falcon-jax
JAX implementation of the falcon model
Gemma-EasyLM
Train GEMMA on TPU/GPU! (Codebase for training Gemma-Ko Series)
llama-2-jax-parallel
JAX implementation of the Llama 2 model
mmlu_ru
MMLU eval for RU/EN
movie_reviews_rating
phi-jax
Implementation of microsoft Phi model in pure functional JAX
SMPD-training
Reference HF's LLMs (e.g LLAMA) training implementation on Kaggle TPU hardware leveraging torch XLA + SMPD
defdet's Repositories
defdet/SMPD-training
Reference HF's LLMs (e.g LLAMA) training implementation on Kaggle TPU hardware leveraging torch XLA + SMPD
defdet/aij-comp
defdet/binary_code_through_sound
Выпускной проект
defdet/EasyLM
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
defdet/falcon-jax
JAX implementation of the falcon model
defdet/Gemma-EasyLM
Train GEMMA on TPU/GPU! (Codebase for training Gemma-Ko Series)
defdet/llama-2-jax-parallel
JAX implementation of the Llama 2 model
defdet/mmlu_ru
MMLU eval for RU/EN
defdet/movie_reviews_rating
defdet/phi-jax
Implementation of microsoft Phi model in pure functional JAX
defdet/Poly-modification-bolgov
defdet/qwen2-jax
defdet/rulm
Language modeling and instruction tuning for Russian
defdet/tpu-generate
defdet/transformers-fixed-llama
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
defdet/vk-spam-classification
defdet/xformers_hip_try
Hackable and optimized Transformers building blocks, supporting a composable construction.