maxxie114
CEO and Head developer of https://www.qubemc.com/; plugin developer of Nukkit; code in Java, Python. Knowledge with Java, Python, C, PHP and HTML/CSS
@2B2TMCBE
maxxie114's Stars
axuew/deepnude_official-master
fishaudio/Bert-VITS2
vits2 backbone with multilingual-bert
233boy/v2ray
最好用的 V2Ray 一键安装脚本 & 管理脚本
aiboboxx/clashfree
clash节点、免费clash节点、免费节点、免费梯子、clash科学上网、clash翻墙、clash订阅链接、clash for Windows、clash教程、免费公益节点、最新clash免费节点订阅地址、clash免费节点每日更新
Jrohy/trojan
trojan多用户管理部署程序, 支持web页面管理
GoogleCloudPlatform/generative-ai
Sample code and notebooks for Generative AI on Google Cloud, with Gemini on Vertex AI
karpathy/ng-video-lecture
tloen/alpaca-lora
Instruct-tune LLaMA on consumer hardware
datawhalechina/office-automation
python自动化办公
EleutherAI/gpt-neo
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
tensorflow/text
Making text a first-class citizen in TensorFlow.
microsoft/tensorflow-directml-plugin
DirectML PluggableDevice plugin for TensorFlow 2
microsoft/tensorflow-directml
Fork of TensorFlow accelerated by DirectML
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
ajhalthor/Transformer-Neural-Network
Code Transformer neural network components piece by piece
google/flaxformer
gmontamat/poor-mans-transformers
Implement Transformers (and Deep Learning) from scratch in NumPy
eriklindernoren/ML-From-Scratch
Machine Learning From Scratch. Bare bones NumPy implementations of machine learning models and algorithms with a focus on accessibility. Aims to cover everything from linear regression to deep learning.
jsbaan/transformer-from-scratch
Well documented, unit tested, type checked and formatted implementation of a vanilla transformer - for educational purposes.
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
n2cholas/awesome-jax
JAX - A curated list of resources https://github.com/google/jax
gordicaleksa/get-started-with-JAX
The purpose of this repo is to make it easy to get started with JAX, Flax, and Haiku. It contains my "Machine Learning with JAX" series of tutorials (YouTube videos and Jupyter Notebooks) as well as the content I found useful while learning about the JAX ecosystem.
google/flax
Flax is a neural network library for JAX that is designed for flexibility.
openlm-research/open_llama
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
lucidrains/PaLM-rlhf-pytorch
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
lucidrains/PaLM-pytorch
Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways
lucidrains/PaLM-jax
Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)
google-deepmind/dm-haiku
JAX-based neural network library
jax-ml/jax
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
lxe/simple-llm-finetuner
Simple UI for LLM Model Finetuning