mnaylor5's Stars
huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
CamDavidsonPilon/Probabilistic-Programming-and-Bayesian-Methods-for-Hackers
aka "Bayesian Methods for Hackers": An introduction to Bayesian methods + probabilistic programming with a computation/understanding-first, mathematics-second point of view. All in pure Python ;)
d2l-ai/d2l-en
Interactive deep learning book with multi-framework code, math, and discussions. Adopted at 500 universities from 70 countries including Stanford, MIT, Harvard, and Cambridge.
modularml/mojo
The Mojo Programming Language
fastai/fastbook
The fastai book, published as Jupyter Notebooks
microsoft/qlib
Qlib is an AI-oriented quantitative investment platform that aims to realize the potential, empower research, and create value using AI technologies in quantitative investment, from exploring ideas to implementing productions. Qlib supports diverse machine learning modeling paradigms. including supervised learning, market dynamics modeling, and RL.
karpathy/micrograd
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
openai/spinningup
An educational resource to help anyone learn deep reinforcement learning.
facebookresearch/xformers
Hackable and optimized Transformers building blocks, supporting a composable construction.
mosaicml/composer
Supercharge Your Model Training
pytorch/captum
Model interpretability and understanding for PyTorch
MIT-LCP/mimic-code
MIMIC Code Repository: Code shared by the research community for the MIMIC family of databases
huggingface/optimum
🚀 Accelerate training and inference of 🤗 Transformers and 🤗 Diffusers with easy to use hardware optimization tools
state-spaces/s4
Structured state space sequence models
allenai/longformer
Longformer: The Long-Document Transformer
jalammar/ecco
Explain, analyze, and visualize NLP language models. Ecco creates interactive visualizations directly in Jupyter notebooks explaining the behavior of Transformer-based language models (like GPT2, BERT, RoBERTA, T5, and T0).
idiap/fast-transformers
Pytorch library for fast transformer implementations
rio-labs/rio
WebApps in pure Python. No JavaScript, HTML and CSS needed
DeepReinforcementLearning/DeepReinforcementLearningInAction
Code from the Deep Reinforcement Learning in Action book from Manning, Inc
EmilyAlsentzer/clinicalBERT
repository for Publicly Available Clinical BERT Embeddings
google-research/bigbird
Transformers for Longer Sequences
mlpen/Nystromformer
facebookresearch/mega
Sequence modeling with Mega.
lucidrains/Mega-pytorch
Implementation of Mega, the Single-head Attention with Multi-headed EMA architecture that currently holds SOTA on Long Range Arena
OpenBeta/open-tacos
Rock climbing route catalog (openbeta.io)
bvanaken/clinical-outcome-prediction
Code for the EACL 2021 Paper: Clinical Outcome Prediction from Admission Notes using Self-Supervised Knowledge Integration
OpenBeta/climbing-data
Open license climbing data
jvfe/pytrials
Python package that wraps around the ClinicalTrials.gov API
simonlevine/clinical-longformer
SeanNaren/CORD-19-ANN
ANN Search through the COVID CORD-19 Dataset using SBERT.