ChenK19's Stars
xai-org/grok-1
Grok open release
microsoft/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Lightning-AI/pytorch-lightning
Pretrain, finetune and deploy AI models on multiple GPUs, TPUs with zero code changes.
microsoft/unilm
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Dao-AILab/flash-attention
Fast and memory-efficient exact attention
CompVis/latent-diffusion
High-Resolution Image Synthesis with Latent Diffusion Models
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
huggingface/accelerate
🚀 A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (including fp8), and easy-to-configure FSDP and DeepSpeed support
yl4579/StyleTTS2
StyleTTS 2: Towards Human-Level Text-to-Speech through Style Diffusion and Adversarial Training with Large Speech Language Models
vispy/vispy
Main repository for Vispy
schrodingercatss/tuning_playbook_zh_cn
一本系统地教你将深度学习模型的性能最大化的战术手册。
s3prl/s3prl
Self-Supervised Speech Pre-training and Representation Learning Toolkit
EleutherAI/pythia
The hub for EleutherAI's work on interpretability and learning dynamics
dstat-real/dstat
Versatile resource statistics tool (the real one, not the Red Hat clone)
fangwei123456/spikingjelly
SpikingJelly is an open-source deep learning framework for Spiking Neural Network (SNN) based on PyTorch.
braindecode/braindecode
Deep learning software to decode EEG, ECG or MEG signals
GFNOrg/gflownet
Generative Flow Networks
eeyhsong/EEG-Conformer
EEG Transformer 2.0. i. Convolutional Transformer for EEG Decoding. ii. Novel visualization - Class Activation Topography.
kakaobrain/torchlars
A LARS implementation in PyTorch
EtienneCmb/visbrain
A multi-purpose GPU-accelerated open-source suite for brain data visualization
thuml/Large-Time-Series-Model
Official code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)
jonaskohler/stereoEEG2speech
Code for a seq2seq architecture with Bahdanau attention designed to map stereotactic EEG data from human brains to spectrograms, using the PyTorch Lightning.
Jason-Qiu/EEG_Language_Alignment
[EMNLP 2023] An Empirical Exploration of Cross-domain Alignment between Language and Electroencephalogram
lRomul/sensorium
NeurIPS | 1st place solution for Sensorium 2023 Competition
postech-ami/Sound2Scene
neuralinterfacinglab/SingleWordProductionDutch
Scripts to work with an intracranial EEG dataset of speech production.
kevmtan/electroCUDA
Robust electrophysiology tools with GPU acceleration
neurotechcenter/VERA
Versatile Electrode localization fRAmework
BruntonUWBio/ajile12-nwb-data
elliothsmith/IEDs
code for detecting, measuring, and visualizing interictal epileptiform discharges.