leehyeonbeen's Stars
GuitarMechanics/ros_final_2023
scripts for final project
KHU-MASLAB/cNN-DP
A novel neural network for effective learning of highly impulsive/oscillatory dynamic systems by jointly utilizing low-order derivatives
openai/mujoco-py
MuJoCo is a physics engine for detailed, efficient rigid body simulations with contacts. mujoco-py allows using MuJoCo from Python 3.
pranz24/pytorch-soft-actor-critic
PyTorch implementation of soft actor critic
namsan96/SiMPL
AntixK/PyTorch-VAE
A Collection of Variational Autoencoders (VAE) in PyTorch.
clvrai/spirl
Official implementation of "Accelerating Reinforcement Learning with Learned Skill Priors", Pertsch et al., CoRL 2020
syncdoth/RetNet
Huggingface compatible implementation of RetNet (Retentive Networks, https://arxiv.org/pdf/2307.08621.pdf) including parallel, recurrent, and chunkwise forward.
ggory15/ros_tutorials
zhouhaoyi/Informer2020
The GitHub repository for the paper "Informer" accepted by AAAI 2021.
thuml/Autoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
MAZiqing/FEDformer
cure-lab/LTSF-Linear
[AAAI-23 Oral] Official implementation of the paper "Are Transformers Effective for Time Series Forecasting?"
KHU-MASLAB/TimeSeriesSeq2Seq
Sequence-to-sequence model implementations including RNN, CNN, Attention, and Transformers using PyTorch
hyunwoongko/transformer
Transformer: PyTorch Implementation of "Attention Is All You Need"
KHU-MASLAB/RecurDynPython
RecurDyn automation using Python and ProcessNet
Spenhouet/tensorboard-aggregator
Aggregate multiple tensorboard runs to new summary or csv files
namhokim/cocoa_app
mac os applications
AuCson/PyTorch-Batch-Attention-Seq2seq
PyTorch implementation of batched bi-RNN encoder and attention-decoder.
microsoft/DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
nlpodyssey/rwkv
RWKV (Receptance Weighted Key Value) is a RNN with Transformer-level performance
UdbhavPrasad072300/Transformer-Implementations
Library - Vanilla, ViT, DeiT, BERT, GPT
damnever/pigar
:coffee: A tool to generate requirements.txt for Python project, and more than that. (IT IS NOT A PACKAGE MANAGEMENT TOOL)
BlinkDL/RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
UbuntuAsahi/ubuntu-asahi
Native Ubuntu installations for Apple silicon hardware
winfsp/sshfs-win
SSHFS For Windows
facebookresearch/segment-anything
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
DedalusProject/dedalus
A flexible framework for solving PDEs with modern spectral methods.
Ceyron/machine-learning-and-simulation
All the handwritten notes 📝 and source code files 🖥️ used in my YouTube Videos on Machine Learning & Simulation (https://www.youtube.com/channel/UCh0P7KwJhuQ4vrzc3IRuw4Q)
sachabinder/Burgers_equation_simulation
Python script solving the Burgers' equation (équation de Burgers) 1D by using FFT pseudo-spectral method.