Marytong6's Stars
ChihebTrabelsi/deep_complex_networks
Implementation related to the Deep Complex Networks
gpeyre/SinkhornAutoDiff
Toolbox to integrate optimal transport loss functions using automatic differentiation and Sinkhorn's algorithm
MoranCoder95/MDLR
thuml/Time-Series-Library
A Library for Advanced Deep Time Series Models.
mims-harvard/UniTS
A unified multi-task time series model.
wxie9/CARD
wang-fujin/PINN4SOH
A physics-informed neural network for battery SOH estimation
TL-UESTC/Domain-Adaptive-Remaining-Useful-Life-Prediction-with-Transformer
Pytorch implementation for Domain Adaptive Remaining Useful Life Prediction with Transformer
orobix/mdd-domain-adaptation
Simple reimplementation of Maximum Density Divergence for Unsupervised Domain Adaptation (https://arxiv.org/abs/2004.12615) in PyTorch Lightning
lijin118/ATM
Maximum Density Divergence for Domain Adaptation, TPAMI 2020, Code release, Cross-domain Adversarial Tight Match
ritikdhame/Electricity_Demand_and_Price_forecasting
Building Time series forecasting models, including the XGboost Regressor, GRU (Gated Recurrent Unit), LSTM (Long Short-Term Memory), CNN (Convolutional Neural Network), CNN-LSTM, and LSTM-Attention. Additionally, hybrid models like GRU-XGBoost and LSTM-Attention-XGBoost for Electricity Demand and price prediction
thistleknot/cnn_lstm
cnn lstm w attention for time series forward walk moving windows and subset selection as well as prediction intervals
lancopku/Prime
A simple module consistently outperforms self-attention and Transformer model on main NMT datasets with SoTA performance.
xmu-xiaoma666/External-Attention-pytorch
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Zehui127/1d-swin
The implementation of 1d-swin, an efficient transformer for capturing hierarchical 1-dimentional long range sequence
xyxdegithub/cwt_swinTransformer
基于小波时频图与 Swin Transformer 的轴承故障诊断方法
WangFeng18/Swin-Transformer
Implementation of Swin Transformer with Pytorch
microsoft/Swin-Transformer
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
berniwal/swin-transformer-pytorch
Implementation of the Swin Transformer in PyTorch.
Z-Sherkat/Hybrid-Model-Attention
philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
liuslnlp/CoupletAI
基于CNN+Bi-LSTM+Attention 的自动对对联系统
Jongchan/attention-module
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
PatientEz/CNN-BiLSTM-Attention-Time-Series-Prediction_Keras
CNN+BiLSTM+Attention Multivariate Time Series Prediction implemented by Keras
valentinsulzer/kneepoint-review
Collaborative knee point review paper
alxndrTL/mamba.py
A simple and efficient Mamba implementation in pure PyTorch and MLX.
WenPengfei0823/PINN-Battery-Prognostics
State of Health (SoH) and Remaining Useful Life (RUL) prediction for Li-ion batteries based on Physics-Informed Neural Networks (PINN).
zshicode/MambaLithium
MambaLithium: Selective state space model for remaining-useful-life, state-of-health, and state-of-charge estimation of lithium-ion batteries
amirhosseinh77/Battery-Charging-DRL
DRL-based Fast Balance Charging Optimization for lithium-ion Batteries
ma921/SOBER
Fast Bayesian optimization, quadrature, inference over arbitrary domain with GPU parallel acceleration