1348598339's Stars
keras-team/keras
Deep Learning for humans
ultralytics/yolov5
YOLOv5 🚀 in PyTorch > ONNX > CoreML > TFLite
BlinkDL/RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
pytorch/tutorials
PyTorch tutorials.
thuml/Time-Series-Library
A Library for Advanced Deep Time Series Models.
timeseriesAI/tsai
Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai
cmhungsteve/Awesome-Transformer-Attention
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
pytorch/ignite
High-level library to help with training and evaluating neural networks in PyTorch flexibly and transparently.
Nixtla/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and anomaly detection. Generative pretrained transformer for time series trained on over 100B data points. It's capable of accurately predicting various domains such as retail, electricity, finance, and IoT with just a few lines of code 🚀.
ddz16/TSFpaper
This repository contains a reading list of papers on Time Series Forecasting/Prediction (TSF) and Spatio-Temporal Forecasting/Prediction (STF). These papers are mainly categorized according to the type of model.
thuml/Autoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
yuqinie98/PatchTST
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
bojone/vae
a simple vae and cvae from keras
cerlymarco/tsmoothie
A python library for time-series smoothing and outlier detection in a vectorized way.
laiguokun/LSTNet
opencv/opencv_zoo
Model Zoo For OpenCV DNN and Benchmarks.
mblondel/soft-dtw
Python implementation of soft-DTW.
thuml/Nonstationary_Transformers
Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415
ant-research/Pyraformer
GeoscienceAustralia/ginan
The Australian Government, through Positioning Australia (part of Geoscience Australia), is funding the design, development and operational service of a Global Navigation Satellite System (GNSS) position correction system - the Ginan service and toolkit. The application of the Ginan correction service by a GNSS device has the potential to increase positioning accuracy from meters to centimetres across Australia. The suite of software systems in this repository (the Ginan toolkit) will be used to create the service. It is available now under an open source licence. Ginan will give individuals and organisations no-cost access to the Ginan software and service as a public good.
DAMO-DI-ML/ICML2022-FEDformer
Source code of ICML'22 paper: FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting
plumprc/MTS-Mixers
MTS-Mixers: Multivariate Time Series Forecasting via Factorized Temporal and Channel Mixing
DAMO-DI-ML/KDD2023-DCdetector
google-research/soft-dtw-divergences
An implementation of soft-DTW divergences.
michaelgrund/GMT-plotting
Collection of GMT (Generic Mapping Tools) scripts, jupyter notebooks (using PyGMT) and files (including digitized map content, colormaps, grid files etc.)
weepon/feature_selection
常用的特征选择方法
plumprc/RTSF
Revisiting Long-term Time Series Forecasting: An Investigation on Linear Mapping
zhuozhudd/PyTorch-Course-Note
seunghan96/pits
Jhryu30/AnomalyBERT4ESS