MartinaCadiz's Stars
Spandan-Madan/DeepLearningProject
An in-depth machine learning tutorial introducing readers to a whole machine learning pipeline from scratch.
cmhungsteve/Awesome-Transformer-Attention
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
stellargraph/stellargraph
StellarGraph - Machine Learning on Graphs
thuml/Autoformer
About Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008
NX-AI/xlstm
Official repository of the xLSTM.
ZhuiyiTechnology/roformer
Rotary Transformer
google/edward2
A simple probabilistic programming language.
lucidrains/rotary-embedding-torch
Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
SimonKohl/probabilistic_unet
A U-Net combined with a variational auto-encoder that is able to learn conditional distributions over semantic segmentations.
yandex-research/tab-ddpm
[ICML 2023] The official implementation of the paper "TabDDPM: Modelling Tabular Data with Diffusion Models"
axelbrando/Mixture-Density-Networks-for-distribution-and-uncertainty-estimation
A generic Mixture Density Networks (MDN) implementation for distribution and uncertainty estimation by using Keras (TensorFlow)
ds4dm/ecole
Extensible Combinatorial Optimization Learning Environments
stefanknegt/Probabilistic-Unet-Pytorch
A Probabilistic U-Net for segmentation of ambiguous images implemented in PyTorch
alinlab/CSI
CSI: Novelty Detection via Contrastive Learning on Distributionally Shifted Instances (NeurIPS 2020)
team-approx-bayes/dl-with-bayes
Contains code for the NeurIPS 2019 paper "Practical Deep Learning with Bayesian Principles"
Mephisto405/Learning-Loss-for-Active-Learning
Reproducing experimental results of LL4AL [Yoo et al. 2019 CVPR]
EderSantana/gumbel
Gumbel-Softmax Variational Autoencoder with Keras
coastalcph/hierarchical-transformers
Hierarchical Attention Transformers (HAT)
lhirschfeld/ChempropUncertaintyQuantification
Message Passing Neural Networks for Molecule Property Prediction
jayheo/UA
saif-mahmud/hierarchical-attention-HAR
[PAKDD-2021] Hierarchical Self Attention Based Autoencoder for Open-Set Human Activity Recognition
xiaoyuxin1002/UQ-PLM
Uncertainty Quantification with Pre-trained Language Models: An Empirical Analysis
amzn/sto-transformer
StijnVerdenius/Lat-PFN
This work introduces LaT-PFN, a novel time series model that combines PFN and JEPA frameworks to generate zero-shot forecasts efficiently, using a versatile latent space that enables adaptable time granularity and superior predictive performance.
aritraghsh09/GaMPEN
ML framework to estimate Bayesian posteriors of galaxy morphological parameters
bsantraigi/HIER
Official Repo for Implementations of Models/Experiments in "Hierarchical Transformer for Task Oriented Dialog Systems" - NAACL 2021 (Long Paper)
felipeelorrieta/IAR_Model
This folder contains R and Python functions to fit unequally spaced time series from the Irregular Autoregressive (IAR). From the functions of this folder we can generate observations for each process, compute the negative of the log likelihood of these process, fit each model to irregularly sampled data, and test the significance of the estimated parameters.
omersan/Practice
luvalenz/time-series-variability-tree