deep-learning-architectures
There are 21 repositories under deep-learning-architectures topic.
mead-ml/mead-baseline
Deep-Learning Model Exploration and Development for NLP
gmh14/RobNets
[CVPR 2020] When NAS Meets Robustness: In Search of Robust Architectures against Adversarial Attacks
ashishsalunkhe/DeepSpamReview-Detection-of-Fake-Reviews-on-Online-Review-Platforms-using-DeepLearning-Architectures
DeepSpamReview: Detection of Fake Reviews on Online Review Platforms using Deep Learning Architectures. Summer Internship project at CoreView Systems.
Totoketchup/Adaptive-MultiSpeaker-Separation
Adaptive and Focusing Neural Layers for Multi-Speaker Separation Problem
basiralab/GSR-Net
Graph SuperResolution Network using geometric deep learning.
mike-gimelfarb/cascade-correlation-neural-networks
A general framework for cascade correlation architectures in Python with wrappers to keras, tensorflow and sklearn
nutellamok/advrush
Official Code for AdvRush: Searching for Adversarially Robust Neural Architectures (ICCV '21)
raunakm90/AirWare
Deep learning architectures for in-air hand gesture recognition
SaashaJoshi/deep-learning-architectures
Deep Learning architectures in Tensorflow Keras, and PyTorch.
vishal-keshav/xcelerator
Exploring RL ideas for deep neural network hyper-parameter search
yskim0/pytorch-cifar10
Implementing and training/testing popular model architectures on the CIFAR10 dataset.
HasibAlMuzdadid/Deep-Learning-Papers-Reading-Roadmap
Deep Learning papers reading roadmap for anyone who is eager to learn this amazing tech!
AkhithaBabu/Notes
Notes on ML and DL with jupyter notebooks (python)
jmaczan/deep-learning-pytorch
Deep Learning architectures implemented in PyTorch Lightning
MLD3/MLHC2018_SequenceTransformerNetworks
Code release for "Learning to Exploit Invariances in Clinical Time-Series Data Using Sequence Transformer Networks" (Oh, Wang, Wiens), MLHC 2018. https://arxiv.org/abs/1808.06725
MLD3/MLHC2019_Relaxed_Parameter_Sharing
Code release for "Relaxed Weight Sharing: Effectively Modeling Time-Varying Relationships in Clinical Time-Series" (Oh, Wang, Tang, Sjoding, Wiens), MLHC 2019. https://arxiv.org/abs/1906.02898
Morin3/hash-toolbox
# Andoka-2 H.5 Andoka now TV
Dheeraj2444/keras-examples
My experimentations with Keras
Ishan-Kotian/Tokenizer_NLP
Tokenization is a way of separating a piece of text into smaller units called tokens. Here, tokens can be either words, characters, or subwords. Hence, tokenization can be broadly classified into 3 types – word, character, and subword (n-gram characters) tokenization.
Morin3/storybook
Storybook is the industry standard workshop for building, documenting, and testing UI components in isolation