batch-normalization
There are 187 repositories under batch-normalization topic.
pytorch-syncbn
Synchronized Multi-GPU Batch Normalization for PyTorch based on https://github.com/tamakoji/pytorch-syncbn
revisiting-bn-init
Code for "Revisiting Batch Norm Initialization".
GAN_image_colorizing
Image colorization with generative adversarial networks on the CIFAR10 dataset.
GoogLeNet
Implementation of GoogLeNet series Algorithm
Deep-Learning-Specialization
Coursera - Deep Learning Specialization - deeplearning.ai
dropout-vs-batch-normalization
Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
BFMD-SN-U-net
The open source code for the paper "Block Attention and Switchable Normalization based Deep Learning Framework for Segmentation of Retinal Vessels"
pytorch_ConvUnitOptimization
This is the official repository of "Improving generalization of Batch Whitening by Convolutional Unit Optimization", ICCV 2021.
SimpleNN
A simple neural network library in C++
RealisticTTA
Official repository for AAAI2024 paper <Unraveling Batch Normalization for Realistic Test-Time Adaptation>.
TheSchoolOfAI
Projects for The School of AI
Problem-of-BatchNorm
Playground repository to highlight the problem of BatchNorm layers for an blog article
Keras_IEBN
Unofficial Keras implementation of the paper Instance Enhancement Batch Normalization.
Kuzushiji-DropBlock
Japanese Handwritten Character Recognition using DropBlock Regulzarization
vanishing-gradients
Avoiding the vanishing gradients problem by adding random noise and batch normalization
keras-mode-normalization
Keras Implementation of Mode Normalization (Lucas Deecke, Iain Murray, Hakan Bilen, 2018)
dl_stereo_matching
A Tensorflow implementation of the models described in the paper "Efficient Deep Learning for Stereo Matching"
Networks
Library which can be used to build feed forward NN, Convolutional Nets, Linear Regression, and Logistic Regression Models.
awesome-convnets
A list of papers I used for my thesis about convolutional neural networks and batch normalization
SVHN-Deep-Neural-Network
Implementing an Image classification neural network to classify Street House View Numbers
Improving-Deep-Neural-Networks-Hyperparameter-tuning-Regularization-and-Optimization
Curso Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization. Segundo curso del programa especializado Deep Learning. Este repositorio contiene todos los ejercicios resueltos. https://www.coursera.org/learn/neural-networks-deep-learning
functional-whitening
M. Vidal & A. M. Aguilera. Novel whitening approaches in functional settings. Stat, 12(1), e516.
bidaf-question-answering
Bi-Directional Attention Flow (BiDAF) question answering model enhanced by multi-layer convolutional neural network character embeddings.
bn-advex-zhang-fixup
Code for ResNet-Fixup experiments as part of "Batch Norm is a Cause of Adversarial Vulnerability" presented at http://deep-phenomena.org/
batch-norm
An implementation of the technique of batch normalization on a feed forward neural network.
Densenet-Tensorflow
使用Tensorflow实现DenseNet
Deeplearning-Models
Deep learning models in Python
LeukoDif
Denoising Diffusion Medical Model (DDMM) on PyTorch for generating datasets of Acute Lymphoblastic Leukemia 🩺💜
American-Sign-Language-Detection-CNN
Unleashing the Power of CNNs for Precise American Sign Language Recognition.
BATCH-FILE
I'm great with batch file!
Deep-Learning-Introduction
A collection of deep learning exercises collected while completing an Intro to Deep Learning course. We use TensorFlow and Keras to build and train neural networks for structured data.
One-Piece-Image-Classifier
A quick image classifier trained with manually selected One Piece images.
Batch-Normalisation
This notebook shows you one way to add batch normalization to a neural network built in PyTorch.
Coursera-Deep-Learning
A five-course specialization covering the foundations of Deep Learning, from building CNNs, RNNs & LSTMs to choosing model configurations & paramaters like Adam, Dropout, BatchNorm, Xavier/He initialization, and others.