Pinned Repositories
Advanced-Deep-Learning-with-Keras
Advanced Deep Learning with Keras, published by Packt
adversarial
Code and hyperparameters for the paper "Generative Adversarial Networks"
Anomaly-ReactionRL
Using RL for anomaly detection in NSL-KDD
Artificial-Intelligence
Autoencoder-Classification
Autoencoder_standfordcars
Convolutional Autoencoder with SetNet in PyTorch
Awesome-Deep-Neural-Network-Compression
Summary, Code for Deep Neural Network Quantization
Awesome-Pruning
A curated list of neural network pruning resources.
B_net
branchynet
max2022's Repositories
max2022/Artificial-Intelligence
max2022/Advanced-Deep-Learning-with-Keras
Advanced Deep Learning with Keras, published by Packt
max2022/Anomaly-ReactionRL
Using RL for anomaly detection in NSL-KDD
max2022/Autoencoder-Classification
max2022/Awesome-Deep-Neural-Network-Compression
Summary, Code for Deep Neural Network Quantization
max2022/classification_models
Classification models trained on ImageNet. Keras.
max2022/Compiler_decaf_2020_spring
max2022/cpu-energy-meter
A tool for measuring energy consumption of Intel CPUs
max2022/flops-counter.pytorch
Flops counter for convolutional networks in pytorch framework
max2022/keras-surgeon
Pruning and other network surgery for trained Keras models.
max2022/Keras_FLOP_Estimator
This is a function for estimating the floating point operations (FLOPS) of deep learning models developed with keras.
max2022/knowledge-distillation-papers
knowledge distillation papers
max2022/Knowledge_distillation_via_TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
max2022/Lottery-Ticket-Hypothesis-in-Pytorch
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
max2022/mae-scalable-vision-learners
A TensorFlow 2.x implementation of Masked Autoencoders Are Scalable Vision Learners
max2022/mdistiller
The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/ICCV2023/papers/Zhao_DOT_A_Distillation-Oriented_Trainer_ICCV_2023_paper.pdf
max2022/model-optimization
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
max2022/pyclustering
pyclustering is a Python, C++ data mining library.
max2022/pytorch-cifar100
Practice on cifar100(ResNet, DenseNet, VGG, GoogleNet, InceptionV3, InceptionV4, Inception-ResNetv2, Xception, Resnet In Resnet, ResNext,ShuffleNet, ShuffleNetv2, MobileNet, MobileNetv2, SqueezeNet, NasNet, Residual Attention Network, SENet, WideResNet)
max2022/pytorch_resnet_cifar10
Proper implementation of ResNet-s for CIFAR10/100 in pytorch that matches description of the original paper.
max2022/resnet-18-autoencoder
max2022/rpi-power-monitor
Power Monitor (for Raspberry Pi)
max2022/sage-smoke-detection
max2022/SpinalNet
SpinalNet: Deep Neural Network with Gradual Input
max2022/tensorflow-deep-learning
All course materials for the Zero to Mastery Deep Learning with TensorFlow course.
max2022/Tiny-ImageNet
Image classification on Tiny ImageNet
max2022/torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆20 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
max2022/TrojanNN
Trojan Attack on Neural Network
max2022/um34c
A small NodeJS tool to read out and control the UM34C (or UM24C / UM25C) USB analyzer via Bluetooth
max2022/X-AI