early-exit
There are 13 repositories under early-exit topic.
IntelLabs/distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
facebookresearch/LayerSkip
Code for "LayerSkip: Enabling Early Exit Inference and Self-Speculative Decoding", ACL 2024
falcon-xu/early-exit-papers
A curated list of early exiting (LLM, CV, NLP, etc)
dywsjtu/apparate
Artifact for "Apparate: Rethinking Early Exits to Tame Latency-Throughput Tensions in ML Serving" [SOSP '24]
falcon-xu/LGViT
Official PyTorch implementation of "LGViT: Dynamic Early Exiting for Accelerating Vision Transformer" (ACM MM 2023)
fangvv/EdgeKE
Code for paper "EdgeKE: An On-Demand Deep Learning IoT System for Cognitive Big Data on Industrial Edge Devices"
Shikha-code36/early-exit-cnn
A deep learning framework that implements Early Exit strategies in Convolutional Neural Networks (CNNs) using Deep Q-Learning (DQN). This project enhances computational efficiency by dynamically determining the optimal exit point in a neural network for image classification tasks on CIFAR-10.
hpclab/learning-exit-strategies-ensembles
Official repository of Busolin et al., "Learning Early Exit Strategies for Additive Ranking Ensembles", ACM SIGIR 2021.
chbtt/sha1-cracker
C implementation of a SHA-1 cracker with various optimizations
giulio-derasmo/Experimenting-with-modularity-in-deep-learning
The project aim to experiment implementing a modular architecture: an early-exit model and testing it using Tensorflow.
ywuwuwu/Early-Exit-Papers
This repository is dedicated to self-learning about early exit papers, including relevant code and documentation.
Sottix99/Leaf-Disease-Classification
This project focuses on the automatic classification of corn leaf diseases using deep neural networks. The dataset includes over 4000 images categorized into four classes: Common Rust, Gray Leaf Spot, Blight, and Healthy. Through the use of Convolutional Neural Networks and advanced techniques, the model achieves a classification accuracy of 91.5%