/KayPruning

Pruning Neural Networks in Tensorflow 2 ✂️🕸

Primary LanguageJupyter NotebookMIT LicenseMIT

KayPruning

Making Neural Networks smaller and faster!

Getting started

It is recommended to use a virtual environment (conda).

Using conda: conda env create --name envname --file=env.yml

Using pip: pip install -r requirements.txtinst

Basic usage:

Check forai.ipynb notbook

Simple training & pruning

db = DataBunch('mnist')
model = get_model('BaseModel')

glogger.info('Training')
trainer = Trainer(model=model, db=db, epochs=1)
trainer.run()
glogger.info(trainer.metrics)

glogger.info('Pruning')
trainer.run_pruning()
glogger.info(trainer.print_metrics())

Hyper-parameters & configurations:

The configurations and hyper-parameters can be found in the configs package and you can change and adjust them.

Resource:

What I did/learn? 📚👨🏻‍💻

  • This was my first time to learn about Pruning Neural Networks!
  • In this project I decided to use Tensorflow 2.0 to learn and have actual project using tf2.
  • I tried to follow tensorflow best practices and have a modularize code which can be extended to support more types of pruning and models.
  • I followed some of FOR.ai/rl library design ;)
  • I tried to implement everything with customization and also used Keras APIs to integrate it with the project.
  • I read and watched some useful resources that helped me during the project
  • I really enjoyed working on this project! 😄 👌

Contributors:


Amr M. Kayid