/NAS

B.Tech Thesis/Project

Primary LanguageJupyter Notebook

Expanded Neural Architecture Search

We are following [1] as the baseline paper.

Expected Contributions :

  • Learning Curve Prediction.
  • Addition of Maxpool Node.
  • Addition of Conv Layer without padding.
  • Critique Morphism as in [1].
  • Exploring other methods that may alter image dimen.
  • Improve EER.
  • Exploring activation functions

# Tasks :

  • Plot Graph & torchviz Tensor Graph
  • Correct find 2 conv nodes to connect function.
  • Add padding as fn params of Nodes (Conv etc.)
  • Concept : Add support for adding Maxpool Node.
  • Modify compatCheck to get @forward fn.
  • Run Hill Climbing.
  • Testing of Merge Code
  • Write Merge
  • Merge with Conv
  • Implement Swish, AriA , AriA2, LeakyReLU, ReLU6 , Swish_Beta
  • Add Swish & Beta, AriA , etc as parameters to NASGraph
  • Add Final Layers
  • Run Training Loop
  • Dynamic Final Linear Layers to accomodate changing convs
  • Linear Widen for Dynamic Linear Layers
  • Find Contribution Source eg. Ref Papers/Implementations

Ref --- [1] https://ieeexplore.ieee.org/abstract/document/8791709

If using our code please cite our paper: Verma, Mudit, Pradyumna Sinha, Karan Goyal, Apoorva Verma, and Seba Susan. "A Novel Framework for Neural Architecture Search in the Hill Climbing Domain." In 2019 IEEE Second International Conference on Artificial Intelligence and Knowledge Engineering (AIKE), pp. 1-8. IEEE, 2019.