/Graph

Primary LanguageJupyter Notebook

Graph

We are interested in the following questions:

  1. How do we implement traditional processing in a deep learning environment?
  2. How to maintain >16-bit precision?
  3. How to constrain the network to find exact SAT like solutions?
  4. How to map the minimal network to hardware such as ALUs, LUTs and memories?
  5. How to create a programming environment where the program/code/hw is defined by test vector generators?
  6. how to optimally reduce an netwrk automatically to fit existing hardware such as FPGAs?
  7. What are the qualitive differences between NN based CPUs and traditional ALUs
  8. What can be achieved in terms of error rate if exact solutions exist?
  9. How to merge and discover logical graph structure into an NN

NNs+SAT as a CAD tool for 1? kbits for FPGA

  1. Bit level routing
  2. HiFi computation
  3. Code compilation

Demonstrations

  1. NN ALU
  2. NN memory
  3. NN CPU with program
  4. Live video demo on Zynq FPGA