/axgrad

lightweight tensor library that contains it's own auto-diff engine like pytorch

Primary LanguagePython

axgrad

axon.jpg My attempt to make something like TinyGrad or PyTorch A framework like PyTorch & MicroGrad written fully in python(i will add the c & cpp components for faster implementation though). It's supposed to be a good and lightweight C and Python based deep learning framework, which it's not, as of now(still building).

Overview

It contains a framework similar to Numpy which allows to do basic matrix operations like element-wise add/mul + matrix multiplication + broadcasting. Also building pytorch like auto-differentiation engine: axgrad (work in progress!)

Features

It has basic building blocks required to build a neural network:

  1. Basic tensor ops framework that could easily so matrix add/mul (element-wise), transpose, broadcasting, matmul, etc.
  2. A gradient engine that could compute and update gradients, automatically, much like micrograd, but on a tensor level ~ autograd like (work in progress!).
  3. Optimizer & loss computation blocks to compute and optimize (work in progress!). i'll be adding more things in future...

Usage

This shows basic usage of axgrad.engine & few of the axon's modules to preform tensor operations and build a sample neural network

anyway, prefer documentation for detailed usage guide:

  1. axon.doc: for using like numpy
  2. axgrad.doc: for building neural network from axon library (incomplete for now)

Creating a MLP

To create a multi-layer perceptron in axgrad, you'll just need to follow the steps you followed in PyTorch. Very basic, initiallize two linear layers & a basic activation layer.

import axgrad
import axgrad.nn as nn

class MLP(nn.Module):
  def __init__(self, _in, _hid, _out, bias=False) -> None:
    super().__init__()
    self.layer1 = nn.Linear(_in, _hid, bias)
    self.gelu = nn.GELU()
    self.layer2 = nn.Linear(_hid, _out, bias)
  
  def forward(self, x):
    out = self.layer1(x)
    out = self.gelu(out)
    out = self.layer2(out)
    return out

refer to this Example for detailed info on making mlp

btw, here's the outputs i got from my implementation, that ran till 6k iters: implemented results

Contribution

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Please make sure to update tests as appropriate. But it's still a work in progress.

License

None!