/attention-golang

Simple attention mechanism implemented in GO

Primary LanguageGo

Transformer Inference in Go from Scratch

Overview

I've built a Transformer inference code from scratch in Go, without relying on external dependencies. The inspiration? Well, let's just say minGPT had a hand in it 😉. But heads up, there might be a few bugs as it was a weekend project without too much attention (no pun intended) to detail.

Features

  • Train a Transformer model in PyTorch on Iris dataset.
  • Export the model weights.
  • Perform inference in Go without external dependencies.

Requirements

To run this project, you'll need:

  • Python 3.x
  • PyTorch and Scikit-learn
  • Go compiler

Installation

  1. Clone this repository:

    git clone https://github.com/your_username/transformer-inference-go.git
  2. Install Python dependencies (only Pytorch and Scikit-learn):

    pip install -r requirements.txt

Usage

Training the Transformer Model

  1. Navigate to the main directory:

  2. Train the Transformer model using PyTorch. Modify the training script (transformer.py) to suit your dataset

python transformer.py

Model weights will be written into the weights.bin file

Inference in GO

  1. Run the Go file:
go run simple_attn.go

It will read the dataset, run prediction on samples and calculate the accuracy