/MLX_Models

MLX Institute | Models and experiments: Handwritten Transformer Architecture in PyTorch | CNN Autoencoder with PCA

Primary LanguagePython

Architectures | MLX Institute

My handwritten models and experiments from the intensive ML program at MLX Institute.

Transformers

Transformer Architecture: Intensive Week 4 at MLX Institute

A GPT for generating new stories. A Transformer architecture containing Multi-Headed Self-Attention and tracking with Weights and Biases, trained on the TinyStories dataset.

PCA

CNNs and Autoencoders: Intensive Week 2 at MLX Institute

A CNN autoencoder script for visualising similarity between a dataset of images from the MLX Institute TinyWorld simulation, including interactive 3D plots of Principle Component Analysis (PCA) to visualise clusters of similar encodings.

## Numpy CNNs and MLP in Numpy: Module 1 Project 1 at MLX Institute

Simple implementaitons of multi-layer perceptron architecture using only numpy and expanding to convolutional neural networks.

PCA & Transformers files can be run as Jupyter Notebooks in VSC with the Jupyter Extension.