by Qin Yu, Dec 2018
(all code tested on 24-core machine)
All Machine Learning from Scratch Examples:
- Linear Regression & Kernel-Ridge Regression (KRR) on Boston Housing Dataset
- Classification (Kernel Perceptron) on MNIST Handwritten Digits (#Multiprocessing)
- Manifesto
- Overview of Examples
- Algorithms and Tricks
- About The MNIST Database of Handwritten Digits (external link)
Julia 1.0 was released during JuliaCon in August 2018, two months before I started to look into machine learning. Last time when I was working on regression, I had fun playing with it, felt its power, and decided to use it again for classifying MNIST handwritten digits. This example includes the use of multiprocessing in Julia, so it would be great if run my code on a machine with more than 20 cores. A 12-core desktop like mine will do: just come with some other ways of splitting the tasks, and we have an alternative parallelism.
- Basics:
- 2-class Kernel Perceptron
- Multi-class Kernel Perceptron (one-vs-rest)
- Choice of Epochs & Parameters
- Multi-class Kernel Perceptron (one-vs-one)
- Formal Examples:
- Multi-class Kernel Perceptron (one-vs-rest) with 5-fold Cross-validation
- Confusion Matrix
- Digits that are Hardest to Predict
- Multi-class Kernel Perceptron (one-vs-one) with 5-fold Cross-validation
- Multi-class Kernel Perceptron (one-vs-rest) with 5-fold Cross-validation