/nograd

Gradient descent is cool and all, but what if we could delete it?

Primary LanguagePython

Learning to play perfect tic-tac-toe without gradient descent

Most of the world's learning is just DNA-replication + mutation. Less than 1% of the biomass of all life is of organisms that have any neurons at all. That means over 99% of all life learns with DNA-replication + mutation alone. However, no modern ML techniques look anything like this. I think that should change. This repo can produce a perfect tic-tac-toe player in less than 200 lines of code using DNA-like learning. There is no optimizer, no gradients, and no loss function. It is more robust, conceptually simpler, and I think far more beautiful than conventional ML techniques to solve tic-tac-toe.