A "gradient-descent"-like optimization technique that follows the smallest nearby error
Update: This technique actually uses the same idea as steepest hill descent!
A "gradient-descent"-like optimization technique that follows the smallest nearby error
Jupyter NotebookMIT
A "gradient-descent"-like optimization technique that follows the smallest nearby error
Update: This technique actually uses the same idea as steepest hill descent!