vadixidav/mli

Create abstraction for optimizers

Opened this issue · 0 comments

Currently the NAG optimizer is just written directly into the demo application deep-train. While it is not particularly many lines, it clutters the code and is easily abstracted. A trait should be created encapsulating the minimum behavior required to make an optimizer and then NAG should be created as the first such optimizer since it has been found to generalize well in training and testing, while also converging relatively quickly given that the learning rate is adjusted appropriately. More optimizers, such as adam, will be included later, since each optimizer usually has a tradeoff between generality, fiddling, and speed of training. adam is a perfect example of an optimizer that has little fiddling and high speed of training, so it will likely be the second one added.