accounting for deep learning initialization
nrakocz opened this issue · 4 comments
Initialization of the keras models between trials.
That sometimes has a large effect on results. I think a random seed is worth mentioning in the tutorial.
Hi @rakoczUCLA !
You are absolutely right, that can have a large impact. Did you have a particular tutorial in mind? Probably this one https://parameter-sherpa.readthedocs.io/en/latest/algorithms/algorithms.html would fit mentioning of the seed.
I would put it in the basic ones: "A guide to SHERPA", "30 seconds from Keras to SHERPA".
Sounds good! We actually have an algorithm coming soon that will explicitly deal with this.
Update on this: there is the Repeat algorithm now
sherpa/sherpa/algorithms/core.py
Line 83 in 14bc6aa