Same MSE for different crossValidate methods
Opened this issue · 2 comments
I switched from python's sklearn NMF to your implementation because of the improved speed and convenient cross-validation. This also means I'm fairly new to R and I'm sorry if my question is obvious.
I have a dataset with 500000 features and 734 observations. I ran crossValidate()
with methods 'predict', 'robust', and 'impute' for 2:10 components with ten repetitions each and the same random seed (everything else was default settings). The resulting MSE's were exactly the same across methods, which I believe shouldn't be the case despite using the same random seed. Any ideas why this might happen?
Thank you for developing this amazing package!
I encountered this problem too..... Have you solved this?
Any reproducible example would help. I can't reproduce. Thanks! Sorry for the delay.