wandb/examples

The order of parameters in sweep config will influence the result

Closed this issue · 3 comments

I used the sweep example of pytorch version https://colab.research.google.com/github/wandb/examples/blob/master/colabs/pytorch/Organizing_Hyperparameter_Sweeps_in_PyTorch_with_W%26B.ipynb# and found that if I change the order of parameters, the corresponding will be different. For example, in this example, if the 'dropout' in parameters_dict is changing from [0.3, 0.4, 0.5] to [0.4,0.3,0.5], the results don't match.
image

Are you performing a grid search or a random search? The grid search will just walk through the parameters you've chosen and the order it walks through them will be determined by the order you specified them. I'm not really sure I understand the question / issue? Can you try describing the issue more clearly?

Are you expecting the "loss" values to be the same? Because each model will start with a seperate random seed even if the hyperparameters are the same the results will be different.

Did you found a solution on this issue? I will close this as it is not an "Examples" issue and more a wandb/client issue. Feel free to reopen an issue at https://github.com/wandb/client