GraphPKU/NeuralCommonNeighbor

Suggestion on hyperparamter tuning on other datasets

Opened this issue · 1 comments

Hi,

I am trying to apply NCN/NCNC to other graphs. In the README, it seems there are a lot of hyperparameters to tweak with. Are there any general suggestions about where to start the hyperparameter tuning?

Thanks,

Hi,

We upload hyperparamopt.py in the refactor branch. It uses optuna to do hyperparameter tuning. You can also check the parseargs function in NeighborOverlap.py for the meaning of each hyperparameter.

We also used the following tricks.

  1. We first optimize the hyperparameters of NCN, then use them for NCNC with a few modifications.

  2. Reduce the search space during optimization. The choice of some parameters, like mplayers, model, jk, res, converges very fast. So we fix these hyperparameters. Moreover, you may find that large dp and small lr leads to poor performance, so you can reduce their search range.

  3. num_trial: We run the optimization until we get a satisfactory result. For ddi, we run about 900 trials. For ppa, we use 60.

  4. runs: the number of repeated runs in each trial. More runs lead to a more stable score. So we use 1 in the beginning and 3 later. For ddi, we use 10 as the score is very unstable.

Feel free to ask me if there exists any problem.