mlr-org/mlrMBO

Error in generateDesign

dagola opened this issue · 13 comments

Today I got the following error during a mlr (version 2.14.0) benchmark experiment using mlrMBO (version 1.1.2) as a search strategy after 191 successful tuning iterations:

...
[Tune-x] 191: summary.statistics.maf.thresholds=0.0222; target.geno=0.978; target.maf=0.00789; ld.external=FALSE; clumping=TRUE; clumping.kb=4381; clumping.r2=0.108; pval.level=0.67; missing.handling=CENTER
[Tune-y] 191: aucpr.test.mean=0.0580633,mmce.test.mean=0.9691866,npv.test.mean=     NaN,fpr.test.mean=1.0000000,f1.test.mean=0.0597839,fnr.test.mean=0.0000000,ssr.test.mean=0.4999102,tp.test.mean=23.4000000,tn.test.mean=0.0000000,gpr.test.mean=0.1755285,lsr.test.mean=-1.0108806,acc.test.mean=0.0308134,wkappa.test.mean=0.0000000,ppv.test.mean=0.0308134,logloss.test.mean=1.0108806,ber.test.mean=0.5000000,tpr.test.mean=1.0000000,brier.test.mean=0.4034924,gmean.test.mean=0.0000000,fdr.test.mean=0.9691866,tnr.test.mean=0.0000000,qsr.test.mean=0.1930152,bac.test.mean=0.5000000,brier.scaled.test.mean=-0.7553917,fp.test.mean=736.0000000,fn.test.mean=0.0000000,kappa.test.mean=0.0000000,auc.test.mean=0.6099305; time: 29.4 min
Warning in train(learner, task, subset = train.i, weights = weights[train.i]) :
  Could not train learner classif.prsice.tuned: Error in generateDesign(control$infill.opt.focussearch.points, ps.local,  :
  REAL() can only be applied to a 'numeric', not a 'NULL'

Error : $ operator is invalid for atomic vectors

### [bt]: Job terminated with an exception [batchtools job.id=3]
### [bt]: Calculation finished!

This is a custom learner, however I applied it successfully on two other but similar tasks without any error.

There is some hierarchy in the parameter set, to be exact it looks like this:

prsice_tuning_par_set <- ParamHelpers::makeParamSet(
  ParamHelpers::makeNumericParam(id = "summary.statistics.maf.thresholds", lower = 0, upper = 0.1),
  ParamHelpers::makeNumericParam(id = "target.geno", lower = 0.9, upper = 1),
  ParamHelpers::makeNumericParam(id = "target.maf", lower = 0, upper = 0.1),
  ParamHelpers::makeLogicalParam(id = "ld.external", requires = quote(clumping == TRUE)),
  ParamHelpers::makeNumericParam(id = "ld.geno", lower = 0.9, upper = 1, requires = quote(!is.na(ld.external) & clumping == TRUE & ld.external == TRUE)),
  ParamHelpers::makeNumericParam(id = "ld.maf", lower = 0, upper = 0.1, requires = quote(!is.na(ld.external) & clumping == TRUE & ld.external == TRUE)),
  ParamHelpers::makeLogicalParam(id = "clumping"),
  ParamHelpers::makeIntegerParam(id = "clumping.kb", lower = 125, upper = 5000, requires = quote(clumping == TRUE)),
  ParamHelpers::makeNumericParam(id = "clumping.r2", lower = 0.1, upper = 0.8, requires = quote(clumping == TRUE)),
  ParamHelpers::makeNumericParam(id = "pval.level", lower = 5e-8, upper = 1),
  ParamHelpers::makeDiscreteParam(id = "missing.handling", values = c("IMPUTE", "SET_ZERO", "CENTER"))
)

Unfortunately, I didn't use the option to save the mlrMBO state during this benchmark, because I applied the same learner on multiple tasks which would lead to concurrent writes to the same file on disk. But this is another issue.

I'm not completely sure what caused this error but I suspect the hierarchical parameter set to cause this.
For convenience, I parsed the log file to extract the optimisation path:

structure(list(summary.statistics.maf.thresholds = c(0.0293, 
0.0637, 0.0423, 0.0244, 0.0563, 0.0764, 0.0344, 0.052, 0.0377, 
0.079, 0.0457, 0.0315, 0.0897, 0.0957, 0.0575, 0.0408, 0.0339, 
0.0484, 0.0268, 0.0451, 0.067, 0.017, 0.00494, 0.0839, 0.0951, 
0.0217, 0.0875, 0.0617, 0.0924, 0.0863, 0.00259, 0.0149, 0.069, 
0.00206, 0.00753, 0.0997, 0.012, 0.0729, 0.0601, 0.0525, 0.0807, 
0.072, 0.0189, 0.0104, 0.0575, 0.0211, 0.0238, 0.0824, 0.0624, 
0.067, 0.0842, 0.0631, 0.0247, 0.0397, 0.0789, 0.0995, 0.00572, 
0.0806, 0.0485, 0.086, 0.0801, 0.0511, 0.0349, 0.0626, 0.0536, 
0.0834, 0.0277, 0.00236, 0.0217, 0.0239, 0.000668, 0.000302, 
0.07, 0.00069, 0.000473, 0.0186, 0.000435, 0.00118, 6.91e-05, 
0.00874, 0.000242, 0.0403, 7.1e-06, 0.0982, 0.0513, 0.000643, 
0.0163, 0.000196, 0.0251, 0.0789, 0.0204, 0.0064, 0.000178, 0.0654, 
0.00138, 0.0275, 0.00858, 0.00047, 0.000666, 0.0665, 0.0559, 
0.0331, 0.0486, 0.0809, 0.000114, 8.33e-05, 0.0171, 0.0353, 0.0264, 
0.00359, 0.0855, 0.00798, 0.00174, 0.0609, 0.00186, 0.00245, 
0.0374, 0.00126, 0.00204, 0.00109, 0.000916, 0.0936, 0.0151, 
0.000496, 0.0308, 0.00589, 0.0804, 0.0186, 0.0064, 0.0259, 0.000253, 
0.0257, 0.012, 0.00196, 0.0658, 0.00243, 0.00122, 0.00213, 0.00122, 
0.00118, 0.00377, 0.00134, 0.00146, 0.0346, 0.0244, 0.00114, 
0.00454, 0.00219, 0.00946, 0.0494, 0.000433, 0.00372, 0.00502, 
0.00682, 0.00257, 0.000118, 0.00549, 0.000355, 0.00396, 0.00166, 
0.000358, 0.00554, 0.00133, 0.0104, 0.00197, 0.00223, 0.00231, 
0.00636, 0.000861, 0.00594, 0.00607, 0.00528, 0.000479, 0.00148, 
0.032, 0.00435, 0.00025, 7.89e-05, 0.00147, 0.00977, 3.33e-05, 
0.00633, 0.00134, 0.00029, 0.00029, 0.000242, 0.00716, 0.000505, 
0.00285, 0.000761, 0.0222), target.geno = c(0.944, 0.958, 0.97, 
0.948, 0.938, 0.96, 0.932, 0.918, 0.976, 0.966, 0.955, 0.928, 
0.953, 0.971, 0.91, 0.963, 0.979, 0.931, 0.949, 0.925, 0.982, 
0.995, 0.9, 0.913, 0.939, 1, 0.951, 0.996, 0.903, 0.941, 0.989, 
0.983, 0.909, 0.922, 0.915, 0.927, 0.993, 0.974, 0.985, 0.92, 
0.905, 0.967, 0.987, 0.934, 0.95, 0.919, 0.926, 0.941, 0.936, 
0.901, 0.949, 0.98, 0.912, 0.982, 0.939, 0.984, 0.951, 0.926, 
0.924, 0.927, 0.95, 0.966, 0.938, 0.979, 0.985, 0.997, 0.926, 
0.965, 0.943, 0.969, 0.978, 0.982, 0.969, 0.916, 0.914, 0.914, 
0.978, 0.977, 0.985, 0.906, 0.982, 0.958, 0.988, 0.979, 0.971, 
0.929, 0.916, 0.966, 0.949, 0.982, 0.973, 0.987, 0.911, 0.986, 
0.996, 0.968, 0.949, 0.973, 0.936, 0.989, 0.986, 0.985, 0.954, 
0.981, 0.978, 0.965, 0.978, 0.976, 0.942, 0.946, 0.95, 0.97, 
0.928, 0.933, 0.91, 0.903, 0.989, 0.932, 0.956, 0.956, 0.913, 
0.929, 0.9, 0.963, 0.979, 0.996, 0.983, 0.998, 0.996, 0.997, 
0.924, 0.999, 0.962, 0.961, 0.915, 0.939, 0.979, 0.952, 0.986, 
0.913, 0.967, 0.926, 0.943, 0.928, 0.971, 0.933, 0.966, 0.96, 
0.958, 0.947, 0.952, 0.965, 0.996, 0.985, 0.952, 0.922, 0.938, 
0.99, 0.939, 0.994, 0.98, 0.965, 0.976, 0.967, 0.915, 0.937, 
0.988, 0.951, 0.978, 0.92, 0.99, 0.974, 0.98, 0.976, 0.994, 0.948, 
0.938, 0.931, 0.927, 0.976, 0.911, 0.936, 0.908, 0.979, 0.911, 
0.955, 0.945, 0.935, 0.924, 0.998, 0.978), target.maf = c(0.0891, 
0.0248, 0.058, 0.0616, 0.0307, 0.0763, 0.0149, 0.00391, 0.0349, 
0.0807, 0.0667, 0.0544, 0.022, 0.0708, 0.0486, 0.0442, 0.0944, 
0.00898, 0.0281, 0.0883, 0.000322, 0.0501, 0.0323, 0.0557, 0.096, 
0.0991, 0.0837, 0.0271, 0.04, 0.07, 0.0165, 0.0795, 0.0746, 0.0204, 
0.0861, 0.0466, 0.0432, 0.0112, 0.00597, 0.0612, 0.0135, 0.0923, 
0.0643, 0.0367, 0.0234, 0.0549, 0.00935, 0.0555, 0.0883, 0.094, 
0.0523, 0.0268, 0.0135, 0.0703, 0.085, 0.0971, 0.074, 0.067, 
0.0608, 0.0154, 0.00796, 0.0271, 0.0528, 0.0786, 0.00345, 0.00422, 
0.0927, 0.0663, 0.00306, 0.00122, 0.00081, 0.000837, 0.000995, 
0.0735, 0.000247, 0.00055, 0.0555, 0.0609, 0.0242, 0.000916, 
0.00223, 0.0666, 0.0719, 8.45e-05, 0.0235, 7.55e-05, 0.000877, 
0.0261, 0.0911, 0.000649, 0.000792, 0.027, 0.00152, 0.00082, 
0.000848, 0.0511, 0.0247, 0.073, 0.053, 0.000576, 0.011, 0.000878, 
0.000812, 0.000839, 0.0373, 0.0534, 0.034, 0.0456, 0.0318, 0.00414, 
0.0597, 0.0641, 0.000824, 0.0183, 0.0244, 0.00349, 0.0449, 0.000528, 
0.00739, 0.000828, 0.000534, 0.000813, 0.000757, 0.00158, 0.000813, 
0.00183, 0.00184, 0.00171, 0.0733, 0.0274, 0.000231, 0.000486, 
0.02, 0.000838, 0.048, 0.00156, 0.00162, 0.000891, 0.00162, 0.00136, 
0.0071, 0.000596, 0.0291, 0.0696, 0.0207, 0.00132, 0.000924, 
0.000629, 0.00181, 0.000901, 0.00086, 0.00289, 0.00105, 0.00232, 
0.0228, 0.00102, 0.0036, 0.00265, 0.00151, 0.00232, 0.000669, 
0.00744, 0.000651, 0.0631, 0.00423, 0.00393, 0.00186, 0.00381, 
0.00677, 0.00675, 0.00696, 0.00699, 0.0468, 0.0672, 0.000928, 
0.00362, 0.00231, 0.00283, 0.000761, 0.026, 0.00226, 0.00498, 
0.00062, 0.0371, 0.00691, 0.00887, 0.0855, 0.00705, 0.00714, 
0.00393, 0.00789), clumping = c(FALSE, FALSE, FALSE, FALSE, TRUE, 
FALSE, TRUE, FALSE, TRUE, TRUE, TRUE, FALSE, FALSE, TRUE, TRUE, 
FALSE, TRUE, TRUE, TRUE, TRUE, FALSE, FALSE, TRUE, TRUE, FALSE, 
FALSE, FALSE, TRUE, FALSE, TRUE, FALSE, TRUE, TRUE, TRUE, FALSE, 
FALSE, FALSE, TRUE, FALSE, TRUE, FALSE, FALSE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, FALSE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, FALSE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, 
TRUE, TRUE, TRUE, TRUE, TRUE, TRUE, TRUE), pval.level = c(0.378, 
0.567, 0.801, 0.193, 0.464, 0.76, 0.664, 0.683, 0.497, 0.584, 
0.511, 0.726, 0.739, 0.178, 0.329, 0.314, 0.875, 0.234, 0.107, 
0.524, 0.861, 0.961, 0.291, 0.0244, 0.132, 0.0123, 0.153, 0.217, 
0.827, 0.64, 0.432, 0.6, 0.0651, 0.402, 0.353, 0.952, 0.92, 0.615, 
0.261, 0.996, 0.416, 0.9, 0.779, 0.0859, 0.0988, 0.136, 0.475, 
0.0332, 0.639, 0.736, 0.066, 0.857, 0.148, 0.674, 0.245, 0.119, 
0.9, 0.92, 0.993, 0.469, 0.92, 0.11, 0.076, 0.612, 0.422, 0.287, 
0.657, 0.633, 0.977, 0.59, 1, 0.442, 0.998, 0.683, 0.492, 0.999, 
0.388, 0.15, 0.195, 0.983, 0.285, 1, 0.912, 0.973, 1, 1, 0.214, 
0.939, 1, 0.438, 0.585, 0.993, 0.373, 0.416, 0.146, 0.0597, 1, 
0.764, 0.127, 0.997, 1, 0.365, 0.146, 0.453, 0.0302, 0.435, 0.0351, 
1, 0.071, 1, 0.0679, 0.000541, 0.441, 1, 0.748, 0.0437, 0.807, 
0.0108, 0.00766, 0.133, 0.859, 0.457, 0.517, 0.59, 0.04, 0.934, 
0.945, 0.491, 0.0516, 0.0351, 0.0588, 0.924, 0.0659, 0.784, 0.0704, 
0.309, 0.626, 0.79, 0.816, 0.197, 0.0901, 0.761, 0.556, 0.915, 
0.0761, 0.463, 0.967, 0.194, 0.362, 0.355, 0.113, 0.283, 0.661, 
0.231, 0.298, 0.119, 0.88, 0.593, 0.163, 0.16, 0.722, 0.145, 
0.748, 0.0843, 0.572, 0.768, 0.798, 0.375, 0.48, 0.468, 0.737, 
0.166, 0.113, 0.0996, 0.153, 0.154, 0.644, 0.691, 0.756, 0.18, 
0.826, 0.911, 0.539, 0.718, 0.4, 0.67, 0.855, 0.459, 0.928, 0.991, 
0.67), missing.handling = structure(c(1L, 1L, 1L, 2L, 1L, 3L, 
3L, 3L, 1L, 2L, 1L, 1L, 3L, 2L, 2L, 2L, 2L, 3L, 2L, 1L, 1L, 2L, 
1L, 2L, 3L, 1L, 2L, 3L, 2L, 3L, 3L, 2L, 1L, 3L, 2L, 2L, 3L, 1L, 
2L, 3L, 3L, 3L, 3L, 1L, 2L, 3L, 1L, 3L, 3L, 3L, 3L, 1L, 1L, 1L, 
2L, 3L, 2L, 1L, 3L, 3L, 1L, 3L, 3L, 2L, 2L, 3L, 3L, 1L, 1L, 2L, 
2L, 3L, 2L, 3L, 3L, 2L, 3L, 1L, 3L, 1L, 1L, 3L, 2L, 1L, 1L, 2L, 
1L, 1L, 3L, 1L, 3L, 1L, 1L, 1L, 1L, 3L, 1L, 1L, 2L, 3L, 2L, 2L, 
2L, 3L, 3L, 3L, 3L, 1L, 3L, 3L, 3L, 3L, 3L, 1L, 2L, 3L, 2L, 3L, 
3L, 3L, 1L, 3L, 2L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 2L, 2L, 
2L, 3L, 3L, 2L, 1L, 3L, 2L, 3L, 3L, 2L, 3L, 3L, 3L, 2L, 2L, 1L, 
3L, 2L, 1L, 2L, 2L, 3L, 3L, 3L, 3L, 2L, 3L, 3L, 2L, 2L, 3L, 3L, 
1L, 3L, 2L, 3L, 3L, 3L, 2L, 2L, 2L, 3L, 3L, 3L, 2L, 2L, 3L, 3L, 
3L, 3L, 3L, 2L, 3L, 3L, 3L, 1L, 3L), .Label = c("SET_ZERO", "IMPUTE", 
"CENTER"), class = "factor"), ld.external = c(NA, NA, NA, NA, 
TRUE, NA, TRUE, NA, FALSE, TRUE, TRUE, NA, NA, TRUE, TRUE, NA, 
FALSE, TRUE, FALSE, TRUE, NA, NA, FALSE, FALSE, NA, NA, NA, FALSE, 
NA, TRUE, NA, TRUE, TRUE, TRUE, NA, NA, NA, FALSE, NA, TRUE, 
NA, NA, TRUE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, FALSE, 
FALSE, FALSE, TRUE, TRUE, FALSE, FALSE, TRUE, TRUE, TRUE, TRUE, 
TRUE, FALSE, FALSE, FALSE, FALSE, FALSE, TRUE, TRUE, FALSE, FALSE, 
FALSE, FALSE, NA, FALSE, TRUE, TRUE, TRUE, FALSE, FALSE, FALSE, 
FALSE, TRUE, FALSE, NA, TRUE, NA, FALSE, FALSE, FALSE, FALSE, 
FALSE, FALSE, FALSE, TRUE, FALSE, TRUE, TRUE, FALSE, TRUE, FALSE, 
TRUE, FALSE, NA, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, 
FALSE, FALSE, FALSE, TRUE, FALSE, NA, TRUE, FALSE, FALSE, FALSE, 
FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, 
TRUE, FALSE, FALSE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, 
FALSE, FALSE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, FALSE, 
FALSE, FALSE, TRUE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, 
FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, TRUE, 
FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, 
FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, 
FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FALSE), ld.geno = c(NA, 
NA, NA, NA, 0.921, NA, 0.916, NA, NA, 0.92, 0.989, NA, NA, 0.958, 
0.944, NA, NA, 0.971, NA, 0.962, NA, NA, NA, NA, NA, NA, NA, 
NA, NA, 0.911, NA, 0.932, 0.998, 0.991, NA, NA, NA, NA, NA, 0.996, 
NA, NA, 0.915, NA, NA, NA, 0.985, NA, NA, NA, NA, NA, 0.921, 
0.933, NA, NA, 0.976, 0.936, 0.991, 0.91, 0.934, NA, NA, NA, 
NA, NA, 0.948, 0.92, NA, NA, NA, NA, NA, NA, 0.943, 0.968, 0.999, 
NA, NA, NA, NA, 0.93, NA, NA, 0.966, NA, NA, NA, NA, NA, NA, 
NA, NA, 0.995, NA, 0.984, 0.97, NA, 0.98, NA, 0.973, NA, NA, 
NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 0.946, NA, NA, 0.975, 
NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 0.976, NA, NA, 
NA, NA, NA, 0.902, NA, NA, NA, NA, NA, NA, NA, 0.998, NA, NA, 
NA, NA, NA, 0.941, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 
NA, NA, NA, 0.968, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 
NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA), 
    ld.maf = c(NA, NA, NA, NA, 0.0461, NA, 0.0713, NA, NA, 0.0507, 
    0.0276, NA, NA, 0.0938, 0.08, NA, NA, 0.0535, NA, 0.0904, 
    NA, NA, NA, NA, NA, NA, NA, NA, NA, 0.02, NA, 0.0836, 0.0221, 
    0.0615, NA, NA, NA, NA, NA, 0.00602, NA, NA, 0.00265, NA, 
    NA, NA, 0.0218, NA, NA, NA, NA, NA, 0.096, 0.0368, NA, NA, 
    0.0269, 0.00927, 0.00619, 0.0181, 0.00948, NA, NA, NA, NA, 
    NA, 0.0276, 0.0589, NA, NA, NA, NA, NA, NA, 0.0221, 0.0501, 
    0.00971, NA, NA, NA, NA, 0.0179, NA, NA, 0.0598, NA, NA, 
    NA, NA, NA, NA, NA, NA, 0.0192, NA, 0.0868, 0.027, NA, 0.0924, 
    NA, 0.0843, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 
    0.0986, NA, NA, 0.0515, NA, NA, NA, NA, NA, NA, NA, NA, NA, 
    NA, NA, NA, 0.0613, NA, NA, NA, NA, NA, 0.0716, NA, NA, NA, 
    NA, NA, NA, NA, 0.0246, NA, NA, NA, NA, NA, 0.0783, NA, NA, 
    NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 0.0953, NA, 
    NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, 
    NA, NA, NA, NA, NA, NA, NA, NA, NA, NA), clumping.kb = c(NA, 
    NA, NA, NA, 3917, NA, 2263, NA, 2580, 4112, 1490, NA, NA, 
    863, 2348, NA, 2551, 4818, 1095, 2197, NA, NA, 1201, 1268, 
    NA, NA, NA, 1643, NA, 3706, NA, 4148, 1984, 439, NA, NA, 
    NA, 160, NA, 4588, NA, NA, 911, 4918, 304, 1097, 206, 2833, 
    153, 143, 3283, 2144, 284, 1926, 638, 4346, 2432, 1273, 1582, 
    3506, 1251, 1063, 2590, 212, 1599, 1737, 1345, 1246, 431, 
    4042, 4400, 3219, NA, 4392, 3052, 3700, 4608, 3479, 4628, 
    4596, 3192, 1950, 257, NA, 4105, NA, 3282, 1844, 2914, 2408, 
    4361, 1752, 4004, 4813, 2308, 4088, 1599, 208, 3756, 3827, 
    3843, 1890, NA, 2792, 456, 619, 1440, 173, 3358, 4137, 1230, 
    322, 3375, 3667, 4131, NA, 185, 3962, 3561, 224, 3333, 3917, 
    970, 504, 1124, 3284, 3327, 3310, 4901, 4264, 3259, 415, 
    570, 3721, 1336, 2467, 3245, 126, 2187, 378, 450, 3550, 131, 
    126, 2095, 2232, 1008, 2258, 4469, 1210, 3360, 742, 3748, 
    328, 2109, 4467, 1115, 3013, 3183, 1259, 2307, 561, 4896, 
    2131, 2262, 4268, 2247, 4503, 4671, 3934, 4500, 668, 1160, 
    2034, 1314, 3741, 4238, 4140, 2659, 4388, 3750, 2673, 3963, 
    3848, 2434, 3846, 4751, 4248, 2053, 3288, 4381), clumping.r2 = c(NA, 
    NA, NA, NA, 0.627, NA, 0.336, NA, 0.254, 0.413, 0.427, NA, 
    NA, 0.178, 0.161, NA, 0.533, 0.402, 0.263, 0.734, NA, NA, 
    0.785, 0.184, NA, NA, NA, 0.751, NA, 0.715, NA, 0.122, 0.78, 
    0.318, NA, NA, NA, 0.115, NA, 0.38, NA, NA, 0.143, 0.766, 
    0.278, 0.262, 0.115, 0.222, 0.59, 0.207, 0.32, 0.364, 0.119, 
    0.353, 0.101, 0.641, 0.111, 0.593, 0.393, 0.718, 0.356, 0.367, 
    0.262, 0.589, 0.746, 0.203, 0.642, 0.404, 0.372, 0.253, 0.112, 
    0.104, NA, 0.743, 0.151, 0.641, 0.111, 0.11, 0.105, 0.565, 
    0.101, 0.34, 0.1, NA, 0.198, NA, 0.321, 0.267, 0.12, 0.107, 
    0.233, 0.113, 0.104, 0.221, 0.775, 0.103, 0.173, 0.104, 0.269, 
    0.101, 0.439, 0.107, NA, 0.11, 0.509, 0.614, 0.142, 0.166, 
    0.465, 0.174, 0.178, 0.107, 0.104, 0.534, 0.103, NA, 0.107, 
    0.106, 0.102, 0.591, 0.104, 0.104, 0.629, 0.231, 0.116, 0.105, 
    0.104, 0.105, 0.331, 0.723, 0.726, 0.104, 0.231, 0.104, 0.738, 
    0.596, 0.105, 0.171, 0.656, 0.398, 0.535, 0.104, 0.214, 0.484, 
    0.609, 0.103, 0.212, 0.162, 0.253, 0.104, 0.674, 0.226, 0.489, 
    0.515, 0.105, 0.227, 0.211, 0.104, 0.62, 0.211, 0.676, 0.713, 
    0.143, 0.307, 0.104, 0.484, 0.102, 0.412, 0.326, 0.104, 0.241, 
    0.386, 0.123, 0.405, 0.106, 0.329, 0.116, 0.395, 0.127, 0.61, 
    0.105, 0.603, 0.124, 0.768, 0.696, 0.104, 0.399, 0.109, 0.116, 
    0.411, 0.108)), row.names = c(NA, -191L), class = "data.frame")

However, as far as I see, there is no nonvisited combination of dependent hyperparameters as this should also be covered by the initial generateDesign, I guess?

Any thoughts on how to catch this error?

I am pretty sure we cannot help you if you cannot boil this down to a minimal reproducibil example

Ok, thanks. I'll try to create a minimal reproducible example.

Unfortunately, I didn't use the option to save the mlrMBO state during this benchmark, because I applied the same learner on multiple tasks which would lead to concurrent writes to the same file on disk. But this is another issue.

But you can set the save file individually in each MBO run so there should not be concurrent writes to the same file if configured correctly.

I think I observed this error as well and I am pretty sure it is not easily reproducible. Often restarting on a saved intermediate state helps.

Thanks for your reply @jakob-r.

I think I observed this error as well and I am pretty sure it is not easily reproducible. Often restarting on a saved intermediate state helps.

Yes, I'm struggling with creating a minimal reproducible example. I'm not sure where the error is thrown because in ParamHelpers:::c_generateDesign REAL() is called three times:
REAL(s_low) and REAL(s_upp) should be safely handled by ParamHelpers::doBasicGenDesignChecks(par.set).
And REAL(s_rescol) does not look too bad either.

There is also the macro UNPACK_REAL_MATRIX using a REAL() call. Maybe this could be the source of the error since there seems to be explicit conversion of the newdes object from ParamHelpers::generateDesign? But I'm a bit lost in the code at this point.

But you can set the save file individually in each MBO run so there should not be concurrent writes to the same file if configured correctly.

How do I do that in combination with mlr::batchmark? At the moment I create a mlr::makeTuneControlMBO object with mlrMBO::makeMBOControl where one could set the save.file.path and this is used in mlr::makeTuneWrapper and therefore used for every job.

I won't be able to help you on the ParamHelpers C Code Part.

Regarding the TuneWrapper it is complicated indeed. You can modify the control object for each learner. But to set it for each task and resampling iteration you would have to do it manually or hook some code in the batchtools algorithm which is only possible in a really hacky way.

Well how about instead of a minimal example a reproducible example?
And do not search too much for the place of the error. We can do that.

Glad that someone else also raised this issue up. I have been running mrMBO for 100-200 iterations smoothly. After running for a while generating proposals, and receiving updates, I eventually end up with this error and henceforth the data structure is corrupted

<simpleError in generateDesign(control$infill.opt.focussearch.points, ps.local, randomLHS): REAL() can only be applied to a 'numeric', not a 'character'>

Would have been easier to isolate the problem if I got this error in first iteration. But after 2-3 days, is not so easy.

Well how about instead of a minimal example a reproducible example?

Yes, that's what I'm trying to do now. But it is very tricky, as this error seems to show up rather randomly. I haven't been able to reproduce it with small datasets so far and as @asheetal said, having an example running for several days is also not very convenient, I guess? Additionally, I'm working with genetic data right now and you may know that sharing this kind of data is not easy from the regulatory side.

If it's random why don't you set a seed?

Ok the runtime is a problem there I agree

The point is we have absolutely 0 chance to debug this and help without a test

Sorry, I'm not able to reproduce this error even with setting a seed (how can that be...?). Maybe this issue can be closed then?

pat-s commented

sounds like it - sometimes strange things happen ;)