v_function: improve heuristics
Opened this issue · 2 comments
hartytp commented
The heuristics for v_function are currently not great.
Thoughts
- API: currently the v function is described by positive and negative gradients. IMHO that's non-ideal since there isn't an easy way to constrain the fit to be symmetric. Might be better to do something like a gradient and an asymmetry factor (which can usually default to 1)
- putting
x0
asmean(x)
only works if the v function is decently well centred. I've found this to be pretty fragile - similarly, putting the offset as the mean of
y
doesn't work that well in my experience - maybe a better way is to first figure out which way up the v function is (maybe add a sign parameter so users can specify this, and then enforce that the other gradients are strictly positive). once we know that we can easily generate good guesses for the other parameters
- maybe one way of guessing which way up it is is to find the min/max points, then look at the max/max on either side of them. e.g. on a downwards v function, should have higher points on either side of the minimum. This doesn't work if the minimum is off the end of the dataset in one direction, but I don't see an obviously better way (other than just try fitting for both signs and see which gives lower residuals).
I'd be interested in input on better ideas if people have them?
pathfinder49 commented
Sounds good to me. Some thoughts:
- I imagine you could find the sign more robustly using y(x_min), y(x_max), max(y) and min(y).
- Maybe keep the two graidients as derived parameters for better back compatibility
hartytp commented
I imagine you could find the sign more robustly using y(x_min), y(x_max), max(y) and min(y).
Can you sketch out what you had in mind? I agree there should be a good solution here but don't quite see it.
Maybe keep the two graidients as derived parameters for better back compatibility
👍