Read poly approx.ipynb
first, poly approx in higher dimensions.ipynb
is currently unfinished.
So, frequently, on graphics twitter, a discussion will come up about finding a "good" approximation to some function like
or
on some range, usually
Well, there are entire families of methods to cook up polynomial approximations to functions.
In this notebook, we focus on a technique that minimizes mean squared error (MSE).
It's not the only thing we can minimize.
But it's one of the easiest things to minimize.
And we can do it in almost closed form for surprisingly arbitrary choices of functions to approximate.
Imagine you have a function
Now, if you write that whole thing out, you get
That looks messy, but it turns out the whole thing can be written as a matrix expression that expresses the MSE
of our approximation function in terms of the sequence of
It kind of looks like
where
We can minimize it by using the standard least-squares trick of setting the gradient to zero, and solving
I've been told that this is a standard technique, but, having stumbled upon it by re-inventing the wheel, I don't know what the standard name for it is.
It's closely related to least-squares regression. Maybe it's just called "least squares regression". Or maybe it's called something like "least squares regression with (word for one of the tricks being used here)"
I believe some established work on this topic is contained in https://arxiv.org/abs/1208.6398 -- but I suspect most of the techniques presented here are far older. Any help on tracking down earlier references would be appreciated.