Now we'll look at a specific type of optimization problem and how Newton's approach can be used to solve it.
Firstly we generate 100 noisy data of a sinusoid as
where A is the amplitude between 1-10, f is the frequency between 5-10 Hz and Φ is the phase between 0-2π rd.
wi is a zero-mean Gaussian noise whose variance is given as
After creating 100 noisy data points, we can move on to a sinusoid fitting process with Newton's method.
We wish to fit a sinusoid to the measurement data. The equation of the sinusoid is: y = A.sin(2πft+ɸ) with appropriate choices of the parameters A,f and ɸ.
To formulate the data-fitting problem, we construct the objective function ∑ (yi - A.sin(2πfti+ɸ)yi - A.sin(2πfti+ɸ)) where i = 1,2,3,....,100 and yi's are the measurement nosiy data points.
Let x = [A,f,ɸ]T represent the vector of decision variables.To apply Newton’s method, we need to compute the gradient and the Hessian of f.For calculating Jacobian and Hessian of objective function we defining a function of which calculate partial derivative of f.
f(x) = transpose(r(x)) * r(x)
To apply Newton’s method, we need to compute the gradient of f.So we get the Jacobien with the function we defined.Then, the gradient of f can be represented as d[f(x)]=2transpose(J(x))*r(x)
Newton’s method applied to the nonlinear least-squares problem is given by x(k+1) = x(k) * inverse[transpose(J(x))*J(x)] * transpose[J(x)] * r(x)
When we fit the curve with the non-linear Gaussian method, we get the graph below.