Exercise "Minimization"
(6 points)
Implement the Newton's minimization method with back-tracking line-search where the user provides the first and second derivatives (the gradient and the Hessian matrix) of the objective function.
The interface to the function could be like this,
void newton( double f(vector* x, vector* df, matrix* H), /* f: objective function, df: gradient, H: Hessian matrix H*/ vector* xstart, /* starting point, becomes the latest approximation to the root on exit */ double eps /* accuracy goal, on exit |gradient|<eps */ )
Find the minima of the Rosenbrock's valley function,
f(x,y)=(1-x)2+100(y-x2)2.
Find the minima of the Himmelblau's function,
f(x,y)=(x2+y-11)2+(x+y2-7)2.
Record the number of steps it takes for the algorithm to reach the minimum.
(3 points) Quasi-Newton method with Broyden's update.
Implement the quasi-Newton minimization method with the Broyden's update. Start with the unity inverse Hessian matrix and then apply the updates. If the update diverges, reset the inverse Hessian matrix to unity and continue. You can implement one of the two variants (or both): i) the user only supplies the function, the gradient must be evaluated numerically; ii) the user supplies both the function and the gradient.
Find the minima of the Rosenbrock valley function and the Himmelblau's function from part (A).
Compare the effectiveness (say, the number of steps to reach the minimum) of
Apply your implementation to the following non-linear lieast-squares fitting problem:
Suppose that an experiment on measuring the activity of a radioactive substance as function of time gives the following result
# ti, yi, σi 0.23 4.64 0.42 1.29 3.37 0.37 2.35 3.00 0.34 3.41 2.55 0.31 4.47 2.29 0.29 5.53 1.66 0.27 6.59 1.58 0.26 7.65 1.69 0.25 8.71 1.37 0.24 9.77 1.45 0.24or
double t[] = {0.23,1.29,2.35,3.41,4.47,5.53,6.59,7.65,8.71,9.77}; double y[] = {4.64,3.38,3.01,2.55,2.29,1.67,1.59,1.69,1.38,1.46}; double e[] = {0.42,0.37,0.34,0.31,0.29,0.27,0.26,0.25,0.24,0.24}; int N = sizeof(t)/sizeof(t[0]);where the time ti, activity yi, and the experimental uncertainty σi are given in some rescaled units.
Fit the function f(t)=A*exp(-t/T)+B —where A (initial amount), T (lifetime), and B (background noise) are the fitting parameters—to the data and determine the lifetime, T, of the substance.
The (master) function to minimize is
F(A,T,B)=∑i(f(ti)-yi)²/σi²
. That is, you have
a three-dimensional
minimization problem in the space of the parameters A, T, and
B.
(1 point) Downhill Simplex Method
Implement the downhill simplex method.
Apply it to the minimization problems above (Rosnebrock's, Himmelblau's, non-linear least-squares fit).