Exercise "Root Finding"
Implement the Newton's method with simple backtracking linesearch algorithm where the derivatives in the Jacobian matrix are provided by the user.
The interface to the function could be something like
void newton_with_jacobian( void f(const vector* x, vector* fx, matrix* J), vector x, double epsilon );where the parameters are:
An obvious convergence criterion is ‖f(x)‖<ε.
You should use your own routines for solving linear systems, but you may use BLAS routines for 'elementary' matrix/vector operations.
Solve the system of equations
A*x*y = 1 ,where A=10000.
exp(-x) + exp(-y) = 1 + 1/A,
Find the minimum of the Rosenbrock's valley function,
f(x,y) = (1-x)2+100(y-x2)2by searching for the roots of its gradient.
Find the minimum of the Himmelblau's function
f(x,y) = (x2+y-11)2+(x+y2-7)2by searching for the roots of its gradient.
(3 points) Newton's method with back-tracking linesearch and numerical Jacobian
void newton( void f(const vector* x, vector* fx), vector* x, double dx, double epsilon );where dx is the value of finite difference to be used in numerical evaluation of the Jacobian it should be smaller than the typical interval where the function changes appreciably but probably not smaller than square root of machine epsilon. One should stop iterations if the step-size becomes smaller than the dx parameter.
(1 point) Newton's method with refined linesearch
Implement a more refined linear search with, say, quadratic interpolation.
Compare the number of steps and the number of function calls for the refined linear search and the simple backtracking.