Exercise "Root Finding"

  1. (6 points) Newton's method with analytic Jacobian and back-tracking linesearch
    1. Implement the Newton's method with simple backtracking linesearch algorithm where the derivatives in the Jacobian matrix are provided by the user.

      The interface to the function could be something like

      void newton_with_jacobian(
      	void f(const vector* x, vector* fx, matrix* J),
      	vector x,
      	double epsilon
      );
      
      where the parameters are:
      • function f: takes the input vector x, calculates analytically the vector f(x) and the Jacobian matrix of derivatives and puts them it into vector fx and matrix J;
      • vector x: on input contains the starting point, on output becomes the latest approximation to the root;
      • double epsilon: the accuracy goal.

      An obvious convergence criterion is f(x)‖<ε.

      You should use your own routines for solving linear systems, but you may use BLAS routines for 'elementary' matrix/vector operations.

    2. Solve the system of equations

      A*x*y = 1 ,
      exp(-x) + exp(-y) = 1 + 1/A,
      where A=10000.
    3. Find the minimum of the Rosenbrock's valley function,

      f(x,y) = (1-x)2+100(y-x2)2
      
      by searching for the roots of its gradient.
    4. Find the minimum of the Himmelblau's function

      f(x,y) = (x2+y-11)2+(x+y2-7)2
      
      by searching for the roots of its gradient.
  2. (3 points) Newton's method with back-tracking linesearch and numerical Jacobian

    1. Implement the version of the Newton's method with back-tracking where the Jacobian matrix of the derivatives is evaluated numerically. The interface could be like this,
      void newton(
      	void f(const vector* x, vector* fx),
      	vector* x,
      	double dx,
      	double epsilon
      );
      
      where dx is the value of finite difference to be used in numerical evaluation of the Jacobian it should be smaller than the typical interval where the function changes appreciably but probably not smaller than square root of machine epsilon. One should stop iterations if the step-size becomes smaller than the dx parameter.
    2. Solve the same systems as in the "A"-sub-exercise and compare the number of steps (and the number of function calls) the two routines take to find the root.
    3. Also compare the number of steps your root-finder takes with that of a GSL-root-finder (or any other library routine you can find).
  3. (1 point) Newton's method with refined linesearch

    Implement a more refined linear search with, say, quadratic interpolation.

    Compare the number of steps and the number of function calls for the refined linear search and the simple backtracking.