Ordinary least-squares fit

  1. (6 points) Ordinary least-squares fit by QR-decomposition
    1. Implement a function that makes a least-squares fit—using your QR-decomposition routines—of a given data-set, {xi, yi, δyi}i=1...n , with a linear combination F(x)≐∑ k=1..m ck fk(x) of given functions fk(x)|k=1..m .

      The parameters to the function should be the data to fit, {xi, yi, δyi}, and the set of functions, {fk}, the linear combination of which should fit the data. The function must calculate the vector of the coefficients, {ck}.

    2. Fit the following data,

      x  =     0.1    1.33    2.55    3.78       5    6.22    7.45    8.68     9.9
      y  =   -15.3    0.32    2.45    2.75    2.27    1.35   0.157   -1.23   -2.75
      dy =    1.04   0.594   0.983   0.998    1.11   0.398   0.535   0.968   0.478
      
      with the linear combination of functions {log(x), 1, x}.

    Hint: the set of fitting functions {fk(x)} can be implemented, e.g., as

  2. (3 points) Uncertainties of the fitting coefficients

  3. (1 points) Ordinary least-squares solution by thin singular-value decomposition

    1. Implement a function that makes a singular value decomposition of a (tall) matrix A by diagonalising ATA (with your implementation of the Jacobi eigenvalue algorithm), ATA=VDVT (where D is a diagonal matrix with eigenvalues of ATA and V is the matrix of the corresponding eigenvectors) and then using A=UΣVT where U=AVD-1/2 and Σ=D1/2.
    2. Implement a function that solves the linear least squares problem Ax=b using your implementation of singular-value decomposition.
    3. Implement a function that makes ordinary least-squares fit using your implementation of singular-value decomposition.