The list of examination projects (link above) is ready, you may
start working on your project now. You may well ask me questions about
the formulation of the projects via email (or personally in the week
26). Good luck!
It seems that you must upload the link to your repository
onto the examination server on Friday 30/6/2023 9:00-13:00. You should check
this at your personal examination plan.
You have to do you exam project in a separate directory (easily
identifyable, say "exam") just like another homework with the little
difference that here you have a bit more freedom to decide what to do
and how (within the limits of the project formulation, of course).
The points for your exam project will be given similar to
homeworks: 6 points for A-level, 9 points for B-level, and 10 points for
C-level. Here you have to decide on the level yourself (and inform me
of your decision in README.TXT or similar). It's part of the examination:
you must be able to critically access your own achievements.
Your score will be calculated using the formula
score = 0.7*total_points_for_homeworks/10 + 0.3*points_for_exam_project
Your grade will be calculated from your score using the following formulae,
if 9.1 ≤ score : grade = 12 (A);
if 8.1 ≤ score < 9.1 : grade = 10 (B);
if 7.1 ≤ score < 8.1 : grade = 7 (C);
if 6.1 ≤ score < 7.1 : grade = 4 (D);
if 5.1 ≤ score < 6.1 : grade = 02 (E);
if score < 5.1 : grade = 00 (F);
If I remember correctly, we have decided to weight the grades
for the homeworks and the examination project in the 70%-30% proportion
(provided both are at least 02/E). Please send me an email if this
is wrong.
(26/4) We have voted for the two extra topics for this semester
and the winners are: "Fast Fourier Transform" and "Error estimates in
nonlinear least squares analysis of data".
(3/4) Added an alternative task—about
relativistic precession of planet orbits—to "ODE" homework.
(31/3) No teaching next week because of Easter holidays.
(31/3) Eigenvalues homework: remember to generate a
symmetric (!) matrix to be diagonalized.
(27/3) In your Makefiles please use relative paths to your
directories rather than absolute paths. For example, in your homeworks,
instead of
MATLIBDIR = $(HOME)/repos/ppnm/matlib
use (assuming you are in, say, $(HOME)/repos/ppnm/homeworks/splines)
MATLIBDIR = ../../matlib
Absolute paths do not generally work when somebody clones your repository
to their box.
(22/3) Meeting the popular demand, I have compiled all chapters
into a book available at the link above.
(16/3) For C++ coders: I did the multiproc exercise both with
"pthreads" and with "std::thread" and it works as expected on my Termux
box, see [./examples/multiproc].
I've also run the exercise on another box and it worked there as well
(I had to compile it with -pthread though). So, the problem must be
in WSL.
- (16/3)
Could you please indicate (e.g. in your readme file) your
progress with homeworks with a table like this,
======================================
| # | homework | A | B | C | Σ |
======================================
| 1 | LinEq | 6 | 3 | 1 | 10 |
---------------------------------------
| 2 | EVD | 6 | - | - | 6 |
---------------------------------------
| 3 | LeastSquares | 6 | 3 | - | 9 |
---------------------------------------
| ... |
======================================
| total points: 95 |
======================================
- (!) Remember that exercises/homeworks in
your repositories must be made (with the command "make").
- (4/3) If you use "svg" format for your plots you might consider making
your background "white" (instead of the default "transparent"), like this,
set terminal svg background "white"
- (4/3) Hint: in your Makefile you can define macros
DLLS = $(addprefix -reference:,$(filter %.dll,$^))
CODE = $(filter %.cs,$^)
MKEXE = mcs -target:exe -out:$@ $(DLLS) $(CODE)
MKDLL = mcs -target:library -out:$@ $(DLLS) $(CODE)
and then do like this,
main.exe : main.cs sfuns.dll ; $(MKEXE)
sfuns.dll : gamma.cs erf.cs ; $(MKDLL)
- (2/3) You can instruct mono to look for your dll-files in a particular
folder, say "/path/to/my/dll/folder", like this: instead of our usual
mono main.exe >Out.txt
you should say
MONO_PATH=/path/to/my/dll/folder mono main.exe >Out.txt
- (24/2) It seems that mono looks for the dynamically
linked libraries (dll-files) in the same directory
as the executable file; then in the directories specified by the
MONO_PATH environment variable; and then in the system specific folder
where all the system dll's are located.
However mono can build a standalone executable with the mkbundle
tool. For example you can bundle 'main.exe' and 'sfuns.dll' into an
executable 'main' like this,
mkbundle -o main --simple main.exe sfuns.dll
You can then run the 'main' executable simply as
./main
- (21/2) In the lecture "4-io" the different methods to read numbers
into your program are now split into separate examples.
- (15/2) Added "manually calculated results" to the "complex" exercise
(for tests). Check them before using though.
- (13/2)
Moved the backup to sdfeu.org
(http://fedorov.sdfeu.org/prog).
-
Introduction.
• About the course; exercises/homeworks; code repositories; examination.
• POSIX [→]
(UNIX) systems; clusters and supercomputers
[CSCAA,
top500 supercomuters].
• The Computer Language Benchmarks Game
[→]
summary
[→].
• We shall only use open software in this course
[copyleft].
⊕ Exercise "posix".
-
Code repositories. Makefiles.
* Distributed Version Control[→]
systems and code repositories; Mercurial[→] (hg
)
and Git[→] (git
);
* POSIX make
utility[→]
for managing projects on a computer.
* C-sharp[→]
mono[→]
framework for POSIX programming.
+ Reading:
note "make",
introduction to Git from atlassian[→].
+ Exercises:
"git",
"hello";
- Extra reading: Chapter
"Introduction To Makefiles"[→]
in the "GNU make manual";
- Extra extra reading: Chapter
"make - maintain, update, and regenerate groups of programs",
of the POSIX standard
[IEEE Std 1003.1-2017 (POSIX.1-2017)]
+ Quiz
(you might want to install additional man-pages with the command
sudo apt-get install manpages-posix manpages-posix-dev
):
- What is your shell?
- In your shell prompt, what do the keys "up",
"down", "left", and "right" do?
- What are the "~", ".", and ".." directories?
- At your shell prompt, what does the command
!string
do?
- At your shell prompt, what does the command
!number
do?
- What are the personal configuration files for your shell?
- What do the commands
echo
,
time
, and
date
do?
- At your system, are
echo
and
time
programs or shell commands?
- At your system, does your shell-completion work for the long options
and/or Makefile targets?
-
Basics of C# programming.
*
Structure of a C#-program; Main function.
*
Basic types:
int
double
string
complex
;
Scope of variables; Variable shadowing.
*
Compiling, linking, and running C#-programs with mono.
*
Simple output with System.Console.Write
method.
*
Mathematical functions in the System.Math
class.
+ Reading: notes
"C#-program"[→],
"Compile/link/run"[→]
"Scope"[→],
"Simple output"[→];
+
Reading: Makefile automatic variables[→]:
$@
(the target),
$<
(the first prerequisite),
$^
(all prerequisites).
+
Reading: Makefile Functions for Transforming Text[→].
-
Extra reading: C#-syntax / Program structure[→],
C#-syntax / Operators[→].
- Extra reading:
Math functions in the System.Math
class[→].
the System.Console.Write
method[→] method for simple output.
+ Exercises:
"math".
+ Quiz
- Suppose you have messed something up terribly with the latest
(uncommitted) changes to your project. How can you discard all changes
you have done to your project since the last commit?
- Suppose you have realized that the last commit contained an
error. How can you discard the last commit? The last two commits?
- Suppose you have committed some changes to your project both at the
server (the place where you push) and at your box. What do you do now?
- What are the values of the following automatic variables
in a Makefile:
$@
,
$<
,
$^
?
- What does this statement do in a Makefile recipe:
$(filter %.cs,$^)
?
- What does this statement do in a Makefile recipe:
$(addprefix -reference:,$(filter %.dll,$^))
?
- When defining a macro in a Makefile, what is the difference between
"=" and ":="?
-
More Csharp: conditionals, loops, classes.
*
Conditional if else
;
Loops for
, while
, do while
;
Loop foreach
;
*
Arrays; Classes/structs; Value/reference type; Example: "vec" class;
+Reading:
C# conditionals [→];
C# loops [→];
C# arrays [→];
C# classes [→];
+Exercises:
"epsilon",
"vec".
+Quiz:
- Rewrite the "for-loop
for(init;condition;increment)body
"
using the while-loop.
- Rewrite the while-loop "
while(condition)body
" using
the for-loop.
- Rewrite the do-while-loop "
do body while(condition)
" using
the while-loop.
- Rewrite this piece of code, "
(condition?iftrue:iffalse)
",
using the "if" statement.
Hint: google "ternary conditional".
- What is the difference between "class", "static class", and "struct"?
- What is the difference between "value type" and "reference type".
- Explain the modifier "e16" in
WriteLine($"x={x:e16}");
(try google "c# standard numeric format").
- Try
WriteLine( 0.1+0.1+0.1+0.1+0.1+0.1+0.1+0.1 == 8*0.1 );
Why does it produce "false" result ?
- Now, why does
WriteLine( 0.125+0.125+0.125+0.125+0.125+0.125+0.125+0.125 == 8*0.125 );
produce "true" result?
-
Input/output.
*Standard streams, redirections, piping; File streams.
*Command-line arguments to Main function.
+Reading: note "Input/Output".
+Exercises:
input/output;
complex;
Wiki.
-Extra reading:
Standard streams [→];
Standard streams in C# [→];
Redirection and piping [→];
Microsoft docs: command line arguments [→].
+Quiz:
- What are the types of the objects
System.Console.Out
and
System.Console.In
?
Hint: try
//docs.microsoft.com/dotnet/api/system.console.out
and
//docs.microsoft.com/dotnet/api/system.console.in
- What are the delimiter-charactes in your command-line?
- What are the elements of the "args" array in the "
Main(string[]
args)
"
function after the command
mono main.exe -input:in.txt -output:out.txt -numbers:1e0,2e0,3e0
- What is the maximum length of the command-line in your system?
Hint: try "getconf ARG_MAX
" and/or (if you use GNU)
"echo|xargs --show-limits
".
- How can you send the contents of a file into the program's standard
input stream?
- Explain the syntax
`command`
and
$(command)
in bash. Hint:
Bash
command substitution; POSIX
shell command substitution.
- How can you send the contents of a file as the argument to the program's
Main function?
- Why do you need double-dollar,
$$(command)
, in the Makefile? Hint:
GNU make: variables in
recipes.
- What will the following Makefile print (and why)?
pwd = a random string
test:
@echo pwd
@echo `pwd`
@echo $(pwd)
@echo $$(pwd)
-
More C-sharp: generics; delegates; closures.
* C-sharp generics;
* Passing functions around: delegates, anonymous delegates, closures;
+ Reading:
note "generics";
note "delegates";
+ Exercise "generic list";
- Extra reading:
C-sharp generics[→];
C-sharp delegates[→];
System.Func delegate[→];
Closures in computer programming[→];
+ Quiz
- Suppose we have a function
static void set_to_seven(double x){ x=7; }
Is the argument variable set to seven in the caller's scope (in the
scope where the function is called)?
Explain.
- Suppose we have a function
static void set_to_seven(double[] xs){ for(int i=0;i<xs.Length;i++)xs[i]=7; }
Are the elements of the argument array set to seven in the caller's scope?
Explain.
- Suppose we have a function
static void set_to_seven(double[] xs){
xs = new double[xs.Length];
for(int i=0;i<xs.Length;i++)xs[i]=7;
}
Are the elements of the argument array set to seven in the caller's scope?
Explain.
- Suppose we have a function
static void set_to_seven(ref double x){ x=7; }
Is the argument variable set to seven in the caller's scope?
Hint: google "csharp call by reference".
-
Multithreading, multiprocessing, parallel computing.
* Symmetric multiprocessing; threads;
* System.Threading.Thread class: preparing, running, and joining threads;
* Pitfalls in parallelism;
* make --jobs[=4]
;
+ Reading:
note "multiprocessing";
+ Exercise
"multiprocessing";
- Extra reading: system.threading.thread class [→];
potential pitfalls in parallelism.
+ Quiz
Suppose you calculate the harmonic sum using system's
"Parallel.For"[→]
loop like this,
double sum=0; Parallel.For( 1, N+1, delegate(int i){sum+=1.0/i;} );
why does it run slower than the ordinary serial loop,
double sum=0; for(int i=1; i<N+1; i++) {sum+=1.0/i;}
despite using more processors? (It does so in my box).
Suppose you've got the following Makefile,
TIME1 := time --output $@
TIME2 = time --output $@
test:
@echo $(TIME1)
@echo $(TIME2)
what will it print and why?
-
Scientific plots with gnuplot/plotutils.
⊕ Reading:
"plot" command from
gnuplot documentation,
and/or
graph
manual.
⊕ Exercise "plots";
-
Systems of linear equations
[chapter "Linear equations"]
[homework "linear equations"]
* Triangular systems; back- and forward-substitution.
* QR-decomposition algorithms: modified Gram-Schmidt orthogonalization, Householder reflections, Givens rotations.
* LU-decomposition: Doolittle algorithm (→exam).
* Cholesky decomposition (→exam).
* Determinant of a matrix.
* Matrix inverse.
-
Eigenvalue (EVD) and singular value (SVD) decompositions of matrices.
[chapter "Eigenvalues"]
[homework "eigenvalues"]
* Eigenvalues and eigenvectors;
Eigenvalue decomposition (diagonalization) of a matrix;
* Singular value decomposition of a matrix (→exam);
* QR/QL algorithm; Jacobi eigenvalue algorithm;
* Rank-1 and Rank-2 updates of an EVD (→exam);
* Quiz:
- What happens if you apply your Jacobi EVD algorithm to a matrix that
is already diagonal? How many operations will it take?
- What change in the formulae is needed to make the algorithm
arrange the eigenvalues in descending order?
- Suppose a symmetric square matrix A has its EVD as
A=VDVT. What is its SVD?
-
Least-squares problem; Ordinary least-squares fit.
[chapter "Ordinary least-squares"]
[homework "least-squares"]
• Least-squares problem.
• Least-squares solution with QR-factorization and with SVD
factorization; Pseudoinverse of a matrix (→exam?).
• Ordinary least-squares fit, fit errors and covariance matrix.
-
Interpolation 1
[chapter "Interpolation"]
[homework "splines"]
• Polynomial interpolation;
• Spline interpolation: linear, quadratic, cubic splines;
• Object oriented and functional programming styles.
-
Interpolation 2
•Akima sub-spline (→exam) and monotone cubic interpolants;
•Rational function interpolation, Berrut interpolants (→exam);
•Bilinear interpolation on a rectilinear 2D grid (→exam);
-
Ordinary differential equations (ODE) - 1
["chapter ODE"]
[homework "ODE"]
• Runge-Kutta (one-step) methods;
• Local error estimate;
• Adaptive step-size strategy;
-
Ordinary differential equations (ODE) - 2
• Implicit, multi-step, and predictor-corrector steppers;
-
Numerical integration - 1
[chapter "Integration"]
[homework "Integration"]
• Classical Newton-Cotes quadratures
(quadratures with regularly spaced abscissas);
• Adaptive algorithms with regularly spaced abscissas;
-
Numerical integration - 2
• Quadratures with optimized nodes;
Gauss and Gauss-Kronrod quadratures;
• Variable transformation quadratures; Clenshaw-Curtis quadrature;
-
Monte Carlo integration
[chapter "montecarlo"]
[homework "Monte Carlo"]
• Plain Monte Carlo sampling; pseudo- and quasi-random (= low
discrepancy) sequences.
• Importance and Stratified sampling.
-
Nonlinear equations / root-finding
[chapter "roots"]
[homework "roots"]
• Modified Newton's method with backtracking linesearch;
• Quasi-Newton methods, rank-1 updates, Broyden's update;
-
Minimization 1: locally convergent methods
[chapter "Minimisation"]
[homework "Minimisation"]
• Modified Newton's method with backtracking linesearch;
• Quasi-Newton methods: Broyden and SR1 updates of Hessian matrix;
• Dowhill simplex (Nelder-Mead) method;
-
Minimization 2: globally convergent methods
[chapter "Optimization"]
• Randomized local minimizers;
• Simulated annealing;
• Particle swarm optimization;
-
Artificial Neural Networks
[chapter "Artificial Neural Networks"]
[homework "neural network"]
• Artificial neurons and neural networks;
• Three-layer feed-forward neural networks; input, output, and hidden
neurons;
• Network training: supervised and unsupervised learning;
-
Nonlinear least squares fit: uncertainties of the fitting parameters
[chapter "Minimisation"]
- Non-linear least squares fit: Gauss-Newton method.
- Linearization of the problem at the minimum.
- Correlation matrix of the linearized problem.
-
Fast Fourier transform and applications
[chapter "FFT"]
- Discrete Fourier Transform and applications
- Fast Fourier Transform: Danielson-Lanczos lemma and Cooley-Tukey algorithm;
-
Eigenvalues: Rayleigh quotient methods;
Generalized eigenvalue problems
• Rayleigh quotient iteration;
• Lagrange multipliers method;
• Generalized eigenvalue problems;
-
Linear algebra: power methods and Krylov subspaces
[chapter "Power methods"]
• Eigenvalues: power and inverse iteration methods (→exam);
• Eigenvalues: Arnoldi and Lanczos methods (→exam);
• Linear equations: GMRES (Generalized Minimum RESidual) method;
-
Numerical methods for solving the few-body (partial differential)
Schrödinger equation
- Finite difference method;
- Spectral method (basis expansion);
- Variational methods;
- Spectral variational method;
- Diffusion (Green's function) Monte Carlo;
- Artificial neural networks.