Readings and topics
Main reference
- Gautschi
- Numerical analysis
2nd edition. Birkhäuser/Springer 2012
Walter Gautschi
Other references
- ATAP
- Lloyd N. Trefethen. Approximation Theory and Approximation Practice, SIAM
- First course
- Uri M. Ascher and Chen Greif. A first course in numerical methods, SIAM
Last office hour
- Video
- Zoom-2021-05-06
Lecture 28
A-stable methods for ODEs and A-stability and eigenvalues.
Spectral methods,
Review of class.
- Readings
- Everything below
- Section 5.9
- Trefethen, Chapter 6, 7
- Final Review Slides
- Julia
- Spectral differentiation (html)
(ipynb)
- Notes
- Lecture 28
- Video
- Zoom-2021-04-29
Lecture 27
Global error analysis and step length control
- Readings
- Section 5.7, 5.8, 5.9
- ODE Theory
- Video
- Zoom-2021-04-27
- Notes
- Lecture 27
- Julia
- ODEs and Error Analysis (html)
(ipynb)
Lecture 26
One-step methods for ODEs and an intro to error analysis.
- Readings
- Section 5.3, 5.4, 5.5, 5.6, 5.8, 5.9
- ODE Theory
- ODE Theory
- Video
- Zoom-2021-04-22
- Notes
- Lecture 26
- Julia
- ODEs and Error Analysis (html)
(ipynb)
Lecture 25
One-step methods for ODEs and an intro to error analysis.
- Readings
- Section 5.3, 5.4, 5.5, 5.6
- ODE Theory
- Video
- Zoom-2021-04-20
- Notes
- Lecture 25
Lecture 24
Intro to ODEs and Forward Euler.
- Readings
- [Section 5.1][Gaustchi]
- Section 5.6.1
- Video
- Zoom-2021-04-15
- Notes
- Lecture 24
Lecture 23
Exam 2
Lecture 22
Newton's method and fixed points and exam 2 review.
- Readings
- Section 4.6, 4.7
- Notes
- Lecture 22
[Missing a piece because the app crashed!]
- Video
- Zoom-2021-04-06
- Handouts
- Exam 2 Review
Lecture 21
I decided to re-use the materials from a previous class recording (2016)
on the convergence of the secant method, the method of fixed points
for nonlinear equations and Newton's method for nonlinear equations. There is a quick
new (2021) intro video to give an overview of these ideas. But then we jump into them!
- Video
- Lecture-21-intro
- Lecture-21-Secant-analysis
- Lecture-21-Nonlinear-systems-2016
- Julia
- Secant Convergence Rates (html)
(ipynb)
- 2d Bisection (html)
(ipynb)
- Notes
- Lecture 21
- Readings
- Section 4.4, 4.9
- Julia
- Secant Convergence Rates (html)
(ipynb)
- 2d Bisection (html)
(ipynb)
Lecture 20
Rates of convergence, method of false
position.
- Readings
- Section 4.2, 4.4
- Notes
- Lecture 20
- Julia
- How quickly sequences converge (html)
(ipynb)
- Video
- Zoom-2021-03-30
Lecture 19
Intro to nonlinear equation solving, example problems and the method of
bisection.
- Readings
- Section 4.1, 4.3
- Bisecting roots in Julia Some interesting
details that arise with bisection in floating point.
- Notes
- Lecture 19
- Video
- Zoom-2021-03-25
Lecture 18
We continued with orthogonal polynomials and Gaussian quadrature and finished up
with how to compute eigenvalues of the Jacobi matrix to get nodes of the Gauss
Quadrature formula! And a bit about Gauss-Hermite quadrature.
- Reading
- Integration and Quadrature, Chapter 12, Spectral Methods in Matlab, Trefethen
- Section 3.2.2, 3.2.3, 3.2.5
- Section 3.2.4, 3.2.7 (Not Richardson though)
- Notes
- Lecture 18
- Video
- Zoom-2021-03-23
- Julia
- Orthogonal polynomials, eigenvalues, and quadrature (html)
(ipynb)
Lecture 17
Interpolatory quadrature via integrating polynomials exactly and getting
better accuracy via the node polynomial. We ended on how orthogonal
polynomials arise in quadrature. Then we discussed quadrature via the
method of undeteremined coefficients, which is a handy way to compute
these!
- Reading
- Section 3.2.2
- Section 3.2.5 Method of undetermined coefficients
- Notes
- Lecture 17
- Video
- Zoom-2021-03-16
Lecture 16
We saw the introduction of numerical integration with the Trapezoidal
and Simpsons rule and started thinking about interpolatory methods
- Reading
- Gautschi Section 3.2 (Trapezoidal and Simpson's rule)
- Section 3.2.2
- Notes
- Lecture 16
- Video
- Zoom-2021-03-11
Lecture 15
We saw a number of ways to approximate derivatives starting
from using polynomial interpolation and then going through
Taylor series and finite difference methods before reviewing
Gaustchi's perspective in the textbook.
- Readings
- Gautschi Section 3.1
- Notes
- Lecture 15
- Video
- Zoom-2021-03-09
Lecture 14
We had the first exam.
Lecture 13
We finished up splines and then started into derivates.
There is some julia
code to illustrate some non-obvious effects here, including
a good choice for the value of h in a finite difference
approximation. We also reviewed for the first exam on Thursday.
- Notes
- Lecture 13
- Video
- Zoom-2021-03-02
- Julia
- Approximating derivatives (html)
(ipynb)
- Handouts
- Exam 1 Review
Lecture 12
Today was all about splines and piecewise interpolants. We saw how
piecewise interpolants are arbitrarily accurate and it was easy to
get a good approximation of the best possible piecewise interpolant.
- Notes
- Lecture 12
- Video
- Zoom-2021-02-25
- Readings
- Gautschi Section 2.3
- Chapter 11 This has a good treatment of piecewise interpolants
from the basis function perspective.
Lecture 11
We reviewed Newton interpolation and used it to derive Hermite
interpolation by studying what it would mean to take divided
differences at the same point.
- Notes
- Lecture 11
- Video
- Zoom-2021-02-23
- Readings
- Gautschi Section 2.2
- Julia
- Newton interpolation (html)
(ipynb)
(wrong name, it's really Newton)
- Handouts
- Hermite Interpolation (Not finished)
Lecture 10
Barycentric interpolation and Newton interpolation.
- Notes
- Lecture 10
- Video
- Zoom-2021-02-18
- Readings
- Gautschi Section 2.2
- Berrut and Trefethen, Barycentric Lagrange Interpolation
- Julia
- Evaluating Polynomial Interpolants (html)
(ipynb)
Lecture 9
Due to the snow emergency, and the daycare closure, we had to improvise a bit.
I had grand visions of recording a fun lecture on the Lagrange interpolation
formula -- or what I think of as the pointwise cancellation form of the interpolating
polynomials -- along with Chebyshev nodes and Chebyshev polynomials. However, my first
attempt doing this quickly failed. So in the interest of getting you material as quickly
as possible, I'm going to use an old lecture from Ahmed Sameh on this same topic.
I'm so sorry I didn't have time to do more for you but I want to make sure you are
equipped for the homework.
Guest lecture by Ahmed Sameh on Lagrange interpolation, the error
formula, and Chebyshev nodes.
- Video
- Pointwise and Least Squares Error
2021 Video on a helpful relationship between pointwise error and least squares error
- CS514-2016-Lecture-9
(2016 Guest Lecture by Ahmed Sameh)
- Readings
- Gautschi Section 2.2
Lecture 8
We covered the normal equations and the Weierstrauss approximation theorem,
and orthogonal functions.
- Notes
- Lecture 8
- Video
- Zoom-2021-02-11
- Readings
- Gautschi Section 2.1
- Chapter 6
- Handouts
- Intro to Approx Theory
Lecture 7
We saw how to formalize the best approximation problem in terms
of function norms.
We got through integrals with a discrete measure, then did a quick demo
with Julia
- Notes
- Lecture 7
- Video
- Zoom 2021-02-09
- Handouts
- Intro to Approx Theory
- Readings
- Gautschi Section 2.0
- Trefethen, Numerically computing with functions
- Chapter 1-3
- Links
- chebfun
- ApproxFun
- Julia
- Approximating functions with Julia (html)
(ipynb)
Lecture 6
We did more Julia, derived the condition number of a matrix, and saw
some issues with the variance computation, then started into chapter 2
on functions.
- Notes
- Lecture 6
- Readings
- Gautschi Section 2.0
- Sample variance computations
- Julia
- Julia In-class
- Video
- Zoom 2021-02-04
Lecture 5
Today we completed our study of the overall floating point error
computation using the condition number of an algorithm. We saw
a few examples too and then had a demo of Julia
- Handouts
- Computer Error
- Notes
- Lecture 5
- Readings
- Gautschi Section 1.4, 1.5
- Julia
- Julia Intro (2021)
- Video
- Zoom 2021-02-02
- Zoom 2021-02-02-Julia
Extra little bit of lecture on Julia
Lecture 4
We introduced condition numbers for a function and studied two types:
a sharp estimate based on the condition number of all the constituent
gradients and a weaker condition number based on the norm of the Jacobian.
- Readings
- Gautschi Section 1.3
- Notes
- Lecture 4
- Handouts
- Computer Error
- Video
- Zoom 2021-01-28
Lecture 3
We got through the analysis of the fundamental floating point operations
with respect to input errors.
- Notes
- Lecture 3
- Readings
- Gautschi Section 1.2
- Pentium FDIV Bug
- Fun reading that mentions cancellation as a plot point
- Once dead by Richard Phillips (See intro to Chapter 55, page 208)
- Video
- Zoom 2021-01-26
- Syllabus
Helpful Julia Videos
- These are from another class. They are more introductory than our type of material,
- but ought to be helpful. Some of the details are outdated, but most should still work.
-
Julia-01 - Introduction
-- getting started. But see this instead
- for the modern equivalent -- https://techytok.com/julia-vscode/ Everything else
- should be adaptable.
- Julia-02 - Types
- Julia-03 - Plots
- Julia-04 - Features
- Julia-05 - Control and Scope
- Julia-06 - Problems
- Julia-07 - Birthday Paradox
- Julia-08 - Julia Set
- Julia-09 - Julia Set P2
Lecture 2
We reviewed sources for numerical errors in computers and got through the
format of IEEE Floating Point Arithmetic and how the numbers are stored.
- Notes
- Lecture 2
- Video
- Floating Point Accuracy
- Zoom 2021-01-21
- Readings
- Gautschi Section 1.1
- Constructions of the real numbers
- Decimal arithmetic
- Integers and floating point numbers in Julia Most of the things described here for Float64 will also work in Matlab, except Julia has some nicer functions.
- IEEE Floating point
- Michael Overton, Numerical Computing with IEEE Floating Point Arithmetic - a great textbook on IEEE arithmetic!
- Julia
- IEEE Floats in Julia
Lecture 1
We introduced the class and reviewed the syllabus.
- Readings
- Syllabus
- What is numerical analysis, Trefethen
- Fast inverse square root, Wikipedia
- Notes or Slides
- Lecture 1
- Video
- Zoom-2021-01-19
- Julia
- Julia Intro