Algorithmic Differentiation of Numerical Methods
Abstract
We consider the Algorithmic Differentiation (also know as Automatic Differentiation; AD) of numerical simulation programs that contain calls to direct solvers for systems of n linear equations. AD of the linear solvers yields a local overhead of O(n^3) for the computation of directional derivatives or adjoints of the solution vector with respect to the system matrix and right-hand side. The local memory requirement is of the same order in adjoint mode AD. Mathematical insight yields a reduction of the local computational complexity to O(n^2). The memory overhead can be reduced to at least O(n^2) in adjoint mode. We derive efficient tangent-linear and adjoint direct linear solvers and illustrate their use within tangent-linear and adjoint versions of the enclosing numerical simulation.
Preprint (pdf)
Code
The code is for review only. For compiling, dco/c++ is needed. Please contact info@stce
Code from paper appendix. (tar.gz)
- tangent-linear direct linear solver (discrete/continuous)
- adjoint direct linear solver (discrete/continuous)
Implementation of the case study using dco/c++. (tar.gz)
- Solves a (linear) PDE-constrained fitting problem with a gradient-based method.