Max Aehle, Reverse-Mode Automatic Differentiation of Compiled Programs

Algorithmic differentiation (AD) is a set of techniques to obtain accurate derivatives of computer-implemented functions. For gradient-based numerical optimization purposes, the reverse mode of AD is especially suited — the run-time it needs to compute a gradient of the objective function is proportional to the run-time of the objective function, and independent of the number of design parameters.

In practice, classical AD tools require that the source code of the computer-implemented function is available, in a limited set of programming languages. As a step towards making AD applicable to cross-language or partially closed-source client programs, we developed the new AD tool Derivgrind [1]. Derivgrind leverages the dynamic binary instrumentation framework Valgrind to add forward-mode AD logic to the machine code of compiled computer code.

In this talk, we present the new index-handling and tape-recording capabilities that we added to Derivgrind during the last months [2]. In combination with a simple tape evaluator program, they enable operator-overloading-style reverse-mode AD for compiled programs.

[1] Max Aehle, Johannes Blühdorn, Max Sagebaum, Nicolas R. Gauger. Forward-Mode Automatic Differentiation of Compiled Programs. arXiv:2209.01895, 2022.
[2] Max Aehle, Johannes Blühdorn, Max Sagebaum, Nicolas R. Gauger. Reverse-Mode Automatic Differentiation of Compiled Programs. arXiv:2212.13760, 2022.

How to join online

You can join online via Zoom, using the following link:
https://uni-kl-de.zoom.us/j/63123116305?pwd=Yko3WU9ZblpGR3lGUkVTV1kzMCtUUT09

Referent: Max Aehle, Chair for Scientific Computing (SciComp), University of Kaiserslautern-Landau (RPTU)

Zeit: 11:45 Uhr

Ort: Hybrid (Room 32-349 and via Zoom)