Technical Talk: Michele Pagani on automatic differentiation
On June 16 we had the pleasure of a technical talk by Prof. Michele Pagani, Institut de Recherche en Informatique Fondamentale, Université de Paris.
Abstract:
Backpropagation is a classic automatic differentiation algorithm computing the gradient of functions specified by a certain class of simple, first-order programs, called computational graphs. It is a fundamental tool in several fields, most notably machine learning, where it is the key for efficiently training (deep) neural networks. Recent years have witnessed the quick growth of a research field called differentiable programming, the aim of which is to express computational graphs more synthetically and modularly by resorting to actual programming languages endowed with control flow operators and higher-order combinators, such as map and fold. We extend the backpropagation algorithm to a paradigmatic example of such a programming language: we define a compositional program transformation from PCF (a Turing complete simply-typed lambda-calculus) to itself augmented with a notion of linear negation, and prove that this computes almost everywhere the gradient of the source program with the same efficiency as first-order backpropagation. The transformation is completely effect-free and thus provides a purely logical understanding of the dynamics of backpropagation.
Joint work with Alois Brunel (Deepomatic) and Damiano Mazza (LIPN)
The talk was recorded:
Comments are closed
Comments to this thread have been closed by the post author or by an administrator.