Todd Waugh Ambridge: Global Optimisation via Constructive Real Numbers
Efficient gradient descent-style local optimisation algorithms have been studied extensively and applied abundantly throughout the field of deep learning. On the other hand, convergent global optimisation algorithms that never yield an incorrect result — such as branch-and-bound methods — are much less practically available. Optimisation is a core component of supervised machine learning, having broad applications to function approximation algorithms such as (non-parametric) interpolation and (parametric) regression. Convergence properties of interpolation have been studied extensively by Weierstrass-style theorems, though the convergence of regression has not been widely studied.
In this talk, I will introduce the concepts behind convergent global optimisation and discuss how floating-point real numbers are an inappropriate data type for such algorithmics. This leads us towards exploring data types for arbitrary-precision constructive real numbers, and showing how convergent global optimisation algorithms can be built using these types. Lastly, I will apply this framework to supervised machine learning, and show how convergence properties for regression on arbitrary ‘searchable’ data types can be satisfied via convergent global optimisation.
Date: 2021/03/11
Slides: Convergent Global Optimisation via Constructive Real Numbers
Comments are closed
Comments to this thread have been closed by the post author or by an administrator.