Year: 2021
Efficient gradient descent-style local optimisation algorithms have been studied extensively and applied abundantly throughout the field of deep learning. On the other hand, convergent global optimisation algorithms that never yield an incorrect result — such as branch-and-bound methods — are much less practically available. Optimisation is a core component of supervised machine learning, having broad […]
The first Huawei Lab Workshop of 2021 will be held from the 14th to 16th of June (mornings in Edinburgh, late afternoon China) The Huawei lab June workshop is the forum for researchers to share findings from their Huawei-sponsored research project through discussion, posters and reports. Attendees will have the opportunity to hear […]
Abstract String diagrams are becoming the established mathematical language of diagrammatic reasoning, with the mantra of ‘only connectivity matters’: equal terms are represented as isomorphic (or isotopic) diagrams. Unfortunately, when adding more structure to categories in the form of additional axioms, this mantra is lost: we must now consider diagrams up to rewriting. To perform […]
Time: 2021-02-11 09:30-10:30 ((UTC00:00) Edinburgh Video of the talk: https://youtu.be/sMWOQNfYIUw Speaker: James Wood, Strathclyde U Abstract: The metatheory of simple type systems presented using de Bruijn indices is well understood. We know to follow the principle that variable binding is the only interaction between the context and typing rules other than the variable rule. From […]
Jesse Sigal, University of Edinburgh. Automatic differentiation (AD) is an important family of algorithms which enables derivative based optimization. We show that AD can be simply implemented with effects and handlers by doing so in the Frank language. By considering how our implementation behaves in Frank’s operational semantics, we show how our code performs the […]
Recent comments