Deep Declarative Networks

In this episode of the Talking Papers Podcast, I hosted Dylan Campbell to chat about his paper “Deep Declarative Networks – A New Hope”, published in TPAMI. DDN is a general name for injecting and solving an optimization problem within a neural network. They are particularly useful if you need to enforce constraints or get some guarantees for the solution. Dylan is currently a postdoc at the Visual Geometry Group (VGG) in Oxford and this work was done back when he was still a Research Fellow in The Australian Centre for Robotic Vision (ACRV) at the Australian National University (ANU) Node. Dylan has a unique perspective on optimization, in particular in applications involving geometry. He is a great researcher and a good friend and it was a pleasure recording this episode with him.

AUTHORS

Stephen Gould, Richard Hartley, Dylan Campbell

ABSTRACT

We explore a new class of end-to-end learnable models wherein data processing nodes (or network layers) are defined in terms of desired behaviour rather than an explicit forward function. Specifically, the forward function is implicitly defined as the solution to a mathematical optimization problem. Consistent with nomenclature in the programming languages community, we name these models deep declarative networks. Importantly, we show that the class of deep declarative networks subsumes current deep learning models. Moreover, invoking the implicit function theorem, we show how gradients can be back-propagated through many declaratively defined data processing nodes thereby enabling end-to-end learning. We show how these declarative processing nodes can be implemented in the popular PyTorch deep learning software library allowing declarative and imperative nodes to co-exist within the same network. We also provide numerous insights and illustrative examples of declarative nodes and demonstrate their application for image and point cloud classification tasks.

📚”On differentiating parameterized argmin and argmax problems with application to bi-level optimization” :

📚”OptNet: Differentiable Optimization as a Layer in Neural Networks” :

CODE: 💻https://github.com/anucvml/ddn

💻Jupiter notebooks

Preprint Link: “Deep Declarative Networks: a new hope”

Paper Link: “Deep Declarative Networks”

ECCV 2020 Tutorial

CVPR 2020 Workshop

This episode was recorded on March, 31th 2021.

CONTACT

If you would like to be a guest, sponsor or just share your thoughts, feel free to reach out via email: talking.papers.podcast@gmail.com


SUBSCRIBE AND FOLLOW

🎧Subscribe on your favourite podcast app: https://talking.papers.podcast.itzikbs.com

📧Subscribe to our mailing list: http://eepurl.com/hRznqb

🐦Follow us on Twitter: https://twitter.com/talking_papers

🎥YouTube Channel: