Differentiable Programming with Julia

Speaker: Mike Innes (Julia Computing)

Title: Differentiable Programming with Julia

Bio: Mike is a software engineer at Julia Computing, where he created the Flux machine learning stack and Zygote automatic differentiation tool. Previously he did research for MIT, where he built the Juno IDE for Julia, and received a degree in Physics from Oxford University.

Abstract: As researchers increasingly push the limits of frameworks like TensorFlow and PyTorch, they are looking to more powerful tools for the next generation of machine learning and statistical models. Where ML frameworks generalised machine learning beyond simple feed-forward chains to more complex architectures, new differentiable programming languages make virtually any numerical program a potential model.

This provides an elegant and expressive new way to do ML, without tradeoffs between performance and expressiveness. But more importantly, it enables a differentiable library ecosystem; algorithms in areas as diverse as colour theory, finance, physics and ray tracing can be differentiated and used directly in models, opening up exciting new research areas. This talk will cover the recent advances and open challenges in the field, with a focus on both the underlying technology and the applications for ML, datascience and statistics researchers.

To view Mike's slides from the seminar, please click here.