The impact of differentiable programming: how ∂P is enabling new science in JuliaAMA
Fully incorporating differentiable programming (∂P) into the Julia language has enabled composability between modern machine learning techniques and existing high performance computing (HPC) modeling and simulation without sacrificing expressivity. Most notably, this has meant that small neural networks can be embedded within larger models whose other behaviors are fully understood and can be concretely represented. Smaller neural networks, in turn, are easier to train and interpret. It has also enabled complex computations to be embedded within cost functions for fast and robust reinforcement learning. In this talk, we’ll walk through several concrete examples and demonstrate how the combination of ∂P with Julia’s generic programming has enabled powerful and expressive new models.
The discussion and AMA following this talk will be moderated by Ras Bodik.
Matt Bauman is a Senior Research Scientist at JuliaComputing’s Chicago outpost, where he spends lots of time working on arrays and broadcasting. He’s been contributing to both the core language and multiple packages since 2014. At his previous position as a Data Science Fellow at the University of Chicago’s Center for Data Science and Public Policy, he longed for dot-broadcasting in Python while working with local governments to use data science for social good. He recently defended his PhD dissertation in Bioengineering from the University of Pittsburgh, focusing on neural prosthetics.
Fri 20 NovDisplayed time zone: Central Time (US & Canada) change
15:00 - 15:40 | |||
15:00 40mTalk | The impact of differentiable programming: how ∂P is enabling new science in JuliaAMA REBASE Matt Bauman Julia Computing |