Andrew Hearin, LANL
When
Where
Fundamental Physics with Differentiable Programming
Abstract: One of the core technical innovations that makes it possible to train neural networks is automatic differentiation (autodiff): the ability to efficiently compute gradients through arbitrarily complex functions. But autodiff isn't just for neural networks—it works for any differentiable computation, including traditional physics models and calculations. Differentiable programming brings these computational tools to physics, harnessing the computational power of AI with the reliability of established methods of uncertainty quantification. In this talk, I’ll give an overview of a growing movement to recast conventional physics prediction pipelines as differentiable programs, with examples from particle physics, climate science, and astrophysics. I will then focus on Diffsky: a new forward model of galaxies co-evolving with the dark matter halos they inhabit. Diffsky is a ground-up reformulation of the galaxy-halo connection designed to enable joint analyses of two or more cosmological surveys at once, enabling tighter constraints on dark energy, neutrino mass, and primordial non-Gaussianity through multi-probe cosmology. I will present recent applications of Diffsky in generating synthetic data for DESI, the Rubin Observatory, and the Roman Survey Telescope, and highlight how these tools are opening new frontiers in our ability to study fundamental physics with a new generation of astronomical data.
3:00 PM in PAS 201 / Zoom https://arizona.zoom.us/j/86395646910
Refreshments in PAS 236, 2:30PM

