Pipeline-level differentiable programming for the real world

Alessandro Angioi

Friday 14:00 in Hassium

The tools enabling automatic differentiation (AD), like JAX and PyTorch, are increasingly being adopted beyond machine learning to tackle optimization problems in various scientific and engineering contexts. These tools have catalyzed the development of differentiable simulators, solvers, 3D renderers, and other powerful components, under the umbrella of differentiable programming (DP).

However, building pipelines that propagate gradients effortlessly across components introduces unique challenges. Real-world pipelines often span diverse technologies, frameworks (e.g., JAX, TensorFlow, PyTorch, Julia), computing environments (local vs. distributed clusters; CPU vs. GPU), and teams with varying expertise. Additionally, legacy systems and non-differentiable components often need to coexist with modern AD-enabled frameworks.

This talk will provide an overview of differentiable pipelines: why they matter and the types of optimization problems they address. We will revisit foundational concepts of automatic differentiation to set the stage for understanding the intricacies of orchestrating differentiable pipelines in Python.

Then, using our open-source project, Tesseract, as a case study, we will share lessons learned and best practices for designing AD-friendly APIs with tools like Pydantic and FastAPI, achieving seamless integration with JAX, packaging scientific software, and enabling end-to-end systems-level optimization.

Attendees will leave with practical insights on why they should care about differentiable programming, and how to overcome the challenges of building real-world differentiable pipelines.

Alessandro Angioi

I work at the boundary between physical simulations and machine learning. I have 5+ years experience in machine learning and data science, and my background is in theoretical physics. Born in Sardinia, but I've been living in the Rhein-Neckar region for the past 10 years. Cat person.