POPL 2022 (series) / LAFI 2022 (series) / The Seventh International Workshop on Languages for Inference / JAX: accelerating ML research with composable function transformations
JAX: accelerating ML research with composable function transformationsRemote
JAX is a system for high-performance machine learning research and numerical computing. It offers the familiarity of Python+NumPy together with a collection of user-wielded function transformations, including automatic differentiation, vectorized batching, end-to-end compilation (via XLA), parallelization over multiple accelerators, and more. Composing these transformations is the key to JAX’s power and simplicity.
This talk presents an overview of the project today and highlights some of our discoveries so far. These include useful autodiff primitives as well as lessons from embedding strictly-typed pure functional languages within the unruly realm of research programming in Python.
Sun 16 JanDisplayed time zone: Eastern Time (US & Canada) change
Sun 16 Jan
Displayed time zone: Eastern Time (US & Canada) change
13:30 - 14:45 | |||
13:30 37mTalk | JAX: accelerating ML research with composable function transformationsRemote LAFI Roy Frostig Google Research | ||
14:07 37mTalk | Scalable structure learning and inference for domain-specific probabilistic programsRemote LAFI Feras Saad Massachusetts Institute of Technology |