A Dual Number Abstraction for Static Analysis of Clarke JacobiansInPerson
We present a novel abstraction for bounding the Clarke Jacobian of a Lipschitz continuous, but not necessarily differentiable function over a local input region. To do so, we leverage a novel abstract domain built upon dual numbers, adapted to soundly over-approximate all first derivatives needed to compute the Clarke Jacobian. We formally prove that our novel forward-mode dual interval evaluation produces a sound, interval domain-based over-approximation of the true Clarke Jacobian for a given input region.
Due to the generality of our formalism, we can compute and analyze interval Clarke Jacobians for a broader class of functions than previous works supported – specifically, arbitrary compositions of neural networks with Lipschitz, but non-differentiable perturbations. We implement our technique in a tool called DeepJ and evaluate it on multiple deep neural networks and non-differentiable input perturbations to showcase both the generality and scalability of our analysis. Concretely, we can obtain interval Clarke Jacobians to analyze Lipschitz robustness and local optimization landscapes of both fully-connected and convolutional neural networks for rotational, contrast variation, and haze perturbations, as well as their compositions.
Thu 20 JanDisplayed time zone: Eastern Time (US & Canada) change
10:20 - 12:00 | Foundation and Verification of Machine-Learning SystemsPOPL at Salon I Chair(s): Gilbert Bernstein University of California at Berkeley | ||
10:20 25mResearch paper | A Dual Number Abstraction for Static Analysis of Clarke JacobiansInPerson POPL Jacob Laurel University of Illinois at Urbana-Champaign, Rem Yang University of Illinois at Urbana-Champaign, Gagandeep Singh University of Illinois at Urbana-Champaign; VMware, Sasa Misailovic University of Illinois at Urbana-Champaign DOI Media Attached | ||
10:45 25mResearch paper | Provably Correct, Asymptotically Efficient, Higher-Order Reverse-Mode Automatic DifferentiationRemote POPL Faustyna Krawiec University of Cambridge, Simon Peyton Jones Microsoft Research, Neel Krishnaswami University of Cambridge, Tom Ellis Microsoft Research, Richard A. Eisenberg Tweag, Andrew Fitzgibbon Graphcore DOI Media Attached | ||
11:10 25mResearch paper | Interval Universal Approximation for Neural NetworksInPerson POPL Zi Wang University of Wisconsin-Madison, Aws Albarghouthi University of Wisconsin-Madison, Gautam Prakriya Chinese University of Hong Kong, Somesh Jha University of Wisconsin DOI Media Attached | ||
11:35 25mResearch paper | PRIMA: General and Precise Neural Network Certification via Scalable Convex Hull ApproximationsInPerson POPL Mark Niklas Müller ETH Zurich, Gleb Makarchuk ETH Zurich, Gagandeep Singh University of Illinois at Urbana-Champaign; VMware, Markus Püschel ETH Zurich, Martin Vechev ETH Zurich DOI Media Attached |