arXiv Analytics

Sign in

arXiv:2101.09306 [cs.LG]AbstractReferencesReviewsResources

Partition-Based Convex Relaxations for Certifying the Robustness of ReLU Neural Networks

Brendon G. Anderson, Ziye Ma, Jingqi Li, Somayeh Sojoudi

Published 2021-01-22Version 1

In this paper, we study certifying the robustness of ReLU neural networks against adversarial input perturbations. To diminish the relaxation error suffered by the popular linear programming (LP) and semidefinite programming (SDP) certification methods, we propose partitioning the input uncertainty set and solving the relaxations on each part separately. We show that this approach reduces relaxation error, and that the error is eliminated entirely upon performing an LP relaxation with an intelligently designed partition. To scale this approach to large networks, we consider courser partitions that take the same form as this motivating partition. We prove that computing such a partition that directly minimizes the LP relaxation error is NP-hard. By instead minimizing the worst-case LP relaxation error, we develop a computationally tractable scheme with a closed-form optimal two-part partition. We extend the analysis to the SDP, where the feasible set geometry is exploited to design a two-part partition that minimizes the worst-case SDP relaxation error. Experiments on IRIS classifiers demonstrate significant reduction in relaxation error, offering certificates that are otherwise void without partitioning. By independently increasing the input size and the number of layers, we empirically illustrate under which regimes the partitioned LP and SDP are best applied.

Comments: This is an extension of our IEEE CDC 2020 conference paper arXiv:2004.00570
Categories: cs.LG, math.OC
Related articles: Most relevant | Search more
arXiv:2006.06878 [cs.LG] (Published 2020-06-11)
Optimization Theory for ReLU Neural Networks Trained with Normalization Layers
arXiv:2212.01544 [cs.LG] (Published 2022-12-03)
Probabilistic Verification of ReLU Neural Networks via Characteristic Functions
arXiv:2305.05562 [cs.LG] (Published 2023-05-09)
SkelEx and BoundEx: Natural Visualization of ReLU Neural Networks