arXiv:1803.06567 [cs.LG]AbstractReferencesReviewsResources
A Dual Approach to Scalable Verification of Deep Networks
Krishnamurthy Dvijotham, Robert Stanforth, Sven Gowal, Timothy Mann, Pushmeet Kohli
Published 2018-03-17Version 1
This paper addresses the problem of formally verifying desirable properties of neural networks, i.e., obtaining provable guarantees that the outputs of the neural network will always behave in a certain way for a given class of inputs. Most previous work on this topic was limited in its applicability by the size of the network, network architecture and the complexity of properties to be verified. In contrast, our framework applies to much more general class of activation functions and specifications on neural network inputs and outputs. We formulate verification as an optimization problem and solve a Lagrangian relaxation of the optimization problem to obtain an upper bound on the verification objective. Our approach is anytime, i.e. it can be stopped at any time and a valid bound on the objective can be obtained. We develop specialized verification algorithms with provable tightness guarantees under special assumptions and demonstrate the practical significance of our general verification approach on a variety of verification tasks.