HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Solution-aware vs global ReLU selection: partial MILP strikes back for DNN verification

Yuke Liao Blaise Genest Kuldeep Meel Shaan Aryaman

Solution-aware vs global ReLU selection: partial MILP strikes back for DNN verification

Abstract

To handle complex instances, we revisit a divide-and-conquer approach to break down the complexity: instead of few complex BaB calls, we rely on many small {\em partial} MILP calls. The crucial step is to select very few but very important ReLUs to treat using (costly) binary variables. The previous attempts were suboptimal in that respect. To select these important ReLU variables, we propose a novel {\em solution-aware} ReLU scoring ({\sf SAS}), as well as adapt the BaB-SR and BaB-FSB branching functions as {\em global} ReLU scoring ({\sf GS}) functions. We compare them theoretically as well as experimentally, and {\sf SAS} is more efficient at selecting a set of variables to open using binary variables. Compared with previous attempts, SAS reduces the number of binary variables by around 6 times, while maintaining the same level of accuracy. Implemented in {\em Hybrid MILP}, calling first α,β-CROWN with a short time-out to solve easier instances, and then partial MILP, produces a very accurate yet efficient verifier, reducing by up to 40% the number of undecided instances to low levels (8-15%), while keeping a reasonable runtime (46s-417s on average per instance), even for fairly large CNNs with 2 million parameters.

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Solution-aware vs global ReLU selection: partial MILP strikes back for DNN verification | Papers | HyperAI