HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

SmoothNets: Optimizing CNN architecture design for differentially private deep learning

Remerscheid Nicolas W. ; Ziller Alexander ; Rueckert Daniel ; Kaissis Georgios

SmoothNets: Optimizing CNN architecture design for differentially
  private deep learning

Abstract

The arguably most widely employed algorithm to train deep neural networkswith Differential Privacy is DPSGD, which requires clipping and noising ofper-sample gradients. This introduces a reduction in model utility compared tonon-private training. Empirically, it can be observed that this accuracydegradation is strongly dependent on the model architecture. We investigatedthis phenomenon and, by combining components which exhibit good individualperformance, distilled a new model architecture termed SmoothNet, which ischaracterised by increased robustness to the challenges of DP-SGD training.Experimentally, we benchmark SmoothNet against standard architectures on twobenchmark datasets and observe that our architecture outperforms others,reaching an accuracy of 73.5\% on CIFAR-10 at $\varepsilon=7.0$ and 69.2\% at$\varepsilon=7.0$ on ImageNette, a state-of-the-art result compared to priorarchitectural modifications for DP.

Code Repositories

NiWaRe/DPBenchmark
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
image-classification-on-cifar-10SmoothNetV1
Percentage correct: 73.5
image-classification-on-imagenetteSmoothNetV1
Accuracy: 69.7

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
SmoothNets: Optimizing CNN architecture design for differentially private deep learning | Papers | HyperAI