HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Adaptive Neural Connections for Sparsity Learning

{Prakhar Kaushik Hava Siegelmann Alex Gain}

Adaptive Neural Connections for Sparsity Learning

Abstract

Sparsity learning aims to decrease the computational and memory costs of large deep neural networks (DNNs) via pruning neural connections while simultaneously retaining high accuracy. A large body of work has developed sparsity learning approaches, with recent large-scale experiments showing that two main methods, magnitude pruning and Variational Dropout (VD), achieve similar state-of-the-art results for classification tasks. We propose Adaptive Neural Connections (ANC), a method for explicitly parameterizing fine-grained neuron-to-neuron connections via adjacency matrices at each layer that are learned through backpropagation. Explicitly parameterizing neuron-to-neuron connections confers two primary advantages: 1. Sparsity can be explicitly optimized for via norm-based regularization on the adjacency matrices; and 2. When combined with VD (which we term, ANC-VD), the adjacencies can be interpreted as learned weight importance parameters, which we hypothesize leads to improved convergence for VD. Experiments with ResNet18 show that architectures augmented with ANC outperform their vanilla counterparts.

Benchmarks

BenchmarkMethodologyMetrics
sparse-learning-on-cinic-10-1Resnet18
Sparsity: 92.43
sparse-learning-on-imagenet32-1Resnet18
Sparsity: 93.63

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Adaptive Neural Connections for Sparsity Learning | Papers | HyperAI