HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Sparsity Invariant CNNs

Jonas Uhrig; Nick Schneider; Lukas Schneider; Uwe Franke; Thomas Brox; Andreas Geiger

Sparsity Invariant CNNs

Abstract

In this paper, we consider convolutional neural networks operating on sparse inputs with an application to depth upsampling from sparse laser scan data. First, we show that traditional convolutional networks perform poorly when applied to sparse data even when the location of missing data is provided to the network. To overcome this problem, we propose a simple yet effective sparse convolution layer which explicitly considers the location of missing data during the convolution operation. We demonstrate the benefits of the proposed network architecture in synthetic and real experiments with respect to various baseline approaches. Compared to dense baselines, the proposed sparse convolution network generalizes well to novel datasets and is invariant to the level of sparsity in the data. For our evaluation, we derive a novel dataset from the KITTI benchmark, comprising 93k depth annotated RGB images. Our dataset allows for training and evaluating depth upsampling and depth prediction techniques in challenging real-world settings and will be made available upon publication.

Code Repositories

PeterTor/sparse_convolution
tf
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
depth-completion-on-kitti-depth-completionSparseConvs
MAE: 481
RMSE: 1601
Runtime [ms]: 10

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Sparsity Invariant CNNs | Papers | HyperAI