Command Palette
Search for a command to run...
David W. Romero; Anna Kuzina; Erik J. Bekkers; Jakub M. Tomczak; Mark Hoogendoorn

Abstract
Conventional neural architectures for sequential data present important limitations. Recurrent networks suffer from exploding and vanishing gradients, small effective memory horizons, and must be trained sequentially. Convolutional networks are unable to handle sequences of unknown size and their memory horizon must be defined a priori. In this work, we show that all these problems can be solved by formulating convolutional kernels in CNNs as continuous functions. The resulting Continuous Kernel Convolution (CKConv) allows us to model arbitrarily long sequences in a parallel manner, within a single operation, and without relying on any form of recurrence. We show that Continuous Kernel Convolutional Networks (CKCNNs) obtain state-of-the-art results in multiple datasets, e.g., permuted MNIST, and, thanks to their continuous nature, are able to handle non-uniformly sampled datasets and irregularly-sampled data natively. CKCNNs match or perform better than neural ODEs designed for these purposes in a faster and simpler manner.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| sequential-image-classification-on-sequential | CKCNN (1M) | Permuted Accuracy: 98.54% Unpermuted Accuracy: 99.32% |
| sequential-image-classification-on-sequential | CKCNN (100k) | Permuted Accuracy: 98% Unpermuted Accuracy: 99.31% |
| sequential-image-classification-on-sequential-1 | CKCNN (1M) | Unpermuted Accuracy: 63.74% |
| sequential-image-classification-on-sequential-1 | CKCNN (100k) | Unpermuted Accuracy: 62.25% |
| time-series-on-speech-commands | CKCNN (100k) | % Test Accuracy: 95.27 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.