Command Palette
Search for a command to run...
Saining Xie; Alexander Kirillov; Ross Girshick; Kaiming He

Abstract
Neural networks for image recognition have evolved through extensive manual design from simple chain-like models to structures with multiple wiring paths. The success of ResNets and DenseNets is due in large part to their innovative wiring plans. Now, neural architecture search (NAS) studies are exploring the joint optimization of wiring and operation types, however, the space of possible wirings is constrained and still driven by manual design despite being searched. In this paper, we explore a more diverse set of connectivity patterns through the lens of randomly wired neural networks. To do this, we first define the concept of a stochastic network generator that encapsulates the entire network generation process. Encapsulation provides a unified view of NAS and randomly wired networks. Then, we use three classical random graph models to generate randomly wired graphs for networks. The results are surprising: several variants of these random generators yield network instances that have competitive accuracy on the ImageNet benchmark. These results suggest that new efforts focusing on designing better network generators may lead to new breakthroughs by exploring less constrained search spaces with more room for novel design.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| image-classification-on-imagenet | RandWire-WS | GFLOPs: 7.9 Number of params: 61.5M Top 1 Accuracy: 80.1% |
| image-classification-on-imagenet | RandWire-WS (small) | GFLOPs: 0.583 Number of params: 5.6M Top 1 Accuracy: 74.7% |
| neural-architecture-search-on-imagenet | RandWire-WS (small) | FLOPs: 583M Params: 5.6M Top-1 Error Rate: 25.3 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.