HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

MoCaE: Mixture of Calibrated Experts Significantly Improves Object Detection

Kemal Oksuz Selim Kuzucu Tom Joy Puneet K. Dokania

MoCaE: Mixture of Calibrated Experts Significantly Improves Object Detection

Abstract

Combining the strengths of many existing predictors to obtain a Mixture of Experts which is superior to its individual components is an effective way to improve the performance without having to develop new architectures or train a model from scratch. However, surprisingly, we find that naïvely combining expert object detectors in a similar way to Deep Ensembles, can often lead to degraded performance. We identify that the primary cause of this issue is that the predictions of the experts do not match their performance, a term referred to as miscalibration. Consequently, the most confident detector dominates the final predictions, preventing the mixture from leveraging all the predictions from the experts appropriately. To address this, when constructing the Mixture of Experts, we propose to combine their predictions in a manner which reflects the individual performance of the experts; an objective we achieve by first calibrating the predictions before filtering and refining them. We term this approach the Mixture of Calibrated Experts and demonstrate its effectiveness through extensive experiments on 5 different detection tasks using a variety of detectors, showing that it: (i) improves object detectors on COCO and instance segmentation methods on LVIS by up to $\sim 2.5$ AP; (ii) reaches state-of-the-art on COCO test-dev with $65.1$ AP and on DOTA with $82.62$ $\mathrm{AP_{50}}$; (iii) outperforms single models consistently on recent detection tasks such as Open Vocabulary Object Detection.

Code Repositories

fiveai/MoCaE
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
object-detection-in-aerial-images-on-dota-1MoCaE
mAP: 82.62%
object-detection-on-cocoMoCaE
box mAP: 65.1

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
MoCaE: Mixture of Calibrated Experts Significantly Improves Object Detection | Papers | HyperAI