HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

MeMOTR: Long-Term Memory-Augmented Transformer for Multi-Object Tracking

Ruopeng Gao Limin Wang

MeMOTR: Long-Term Memory-Augmented Transformer for Multi-Object Tracking

Abstract

As a video task, Multiple Object Tracking (MOT) is expected to capture temporal information of targets effectively. Unfortunately, most existing methods only explicitly exploit the object features between adjacent frames, while lacking the capacity to model long-term temporal information. In this paper, we propose MeMOTR, a long-term memory-augmented Transformer for multi-object tracking. Our method is able to make the same object's track embedding more stable and distinguishable by leveraging long-term memory injection with a customized memory-attention layer. This significantly improves the target association ability of our model. Experimental results on DanceTrack show that MeMOTR impressively surpasses the state-of-the-art method by 7.9% and 13.0% on HOTA and AssA metrics, respectively. Furthermore, our model also outperforms other Transformer-based methods on association performance on MOT17 and generalizes well on BDD100K. Code is available at https://github.com/MCG-NJU/MeMOTR.

Code Repositories

mcg-nju/memotr
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
multi-object-tracking-on-dancetrackMeMOTR (Deformable DETR)
AssA: 52.3
DetA: 77.0
HOTA: 63.4
IDF1: 65.5
MOTA: 85.4
multi-object-tracking-on-dancetrackMeMOTR
AssA: 58.4
DetA: 80.5
HOTA: 68.5
IDF1: 71.2
MOTA: 89.9
multi-object-tracking-on-sportsmotMeMOTR (Deformable-DETR)
AssA: 57.8
DetA: 82.0
HOTA: 68.8
IDF1: 69.9
MOTA: 90.2
multi-object-tracking-on-sportsmotMeMOTR
AssA: 59.1
DetA: 83.1
HOTA: 70.0
IDF1: 71.4
MOTA: 91.5
multiple-object-tracking-on-sportsmotMeMOTR
AssA: 59.1
DetA: 83.1
HOTA: 70.0
IDF1: 71.4
MOTA: 91.5
multiple-object-tracking-on-sportsmotMeMOTR (Deformable-DETR)
AssA: 57.8
DetA: 82.0
HOTA: 68.8
IDF1: 69.9
MOTA: 90.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
MeMOTR: Long-Term Memory-Augmented Transformer for Multi-Object Tracking | Papers | HyperAI