HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Attend to Who You Are: Supervising Self-Attention for Keypoint Detection and Instance-Aware Association

Sen Yang; Zhicheng Wang; Ze Chen; Yanjie Li; Shoukui Zhang; Zhibin Quan; Shu-Tao Xia; Yiping Bao; Erjin Zhou; Wankou Yang

Attend to Who You Are: Supervising Self-Attention for Keypoint Detection and Instance-Aware Association

Abstract

This paper presents a new method to solve keypoint detection and instance association by using Transformer. For bottom-up multi-person pose estimation models, they need to detect keypoints and learn associative information between keypoints. We argue that these problems can be entirely solved by Transformer. Specifically, the self-attention in Transformer measures dependencies between any pair of locations, which can provide association information for keypoints grouping. However, the naive attention patterns are still not subjectively controlled, so there is no guarantee that the keypoints will always attend to the instances to which they belong. To address it we propose a novel approach of supervising self-attention for multi-person keypoint detection and instance association. By using instance masks to supervise self-attention to be instance-aware, we can assign the detected keypoints to their corresponding instances based on the pairwise attention scores, without using pre-defined offset vector fields or embedding like CNN-based bottom-up models. An additional benefit of our method is that the instance segmentation results of any number of people can be directly obtained from the supervised attention matrix, thereby simplifying the pixel assignment pipeline. The experiments on the COCO multi-person keypoint detection challenge and person instance segmentation task demonstrate the effectiveness and simplicity of the proposed method and show a promising way to control self-attention behavior for specific purposes.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
multi-person-pose-estimation-on-cocoSupervising Self-Attention
AP: 0.665
multi-person-pose-estimation-on-coco-test-devSupervising Self-Attention
AP: 66.5

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Attend to Who You Are: Supervising Self-Attention for Keypoint Detection and Instance-Aware Association | Papers | HyperAI