
摘要
我们介绍了一种新的方法——关联嵌入(associative embedding),用于监督卷积神经网络执行检测和分组任务。许多计算机视觉问题可以以这种方式进行表述,包括多人姿态估计、实例分割和多目标跟踪。通常,检测结果的分组是通过多阶段管道实现的,而我们提出的方法则是训练网络同时输出检测结果和分组分配。该技术可以轻松集成到任何产生像素级预测的最先进网络架构中。我们展示了如何将这种方法应用于多人姿态估计和实例分割,并报告了在MPII和MS-COCO数据集上多人姿态估计的最先进性能。
代码仓库
open-mmlab/mmpose
pytorch
princeton-vl/pose-ae-train
pytorch
baodi23/hourglass-facekeypoints-detection
pytorch
GitHub 中提及
raymon-tian/hourglass-facekeypoints-detection
pytorch
GitHub 中提及
stevehjc/pose-ae-demo-tf
tf
GitHub 中提及
基准测试
| 基准 | 方法 | 指标 |
|---|---|---|
| 2d-human-pose-estimation-on-coco-wholebody-1 | AE | WB: 27.4 body: 40.5 face: 47.7 foot: 7.7 hand: 34.1 |
| 2d-human-pose-estimation-on-ochuman | Associative Embedding+ | Test AP: 32.8 Validation AP: 40.0 |
| 2d-human-pose-estimation-on-ochuman | Associative Embedding | Test AP: 29.5 Validation AP: 32.1 |
| keypoint-detection-on-coco | Pose-AE | Test AP: 62.8 |
| keypoint-detection-on-coco-test-dev | AE | AP50: 86.8 AP75: 72.3 APL: 72.6 APM: 60.6 AR: 70.2 AR50: 89.5 AR75: 76.0 ARL: 78.1 ARM: 64.6 |
| keypoint-detection-on-mpii-multi-person | Associative Embedding | mAP@0.5: 77.5% |
| keypoint-detection-on-ochuman | Associative Embedding+ | Test AP: 32.8 Validation AP: 40.0 |
| keypoint-detection-on-ochuman | Associative Embedding | Test AP: 29.5 Validation AP: 32.1 |
| multi-person-pose-estimation-on-coco | Associative Embedding | AP: 0.655 |
| multi-person-pose-estimation-on-mpii-multi | Associative Embedding | AP: 77.5% |
| pose-estimation-on-ochuman | Associative Embedding | Test AP: 29.5 Validation AP: 32.1 |
| pose-estimation-on-ochuman | Associative Embedding+ | Test AP: 32.8 Validation AP: 40.0 |