HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Few-Shot Adaptive Gaze Estimation

Seonwook Park; Shalini De Mello; Pavlo Molchanov; Umar Iqbal; Otmar Hilliges; Jan Kautz

Few-Shot Adaptive Gaze Estimation

Abstract

Inter-personal anatomical differences limit the accuracy of person-independent gaze estimation networks. Yet there is a need to lower gaze errors further to enable applications requiring higher quality. Further gains can be achieved by personalizing gaze networks, ideally with few calibration samples. However, over-parameterized neural networks are not amenable to learning from few examples as they can quickly over-fit. We embrace these challenges and propose a novel framework for Few-shot Adaptive GaZE Estimation (FAZE) for learning person-specific gaze networks with very few (less than or equal to 9) calibration samples. FAZE learns a rotation-aware latent representation of gaze via a disentangling encoder-decoder architecture along with a highly adaptable gaze estimator trained using meta-learning. It is capable of adapting to any new person to yield significant performance gains with as few as 3 samples, yielding state-of-the-art performance of 3.18 degrees on GazeCapture, a 19% improvement over prior art. We open-source our code at https://github.com/NVlabs/few_shot_gaze

Code Repositories

NVlabs/few_shot_gaze
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
gaze-estimation-on-mpii-gazeFAZE
Angular Error: 3.14

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Few-Shot Adaptive Gaze Estimation | Papers | HyperAI