HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Perceptual Loss for Robust Unsupervised Homography Estimation

Daniel Koguciuk Elahe Arani Bahram Zonooz

Perceptual Loss for Robust Unsupervised Homography Estimation

Abstract

Homography estimation is often an indispensable step in many computer vision tasks. The existing approaches, however, are not robust to illumination and/or larger viewpoint changes. In this paper, we propose bidirectional implicit Homography Estimation (biHomE) loss for unsupervised homography estimation. biHomE minimizes the distance in the feature space between the warped image from the source viewpoint and the corresponding image from the target viewpoint. Since we use a fixed pre-trained feature extractor and the only learnable component of our framework is the homography network, we effectively decouple the homography estimation from representation learning. We use an additional photometric distortion step in the synthetic COCO dataset generation to better represent the illumination variation of the real-world scenarios. We show that biHomE achieves state-of-the-art performance on synthetic COCO dataset, which is also comparable or better compared to supervised approaches. Furthermore, the empirical results demonstrate the robustness of our approach to illumination variation compared to existing methods.

Code Repositories

NeurAI-Lab/biHomE
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
homography-estimation-on-pds-cocoPFNet+biHomE
MACE: 2.11
homography-estimation-on-s-cocoPFNet+biHomE
MACE: 1.79

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Perceptual Loss for Robust Unsupervised Homography Estimation | Papers | HyperAI