Learning data-adaptive interest points through epipolar adaptation
Research output: Contribution to journal › Conference article › Research › peer-review
Interest point detection and description have been cornerstones of many computer vision applications. Handcrafted methods like SIFT and ORB focus on generic interest points and do not lend themselves to data-driven adaptation. Recent deep learning models are generally either supervised using expensive 3D information or with synthetic 2D transformations such as homographies that lead to improper handling of nuisance features such as occlusion junctions. In this paper, we propose an alternative form of supervision that leverages the epipolar constraint associated with the fundamental matrix. This approach brings useful 3D information to bear without requiring full depth estimation of all points in the scene. Our proposed approach, Epipolar Adaptation, fine-tunes both the interest point detector and descriptor using a supervision signal provided by the epipolar constraint. We show that our method can improve upon the baseline in a target dataset annotated with epipolar constraints, and the epipolar adapted models learn to remove correspondence involving occlusion junctions correctly.
Original language | English |
---|---|
Journal | IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops |
Pages (from-to) | 1-7 |
Number of pages | 7 |
ISSN | 2160-7508 |
Publication status | Published - Jun 2019 |
Externally published | Yes |
Event | 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 - Long Beach, United States Duration: 16 Jun 2019 → 20 Jun 2019 |
Conference
Conference | 32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 |
---|---|
Country | United States |
City | Long Beach |
Period | 16/06/2019 → 20/06/2019 |
Bibliographical note
Funding Information:
This work was supported in part by a research gift from Magic Leap.
Publisher Copyright:
© 2019 IEEE Computer Society. All rights reserved.
ID: 301824258