You are here

MARKERLESS TRACKING USING POLAR CORRELATION OF CAMERA OPTICAL FLOW

Download pdf | Full Screen View

Date Issued:
2010
Abstract/Description:
We present a novel, real-time, markerless vision-based tracking system, employing a rigid orthogonal configuration of two pairs of opposing cameras. Our system uses optical flow over sparse features to overcome the limitation of vision-based systems that require markers or a pre-loaded model of the physical environment. We show how opposing cameras enable cancellation of common components of optical flow leading to an efficient tracking algorithm that captures five degrees of freedom including direction of translation and angular velocity. Experiments comparing our device with an electromagnetic tracker show that its average tracking accuracy is 80% over 185 frames, and it is able to track large range motions even in outdoor settings. We also present how opposing cameras in vision-based inside-looking-out systems can be used for gesture recognition. To demonstrate our approach, we discuss three different algorithms for recovering motion parameters at different levels of complete recovery. We show how optical flow in opposing cameras can be used to recover motion parameters of the multi-camera rig. Experimental results show gesture recognition accuracy of 88.0%, 90.7% and 86.7% for our three techniques, respectively, across a set of 15 gestures.
Title: MARKERLESS TRACKING USING POLAR CORRELATION OF CAMERA OPTICAL FLOW.
38 views
13 downloads
Name(s): Gupta, Prince, Author
da Vitoria Lobo, Niels, Committee Chair
University of Central Florida, Degree Grantor
Type of Resource: text
Date Issued: 2010
Publisher: University of Central Florida
Language(s): English
Abstract/Description: We present a novel, real-time, markerless vision-based tracking system, employing a rigid orthogonal configuration of two pairs of opposing cameras. Our system uses optical flow over sparse features to overcome the limitation of vision-based systems that require markers or a pre-loaded model of the physical environment. We show how opposing cameras enable cancellation of common components of optical flow leading to an efficient tracking algorithm that captures five degrees of freedom including direction of translation and angular velocity. Experiments comparing our device with an electromagnetic tracker show that its average tracking accuracy is 80% over 185 frames, and it is able to track large range motions even in outdoor settings. We also present how opposing cameras in vision-based inside-looking-out systems can be used for gesture recognition. To demonstrate our approach, we discuss three different algorithms for recovering motion parameters at different levels of complete recovery. We show how optical flow in opposing cameras can be used to recover motion parameters of the multi-camera rig. Experimental results show gesture recognition accuracy of 88.0%, 90.7% and 86.7% for our three techniques, respectively, across a set of 15 gestures.
Identifier: CFE0003163 (IID), ucf:48611 (fedora)
Note(s): 2010-05-01
M.S.
Engineering and Computer Science, School of Electrical Engineering and Computer Science
Masters
This record was generated from author submitted information.
Subject(s): computer vision
user interface
device
optical flow
motion
tracking
egomotion
navigation
Persistent Link to This Record: http://purl.flvc.org/ucf/fd/CFE0003163
Restrictions on Access: public
Host Institution: UCF

In Collections