Target Identification Through Human Pointing Gesture Based on Human-Adaptive Approach

Abstract

We propose a human-adaptive approach for calculating human pointing targets, integrating (1) calculating the user’s subjective pointing direction from finger direction, (2) integrating sensory information obtained from user pointing and contextual information such as user action sequences, and (3) arranging target candidates based on the user’s characteristics of pointing and action sequences. The user’s subjective pointing direction is approximated by the linear function with the finger direction. Integration of sensory and contextual information using a probabilistic model enables the system to calculate the target accurately. Using a force-directed approach, we obtained good placement in which false estimations are decreased and not moved much from initial placement. Experimental results demonstrate the usefulness of our proposal.

Publication
Journal of Robotics and Mechatronics, 20 (4)
Yusuke Tamura
Yusuke Tamura
Associate Professor

Related