CUED Publications database

Pantomimic Gestures for Human-Robot Interaction

Burke, M and Lasenby, J (2015) Pantomimic Gestures for Human-Robot Interaction. IEEE Transactions on Robotics, 31. pp. 1225-1237. ISSN 1552-3098

Full text not available from this repository.


This paper introduces a pantomimic gesture interface, which classifies human hand gestures using unmanned aerial vehicle (UAV) behavior recordings as training data. We argue that pantomimic gestures are more intuitive than iconic gestures and show that a pantomimic gesture recognition strategy using micro-UAV behavior recordings can be more robust than one trained directly using hand gestures. Hand gestures are isolated by applying a maximum information criterion, with features extracted using principal component analysis and compared using a nearest neighbor classifier. These features are biased in that they are better suited to classifying certain behaviors. We show how a Bayesian update step accounting for the geometry of training features compensates for this, resulting in fairer classification results, and introduce a weighted voting system to aid in sequence labeling.

Item Type: Article
Divisions: Div F > Signal Processing and Communications
Depositing User: Unnamed user with email
Date Deposited: 17 Jul 2017 19:34
Last Modified: 09 Sep 2021 03:17
DOI: 10.1109/TRO.2015.2475956