Human Head Motion Modeling Using Monocular Cues for Interactive Robotic Applications

Zukhraf Jamil, Abdullah Gulraiz, W. S. Qureshi, Chyi Yeu Lin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

In this paper, a generalized 2D trajectory models are developed for human motion during walking and running for a user-centered Human-Computer Interaction design. The mapping of the head motion trajectories has been done through monocular cues to compute instantaneous displacement and velocity. The novelty of the work lies in using non-contact sensor only which saves from the use of complex body-contact sensors as well as improves data storage and processing. The 2D trajectory models developed are tested on 30 subjects and that show the average head motion trajectory map of a running person differs from that of the walking person on a treadmill. The quantification of head velocity values as a function of walking and running velocities is also tested which can be used to identify different locomotion velocities from head velocities.

Original languageEnglish
Title of host publication2019 International Conference on Robotics and Automation in Industry, ICRAI 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728130583
DOIs
Publication statusPublished - Oct 2019
Externally publishedYes
Event3rd International Conference on Robotics and Automation in Industry, ICRAI 2019 - Rawalpindi, Pakistan
Duration: 21 Oct 201922 Oct 2019

Publication series

Name2019 International Conference on Robotics and Automation in Industry, ICRAI 2019

Conference

Conference3rd International Conference on Robotics and Automation in Industry, ICRAI 2019
Country/TerritoryPakistan
CityRawalpindi
Period21/10/1922/10/19

Keywords

  • 2D trajectory
  • Human-Computer Interaction
  • human motion modeling
  • monocular
  • non-contact sensor

Fingerprint

Dive into the research topics of 'Human Head Motion Modeling Using Monocular Cues for Interactive Robotic Applications'. Together they form a unique fingerprint.

Cite this