Estimating Gesture Accuracy in Motion-Based Health Games

Autor/innen

  • Christian Barrett Purdue University
  • Jacob Brown Purdue University
  • Jay Hartford Purdue University
  • Michael Hoerter Purdue University
  • Andrew Kennedy Purdue University
  • Ray Hassan Purdue University
  • David Whittinghill Purdue University https://orcid.org/0000-0002-2011-7893

DOI:

https://doi.org/10.20385/1860-2037/11.2014.8

Schlagworte:

KINECT, RGB-D camera, algorithms, application development, cerebral palsy, health games, physical therapy, serious games

Abstract

This manuscript details a technique for estimating gesture accuracy within the context of motion-based health video games using the MICROSOFT KINECT. We created a physical therapy game that requires players to imitate clinically significant reference gestures. Player performance is represented by the degree of similarity between the performed and reference gestures and is quantified by collecting the Euler angles of the player's gestures, converting them to a three-dimensional vector, and comparing the magnitude between the vectors. Lower difference values represent greater gestural correspondence and therefore greater player performance. A group of thirty-one subjects was tested. Subjects achieved gestural correspondence sufficient to complete the game's objectives while also improving their ability to perform reference gestures accurately.

Veröffentlicht

2014-12-01

Zitationsvorschlag

Barrett, C., Brown, J., Hartford, J., Hoerter, M., Kennedy, A., Hassan, R., & Whittinghill, D. (2014). Estimating Gesture Accuracy in Motion-Based Health Games. Journal of Virtual Reality and Broadcasting, 11. https://doi.org/10.20385/1860-2037/11.2014.8

Ausgabe

Rubrik

Artikel