Files
Abstract
This paper focuses on a facilitated and intuitive representation of upper-body gestures for developers. The representation is based on the user motion parameters, particularly the rotational and translational components of body segments during a gesture. The developed static representation aims to provide a rapid visualization of the complexity for each body segment involved in the gesture for static representations. The model and algorithms used to produce the representation have been applied to a dataset of 10 representative gestures to illustrate the model.