Acted Rhythmic Gestures Dataset


The database contains 20 sequences acquired with a Microsoft Kinect sensor. Participants were recorded while acting a different rhythmic gestures such as hands fidgeting, legs fidgeting and rocking, sitting in front of the sensor.

For more information on the use of this database please visit the Automatic Multimodal Descriptors of Rhythmic Body Movement project page.


Marwa Mahmoud

Louis- Philippe Morency

Peter Robinson


Marwa Mahmoud, Louis-Philippe Morency, and Peter Robinson
'Automatic Multimodal Descriptors of Rhythmic Body Movement'
International Conference on Multimodal Interaction, Sydney, Australia, December 2013



To be able to download the Acted Rhythmic Gestures Dataset, please download, sign and return the agreement form to this e-mail address.

There are 20 zip files, one for each participant:

Each zip file contains the recorded video, depth data, skeleton file generated from Microsoft Kinect and a csv annotation file containing the gesture label per frame. For questions please contact Marwa Mahmoud


Automatic Multimodal Descriptors of Rhythmic Body Movement Project page

ICT Multicomp Lab

Institute of Creative Technologies