Play Using Kinect + OpenNI to Embody an Avatar in Second Life


Download the software to connect the Microsoft Kinect to Second Life.

At the MxR Lab at the University of Southern California Institute for Creative Technologies we are developing methods of recognizing social gestures in order to explore the transference of emotion and gesture between a virtual world and the real world. Thai Phan an engineer at the MxR Lab, using the OpenNI toolkit as a foundation has developed new software which utilizes Kinect to read gestures and triggers corresponding server-side scripts within Second Life. These methods may allow the user to feel a deeper emotional connection to the social gesture performed by their virtual avatar, regardless of the bond which already exists between the user and his recipient. Instead of having to think about pressing the right sequence of keys to make a ‘wave’ gesture, the user can simply raise their hand and wave.


This project is made possible through the use of the OpenNI toolkit, Kinect, and Second Life. The software was developed by Thai Phan, an engineer at the MxR Lab at the USC Insititute for Creative Technologies and a computer science grad student in the USC Viterbi School of Engineeering.

Phan was first introduced to the MxR lab in his work with Diane Tucker, third year MFA in the Interactive Media Department, on her thesis project, exploring the embodiment of gestures and how they affect the emotions of a player.

In consideration of the navigation technique adopted for this demo, we looked to proven mechanisms of control which harness the familiar yet unconcious feedback systems humans have developed to negotiate the real world.