USC

Work Flexible Action and Articulated Skeleton Toolkit (FAAST)

Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas

Project Email Address: faast@ict.usc.edu

 

DOWNLOAD FAAST 1.2

32-bit
(recommended for most users)

64-bit
(for advanced users)

 

Note from Evan Suma, the developer of FAAST: I have recently transitioned to a faculty position at USC, and unfortunately that means I have very limited time for further development of the toolkit. Future updates may occur but will likely be sporadic. Additionally, it is no longer possible for me to respond to support and feature requests from the community.  

You may also view our online video gallery, which contains videos that demonstrate FAAST’s capabilities, as well as interesting applications that use the toolkit.

 

Have a Kinect for Windows v2?

We have developed an experimental version of FAAST with support for the Kinect for Windows v2, available for download here (64-bit only).  Please note that you must already have the Microsoft Kinect SDK v2 installed and the KinectService application running.  This is based on preliminary software and/or hardware, subject to change.

 

Recent News

December 12, 2013
FAAST 1.2 has been released, adding compatibility for Windows 8.  This is also the first version to include both 32-bit and 64-bit binaries.

Summary

FAAST is middleware to facilitate integration of full-body control with games and VR applications using either OpenNI or the Microsoft Kinect for Windows skeleton tracking software. FAAST includes a custom VRPN server to stream up to four user skeletons over a network, allowing VR applications to read the skeletal joints as trackers using any VRPN client. Additionally, the toolkit can also emulate keyboard input triggered by body posture and specific gestures. This allows the user add custom body-based control mechanisms to existing off-the-shelf games that do not provide official support for depth sensors.

FAAST is free to use and distribute for both commercial and noncommercial purposes.  However, you must still abide by the licensing terms of any third party software you install for skeleton tracking (either OpenNI software or Microsoft Kinect for Windows).  Please see the websites of these libraries for more information.

If you use FAAST to support your research project, we request that any publications resulting from the use of this software include a reference to the toolkit.  Additionally, we encourage you to send us an email about your project, so we can compile a list of projects that use FAAST. This will be help us pursue funding to maintain the software and add new functionality.  The publication to reference is:

E. Suma, D. Krum, B. Lange, S. Koenig, A. Rizzo, and M. Bolas. Adapting user interfaces for gestural interaction with the flexible action and articulated skeleton toolkit. Computers & Graphics, 37(3):193–201, 2013.

FAAST is currently available for Windows only.

Support

FAAST can easily be integrated with existing game engines, VR toolkits, and other applications using a networked client-server architecture provided by the Virtual Reality Peripheral Network (VRPN).  We have also developed a plugin for the Unity3D engine, which we will be posting online soon.  FAAST is also officially supported by several commercial toolkits, including 3DVIA Studio and the Vizard toolkit from WorldViz.

Installation

To run FAAST with a Kinect sensor, you will need to install the Microsoft Kinect for Windows 1.8 runtime.  FAAST should then run out of the box without any additional setup.  Both the OpenNI and Microsoft Kinect for Windows skeleton trackers are supported.

If you encounter an error on startup, you may also need to install the Microsoft Visual C++ 2012 Redistributable Package.

Skeleton Usage

FAAST streams up to four user skeletons over a VRPN server.  The four skeletons are identified as as Tracker0, Tracker1, Tracker2, and Tracker3.  For example, to read from the first skeleton, you should connect to “Tracker0@ip_address” in your VRPN client (“Tracker0@localhost” if running on the same machine as the client). The server automatically starts when the toolkit connects to a sensor.  For each skeleton, a total of 24 skeleton joint transformations (including position and rotation) are streamed as sensors. Corresponding to the OpenNI framework, the joints are ordered as follows:

Sensor Joint Sensor Joint
0 Head 12 Right Elbow
1 Neck 13 Right Wrist
2 Torso 14 Right Hand
3 Waist 15 Right Fingertip
4 Left Collar 16 Left Hip
5 Left Shoulder 17 Left Knee
6 Left Elbow 18 Left Ankle
7 Left Wrist 19 Left Foot
8 Left Hand 20 Right Hip
9 Left Fingertip 21 Right Knee
10 Right Collar 22 Right Ankle
11 Right Shoulder 23 Right Foot

Note: The joint positions and orientations can be streamed in either the world coordinate system or local coordinate system (each joint relative to the parent).  Positions are reported in meters relative to the sensor’s position and orientations are reported as quaternions, in accordance with the VRPN standard units.

Input Emulator Usage

FAAST includes a graphical user interface for designing custom gestures and mapping to a series of input events.  Multiple input and output events can be specified either simultaneously or in sequence.  Any event listed immediately after a previous event will be considered simultaneous (e.g. moving two hands forward at once).  Two or more events separated by a time will be treated as sequential, allowing for more complicated movements over a period of time (e.g. drawing a specific symbol in the air).

More detailed documentation, along with a video tutorial, will be posted soon.