Evan A. Suma, Belinda Lange, Skip Rizzo, David Krum, and Mark Bolas
Project Email Address: firstname.lastname@example.org
Note from Evan Suma, the developer of FAAST: I have recently transitioned to a faculty position at USC, and unfortunately that means I have very limited time for further development of the toolkit. Future updates may occur but will likely be sporadic. Additionally, it is no longer possible for me to respond to support and feature requests from the community.
You may also view our online video gallery, which contains videos that demonstrate FAAST’s capabilities, as well as interesting applications that use the toolkit.
September 30, 2013
FAAST 1.1 has been released. This includes updates to the Microsoft Kinect for Windows 1.8 and OpenNI 2.2. A number of new features have been added, including support for seated mode, a new configurable graphical user interface, and the ability to dynamically disconnect/reconnect to the sensor.
FAAST is middleware to facilitate integration of full-body control with games and VR applications using either OpenNI or the Microsoft Kinect for Windows skeleton tracking software. FAAST includes a custom VRPN server to stream up to four user skeletons over a network, allowing VR applications to read the skeletal joints as trackers using any VRPN client. Additionally, the toolkit can also emulate keyboard input triggered by body posture and specific gestures. This allows the user add custom body-based control mechanisms to existing off-the-shelf games that do not provide official support for depth sensors.
FAAST is free to use and distribute for both commercial and noncommercial purposes. However, you must still abide by the licensing terms of any third party software you install for skeleton tracking (either OpenNI software or Microsoft Kinect for Windows). Please see the websites of these libraries for more information.
E. Suma, D. Krum, B. Lange, S. Koenig, A. Rizzo, and M. Bolas. Adapting user interfaces for gestural interaction with the ﬂexible action and articulated skeleton toolkit. Computers & Graphics, 37(3):193–201, 2013.
FAAST is currently available for Windows only.
FAAST can easily be integrated with existing game engines, VR toolkits, and other applications using a networked client-server architecture provided by the Virtual Reality Peripheral Network (VRPN). We have also developed a plugin for the Unity3D engine, which we will be posting online soon. FAAST is also officially supported by several commercial toolkits, including 3DVIA Studio and the Vizard toolkit from WorldViz.
To run FAAST, you first need to install a skeleton tracker. Both the OpenNI and Microsoft Kinect for Windows implementations are supported. Only one tracker needs to be installed, and you can select the one you want to use at run-time.
Option 1: Microsoft Kinect for Windows
To use the Microsoft skeleton tracker, you must download and install the Microsoft Kinect for Windows SDK 1.8. This is the simplest option for novice users.
Option 2: OpenNI
To use the OpenNI skeleton tracker, you must download and install the following software packages from the OpenNI website.
You will also need to install a driver for the sensor. If you are using the Kinect, you can also install the Microsoft Kinect for Windows SDK and these drivers will work with the OpenNI skeleton tracker. If you are using a different sensor (e.g. PrimeSense Carmine), then these can typically be found on the manufacturer or OpenNI websites.
Switching Between Trackers
You can have both the OpenNI and Microsoft software installed at the same time. As of version 1.1, you can now switch between the two skeleton trackers dynamically without having to restart FAAST!
If you encounter an error on startup, you may also need to install the Microsoft Visual C++ 2012 Redistributable Package.
FAAST streams up to four user skeletons over a VRPN server. The four skeletons are identified as as Tracker0, Tracker1, Tracker2, and Tracker3. For example, to read from the first skeleton, you should connect to “Tracker0@ip_address” in your VRPN client (“Tracker0@localhost” if running on the same machine as the client). The server automatically starts when the toolkit connects to a sensor. For each skeleton, a total of 24 skeleton joint transformations (including position and rotation) are streamed as sensors. Corresponding to the OpenNI framework, the joints are ordered as follows:
|4||Left Collar||16||Left Hip|
|5||Left Shoulder||17||Left Knee|
|6||Left Elbow||18||Left Ankle|
|7||Left Wrist||19||Left Foot|
|8||Left Hand||20||Right Hip|
|9||Left Fingertip||21||Right Knee|
|10||Right Collar||22||Right Ankle|
|11||Right Shoulder||23||Right Foot|
Note: The joint positions and orientations can be streamed in either the world coordinate system or local coordinate system (each joint relative to the parent). Positions are reported in meters relative to the sensor’s position and orientations are reported as quaternions, in accordance with the VRPN standard units.
Input Emulator Usage
FAAST includes a graphical user interface for designing custom gestures and mapping to a series of input events. Multiple input and output events can be specified either simultaneously or in sequence. Any event listed immediately after a previous event will be considered simultaneous (e.g. moving two hands forward at once). Two or more events separated by a time will be treated as sequential, allowing for more complicated movements over a period of time (e.g. drawing a specific symbol in the air).
More detailed documentation, along with a video tutorial, will be posted soon.