Category: Virtual Reality

You Can Change the Laws of Physics and Win the Best Poster Award at IEEE Virtual Reality Conference

teaserCan we adapt if the laws of physics change on us? In virtual reality environments, the laws of physics are very malleable. Yes, we can adapt as long as we are given good visual feedback in a timely manner. Our research into “Motor Adaptation in Response to Scaling and Diminished Feedback in Virtual Reality” was awarded Best Poster at the 2017 IEEE Virtual Reality Conference. Congratulations to David Krum, Thai Phan, and Sin-Hwa Kang.


ICT Mixed Reality Lab “Tested” by Norm Chan


Norm Chan, from Adam Savage’s Tested (, recently visited ICT’s Mixed Reality Lab. We talked about our research which helped create low cost virtual reality displays and also described our latest research projects. See the video.

doing science 29jul16

Ryan Spicer donning some headgear to explore neurological motor control in VR. This project is with Dr. Sook-Lei Lieu who has joint appointments with the USC Division of Biokinesiology and Physical Therapy and the Keck School of Medicine of USC, Department of Neurology.


Near-Field VR Wins Immersive Realities Contest at SIGGRAPH 2015



The MxR Lab has been hard at work creating a unique immersive experience entitled “Discovering Near-Field VR: Stop Motion with a Touch of Light-Fields and a Dash of Redirection,” which just won the Immersive Realities AR/VR Contest at SIGGRAPH 2015. The contest was held to showcase the best immersive reality applications with live demonstrations in the new VR Village venue.

“Discovering Near-Field VR” combined efforts across disciplines at USC with students from the School of Cinematic Arts and the Viterbi School of Engineering collaborating with researchers at the Institute for Creative Technologies to produce a unique piece that introduces the art of stop motion animation to the field of virtual reality. The goal was to create a surreal experience by developing virtual reality techniques that look unlike traditional game engine graphics and leverage the perceptual affordances of the near-field. Light field rendering and redirected walking techniques were leveraged to create an interactive full-body experience.

The Immersive Realities AR/VR contest had 48 submissions from all over the world. A total of ten pieces were demonstrated at the VR Village, with the contest winner selected between the top three finalists.

This effort brought together the contributions of numerous people across multiple disciplines. We would like to express our sincere gratitude to everyone that made “Discovering Near-Field VR” possible:

USC School of Cinematic Arts and Institute for Creative Technologies:
Mark Bolas

USC School of Cinematic Arts:
Vangelis Lympouridis
Fernando Rabelo
Christine Barron
Catalina Matamoros
Cristina Brous
Alicja Jasina
Yawen Zheng
Wasef El-Kharouf
Anshul Pendse
Lindsey Townley

USC Institute for Creative Technologies:
Thai Phan
Evan Suma
Andrew Jones
Paul Debevec
David M. Krum
Timofey Grechkin
David Nelson
Ryan Spicer
Rhys Yahata

USC Viterbi School of Engineering:
Ashok Kuruvilla
Shravani Chintalapudi
Joy D’Souza
Nathan Iskandar
Ashley Yu-Chih
Jin Zhang
Mahdi Azmandian

Philip Eberhart

Otherworld Interactive:
Mike Murdoch
Robyn Gray
Mitch Thompson

Phasespace Inc.
Tracy McSherry
Kan Anant

We would like to thank the US Army for funding research that made this work possible. Statements and opinions expressed do not necessarily reflect the position or the policy of the United States Government, and no official endorsement should be inferred.

DARPA to release software developed by Mixed Reality Lab

The Defense Advanced Research Projects Agency (DARPA), is responsible for funding and developing blue sky advancements in technology and science for the US Department of Defense. These have included significant roles in developing GPS, the Internet, autonomous cars, integrated circuits, and hypersonic aircraft. DARPA has now made some of their software available for free through an Open Source Software Catalog:

ICT’s Mixed Reality Lab is proud to be represented in this catalog with software related to “Immersive Body Based Interactions”, a research project that seeks to understand and address the human computer interface challenges raised by Big Data. MxR team members David Krum and Thai Phan developed and contributed software for low cost tablet based virtual reality displays and innovative multi-touch software for accessing large databases.




Additional coverage:
Information Week

MxR Lab ORGANIZES TRIP TO MARS at SIGGRAPH with Virtual Viewing Party

Mark Bolas, Perry Hoberman and the team at the Mixed Reality Lab (MxR) at the Institute for Creative Technologies at USC brought attendees of SIGGRAPH to the red planet.  Well, at least they provided an immersive Virtual Reality based experience where people could traverse the surface of the Gale Crater, the site where the Curiosity Rover would land that very night. The price for this inter-planetary journey? A free app on your iPhone or Android and a low cost immersive viewer, the plans for which are free to download on the MxR Lab’s website.

“Technology has progressed so far in the past year, that it is truly possible to put VR into everyone’s hands.” Remarked Professor Bolas.

Using the FOV2GO (a smartphone and tablet based low-cost DIY VR viewer platform with stereoscopic software designed by Perry Hoberman, which won the ‘Best Demo’ at this year’s IEEE Virtual Reality Conference), lead designer Thai Phan optimized a detailed stereo model of the Mars’ Gale Crater courtesy of JPL-CalTech Multimedia and NASA/JPL-CalTech to provide an immersive 3D experience.

To celebrate Curiosity’s exploration of our neighboring planet the group hosted a “Virtual Viewing Party” on the opening night of SIGGRAPH, coordinated with a live broadcast of the event projected on two screens in the LA Convention Center screening room. The team provided an assortment of FOV2GO viewers, enabling people to navigate in and around the crater finding points of interest and unique facts provided by NASA.

The team can foresee virtually bringing you to other amazing places as well. Right now it’s FOV2GO to MARS, but eventually it could be FOV2GO to the MOON, or FOV2GO to the Indian Ocean or Paris, France or anywhere you can imagine.


SIGGRAPH 12 attendees experience the FOV2GO to MARS

The team included, project leaders Mark Bolas and Perry Hoberman producers David Nelson, David Krum, Peggy Weil and Nonny De La Pena, and lead designer Thai Phan.