You Can Change the Laws of Physics and Win the Best Poster Award at IEEE Virtual Reality Conference

teaserCan we adapt if the laws of physics change on us? In virtual reality environments, the laws of physics are very malleable. Yes, we can adapt as long as we are given good visual feedback in a timely manner. Our research into “Motor Adaptation in Response to Scaling and Diminished Feedback in Virtual Reality” was awarded Best Poster at the 2017 IEEE Virtual Reality Conference. Congratulations to David Krum, Thai Phan, and Sin-Hwa Kang.


MxR Studio “Left of Boom” Student Developer Team Top 3 Finalist for Body Computing Conference VR Hackathon

The student team working as part of the MxR Lab’s research effort developing the “Language of VR” to determine “best practices” for creating rich “mixed environments”, were selected as one of the three finalists in the Future of Digital Health 10th Annual USC Global Body Computing Conference’s 2nd Annual VR Hackathon, with their submission “The Bystander Project.”

The VR Logline for the project is: A participant puts on the HTC Vive headset and is transported to a college house party where they are encouraged to look for signs of potential sexual harassment or abuse, and act upon them. This leads the participant to discover that they have the power to manipulate time, to pause, rewind and change the outcome of the scenario through the use of their presence and voice ultimately recognizing the importance of being an active bystander.

The Left of Boom team comprised of, Allison Comrie, John Francis, Brian Handy, Duc Ho, Chris Horrigan, Jyotsna Kadimi, Mari Kyle, Atley Loughridge and Vathsal Shashidhar worked under the supervision of the MxR Studio team:

Executive Director, Todd Richmond
Creative Director, David Nelson
Technical Director, Rhys Yahata


ICT Mixed Reality Lab “Tested” by Norm Chan


Norm Chan, from Adam Savage’s Tested (, recently visited ICT’s Mixed Reality Lab. We talked about our research which helped create low cost virtual reality displays and also described our latest research projects. See the video.

technology soup

We were asked what companies/technologies we are currently using in our work. As we looked around the lab, we made a list but only checked it once. For those that are curious: HTC Vive, Hololens, Oculus Rift, Unity, Kinect, Leap Motion, Intel RealSense, Perception Neuron (motion capture), Phase Space (positional tracking), Z-space (3D – fishtank VR), OpenBCI (brain computer interface), Emotive Eopch+ (brain computer interface), Neuroelectrics StarStem32 (EEG/trans cranial stimulation), Agisoft Photoscan (photogrammetry), Pix40 (photogrammetry), DJI Phantom (quad copter for photogrammetry capture), Faro (lidar plus camera scanner), Tobii Eye-X (eye tracking), Scosche (HR monitor), Crazy Flie (micro UAV), Loco Positioning (RF tracking), Rasberry Pi (avatar capture), Arduino (micro controllers).

And we’ll see what’s up next…

doing science 29jul16

Ryan Spicer donning some headgear to explore neurological motor control in VR. This project is with Dr. Sook-Lei Lieu who has joint appointments with the USC Division of Biokinesiology and Physical Therapy and the Keck School of Medicine of USC, Department of Neurology.



David Nelson

As part of the “Rosetta Project” MxR Lab and Studio’s effort to study the developing Language of VR and formulate best practices for immersive Mixed Reality content development, I have tried to compose an analog of a film logline for a Virtual Reality Experience (VRE). The intention is to help illustrate the various types of experiences one can have in VR and to communicate the creative intention of the VRE. This effort is mostly sponsored by the Army Research Office, under our Emerging Concepts in Virtual Environments for Training project. The template (below) may require additional development, but it has proven quite useful in communicating the projects we are developing at the MxR Studio with our Summer Crunch student teams.


A participant puts on the HMD Platform and they are transported to Location/Environment, where they are able to Level of Agency/Interaction , which leads them to discover Theme or Objective. The experience ends and the participant experiences an Emotional Impact.


BlueShark copy


Loglines are a tool used to summarize a feature length script in one or two sentences, illustrating the essential elements of the story. A logline can be a story barometer because if one cannot articulate the sum of a script in a succinct sentence or two it may highlight problems with the story from the outset.

Loglines typically contain a few key elements:

  • The main character.
  • The world where the story takes place.
  • The main character’s goal or desire.
  • The obstacle/opposition that prevents the main character from achieving their goal.
  • The stakes (sometimes the theme or emotional hook is stated here as well)


An unrelenting CIA operative must track down the elusive Osama Bin Laden as she risks it all against his fanatical followers and her own bureaucratic agency. (Zero Dark Thirty)

A precocious private high school student whose life revolves around his school competes with its most famous and successful alumnus for the affection of a first grade teacher. (Rushmore)

Three bumbling groomsmen lose their about-to-be-wed buddy during their drunken misadventures, then must retrace their steps in order to find him before the wedding begins. (The Hangover)



A participant puts on the HMD Platform and they are transported to Location/Environment, where they are able to Level of Agency/Interaction , which leads them to discover Theme or Objective. The experience ends and the participant experiences an Emotional Impact.

A user puts on an Oculus Rift and is transported to the bridge of a ship where they are able to use a virtual touchscreen to control the ship and ‘teleport’ to various views around the ship, providing a deeper level of situational awareness than is available in the physical world, leaving the user inspired from getting a glimpse into what a future Naval workplace might be like. (BlueShark, ICT)

A participant puts on an HTC Vive and is transported to a stark open space where they are able to use their hand-controllers as paintbrushes. They find that they can create 3D painting/sculptures in the space around them, discovering their own artistic talent and providing a fantastic and magical experience. (Tilt Brush, Google)

A participant puts on a Samsung Gear VR headset and is transported to the shores of Liberia, where they are able to look around to take in the sights and sounds around them. The voice of a woman narrator is heard in prayer, discussing her experience with a recent outbreak of Ebola, as the participant is brought to witness many scenes of recovery around present day Liberia. The intimate sense of presence evokes a feeling of empathy in the participant, having become privy to one person’s story amidst this global event. (Waves of Grace, Within)

The MxR Lab continues to research and experiment with VR content creation in search of the new grammar that will be used as the building blocks of the developing Language of VR .