USC

Category: Virtual Reality

VR and Stroke Rehabilitation

REINVENT is a collaboration between the ICT Mixed Reality Lab and Dr. Sook-Lei Liew to leverage virtual reality and EEG for stroke rehabilitation. The story was recently covered by USC News. “It’s a blend of tech, industry, science and the clinic,” says Dr. Liew. “It really takes it to a whole new level.”

REINVENT: Virtual Reality Rehab at USC Division of Biokinesiology and Physical Therapy from USC Dentistry on Vimeo.

You Can Change the Laws of Physics and Win the Best Poster Award at IEEE Virtual Reality Conference

teaserCan we adapt if the laws of physics change on us? In virtual reality environments, the laws of physics are very malleable. Yes, we can adapt as long as we are given good visual feedback in a timely manner. Our research into “Motor Adaptation in Response to Scaling and Diminished Feedback in Virtual Reality” was awarded Best Poster at the 2017 IEEE Virtual Reality Conference. Congratulations to David Krum, Thai Phan, and Sin-Hwa Kang.

 

ICT Mixed Reality Lab “Tested” by Norm Chan

unnamed

Norm Chan, from Adam Savage’s Tested (www.tested.com), recently visited ICT’s Mixed Reality Lab. We talked about our research which helped create low cost virtual reality displays and also described our latest research projects. See the video.

doing science 29jul16

Ryan Spicer donning some headgear to explore neurological motor control in VR. This project is with Dr. Sook-Lei Lieu who has joint appointments with the USC Division of Biokinesiology and Physical Therapy and the Keck School of Medicine of USC, Department of Neurology.

IMG_0586

Near-Field VR Wins Immersive Realities Contest at SIGGRAPH 2015

 

stop1

The MxR Lab has been hard at work creating a unique immersive experience entitled “Discovering Near-Field VR: Stop Motion with a Touch of Light-Fields and a Dash of Redirection,” which just won the Immersive Realities AR/VR Contest at SIGGRAPH 2015. The contest was held to showcase the best immersive reality applications with live demonstrations in the new VR Village venue.

“Discovering Near-Field VR” combined efforts across disciplines at USC with students from the School of Cinematic Arts and the Viterbi School of Engineering collaborating with researchers at the Institute for Creative Technologies to produce a unique piece that introduces the art of stop motion animation to the field of virtual reality. The goal was to create a surreal experience by developing virtual reality techniques that look unlike traditional game engine graphics and leverage the perceptual affordances of the near-field. Light field rendering and redirected walking techniques were leveraged to create an interactive full-body experience.

The Immersive Realities AR/VR contest had 48 submissions from all over the world. A total of ten pieces were demonstrated at the VR Village, with the contest winner selected between the top three finalists.

This effort brought together the contributions of numerous people across multiple disciplines. We would like to express our sincere gratitude to everyone that made “Discovering Near-Field VR” possible:

USC School of Cinematic Arts and Institute for Creative Technologies:
Mark Bolas

USC School of Cinematic Arts:
Vangelis Lympouridis
Fernando Rabelo
Christine Barron
Catalina Matamoros
Cristina Brous
Alicja Jasina
Yawen Zheng
Wasef El-Kharouf
Anshul Pendse
Lindsey Townley

USC Institute for Creative Technologies:
Thai Phan
Evan Suma
Andrew Jones
Paul Debevec
David M. Krum
Timofey Grechkin
David Nelson
Ryan Spicer
Rhys Yahata

USC Viterbi School of Engineering:
Ashok Kuruvilla
Shravani Chintalapudi
Joy D’Souza
Nathan Iskandar
Ashley Yu-Chih
Jin Zhang
Mahdi Azmandian

Music:
Philip Eberhart

Otherworld Interactive:
Mike Murdoch
Robyn Gray
Mitch Thompson

Phasespace Inc.
Tracy McSherry
Kan Anant

We would like to thank the US Army for funding research that made this work possible. Statements and opinions expressed do not necessarily reflect the position or the policy of the United States Government, and no official endorsement should be inferred.

DARPA to release software developed by Mixed Reality Lab

The Defense Advanced Research Projects Agency (DARPA), is responsible for funding and developing blue sky advancements in technology and science for the US Department of Defense. These have included significant roles in developing GPS, the Internet, autonomous cars, integrated circuits, and hypersonic aircraft. DARPA has now made some of their software available for free through an Open Source Software Catalog:
http://www.darpa.mil/opencatalog/

ICT’s Mixed Reality Lab is proud to be represented in this catalog with software related to “Immersive Body Based Interactions”, a research project that seeks to understand and address the human computer interface challenges raised by Big Data. MxR team members David Krum and Thai Phan developed and contributed software for low cost tablet based virtual reality displays and innovative multi-touch software for accessing large databases.

 

SONY DSC

ict_twitter_workbench(2)

Additional coverage:
Wired
Gizmodo
Information Week
Slate
Slashdot