Computer Science > Information > Multi-modal 3D Tele-Immersion Research Project holds its Annual Meeting

Multi-modal 3D Tele-Immersion Research Project holds its Annual Meeting

Multi-modal 3D tele-immersion is a collaborative project that involves the University of Texas at Dallas, the University of Illinois at Urbana-Champaign, the University of California at Berkeley and the Dallas Veterans Affairs Hospital. UT Dallas is the lead institution. The project, funded by the National Science Foundation for $2.4 Million, is tasked to design and develop a collaborative, multi-modal immersive virtual reality environment for medical health care personnel and off-location patients. This multimedia system uses several 3D cameras to create avatars of humans in two different locations and then puts them together in the same virtual space where they can interact. The system has to process large amounts of data due to movement tracking and representation of multiple views, which could cause a serious bottleneck in terms of lag time or transmission delay. The UT Dallas research team consisting of Drs. Balakrishnan “Prabha” Prabhakaran, Xiaohu Guo, Roozbeh Jafari, Mark Spong, and their students is responsible for creating the algorithms and software needed to transmit the data through the Internet in real time.

On April 20th and 21st, principal investigators of the project and their students from the participating institutions met on UT Dallas campus to review the project’s progress and to share the results obtained so far. Presentations by researchers from the UT Dallas covered a wide spectrum of topics including predictive techniques for low-latency tele-immersion, integration and calibration of inertial body sensors with 3D cameras, tracking human skeletons using multiple 3D cameras, using the new International Standard on MPEG Media Transport protocol for 3D Tele-Immersion (3DTI), and advanced techniques for faster and higher quality rendering of 3DTI images. The researchers from University of Illinois at Urbana-Champaign presented their work on view-based rendering of 3D amphitheater where they demonstrated 3D virtual environments emulating amphitheater scenarios. Here, the view of audience is captured by body sensors and based on this view, the scene of the performers are rendered in 3D. The Berkeley team discussed the performance of a 3DTI system deployed between Berkeley and an overseas facility.

The workshop participants gave presentations on 3D tele-rehabilitation that showed the integration of haptic devices and 3D cameras for tele-rehabilitation applications. This facilitates force-feedback based immersive interactions between remote therapist/physician and patient. This system will go to field trials in the summer and early fall. During the field trial, physical medicine and rehabilitation specialists at the Dallas Veterans Affairs (VA) Hospital will be testing the features of the system by having patients interact with remote physical therapists. This project is expected to have a significant impact in the fields of both education and health-care by providing tele-rehabilitation with increased accuracy and flexibility.

The UT Dallas Computer Science department maintains a very active research program, with more than $8.4 million in research expenditures in 2014 and approximately $38 million in new funding received during the last 4 years. This funding comes from federal agencies and from local high-tech companies. The Multi-modal 3D Tele-immersion project is one of many such ongoing research projects.

For more videos or information relating to this subject please click here.

For more information on this subject you can read a previous article from UT Dallas News.