SMILE VR Project Lands $750K NSF Grant to Revolutionize Learning and Training
Virtual reality (VR) offers a captivating and immersive way to revolutionize learning and training, especially when it comes to safety-related concepts.
Despite its potential, the intricate development process has slowed its widespread use—until now. In an exciting breakthrough, the University of Texas at Dallas computer science professors Drs. Balakrishnan Prabhakaran, Shiyi Wei, Yu Xiang, and Jin Ryong Kim have secured an impressive $750,000 National Science Foundation (NSF) grant for their groundbreaking Scan to Multi-sensorial Interactive Learning Environment (SMILE) project. This substantial funding highlights the innovation behind SMILE and marks a significant step forward in bringing cutting-edge VR learning experiences to life.
The SMILE project is a testament to the interdisciplinary nature of modern research. Its primary goal is to design and develop an innovative infrastructure that will enable nearly automated VR environment construction. These environments are designed to mimic real-world indoor scenes and facilitate interactions with virtual objects that engage multiple senses, including touch, visual, aural, and smell. This broad approach underscores the project’s potential impact and relevance across various fields.
SMILE has the potential to significantly lower the cost required for training the students to use STEM laboratories in an intuitive manner, thereby lowering the bar for underrepresented students to enter STEM fields and fostering a more diverse and inclusive academic landscape.