Skip to content

UT Dallas CS and Samsung Join Forces to Offer First Ever Mobile VR Course

This past fall, UT Dallas, in collaboration with Samsung Electronics America, provided UT Dallas Computer Science students the opportunity to participate in an in-depth Virtual Reality (VR) course. Samsung donated 30 Samsung Gear VR Innovator Edition head-mounted displays (HMDs) along with 30 Samsung Galaxy S6 smartphones to the UT Dallas Virtual Reality course instructed by Dr. Ryan McMahan.

Dr. McMahan’s CS 6334 Virtual Reality course afforded students an in-depth overview of VR, including input and output devices, 3D navigation techniques, 3D selection and manipulation, system control techniques, interaction/scenario/display fidelity, design guidelines, and eDSC_0011valuation methods. In contrast with prior years, the students who were enrolled in the course this past fall semester were able to learn all the same concepts, but in the context of mobile VR thanks to Samsung’s donation.

During the span of the course, a total of 43 students divided into seven teams to work together and develop various applications. That experience taught students how to create graphical user interfaces within virtual environments, create multiplayer experiences, develop 3D interaction techniques such as ray-casting selection, and learn how to integrate the use of other wearables into their project.

At the final presentations, Samsung executives were able to experience the Gear VR applications that the students had created with the donated equipment. During the final presentations, each team set up their Gear VRs so that other students, faculty members, and Samsung executives could walk around and experience their final product. Among the project demonstrations was a Virtual Reality Endoscopic Surgery.

DSC_0015This project was a simulation of immersive endoscopic surgery enhanced with a haptic force feedback device. The mission of the project was to provide a risk-free and highly immersive environment for pre-surgical planning and the training of surgeons. A few of the techniques that the group utilized during production of their project included 3D modeling, haptic rendering and interaction using the Geomagic Touch Haptic Device, and data streaming using the Photon multiplayer network. Other student projects included in the final demonstrations were a first-person shooter game using wearables, a robot tank battle game involving in-game programming, an augmented reality (AR) viewer with see-through filters, an AR navigational wayfinding aid, a 3D menu system for viewing media, and a 3D menu system for viewing file structures.

The donations made by Samsung were valued at $28K. With the donations, UT Dallas became the first university to offer a college-level course that incorporates the Samsung Gear VR to teach virtual reality.

Below are descriptions of each teams project and what techniques they used during the making of their project.


Project Name: Menu of Loci (3D menu system for viewing media)

Project Description: The Menu of Loci is about taking relatively static virtual reality menus and re-imagining them as part of a larger 3D space where items reside in virtual locations hence the “loci” in name of the project, “The Menu of Loci.” The imagining of this places the user in a virtual galaxy where a number of planets represent different categories in an online video store. An example of this would be having the users “travel” to a category’s planet, which would then bring up a number of video choices, allowing the user to view their video choice. The project also used a second solar system in place to demonstrate how different types of stores could expand the user’s space and create an environment that would be able to give multiple VR storefronts a unique look, while tying them together in a unified space.

Techniques Used: The project utilized the raycasting technique. Raycasting is technique in which the user directs a virtual line or ray in a direction, which is then used as pointer to select objects with a touch on the touchpad. In this project, the line points in the direction that the user is looking. Pointing this ray at a planet and touching the touchpad moves the user to that planet. Once at the planet of their choosing, the user is able to do the same thing but with a video thumbnail. Except now when the user touches the touchpad, the system will be instructed to play the video. This created a system with high learnability for the user, and rapid implementation for the group, which allowed the group to focus more on creating the space itself.


Project Name: Magic Camera (Augmented Reality (AR) viewer with see-through filters)

Project Description: Magic Camera is an application that allows Samsung Gear VR users to see the real world through the front camera on the Samsung Galaxy S6 Smarthphone under different types of shaders. Magic Camera implements 14 different shaders including the thermal vision, night vision, raining effect, fish eye effect, pixilation, black and white, and the ripple effect from the Ripple World app, and more.

Techniques Used: This team used Unity to create the application. To create the effects, the group involved more of the use of implementing algorithms rather than the use of VR controlling techniques. To change the camera shudders, users just swipe forward and backward by using the control pad of the Gear VR.


Project Name: 3D Menu System (3D menu system for file structures)

Project Description: The objective of the project was to develop a novel approach to accessing files, folders, and directories dynamically. The idea for the project modeled after the movie, “The Matrix.” To achieve this concept, the group designed a system of doors and racks in a hallway. As the user navigates through the hallway, they are able to view more directories based on a 2-meter radius while the previous directories are deleted, as to not obstruct the users view.

After the group received feedback from several users regarding usability within their project, the team designed a traditional floating menu system to provide the user with an alternative interface. The group notes that, “While many users liked the floating menu, most users preferred the hallway system as it was a more fun way to access files and folders. However, Samsung thought it was more of a ‘Hollywood’ style of accessing files and folders.

Techniques Used: In order to create some of the features on the 3D menu system, the group used the raycasting technique, so that the user is able to select any directory, subfolders, and other selectable features in the menu. To simplify some of the features in 3D menu, the group added a system control to access a range of files and folders, then grouping them as, A-F, G-P, and so on.


Project Name: Augmented Reality – Choose What You Want (AR navigational wayfinding aid)

Project Description: The projects mission was to attempt augmented reality where virtual elements are superimposed onto the real world.

The group’s goal for the project was to make navigation possible while wearing the Gear VR by having the user choose a destination from a suggested list using voice recognition and a search engine.

The user is able to see through the camera view and walk with a map and compass. In order to choose a desired destination the user inputs a keyword using voice recognition. Additionally, through the Samsung Galaxy S6 smartphone on the Gear VR, which is attached to the Gear VR, the user is able to see through the camera view and walk. The user will also see a map and compass in their view.

Techniques Used: The group used multiple Google application programming interfaces (API) to fetch data for all the location-based necessities. Once the data was collected, the data was then used to make calculations to display user-friendly information, such as distance and bearings.


Project Name: Virtual Reality Endoscopic Surgery

Project Description: The project is a simulation of immersive endoscopic surgery, enhanced with haptic force feedback. The project aims to provide a risk-free and highly immersive environment for pre-surgical planning and training for surgeons.

Techniques Used: The project utilized techniques such as 3D modeling and rendering, and haptic rendering and interaction based on the Geomagic Touch Haptic Device (formerly named Phantom Omni, a multiplayer and data-streaming network).


Project Name: First Person Shooter (FPS) Game using Wearables

Project Description: The groups project is a proof-of-concept for a first shooter game using wearables, such as the Samsung Gear VR and the Android Smart Watch. In the game, the group implemented three levels that increased with difficulty with each level.

Techniques Used: The group utilized gyroscope sensors in the Android Smart Watch to control the movement of the player within the game’s environment. For the game, the group used Bluetooth technology to collect sensor reading from the Samsung Galaxy S6 Smartphone and transfer the readings to the Gear VR.


Project Name: Robot Tank Battle Game

Project Description: For this project, the team created a robot tank battle game involving in-game programming, similar to the programming game “Robocode” and its predecessors. In the VR game, the player writes a short script to control the tank, which then battles against an enemy robot tank in an arena-style combat.

Techniques Used: What makes this project unique is that coding is done in a virtual world. The team members wanted to do a proof-of-concept where programming is possible (and not too difficult) inside VR. In order to make this possible, the team created a block scripting language similar to scratch or snap. In the game, the user must drag and drop 3D code blocks, which then form a program.


 

First Person Shooter using Wearables
First Person Shooter using Wearables
Menu of Loci
Menu of Loci
Augmented Reality – Choose What You Want (AR navigational wayfinding aid)
Augmented Reality – Choose What You Want (AR navigational wayfinding aid)
Magic Camera (Augmented Reality (AR) viewer with see-through filters)
Magic Camera (Augmented Reality (AR) viewer with see-through filters)
Virtual Reality Endoscopic Surgery
Virtual Reality Endoscopic Surgery
Robot Tank Battle Game with in-game programming.
Robot Tank Battle Game with in-game programming.
3D Menu System (3D menu system for file structures) - Samsung executive testing out the product.
3D Menu System (3D menu system for file structures) – Samsung executive testing out the product.
First Person Shooter team
First Person Shooter team
3D Menu System's team
3D Menu System’s team

DSC_0133

DSC_0102

DSC_0128

DSC_0067

DSC_0061

 

Click here to view all photos from the Mobile VR course final presentations, and click here to view all albums featuring CS related events. 


About the UT Dallas Computer Science Department

The UT Dallas Computer Science program is one of the largest Computer Science departments in the United States with over 1,600 bachelor’s-degree students, more than 1,100 master’s students, 160 PhD students, and 80 faculty members, as of Fall 2015. With The University of Texas at Dallas’ unique history of starting as a graduate institution first, the CS Department is built on a legacy of valuing innovative research and providing advanced training for software engineers and computer scientists.

UT Dallas CS Department Partners with IBM Watson to Offer an Undergraduate Course
CS & SE Seniors Present their Capstone Projects at the Fall 2015 UTDesign CS Expo