Wireless AR/VR with Edge/Cloud Computing

Principal Investigator:  Sujit Dey

 

Triggered by several head-mounted display (HMD) devices that have come to the market recently, such as Oculus Rift, HTC Vive, and Samsung Gear VR, significant interest has developed in virtual reality (VR) systems, experiences and applications. However, the current HMD devices are still very heavy and large, negatively affecting user experience. Moreover, current VR approaches perform rendering locally either on a mobile device tethered to an HMD, or on a computer/console tethered to the HMD.

In our projects, we discuss how to enable a truly portable and mobile VR/AR experience, with lightweight VR/AR glasses wirelessly connecting with edge/cloud computing devices that perform the rendering remotely. Our projects explore and develop techniques to enable three Degrees of Freedom (3DoF) and six Degrees of Freedom (6DoF) immersive experiences, with both natural videos and VR applications, with ultra-low latency requirements. Specifically, such immersive VR applications involve additional user activities -- user head rotation (for both 3DoF and 6DoF), and body movement (for 6DoF). While the acceptable response time to user control commands is 100-200ms (similar to first-person shooter games), a much lower latency of 10-20ms is needed with the user's head rotation and body movements.

Since there are two main challenges of enabling wireless VR/AR with edge/cloud computing: ultra-high throughput needed, and ultra-low latency, our projects aim to find solutions for the above challenges. We summarize our three associated sub-projects below, and generally, three possible approaches to realize the wireless VR/AR with cloud/edge-based implementation are as follows: (a) rendering on the cloud server; (b) rendering on the remote edge server; (c) rendering on the local edge device.

> RETURN TO WIRELESS AR/VR PROJECTS