Real-Time Mixed Reality Rendering for Underwater 360° Videos

Learn about the Underwater Toolkit, a mixed reality toolkit that enables seamless blending of virtual objects into underwater 360° videos in real time.

ismar19-paper-img
Real-time mixed reality underwater rendering into a 360° video. The top left image is rendered without our method, and the top right image is rendered with our underwater lighting method applied. The bottom row shows each of the underwater lighting effects individually. (Our effects throughout the paper may appear brighter than what is in the actual 360° video. This has been done to make it easier to see the effects in the images.)

Abstract

We present the Underwater Toolkit, a mixed reality (MR) toolkit that enables seamless blending of virtual objects into underwater 360° videos (360-video) in real time. It is fully integrated into commercial game engines such as Unity3D and Unreal Engine 4 (UE4), providing a complete pipeline for underwater 360° videos. The toolkit provides real-time underwater lighting (caustics, god rays, fog, and particulates) to ensure that the virtual objects are lit and blend similarly to each frame of the underwater video semiautomatically and in real time. Our toolkit’s user-friendly interface enables users to fine-tune the underwater lighting parameters so they can match the lighting observed in the 360-video for improved visual quality and seamless blending. In our demonstration, users will be able to immerse themselves into underwater 360-videos using a HMD. Using motion controllers, the users will be able to interact with fish by feeding or catching them. An additional user will be able to interact with our toolkit, changing underwater lighting parameters to seamlessly blend fish into the 360-video in real time.

ieeevr19-poster-img
Water surface normal maps with corresponding caustic map and virtual object rendered with caustics.
ismar19-demo-img
Demonstration setup. The first user (left) is interacting with fish in the 360° underwater mixed-reality environment. To seamlessly blend the virtual objects into the video, a second user (right) is editing the underwater lighting parameters in our toolkit in real-time

Publication and authors

ISMAR 2019 paper "Real-Time Mixed Reality Rendering for Underwater 360° Videos"
Stephen Thompson, Andrew Chalmers and Taehyun Rhee (Computational Media Innovation Centre, Victoria University of Wellington, NZ)

ISMAR 2019 demo "Underwater Toolkit: Mixed Reality Object Blending for 360° Videos"
Stephen Thompson, Andrew Chalmers, Daniel Medeiros and Taehyun Rhee (Computational Media Innovation Centre, Victoria University of Wellington, NZ)

IEEEVR 2019 poster "Real-time Underwater Caustics for Mixed Reality 360° Videos"
Stephen Thompson, Andrew Chalmers and Taehyun Rhee (Computational Media Innovation Centre, Victoria University of Wellington, NZ)

Acknowledgments

This project was supported by the Entrepreneurial University Programme funded by TEC and in part by the Smart Ideas project funded by MBIE in New Zealand. We thank Boxfish Research for providing 360-videos captured with their underwater 360° camera.