Networked Virtual Reality and Kvasir-VR


Kvasir-VR: Networked VR Overview


“Kvasir-VR” refers to our networked collaborative VR approaches, and especially to those that let teachers guide students who are immersed in VR environments. Kvasir-VR enables immersed students to meet remote experts for guided activities, or to be guided and assisted more directly by their nearby teacher. Our most recent work captures a teacher using a 3D camera (Kinect) and streams the data over a network for incorporation into live VR field trips. Classroom-deployable VR stations and our specialized teacher interface allow us to bring the approaches “into the field” to study them in real classrooms. The following outline overviews Kvasir-VR:



Teacher-guided Networked VR Field Trips (2014+)

We applied Kvasir-VR field trip approaches in the Networked Virtual Energy Center application to allow a live remote teacher to guide students through a virtual solar plant. A standalone version with recorded 3D teacher clips can also provide the field trips, independently. Our most recent experiment studied the effects of the live (networked) vs. pre-recorded (standalone) approaches. The system has been deployed to hundreds of students in high school classrooms, and to thousands through outreach events. Major demonstrations of the the live version included interstate and intercontinental operation using high-performance research networks such as Internet2.

Images

A remote teacher guides students from a large TV interface with a Kinect mounted at its bottom. Visuals include: 1) a mirror view of the environment, 2) synchronized student views at the TV’s lower left, 3) webcam views of students, in ovals hovering at positions for teacher to make eye contact with the students, and 4) pointing cues to help the teacher point correctly in 3D.

A student wearing a head-mounted display stands at a virtual tower overviewing a virtual solar plant (the background duplicates the student’s view for illustration). A teacher (seen to the left) provides an introduction to the plant and is represented by a mesh captured by a Kinect depth camera.



Video accompanying a paper submission to the IEEE VR 2018 conference : “Teacher-Guided Educational VR: Assessment of Live and Prerecorded Teachers Guiding Field Trips”. The paper describes the “Kvasir-VR” framework for embedding teachers into VR and assesses live (networked) and prerecorded teachers in high schools. The teacher imagery is from a depth camera (Kinect), so it looks rougher than green screen techniques, but it is 3D and does not require a special background.


Selected Publications

  • IEEE VR 2018 paper overviewing Kvasir-VR and evaluating networked and standalone approaches. (PDF) (Video)
  • IEEE Workshop paper (Everyday Virtual Reality 2017) discussing a TV-based teacher interface and evaluating visual pointing cues. (PDF) (DOI)
  • IEEE VR 2017 demo abstract about the teacher-student collaboration interface. Winner of the “Best Research Demo” award. (PDF) (Video) (DOI)
  • ISVC (Visual Computing) paper describing techniques for mesh streaming, rendering, and the teacher’s interface. (PDF) (DOI)



Tiny House Walkthrough and Other Collaboration with High Schools (2017-2018)

With funding from the Mozilla Foundation, we brought our approaches to educational VR applications envisioned by high school faculty and students. We set up Kvasir-VR between two schools, allowing a student at one school to remotely guide a VR walkthrough of a tiny house to explain its purpose and features. The high school students are involved in constructing the real tiny house. Our mentoring of students and other project activities are further documented in our Mozilla Gigabit Community Fund Blog.

Local high school student Andre Garcia remotely guides a Carencro High student through his virtual Tiny House model.


Our project and department were featured in the following wrap-up documentary from Mozilla and AOC Community Media.


Remotely-guided VR for Geosciences Interpretation (2005+)

Research Assistants: Kaushik Das, Jan-Phillip Tiesel, Vijay Baiyya, Chris Best, Adam Guichard

Overview

This application of networked VR and asynchronous displays allowed a geologist to use a low-cost desktop VR display to remotely guide users immersed by head-mounted displays or large projection displays. The geologist guided viewers through live interpretations of various datasets associated with the Chicxulub impact crater area, which is believed to be related to sudden mass extinctions. A proposed expansion of this project to national research networks received support from US Ignite in 2014 and then became the Networked Virtual Energy Center project above.

Remotely-guided VR with the multiple networked display types was demonstrated at a special session at the GCAGS 2006 conference (link), using local networking. Our first deployment that connected separated (remote) sites was at a 2006 conference of the Independent Association of Drilling Contractors, during which our VR lab was connected to a large-scale visualization facility (LITE).

Geological study of an impact crater with networked asynchronous displays for remotely-guided VR interpretation. Users annotate data surfaces and check several datasets for correspondence. The picture was taken during CGACS 2006 setup and later appeared in the Handbook of Virtual Environments (2nd Edition, 2014).

Also see our videos Composable Volumetric Lenses and 3D Windows in Geosciences Exploration System.


Selected Publications

  • IEEE VR 2008 paper discussing a new rendering approach for composable volumetric lenses. (DOI)
  • Gulf Coast Association of Geological Societies 2008 convention paper examining the use of VR to interpret and map the historical geology near Lafayette, LA. (link)
  • CGVR 2007 paper detailing volumetric window applications, rendering techniques, and performance evaluations. (PDF) (DBLP)
  • Gulf Coast Association of Geological Societies 2006 convention paper analyzing a VR geological interpretation refinment system for geophysical and topographic data from the Chicxulub impact crater. (link)
  • Gulf Coast Association of Geological Societies 2005 convention paper discussing an immersive visualization system that allows remote interpreters to collaborate in real time while using different display types. (link)

Networked Virtual Tennis with Heterogeneous Displays (2005)

Research Assistants: Alp V. Asutay, Arun P. Indugula

Our first study of networked asynchronous displays had pairs of users compete in virtual tennis. A “reach-in fishtank” display gave one user a bird’s eye view of the tennis court and a VR character that followed the user’s hand at the surface of a desk. The other user experienced a conventional first-person view in a VR headset with matching player motion and action triggering. The motivation for our study was to consider the relative strengths and weaknesses of these different view styles, with possible implications for techniques like multi-scale collaboration in VR.

Images

Students using each display type (giving a different views of the virtual tennis application). The egocentric head-mounted display (left) allowed more accurate aiming, and the exocentric fishtank display (right) provided better timing cues. The picture was taken in 2004 and later appeared in Interpretation (2015) vol. 3 issue 3.

Student HMD View

An egocentric view inside the virtual tennis game.


Publications

  • IEEE DS-RT 2005 paper showing an early example and evaluation of asymmetrical, networked user interfaces. (DOI)
  • Interpretation journal article showing the application of networked VR collaboration to geological visualization. (DOI)



Networked Touch with 2D Tactile Grids (2003+)

We developed a 2D tactile (haptic) grid and control methods for remote networked touch. At the Eurohaptics 2004, IEEE 2004 Haptics Symposium, and IEEE VR 2009 conferences, we demonstrated a setup allowing one user to drag their finger on a touch surface (laptop touchpad) and another user to feel the movement through vibration patterns on our tactile array. Our work on 2D tactile rendering and control functions provides our Networked Touch work with the ability to render moving paths relatively smoothly, with consistent intensities and positions.

The illustration above shows (on top) the 2004 demo using a laptop touchpad and 2D tactile array to communicate touch and (on bottom) conceptual application to smartphones with palm-facing arrays, allowing one person to “draw” on a remote person’s hand.

Publications

  • Christoph W. Borst and Alp V. Asutay, “Bi-level and Anti-aliased Rendering Methods for a Low-Resolution 2D Vibrotactile Array”, IEEE WorldHaptics 2005 conference, pp. 329-335. (PDF)
  • Christoph W. Borst and Charles D. Cavanaugh, “Touchpad-Driven Haptic Communication using a Palm-Sized Vibrotactile Array with an Open-Hardware Controller Design,” EuroHaptics 2004 conference.
  • Technical report: Christoph W. Borst and Charles D. Cavanaugh, “Haptic Controller Design and Palm-Sized Vibrotactile Array,” technical report. (PDF)



Earlier Telerobotics Work (1990s)

Dr. Borst’s interest in networked VR stems from his 1990s work on computer graphics and VR for telerobotics.

Telepresence: An ANRCP project used VR headsets, trackers, and video techniques to let a user experience an environment from a robot’s perspective, for safer inspection of nuclear material in a secure facility.

Predictive displays: A NASA project involved physics simulation and 3D graphics for controlling a flying camera during substantial network/communication delays. Borst’s approach conceptually matched “predictive simulation and smoothing” techniques that have become common in networked video games: the user “flew” a predictive camera with a physics simulation that considered the latest (delayed) telemetry, and smoothing techniques avoided sudden jumps when the simulation was corrected to remove differences between past predictions and incoming telemetry.