






Virtual Glovebox Training with Haptic Gloves and VR
Members: Scott Kuang (lead), Henry Mack, Emily Portales, Anthony Capponi
Tech stack:
-
SenseGloves Nova 2 Gloves
-
VIVE Focus 3 VR headset
-
VIVE Focus 3 wrist trackers
-
Companion computer
-
Unity 3D game engine
-
SteamVR
-
VIVE Business Streaming
The Mixed Emerging Technology Integration Lab collaborated with the Oak Ridge Enhanced Technology and Training Center to produce a virtual glovebox training simulation using haptic feedback gloves, 3D software and virtual reality (VR) headset. The virtual glovebox simulation has been designed with both single/multiplayer scenarios in to provide enhanced training within a VR environment, involving real scenarios where sensitive or complex applications are performed. Interactions with objects inside the virtual glovebox provide the user with immersive physical feedback through a set of haptic gloves. This enables the user to experience sensations akin to interacting with a tangible physical object in a safe and economical simulation.

XR Symposium 2026





SenseGlove and Sony Spatial Reality Display Project
Members: Anthony Capponi (lead), Scott Kuang, Henry Mack
Tech stack:
-
SenseGlove Nova 2 Gloves
-
VIVE Trackers 2.0
-
VIVE Base Stations
-
Sony Spatial Reality Display ELF-SR2
-
Companion computer
-
SteamVR
-
Unity 3D game engine
-
Sony SRD Unity packages
-
SenseCom
The goal of this project is to facilitate learning and user experience using SenseGlove Nova 2 Gloves with a small-scale digital twin workspace projected from a stereoscopic Sony Spatial Reality Display (Sony SRD). In this condensed digital twin workspace, users move objects around and place them in canisters using haptic gloves. The SRD tracks the user’s eyes, allowing the some freedom of motion in the digital twin environment, while the SenseGloves allow grabbing capabilities and haptic / tensile resistance to simulate handling the objects.

xAPI INTEGRATION IN UNITY

Members: Vivi Lazo (leader), Aurela Broqi
Tech stack:
-
Unity 6.4
-
xAPI(C# TinCan Library)
-
SCORM Cloud Learning Record Store (LRS)
-
Python 3.12 + Flask
-
Blackboard Learning Management System (LMS)
The purpose of this project is to implement the sending of xAPI statements to a Learning Record Store (LRS) with Unity. Certification with Glovebox participants requires examiners to submit results manually. This project aims to automate this by integrating the Glovebox project and an LRS, acting as boilerplate code that can be incorporated into any Unity project requiring a connection to an LRS. This involves building a library to send xAPI statements in Unity and building a server that connects the LRS to the LMS. Statement data from the Glovebox project is sent to an LRS, and a completion status would be sent from the LRS to an instance of a Blackboard Learning Management System (LMS).


Integration of Haptic Gloves with Holographic Display Table

Members: Scott Kuang (lead), Armando Rodriguez
Tech stack:
-
Avalon Holographics NOVAC holographic table
-
Companion computer
-
SenseGlove Nova 2 Gloves
-
VIVE Trackers 3.0
-
VIVE Base Stations
-
Unity 3D game engine
-
SteamVR
-
SenseCom
The Mixed Emerging Technology Integration Lab collaborated with the Oak Ridge Enhanced Technology and Training Center to integrate 3D holographic objects and manipulate them on a NOVAC holographic display table from Avalon Holographics using haptic gloves. The holographic display can render various 3D environments, such as tabletop military exercises, natural disasters, and other geospatial applications. The platform can be extended to simulations or strategic sandbox directives to help plan and train personnel, with the haptic gloves being used to manipulate objects and actors within a focused environment, facilitating multimodal interaction with the holographic display.





Speech to Text Scene Switcher with Integrated AI Model Pipeline for Holographic Display Table
Members: Armando Rodriguez
Tech stack:
-
Avalon Holographics NOVAC holographic table
-
Companion computer
-
Unity C sharp
-
Unity Mirror
-
System.Net
-
Raydiance
-
Python
-
RealtimeSTT
-
Socket
-
Meshy API
The project allows users to give speech commands to the holotable to switch between different Unity scenes in real time. The scene switcher manages the Raydiance setup needed for a scene, so that non-Raydiance scenes without a camera attached can still be used. Additionally, an AI API has been integrated to allow users to ask for a meshy.AI generated model that will be created in real time and inserted into the project that is running. This command uses the keyword “insert” and allows for users to say the keyword “fast” which will have the AI model return a non-textured model that a custom Unity script will fill in resulting in a white/beige model with all other details intact.
