Ai Assistant

VUI for VR medical simulation

Role

IxD, UI, UX

Duration

June 2025

Tools

ShapesXR, Figma, UMG

Team

Asif Moideen

Introduction

Problem

The current VR applications by i3 Simulations for medical training rely on UI such as menus and modals to perform actions, enquire or instruct other participants (Human/AI) to perform actions. During a usability study, users frequently reported errors of "assumed functionality," expecting voice control to be a feature when it was not. This disconnect between user expectations and the product's capabilities leads to frustration and a less intuitive training experience.

Goal

Design an interaction system for AI interactions in simulation environment using VUI.


Enhance Usability: Provide a more intuitive and hands-free interaction method.

Meet User Expectations: feature integration that aligns with users' mental models.

Communication: Natural language commands for simulating role based communication in emergency scenarios.

Research

User research

The core finding was that a significant "gulf of execution" exists in the current system, where users assume certain features, like voice control, will be available when they are not. This qualitative finding points to a disconnect between the users' mental models and the product's design

Analysis of previous designs

Contextual menus

Configuration UI

Tools and devices

Communications

In previous designs actions are performed through Menus that are contextual to items in the virtual emergency room, this does not always recreate a real-life scenario, since tasks may be related to multiple tools and actions and users have to recall where to begin.

Identifying roles

AI assistant

Nurse AI

Participant AI

Caregiver AI

The roles were divided into two categories based on how a medical simulations are performed. An AI assistant as an interface for actions performed by the user and NPC AI for simulating the roles of Nurse, caregiver, other participants/teammates.

Ideation

Agentic system

A high level design of the agentic system was created to enable use of VUI in multiplayer scenarios. The intent is to enable users to execute actions via an AI assistant without disrupting other participants in multiplayer scenarios. Furthermore users can interact naturally with Nurse AI, Participant AI, Caregiver AI etc. intuitively.

VUI Flow

To ensure feedback at each stage, visual & sound cues, responses as well as various states for cancellation, error recovery were mapped.

Layout

To provide visual context to the VUI, a GUI was ideated to display system status, transcriptions, responses and reduce ambiguity. Different UI layouts were play tested in VR and Fig3 was selected to provide a less obstructive but accessible format.

Fig.1

Fig.2

Fig.3

Fig.4

Prototypes

Detailed interactions were prototyped and playtested for implementation in Unreal Engine.

Status for Listening

Realtime feedback for error correction

Disabled microphone

Misclick/improper use of push to talk

Status for processing

Server & system errors

AI generated response

User feedback request

Disambiguation

Scope

Suggestions

Implementation

Currently implementing in new simulation scenarios : After receiving positive feedback from end user representatives, this system is currently being developed.