AU Class
AU Class
Class - AU

Using AI-enabled speech control to increase immersion for XR design review

Share this class
Search for keywords in videos, presentation slides and handouts:

    Description

    In using immersive systems such as VR, AR, and MR for design review a key attribute of the system is a seamless interaction with at-scale, realistic digital models. Autodesk VRED is a well-accepted manufacturing design review application offering close to truth photorealism and a highly immersive XR interface. In using AI-enabled speech control with VRED the level of immersion is increased allowing a user direct interaction with the digital model without the need of scene-occluding GUIs and also allowing the user to naively interact with VRED enabling more users to perform unassisted design reviews in XR. Project Mellon is NVIDIA's internal speech enablement project that uses the state-of-the-art Riva ASR with a prototype dialogue manager and 0-shot learning NLP to achieve a developer friendly integration of AI-enabled speech. In this session we will show Mellon/VRED and discuss how Mellon is used to easily update command lists without the need of extensive NLP training.

    Key Learnings

    • Understand how speech was used to drive immersion with VRED in XR
    • How NLP uses an architecture of intents and slots to understand system commands and command variables
    • How Artificial Intelligence is used in Automated Speech Recognition (DM) and Natural Language Processing (NLP) models
    • How a unique user experience can be built using variant sets in VRED combined with NVIDIA's Project Mellon