NVIDIA ACE: Creating Realistic Virtual Characters with AI Animation and Voice Technology

2023-12-06

NVIDIA has updated the NVIDIA Avatar Cloud Engine (ACE) to provide new animation and speech capabilities for AI-driven virtual characters and digital humans. These enhancements focus on natural conversation and emotional expression. Developers can now access cloud APIs through the early access program for automatic speech recognition (ASR), text-to-speech (TTS), neural machine translation (NMT), and Audio2Face (A2F). These tools, combined with popular rendering tools like Unreal Engine 5, enable creators to build advanced virtual character experiences. The ACE AI animation capabilities now include increased support for A2F emotion and Animation Graph microservices for body, head, and eye movements. These new features aim to create more expressive digital humans. The new microservices facilitate rendering production and real-time inference, and the improved A2F quality enhances lip synchronization for more realistic representation of digital humans. The ACE suite now supports additional languages, including Italian, EU Spanish, German, and Mandarin, and improves the accuracy of ASR technology. The cloud APIs simplify access to Speech AI capabilities. The new Voice Font microservice allows for custom TTS output, enabling the creation of unique voice applications in various scenarios. ACE Agent, a new dialogue management and system integration tool, provides a seamless experience by coordinating connections between microservices. Developers can now integrate NVIDIA NeMo Guardrails, NVIDIA SteerLM, and LangChain for more controlled and precise responses. These updates make it easier to use these tools in different rendering and coding environments. New features include support for blendshapes in the Avatar configurator for integration with rendering engines like Unreal Engine, a new A2F application for Python users, and a reference application for developing virtual assistants in customer service.