News

AI-powered NPCs: NVIDIA ACE for Games revolutionizes interactions in games

Imagine a game where you could have intelligent, unscripted, dynamic conversations with non-player characters (NPCs) with persistent personalities that change over time, as well as precise facial animations and expressions, all in your language. maternal.

Generative AI technologies make this possible, and at COMPUTEX 2023 we got to experience the future of NPCs with the NVIDIA Avatar Cloud Engine (ACE) for Games. NVIDIA ACE for Games is a custom AI modeling service that aims to transform games by bringing intelligence to non-player characters (NPCs) through natural AI-powered language interactions.

“Generative AI has the potential to revolutionize the interactivity players can have with game characters and dramatically increase immersion in games”, said John Spitzer, vice president of development and performance technology at NVIDIA. “Leveraging our expertise in AI and decades of experience working with game developers, NVIDIA is at the forefront of using generative AI in games. »

Middleware, tool, and game developers can use NVIDIA ACE for Games to build and deploy custom AI models for speech, conversation, and animation in their software and games, across the cloud and PCs .

Optimized base AI models include:

  • NVIDIA NeMo, which provides base language models and model customization tools so developers can further refine models for game characters. character-specific personalities that fit into the developer's game world. Developers can better align player interactions within the context of a scene through programmable rules for NPCs with NeMo Guardrails.
  • NVIDIA Riva, which offers automatic speech recognition (ASR) and text-to-speech (TTS) capabilities to enable live voice chat with NVIDIA NeMo.
  • NVIDIA Omniverse Audio2Face, which instantly creates expressive facial animation for game characters from an audio source. Audio2Face offers Omniverse connectors for Unreal Engine 5, which allows developers to directly add facial animation to MetaHuman characters.

How NVIDIA ACE For Games works

To showcase the power of NVIDIA ACE for Games and preview how developers will build NPCs in the future, NVIDIA has partnered with Convai, an NVIDIA Inception startup that is building a platform to create and deploy AI characters in games and virtual worlds, to optimize and integrate ACE modules into their platform.

"With NVIDIA ACE for Games, Convai's tools can achieve the latency and quality needed to make non-playable AI characters available to nearly any developer in a cost-effective way", said Purnendu Mukherjee, founder and CEO of Convai.

The Kairos demo (video above) used NVIDIA Riva for speech-to-text recognition and text-to-speech synthesis, NVIDIA NeMo to power conversational AI, and Audio2Face for face animation based on AI from voice inputs. These modules were seamlessly integrated into Convai's service platform and fed into Unreal Engine 5 and MetaHuman to bring Jin to life.

Jin and his ramen restaurant scene were created by the NVIDIA Lightspeed Studios art team and rendered entirely in Unreal Engine 5, using NVIDIA RTX Direct Illumination (RTXDI) for ray-traced lighting and shadows, as well than DLSS for the highest possible refresh rates and image quality.

With the introduction of NVIDIA ACE for Games and its generative AI technologies, the future of NPCs in video games looks bright. Players will now be able to enjoy smarter and more immersive interactions with virtual characters featuring evolving personalities, realistic dialogue and precise facial expressions. This groundbreaking breakthrough opens up new vistas for future games, delivering more engaging gaming experiences and deeper integration of NPCs into virtual universes. Generative AI thus propels NPCs to new horizons in the video game industry.