Computex 2023: Nvidia Demonstrates Generative AI Model Avatar Cloud Engine for Realistic NPCs in Games

As generative AI models get more popular by the day, the company next in line to show off its model on the same in Nvidia. The chipmaker unveiled its Avatar Cloud Engine (ACE) at Computex 2023. It is a generative AI-based model that makes developing NPCs less labour-intensive for game developers. ACE is a technology that makes NPCs smarter by simulating better conversations in response to the player’s inputs, while making it look more realistic.

In its press release for Avatar Cloud Engine, Nvidia talks about the boon that generative AI is when developing NPCs for games. The blog post mentions, “generative AI can make NPCs more intelligent by improving their conversational skills, creating persistent personalities that evolve over time, and enabling dynamic responses that are unique to the player.” This is what Nvidia’s own Avatar Cloud Engine intends to build on. For the purpose, the company have collaborated with Convai.

What does Avatar Cloud Engine do?

Convai is working to optimise and integrate Avatar Cloud Engine modules into games, by building a “platform for creating and deploying AI characters.” Purnendu Mukherjee, founder and CEO at Convai said:

“With NVIDIA ACE for Games, Convai’s tools can achieve the latency and quality needed to make AI non-playable characters available to nearly every developer in a cost-efficient way”.

Some of the AI foundation models used for developing the ACE inlcude, Nvidia Riva for speech-to-text and text-to-speech capabilities, and Nvidia NeMo for the large language model that drives the conversation. It also includes Audio2Face for AI-powered facial animation from voice inputs.

Nvidia showcases ACE in a demo video at Computex 2023

Nvidia demonstrated the capabilities of the Avatar Cloud Engine with a video at Computex 2023. The video, which has been made on Unreal Engine 5, features MetaHuman NPC Jin and its conversation with player Kai. The Ramen shop, the location where the video is set, has been built using RTX Direct Illumination (RTXDI) and DLSS 3. Check it out here.

Game developers have already started using Nvidia’s generative AI technologies, like in case of GSC Game World’s upcoming S.T.A.L.K.E.R. 2: Heart of Chornobyl. Fallen Leaf is another indie developer, who used Audio2Face for character facial animation in its upcoming sci-fi thriller Fort Solis.

For all the latest Gaming News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TheDailyCheck is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected] The content will be deleted within 24 hours.