News Technologies 03-19-2024 at 18:42 comment views icon

Nvidia AI can turn game characters into chatbots – NPCs with «unique» interaction

author avatar
https://itc.ua/wp-content/uploads/2022/04/ad81c83e9fbf757ce8a90d0eb41dee5b-96x96.jpeg *** https://itc.ua/wp-content/uploads/2022/04/ad81c83e9fbf757ce8a90d0eb41dee5b-96x96.jpeg *** https://itc.ua/wp-content/uploads/2022/04/ad81c83e9fbf757ce8a90d0eb41dee5b-96x96.jpeg

Vadym Karpus

News writer

During the Game Developers Conference, Nvidia showed how developers can use its artificial intelligence tools «digital human» to voice, animate, and create dialogues for video game characters.

Thus, Nvidia showed a video of the Covert Protocol technical demonstration. It clearly demonstrates how its AI tools can allow NPCs to react in unique ways to player interactions, generating new responses that match the gameplay.

In the demo, the player takes on the role of a private detective, completing objectives based on conversations with NPCs managed by artificial intelligence. Nvidia claims that each playthrough is «unique» as real-time player interaction leads to different game outcomes. John Spitzer, Nvidia’s vice president of developer and product technology, says that the company’s AI technology can create the complex animations and conversational speech needed to make digital interactions feel real».

The Covert Protocol demo doesn’t show how effective these AI-based NPCs are for real-world gameplay, instead showing a selection of clips of NPCs delivering different voice chains. The voice delivery and lip sync animations look robotic, as if a real chatbot were talking to you through the screen.

Covert Protocol was created in collaboration with gaming startup Inworld AI and using Nvidia Avatar Cloud Engine (ACE) technology. Inworld plans to release the Covert Protocol «source code in the near future» to encourage other developers to utilize ACE’s digital human technology. Inworld also announced a partnership with Microsoft in November 2023 to help develop Xbox tools for AI-powered character, story, and quest creation.

Nvidia also demonstrated the Audio2Face technology. It was shown in the video of the upcoming MMO World of Jade Dynasty, which demonstrated the synchronization of character lips with English and Chinese. The idea is that Audio2Face will make it easier to create games in multiple languages without manually re-animating characters. Another video from the upcoming action movie Unawake demonstrates how Audio2Face can be used to create facial animation during cinematics and gameplay.

Source: The Verge


Loading comments...

Spelling error report

The following text will be sent to our editors: