We may earn a commission when you buy through links in our articles. Learn more.

New Nvidia AI demo gives NPCs even more reasons to complain at you

Thanks to integrated AI character creation from InWorld AI, Nvidia's updated ACE API can give game characters comprehensive backstories.

nvidia ace inworld ai demo tae

Nvidia has just demoed the latest addition to Nvidia ACE, the company’s interface for making it easy for game developers to create AI-generated NPCs. With the addition of the Inworld Engine from Inworld AI, developers can now quickly and easily add personality traits and backstories to their characters, allowing these features to inform how the character interacts with the player.

Nvidia has previously demonstrated similar tech powered by Convai AI, which it demonstrated at CES 2024. I had the chance to interact with that demo and was impressed by how well the demo could hear my voice, interpret my responses, and make the character interact in a natural feeling way. This was all powered locally using, of course, the best graphics card choices from Nvidia.

YouTube Thumbnail

While the Convai AI demo concentrated on the generally natural interactions of AI characters with each other and based on prompts from the player character, the emphasis with the new Inworld AI demo is on how characters can be given characteristics that inform their personality. You can define the character’s mood, knowledge, goals, relationships, and more, all presented with natural language. For instance, the demo character of Tae is described in part by the following sentences:

“{character} was born and raised in Highlandtown, Baltimore – a neighborhood with a historically significant Korean immigrant community.

{character}’s grandparents were among the wave of Koreans who settled in the Highlandtown, Baltimore area during the 1970s and 80s, drawn by opportunities and a desire to build a better life.

While neither Alex nor Jin-Soo (the character’s friends) choose to continue post-secondary education, {character} surprised his family by enrolling in the hospitality program at a nearby Baltimore community college.”

Where Nvidia ACE comes into the equation is that it already provides the voice recognition, text-to-speech, and voice-to-facial movement technology to bookend and facilitate the character interactions.

Annoyingly, the demo doesn’t show any example demonstrations of interactions with Tae, or generally any examples of how all this information is used to inform NPC responses. There’s just one character giving three different responses to a scenario shown, but the context of the interaction isn’t given.

Still, it certainly goes to show that there should at least be some positives found in the march of AI for game players, with more natural interactions with NPCs.