enterprise

Nvidia releases plugins to improve digital human realism on Unreal Engine 5



Nvidia released its latest tech for creating AI-powered characters who look and behave like real humans.

At Unreal Fest Seattle 2024, Nvidia released its new Unreal Engine 5 on-device plugins for Nvidia Ace, making it easier to build and deploy AI-powered MetaHuman characters on Windows PCs. Ace is a suite of digital human technologies that provide speech, intelligence and animation powered by generative AI.

Developers can now access a new Audio2Face-3D plugin for AI-powered facial animations (where lips and faces move in sync with audio speech) in Autodesk Maya. This plugin gives developers a simple and streamlined interface to speed up and make easier avatar development in Maya. The plugin comes with source code so developers can dive in and develop a plugin for the digital content creation (DCC) tool of their choice.

Lastly, we’ve built an Unreal Engine 5 renderer microservice that leverages Epic’s Unreal Pixel Streaming technology. This microservice now supports Nvidia Ace Animation Graph microservice and Linux operating system in early access. Animation Graph microservice enables realistic and responsive character movements and with Unreal Pixel Streaming support, developers can stream their MetaHuman creations to any device.


Join us for GamesBeat Next!

GamesBeat Next is almost here! GB Next is the premier event for product leaders and leadership in the gaming industry. Coming up October 28th and 29th, join fellow leaders and amazing speakers like Matthew Bromberg (CEO Unity), Amy Hennig (Co-President of New Media Skydance Games), Laura Naviaux Sturr (GM Operations Amazon Games), Amir Satvat (Business Development Director Tencent), and so many others. See the full speaker list and register here.


Nvidia is making it easier to make MetaHumans with ACE.

The Nvidia Ace Unreal Engine 5 sample project serves as a guide for developers looking to integrate digital humans into their games and applications. This sample project expands the number of on-device ACE plugins:

  • Audio2Face-3D for lip sync and facial animation
  • Nemotron-Mini 4B Instruct for response generation
  • RAG for contextual information

Nvidia said developers can build a database full of contextual lore for their intellectual property, generate
relevant responses at low latency and have those responses drive corresponding MetaHuman facial animations seamlessly in Unreal Engine 5. Each of these microservices were optimized to run on Windows PC with low latency and minimal memory footprint.

Nvidia unveiled a series of tutorials for setting up and with the Unreal Engine 5 plugin. The new plugins are coming soon and to get started today, ensure devs have the appropriate Nvidia Ace plugin and Unreal Engine sample downloaded alongside a MetaHuman character.

Autodesk Maya offers high-performance animation functions for game developers and technical artists to create high-quality 3D graphics. Now developers can generate high-quality, audio-driven facial animation easier for any character with the Audio2Face-3D plugin. The user interface has been streamlined and you can seamlessly transition to the Unreal Engine 5 environment. The source code and scripts are highly customizable and can be modified for use in other digital content creation tools.

To get started on Maya, devs can get an API key or download the Audio 2Face-3D NIM. Nvidia NIM is a set of easy-to-use AI inference microservices that speed up the deployment of foundation models on any
cloud or data center. Then ensure you have Autodesk Maya 2023, 2024 or 2025. Access the Maya ACE Github repository, which includes the Maya plugin, gRPC client libraries, test assets and a sample scene — everything you need to explore, learn and innovate with Audio2Face-3D.

Developers deploying digital humans through the cloud are trying to simultaneously reach as many customers as possible, however streaming high fidelity characters requires significant compute resources. Today, the latest Unreal Engine 5 renderer microservice in Nvidia Ace adds support for the Nvidia Animation Graph Microservice and Linux operating system in early access.

Animation Graph is a microservice that facilitates the creation of animation state machines and blend trees. It gives developers a flexible node-based system for animation blending, playback and control.

The new Unreal Engine 5 renderer microservice with pixel streaming consumes data coming from the Animation Graph microservice, allowing developers to run their MetaHuman character on a server in the cloud and stream its rendered frames and audio to any browser and edge device over Web Real-Time Communication (WebRTC).

Devs can apply for early access to download the Unreal Engine 5 renderer microservice today. You can more about Nvidia Ace and download the NIM microservices to begin building game characters powered by generative AI.

Developers will be able to apply for early access to download the Unreal Engine 5 renderer microservice with support for the Animation Graph microservice and Linux OS. The Maya Ace plugin is available to download on GitHub.…



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.