Nvidia creates a realistic metaverse

Software and hardware maker Nvidia has unveiled an update to Omniverse, a collaboration platform for 3D design and real-time scalable simulation. These innovations will make it easier to create a more realistic metaverse and digital avatars.

The updates were announced at the SIGGRAPH event. Tools on display include new AI capabilities, simulations, and other creative tools.

“New features allow users to create physically accurate digital doubles and realistic avatars, as well as redefine the way they create and perceive virtual worlds,” the message says.

Specific updates include the creation of accurate facial animations, avatar emotion management in Audio2Face, and improved legacy features that allow 3D virtual worlds to obey the laws of physics in PhysX.

Nvidia creates a realistic metaverse

During the presentation at the SIGGRAPH event, the Nvidia Omniverse Avatar Cloud Engine (ACE) was introduced. The platform offers a suite of cloud-based AI models for developers of games, chatbots, digital twins, and virtual worlds to create and deploy interactive avatars.

“With Omniverse ACE, developers can build, customize, and deploy their avatar apps on virtually any engine, in any public or private cloud,” Nvidia added.

Meanwhile, Nvidia is no stranger to Web3-related technology. Nvidia’s line of mining chips generates a significant portion of the company’s revenue.

 508 total views,  1 views today

Leave a Comment