As part of its metaverse program — Omniverse — Nvidia has announced Avatar Cloud Engine (ACE) — a collection of cloud-native AI tools to assist metaverse developers.
With the available set of cloud-based AI models and services in Nvidia Omniverse ACE, developers can easily build and customize lifelike virtual assistants and digital humans. With Omniverse ACE, developers can build, customize, and deploy their avatar apps on virtually any engine in any public or private cloud. Also, the created avatars can understand several languages, can interact with the environment and respond to various requests.
ACE revolves around several existing software suites and frameworks. Rich software tools and APIs of Nvidia’s Unified Compute Framework provide ACE to achieve these results. This includes,
- Nvidia Riva for developing speech AI applications,
- Nvidia Metropolis for computer vision and intelligent video analytics,
- Nvidia Merlin for high-performing recommender systems,
- Nvidia NeMo Megatron for large language models with natural language understanding,
- Nvidia Omniverse for AI-enabled animation
Developers using the NVIDIA Omniverse Kit will have access to the update. The update is compatible with applications such as Nucleus, Audio2Face and Machinima.
Also, Nvidia has released an improved real-time physics simulation engine, Nvidia PhysX. Thanks to this, developers will be able to include realistic reactions to interactions in the metaverse, obeying the laws of physics.