NVIDIA ACE platform rolls out, bringing cutting-edge AI tools to the cloud and PCs

NVIDIA has just launched its suite of Digital Human Microservices, making the NVIDIA ACE platform generally available for cloud services and offering early access for RTX AI PCs. This development marks a significant leap forward in the creation and operation of lifelike digital humans, with applications spanning customer service, gaming, and healthcare. Notable companies like …

NVIDIA ACE platform rolls out, bringing cutting-edge AI tools to the cloud and PCs Read More »

NVIDIA has just launched its suite of Digital Human Microservices, making the NVIDIA ACE platform generally available for cloud services and offering early access for RTX AI PCs. This development marks a significant leap forward in the creation and operation of lifelike digital humans, with applications spanning customer service, gaming, and healthcare. Notable companies like Dell Technologies, ServiceNow, and Perfect World Games are already integrating these advanced technologies.

The ACE platform includes a comprehensive set of tools designed to enhance the realism and functionality of digital humans:

  • NVIDIA Riva for automatic speech recognition, text-to-speech conversion, and translation,
  • NVIDIA Nemotron for language understanding and contextual response generation,
  • NVIDIA Audio2Face™ for realistic facial animations based on audio tracks,
  • NVIDIA Omniverse RTX for real-time, path-traced realistic skin and hair rendering.

In addition to these established tools, NVIDIA has announced new technologies:

  • NVIDIA Audio2Gesture™ for generating body gestures based on audio tracks, set to be available soon,
  • NVIDIA Nemotron-3 4.5B, a small language model (SLM) purpose-built for low-latency, on-device RTX AI PC inference, now in early access.

NVIDIA’s CEO, Jensen Huang, emphasized the transformative potential of these technologies, stating, “Digital humans will revolutionize industries.” Huang highlighted that breakthroughs in multi-modal large language models and neural graphics are paving the way for more natural human-computer interactions.

With the deployment of NVIDIA ACE microservices, developers can now bring digital humans to the installed base of 100 million RTX AI PCs and laptops. The new NVIDIA AI Inference Manager software development kit simplifies this process by preconfiguring PCs with necessary AI models, engines, and dependencies, seamlessly orchestrating AI inference across both PCs and the cloud.

The ACE platform is already making a significant impact:

  • Aww Inc., a leading virtual human company, is leveraging ACE Audio2Face microservices to enhance real-time animation and user interaction,
  • Perfect World Games is using ACE in its new mythological wilderness tech demo, allowing players to interact with AI-driven non-playable characters (NPCs) in multiple languages,
  • Inventec is integrating ACE Audio2Face into its VRSTATE platform to improve virtual consultations in healthcare, providing a more engaging experience for patients,
  • ServiceNow showcased ACE NIM in a generative AI service agent demo, illustrating the potential for digital avatars to enhance interactions across various industries.

Additionally, NVIDIA’s art teams used ACE to create a “digital Jensen” avatar for COMPUTEX 2024, demonstrating the platform’s capabilities with generative AI tools like Synthesia and Hour One.For more information on NVIDIA ACE and its applications, watch Jensen Huang’s COMPUTEX keynote or visit the NVIDIA News Center.

Leave a comment on which of these technologies you would like us to explore in-depth next.

Additional information: https://nvidianews.nvidia.com/news/digital-humans-ace-generative-ai-microservices


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top