Nvidia bangs its fist on the metaverse table: discover the new tools for developers – LeBigData.fr

In order to stimulate the development of the metaverse and to make it more realistic, Nvidia has just unveiled new tools intended for the creators of virtual worlds. On the program: realistic and emotional avatars, embodied virtual assistants, physical simulations or even a 3D internet…

Since the beginning of summer 2022, thethe hype around the metaverse seems to be fading. The market of virtual real estate is in free fall, and the creator of Ethereum recently estimated that Mark Zuckerberg’s project is doomed.

Nevertheless, one of the major players in the metaverse industry don’t hear it that way. the graphics card manufacturer Nvidia has not said its last word, and intends to rekindle the enthusiasm around the metaverse.

On Tuesday, August 9, 2022, at the Siggraph conference in Vancouver, Nvidia Omniverse announced a series of new tools for creators and developers of virtual worlds. The objective is to make metaverse environments more realistic.

The firm notably unveiled new artificial intelligence tools and simulations. Users of the Omniverse Kit andtools like Nucleus, Audio2Face and Machinima will be able to take advantage of new creative possibilities.

According to Nvidia, these tools will mainly improve the construction of ” look-alike digital twins and realistic avatars “.

Omniverse Avatar Cloud Engine (ACE): a tool for creating realistic 3D avatars

The quality of interactions in the metaverse is a subject of debate. Developers and users are still looking for the right balance between quantity and quality of experience.

For example, in the spring of 2022, during the first Fashion Week organized in the metaverse, most of the participants deplored the poor quality of the digital environments, the virtual clothes and especially the avatars.

With the Avatar Cloud Engineer (ACE) included in the new Omniverse toolkit, Nvidia intends to solve this problem. This engine will make it possible to improve the conditions for creating ” virtual assistants and digital humans “.

According to the firm, with Omniverse ACE, developers can build, configure, and deploy avatar applications on almost any engine, on any public or private cloud “.

As Simon Yuen, Senior Director of Avatar Technology explains, ACE is built on a 3D framework including the human skeleton and muscles. Just upload a photo to create a 3D model of the person.

Audio2Face transcribes the emotions on the faces of avatars

The Audio2Face application also benefits from an update focused on digital identity. This technology also allows associate the expressions of an avatar with the words it pronounces.

In addition, the Audio2Emotion tool will now allow to change the facial expression of an avatar according to the emotion conveyed by the words of the user.

According to Nvidia’s official press release, users can now control the emotions of digital avatars, including their facial animations. The developers may choose to accentuate emotions or on the contrary to attenuate them.

Virtual assistants with artificial intelligence

The ACE engine will also create digital assistants embodied by avatars for the metaverse. The CEO of Nvidia, Jensen Huang, wants to allow the creation of real robots capable of perceiving their environments, exploiting their knowledge, and acting accordingly.

According to him ” avatars will populate virtual worlds to help us create and build thingswill be the brand ambassadors and customer service agents, and will help you find information on a websiteto place an order at the drive-thru, or will recommend insurance “.

According to Technavio analysts, the metaverse market is expected to surpass $50 billion valuation by 2026. According to its predictions, the technology should therefore take off and attract new users.

Already today, the virtual world is developing and we see the appearance of new events, offices or even classrooms. As it becomes populated, this digital universe will attract more and more users.

These latter will seek to create a digital version of themselves. The development of tools to support this massive adoption of the metaverse is therefore essential.

Universal Scene Description: Towards a 3D Internet?

In the eyes of Nvidia’s CEO, the metaverse is “ the next evolution of the internet “. He affirms that ” the metaverse is the internet in 3d, a network of persistent and connected virtual worlds. The metaverse will expand 2D pages into 3D spaces and worlds. L’hyperlinking will evolve into hyperjuming between 3D worlds “.

However, for this prediction to come true, a standard allowing developers to create HTML-like metaverses for web pages will be needed. This is why Nvidia supports the development of the Universal Scene Description format originally created by the Pixar animation studio.

Nvidia’s ambition is help develop the USD format so that it can support changing and complex 3D worlds, not just animated movie scenes.

Other companies supporting this new standard include: Apple, Epic Games, Autodesk, BMW, Disney, Industrial Light and Magic and Dreamworks.

Another tool unveiled by Nvidia is PhysX: a physical simulation engine realistic in real time. Developers will be able to use it to add physics-based reactions to the metaverse.

Since the emergence of the metaverse, Nvidia is one of the precursors of this new industry. Its artificial intelligence technology is widely used by creators, and these new tools for developers will further strengthen the influence of the firm.

We would like to give thanks to the writer of this write-up for this outstanding content

Nvidia bangs its fist on the metaverse table: discover the new tools for developers – LeBigData.fr


You can view our social media profiles here , as well as other related pages herehttps://yaroos.com/related-pages/