How with AI Google continues to improve human-computer interactions – L’Éclaireur Fnac

On the occasion of its Google I/O, the American group lifted the veil on multiple advances in artificial intelligence distilled within its flagship applications.

During its high mass held on May 11 and 12, the Mountain View firm presented its Pixel 6a and Pixel Buds Pro. The giant also took advantage of the event to lift the veil a little on its future Pixel Watch, its Pixel 7, its Pixel Tab, as well as a augmented reality glasses prototype which makes it easier to converse with peers.

Initially dedicated to developers, the Google I/O was an opportunity for Sundar Pichai’s group to exhibit a whole series of new features, which will soon be available within its services. The juggernaut insisted on certain advances in artificial intelligence (AI), including translation. Its tools have indeed evolved significantly over the past few years, with the integration of automatic and real-time translation of YouTube content, the transcription of dialogues and above all the support for an ever-increasing number of languages ​​in the Google Translate solution. The optimization of artificial intelligence at the heart of collaborative work functionalities, the voice assistant or even online research were also at the heart of the annual conference.

An automatic summary powered in Google Docs and Google Chat

Emails and instant conversations are an integral part of the daily lives of professionals. Between meetings and tasks that multiply in the agenda, it can sometimes be difficult to find the time (and the courage!) to immerse yourself in a long report or meeting minutes. Google Docs should soon help users save valuable time, while allowing them to keep track of essential information. With the support of artificial intelligence, Google Docs will indeed soon offer an automatic summary. A way to quickly go through the main topics covered in a document, before going through it in more detail later. What limit the risk of data loss, a major issue today for many companies.

This feature will also be available in the Google Chat and Spaces apps. A quick and efficient way to discover the key ideas that have been exchanged, without having to reread dozens or even hundreds of lines of discussion.

Chat more naturally with Google Assistant

Look and Talk: the days of saying “Ok Google” are over, one look is enough to trigger the voice assistant©Google

The voice assistants invite themselves everywhere. They make it possible to control smartphones, but also a large number ofconnected objects of the House. In the long run, it can be tiring to have to systematically verbalize “Ok Google” to introduce a new request. Also, to streamline interactions with the voice assistant, and further erase the human-machine interface, Google is announcing the “Look and talk” function for its Nest Hub Max smart speaker.

Thanks to this new feature, all you have to do is simply look at the speaker and then state your request (“what is the weather today”). To do this, the camera and the microphone of the device launch facial and/or voice recognition to ensure that it is indeed an authorized user. The sensors then analyze a hundred parameters such as the inclination of the head and the movement of the lips to detect whether the person wishes to issue a voice command.

Quick phrases also on the program for frequent commands
Quick phrases also on the program for frequent commands©Google

In addition, still with a view to offering an alternative to the “Ok Google” catchphrase, but also to allow the machine to better understand the subtleties of human language, the voice assistant of the Pixel 6 and Nest Hub Max will integrate the option “Quicktalk”. As the American publisher specifies, these are pre-programmed short sentences corresponding to the most common requests such as “turn on the hallway lights” or “set a timer for 10 minutes”.

Initially, only the United States will be able to benefit from these advances. The group has not yet communicated an international deployment date.

The future of visual search is shaping up

In addition to these two new daily practical features, the AI ​​will come to the reinforcement of the visual search, through Lens. It will be possible to find a nearby restaurant on the basis of a photographed dish, or even a store offering a specific item. All you have to do is add “near me”, so that the functionality finds the part available in stock around you, or even the nearest restaurant where you can taste the dish.

Finally, Google is becoming more inclusive and supports a wider range of skin tones, hair color and texture. The queries are thus refined and each specificity better referenced for more relevant results.

We would love to say thanks to the author of this post for this amazing content

How with AI Google continues to improve human-computer interactions – L’Éclaireur Fnac


Find here our social media profiles as well as other pages related to it.https://yaroos.com/related-pages/