Google I/O 2022: Search, Maps, Translate, Meet, YouTube, all the news – CNET France


Before mentioning Android 13 and maybe the Pixel 6athe PixelWatch and others, Google unveiled new features for Maps, YouTube, Meet and Search.

A massive update for Google Translate

Sundar Pichai, CEO of Google, indicated that Google is now using artificial intelligence to facilitate translation. The company adds 24 new languages ​​to Google Translate.

Immersive View for Maps

Google Maps uses computer vision to increase the number of buildings on a map.

There is progress in 3D mapping. For example, a user can watch a 3D view of London, see weather and traffic. Google has also introduced more eco-friendly routes, which help users save fuel.

Google IO 2022 Search Maps Translate Meet YouTube all the

3D view of London in Google Maps

It is also possible to see the interior of buildings like a restaurant thanks to drones.

YouTube updates and delivering accurate information from Ukraine

Updated chapters will help users find the right part of a YouTube video. Using voice recognition models, it will automatically generate chapters and add auto-translated captions.

Google adds Ukrainian language to auto-translated captions to bring more accurate information from YouTubers to viewers around the world.

TL;DR

Google offers an auto-summary feature which, using artificial intelligence, allows users to get a summary (TL;DR) of a large document. This feature will also apply to Google Chat and allow users to get a summary of a long conversation.

Improved image quality on Google Meet

Google Meet’s audio and video quality will be enhanced through Project Starline, first announced at Google I/O 2021. Studio-grade virtual lighting will also make an appearance in Google Meet, adding a glowing face filter of the user to make it more alive.

Understanding natural language with Google Multisearch

Google unveiled Multisearch earlier this year. This service lets you take a photo, search with that photo, and vary that search with text. For example, if you take a photo of a red dress, you get similar results, but you can add “green” to get results for that dress in another shade.

Google is adding a “near me” feature that lets you take a photo and see if that item is available nearby. So if you’re looking for a specific part to repair a sink, you can take a picture of that part and see if it’s for sale near you.

1652292767 290 Google IO 2022 Search Maps Translate Meet YouTube all the

Scene exploration is a powerful new search tool. You can take a picture of a scene and get relevant information. For example, if you take a picture of a store shelf full of candy bars, you can point your camera up to see which bars have the best reviews, which will help you choose the best one instantly, instead of search for each of them individually.

1652292767 744 Google IO 2022 Search Maps Translate Meet YouTube all the

Better representation of skin tones

Google will use the Monk Skin Tone Scale, developed by Dr. Ellis Monk, to better represent skin tone in photos and searches. When searching for makeup, users will be able to filter based on skin tone.

Google says it’s making its skin tone guide open source, allowing all companies and researchers to use the technology. The goal is to improve it over time with the rest of the industry. Those interested can visit skintone.google.

1652292767 725 Google IO 2022 Search Maps Translate Meet YouTube all the

Skintone is open source

LaMDA 2: Google’s most advanced conversational AI

Google developed LaMDA to help search understand what people are looking for. With LaMDA 2, the conversational model can better keep the AI ​​on track and keep delivering relevant information. During a live demonstration, to the question “I want to plant a vegetable garden”, LaMDA was able to give not only relevant information, but also additional advice that the Internet user might not have thought of. Google was able to show tips the tester had never seen before.

On stage, Sundar Pichai acknowledged that the AI ​​could develop offensive or inaccurate responses. Pichai wants to work with users and software developers to improve LaMDA.

AI helps make research smarter

Using the chain of thought, the PALM AI model developed by Google allows developers to see how AI responds to questions.

More interestingly, Mr. Pichai showed that the AI ​​model was able to work in multiple languages. For example, when searching for the best pizza in New York in Bengali, the model might search queries in English and give the correct information to someone from Bangladesh.

This article will be regularly updated.

Source: CNET.com

We would love to give thanks to the writer of this post for this amazing material

Google I/O 2022: Search, Maps, Translate, Meet, YouTube, all the news – CNET France


You can find our social media profiles here as well as other related pages herehttps://yaroos.com/related-pages/