You may know thispersondoesnotexist.com, a site that generates fake selfies based on artificial intelligence from NVIDIA. Google is working on a similar but much more advanced concept called Imagen. Its operation is simple: you enter a description of a few words, and the AI takes care of concocting an image for you. The company released a few examples in a blog post, and the result is stunning. Judge by yourself :
Google is not the first in this segment: the software already exists SLAB, a second version of which was released last month and developed by OpenAI. According to the Mountain View firm, its tool is more powerful. She had fun comparing her results with those of DALL-E, and her study shows that human evaluators clearly prefer Imagen to other methods.
If these results are impressive, it is however necessary to qualify: the teams undoubtedly chose the best results and omitted to relay the blurred images or beside the plate. DALL-E for example struggling with the negations (“a bowl of fruit without apples”), the faces or even with the texts. Google offers on its site a little demowhich allows playing with a limited version of the AI with only a few usable words.

It must be said that the sometimes impressive results obtained thanks to this AI do not encourage leaving its code open source at a time when the fake news are shared in turn. ” Potential Risks of Misuse Raises Concerns About Responsible Opening of Code and Demos “, specify the teams of Google.
In addition, the researchers explain that they fed their algorithm with a large amount of unsorted data from the web. In other words, they ingest just about anything, whether it’s pornography or hateful content. ” These datasets tend to reflect social stereotypes, oppressive viewpoints, and derogatory, or otherwise harmful, associations to marginalized identity groups. “, specifies the text.

In addition, the AI would have a general bias in favor of generating images of people with lighter complexions as well as a ” tendency to align images depicting different professions with Western gender stereotypes “. Imagen’s competitors have the same concerns: DALL-E tends to represent aircraft flight attendants as women, and CEOs as men.
Google also points out that its AI has serious limitations when generating images of people. All of this leads the company to conclude that its product ” is not suitable for public use without additional safeguards in place “.
We want to give thanks to the writer of this short article for this amazing content
Google presents a stunning AI that transforms any text into an image
Check out our social media accounts and other related pageshttps://yaroos.com/related-pages/