Photos captured by Roomba vacuum cleaners leaked to the internet

⇧ [VIDÉO] You might also like this partner content (after ad)

We were talking yesterday about the fact that we are constantly monitored, through various channels. One of these channels is based on smart connected objects that have invaded domestic life. iRobot’s autonomous vacuum cleaners are not intended to collect anything other than dust from your home. And yet: several images captured by development versions of a new model of vacuum cleaner have ended up on the Internet, which raises many questions about the way iRobot manages its data.

The images in question were taken by development versions of iRobot’s Roomba J7 robot vacuum. They were sent to Scale AI, a startup that hires workers around the world to tag audio, photo, and video data, which is then used to train theartificial intelligence integrated with the company’s robots. In 2020, some of these freelancers shared a few images on private online groups (Facebook, Discord and others).

Although the storage and access controls to this data are generally stricter, journalists from MIT Technology Review managed to obtain around 15 screenshots of these private photos, which had been posted on restricted social media groups. While most only show banal domestic scenes, taken from the ground, some are rather intimate (one still shows a woman sitting on the toilet…). We also see a child lying on the ground, staring at the vacuum cleaner, and many details of the dwellings concerned (furniture, objects, decoration, etc.). Some of these images are labeled (cupboard, chair, shelf, lamp, table, etc.).

:: THE T-SHIRT THAT SUPPORTS SCIENCE! ::

Support us by buying a poster that throws:

An entire data supply chain

iRobot— being acquired by Amazon for $1.7 billion —confirmed that these images were captured by its Roomba vacuums in 2020. They were passed on to the employees responsible for annotating them, who were instructed to “remove anything they deem sensitive from any space in which the robot operates, including children,” the company adds.

Image captured by one of iRobot’s development robot vacuums. The person’s face was originally visible, but was obscured by the MIT Technology Review. © MIT Technology Review

All of the images were from in-development models, whose hardware and software improvements never made it into the robots available for purchase, according to the company. The affected devices were labeled with a bright green sticker that read “video recording in progress”. In other words, the manufacturer claims that the people who appear in the photos and videos (the “data collectors”, who are paid in exchange for their contribution) had necessarily accepted that their Roomba robot monitors them.

James Baussmann, spokesperson for iRobot, said in an email that the company had “taken all necessary precautions to ensure that personal data is treated securely and in accordance with applicable law” and that the screenshots screens obtained by MIT Technology Review are a “violation of a written nondisclosure agreement between iRobot and an image annotation service provider.”

MIT Technology Review points out, however, that iRobot refused to let them see the data collectors’ consent agreements and was unwilling to connect them with employees, with whom the reporters would have liked to discuss understanding the terms. Since then, iRobot has reportedly terminated its relationship with the service provider that leaked these images and has taken steps to prevent similar data leaks in the future, the magazine said.

These images “demonstrate the widespread and growing practice of sharing potentially sensitive data for training algorithms” and “reveal an entire data supply chain — and new points from which personal information could leak — few of which consumers are aware,” says the MIT magazine.

As Justin Brookman — director of technology policy at Consumer Reports and former director of the Federal Trade Commission’s Office of Technology Research and Investigation points out — data collectors surely didn’t imagine that real humans would review photos to annotate them. According to iRobot, the consent form indicated that “service providers” would do this.

Collect as much data as possible to model a complex environment

When an object uses artificial intelligence, its development requires a lot of data. The machine learning through which this intelligence is trained requires as much data as possible, to train the algorithm to recognize patterns. And the closer this data is to reality, the better the AI ​​will be trained. Data from real environments — like here, inside homes — is therefore very valuable. And very often, we give our consent – ​​more or less informed – for our data to be collected from our connected devices, then analyzed.

Robot vacuum cleaners today are equipped with powerful sensors. The Roomba J7 series robots, in particular, are equipped with advanced cameras and artificial intelligence, which ensure optimal navigation and cleaning. We are now a long way from the very first model of robot vacuum cleaner – the Electrolux Trilobite, marketed in 2001 – which only integrated ultrasonic sensors, shock sensors and cliff sensors.

High-end devices are also equipped with computer vision. As early as 2015, the Roomba 980 was able to map a house, adapt its cleaning strategy according to the size of the room and identify the basic obstacles to avoid. But for this vision to work best, manufacturers need to train it on high-quality and diverse datasets. However, the domestic environment is particularly complex; the interior of each house is very different, nothing is standardized, moreover from one country to another.

That’s why iRobot, like other manufacturers, needs as many images and videos as possible from inside homes. Along with iRobot employees themselves, and volunteers recruited by third-party data providers, the company has also begun offering consumers the ability to contribute workout data through its app, which allows images to be sent specific obstacles to the company’s servers in order to improve its algorithms. A company representative told MIT Technology Review, however, that these images have not yet been used to train algorithms.

Devices and applications collect more and more personal information. And the companies that sell these products write their privacy policies in sometimes vague and ambiguous terms. “ When a company says they will never sell your data, that doesn’t mean they won’t use it or share it with others for analysis. notes MIT Technology Review. The 12 robot vacuum manufacturers considered in this survey all allow data to be used for the purpose of “improving products and services” — a formulation that ultimately opens the door to a lot of things.

Source : MIT Technology Review

We wish to say thanks to the writer of this post for this awesome web content

Photos captured by Roomba vacuum cleaners leaked to the internet


You can view our social media profiles here , as well as other pages on related topics here.https://yaroos.com/related-pages/