A woman was photographed on the toilet by her Roomba robot vacuum

A Roomba robot vacuum recorded images of a woman in the toilet and they ended up being posted on Facebook. Whose fault is it ?

A photo of a young woman wearing a t-shirt, sitting on the toilet bowl and with her shorts down to mid-thigh… This shot, of poor quality, with a lot of grain, is part of a batch of about fifteen other than the MIT Technology Review (MIT) could not consult on closed social media groups. All these shots have in common that they were captured at ground level. And for good reason, it was not a person who took them, but a robot Roomba J7 vacuum cleaner from iRobot, the largest supplier of this type of device.

So how did these photos taken by vacuum cleaners end up on social networks, as private as they are? And first, why do these vacuums take pictures? As with the vast majority of so-called smart accessories, the manufacturer collects a lot of data. They are used to feed the machine learning algorithm of Artificial Intelligence. It is she who comes to improve the services and capabilities of the device. In the case of a robot vacuum, the sensors are numerous and there are even smart cameras on board the Roomba J7. The robot can send images, voicefaces, geolocations, house plans and a whole lot of other personal information.

This has only one goal: to train the algorithms. Generally, this collection is specified in the privacy policy that no one takes the time to read. In this specific case, these photos do not come from the robots marketed, but from development models used by employees and testers paid by the brand.

Humans to label the data collected

The case is said to date from 2020 and these people had signed written agreements acknowledging that they were sending data feeds, including video, back to the company. In other words, this particular collection does not concern the brand’s customers. This is the beginning of an explanation, but it does not justify the fact that these photos subsequently ended up on social networks. This sensitive data left the home networks of homes in North America, Europe and Asia to be stored on iRobot’s servers in Massachusetts (USA).

“This is the only way to teach the AI ​​to recognize its environment and identify a loose cable or a sock, for example.”

But, according to the MIT, some of this data was sent to a contractor in San Francisco called Scale AI. Its mission is to have the data analyzed by humans to give them description tags. This is the only way to teach the AI ​​to recognize its environment and identify a loose cable or a sock, for example. However, in the face of mountains of data, it turns out that Scale AI uses subcontractors all over the world. Among them, Venezuelan workers, under contract, published these photos of private groups on social networks like Facebook, or Discord. The leak therefore does not come from a security breach at all, but from a weak link in the brand’s subcontracting chain.

The problem is that iRobot told the MIT having shared more than two million pictures taken by Roomba with its subcontractors. If this story happened for data from development devices, it could also be the case for any customer’s vacuum cleaner. It must be said that it is inconceivable for users that a human could observe through their vacuum cleaner. Following this investigation of MIT, iRobot explained that the brand had ended its collaboration with the company whose employees had leaked the images. Still, the intervention of humans is the only possibility to be able to improve the AI.

We want to say thanks to the writer of this article for this awesome web content

A woman was photographed on the toilet by her Roomba robot vacuum


Check out our social media profiles and other pages related to themhttps://yaroos.com/related-pages/