Episode 3

A private moment, caught by a Roomba, ended up on Facebook. Eileen Guo explains how

In 2020, a photo of a woman sitting on a toilet—her shorts pulled half-way down her thighs—was shared on Facebook, and it was shared by someone whose job it was to look at that photo and, by labeling the objects in it, help train an artificial intelligence system for a vacuum.

Bizarre? Yes. Unique? No. 


In December, MIT Technology Review investigated the data collection and sharing practices of the company iRobot, the developer of the popular self-automated Roomba vacuums. In their reporting, MIT Technology Review discovered a series of 15 images that were all captured by development versions of Roomba vacuums. Those images were eventually shared with third-party contractors in Venezuela who were tasked with the responsibility of "annotation"—the act of labeling photos with identifying information. This work of, say, tagging a cabinet as a cabinet, or a TV as a TV, or a shelf as a shelf, would help the robot vacuums "learn" about their surroundings when inside people's homes. 


In response to MIT Technology Review's reporting, iRobot stressed that none of the images found by the outlet came from customers. Instead, the images were "from iRobot development robots used by paid data collectors and employees in 2020." That meant that the images were from people who agreed to be part of a testing or "beta" program for non-public versions of the Roomba vacuums, and that everyone who participated had signed an agreement as to how iRobot would use their data.


According to the company's CEO in a post on LinkedIn: "Participants are informed and acknowledge how the data will be collected."


But after MIT Technology Review published its investigation, people who'd previously participated in iRobot's testing environments reached out. According to several of them, they felt misled


Today, on the Lock and Code podcast with host David Ruiz, we speak with the investigative reporter of the piece, Eileen Guo, about how all of this happened, and about how, she said, this story illuminates a broader problem in data privacy today.


"What this story is ultimately about is that conversations about privacy, protection, and what that actually means, are so lopsided because we just don't know what it is that we're consenting to."


Tune in today.


You can also find us on Apple PodcastsSpotify, and Google Podcasts, plus whatever preferred podcast platform you use.


Show notes and credits:


Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)

Licensed under Creative Commons: By Attribution 4.0 License

http://creativecommons.org/licenses/by/4.0/

Outro Music: “Good God” by Wowa (unminus.com)

About the Podcast

Show artwork for Lock and Code
Lock and Code

Listen for free

About your host

Profile picture for David Ruiz

David Ruiz

Lock and Code host and Senior Privacy Advocate for Malwarebytes. Hates surveillance.