• am
  • ru
  • en
Версия для печати
19.06.2015

BEWARE THE LISTENING MACHINES

When dolls and friendly robots can listen and respond to what people say, where's the line between personal assistance and mass surveillance?

One of my great pleasures in life is attending conferences on fields I'm intrigued by, but know nothing about. (A second pleasure is writing about these events.) So when my friend Kate Crawford invited me to a daylong “Listening Machine Summit,” I could hardly refuse.

What's a listening machine? The example of everyone's lips was Hello Barbie, a version of the impossibly proportioned doll that will listen to your child speak and respond in kind. Here’s how The Washington Post described the doll back in March: “At a recent New York toy fair, a Mattel representative introduced the newest version of Barbie by saying: ‘Welcome to New York, Barbie.’ The doll, named Hello Barbie, responded: ‘I love New York! Don't you? Tell me, what's your favorite part about the city? The food, fashion, or the sights?’

Barbie accomplishes this magic by recording your child’s question, uploading it to a speech recognition server, identifying a recognizable keyword (“New York”) and offering an appropriate synthesized response. The company behind Barbie’s newfound voice, ToyTalk, uses your child’s utterance to help tune their speech recognition, likely storing the voice file for future use.

And that’s the trick with listening systems. If you can imagine reasons why you might not want Mattel maintaining a record of things your child says while talking to his or her doll, you should be able to imagine the possible harms that could come from use—abuse or interrogation of other listening systems. (“Siri, this is the police. Give us the last hundred searches Mr. Zuckerman asked you to conduct on Google. Has he ever searched for bomb-making instructions?”)

As one of the speakers put it (we’re under Chatham House rules, so I can’t tell you who), listening machines trigger all three aspects of the surveillance holy trinity:

1. They're pervasive, starting to appear in all aspects of our lives.

2. They're persistent, capable of keeping records of what we've said indefinitely.

3. They process the data they collect, seeking to understand what people are saying and acting on what they're able to understand.

To reduce the creepy nature of their surveillant behavior, listening systems are often embedded in devices designed to be charming, cute, and delightful: toys, robots, and smooth-voiced personal assistants. Proponents of listening systems see them as a major way technology integrates itself more deeply into our lives, making it routine for computers to become our helpers, playmates, and confidants. A video of a robot designed to be a shared household companion sparked a great deal of debate, both about whether we would want to interact with a robot in the ways proposed by the product’s designers, and how a sufficiently powerful companion robot should behave.

If a robot observes spousal abuse, should it call the police? If the robot is designed to be friend and confidant to everyone in the house, but was paid for by the mother, should we expect it to rat out one of the kids for smoking marijuana? (Underlying these questions is the assumption that the robot will inevitably be smart enough to understand and interpret complex phenomena. One of our best speakers made the case that robots are very far from having this level of understanding, but that well-designed robots were systems designed to deceive us into believing that they had these deeper levels of understanding.)

Despite the helpful provocations offered by real and proposed consumer products, the questions I found most interesting focused on being unwittingly and unwillingly surveilled by listening machines. What happens when systems like ShotSpotter, currently designed to identify shots fired in a city, begins dispatching police to other events, like a rowdy pool party (just to pick a timely example)? Workers in call centers already have their interactions recorded for review by their supervisors—what happens when Uber drivers and other members of the 1099 economy are required to record their interactions with customers for possible review? (A friend points out that many already do as a way of defending themselves from possible firing in light of bad reviews.) It’s one thing to choose to invite listening machines into your life—confiding in Siri or a cuddly robot companion—and something entirely different to be heard by machines installed by your employer or by local law enforcement.

A representative of one of the consumer regulatory agencies in the United States gave an excellent talk in which she outlined some of the existing laws and principles that could potentially be used to regulate listening machines in the future. While the U.S. does not have comprehensive privacy legislation in the way many European nations do, there are sector-specific laws that can protect against abusive listening machines: the Children's Online Privacy Protection Act, the Fair Credit Reporting Act, HIPA, and others. She noted that electronic surveillance systems had been the subject of two regulatory actions in the U.S., where Federal Trade Commission protections against “unfair and deceptive acts in commerce” led to action against the Aaron’s rent-to-own chain, which installed privacy-violating software in the laptops they rented out, capturing images of anyone in front of the camera.

FTC argued that this was a real and concrete harm to consumers with no offsetting benefits, and Aaron’s settled, disabling the software. I found the idea that existing regulations and longstanding ideas of fairness could provide a framework for regulating listening machines fascinating, but I'm not sure I buy it. Outside of the enforcement context, I wonder whether these ideas provide a robust enough framework for thinking about future regulation of listening systems, because I’m not sure anyone understands the implications of these systems well enough to anticipate possible futures for them. A day thinking about eavesdropping dolls and personal assistants left me confident only that I don't think anyone has thought enough about the implications of these systems to posit possible, desirable futures for their use.

(...) We need a better culture of policymaking in the IT world. We need a better tradition of talking through the “whethers, whens, and hows” of technologies like listening machines. And we need more conversations that aren’t about what’s possible, but about what’s desirable.

Ethan Zuckerman is the director of the Center for Civic Media at MIT and the principal research scientist at MIT's Media Lab. He is the author of Rewire: Digital Cosmopolitans in the Age of Connection.

Read more: http://www.theatlantic.com/technology/archive/2015/06/listening-machines/396179/