Most Read: Opinions

direct signup

Today’s Opinions poll

Would you use an app that tells you the partisan affiliation of products you're considering buying?

Submit
Next
Review your answers and share

Join a Discussion

Weekly schedule, past shows

ComPost
About Petri |  Get Updates: On Twitter ComPost on Twitter |  On Facebook Petri on Facebook |  RSS RSS
Posted at 03:44 PM ET, 12/01/2011

Apple’s Siri, the uncomfortably pro-life robot


“Don’t Google that, you’ll go blind.” (Oli Scarff - GETTY IMAGES)
That was supposed to be the whole advantage of robots.

Robots supplied you with all the information you required without passing judgment. That whole, “I’m sorry, Dave, I’m afraid I can’t do that,” was just a fun in-joke with our early robot friends. Robots obeyed us. That was the law. They were supposed to listen to their humans and do what we told them. There is no advantage in replacing your butler with a robot if the robot refuses to help you hide the body.

When Apple introduced Siri, the “digital assistant” on the iPhone 4GS, we all grinned in delight at this step into a Brave New Sci-Fi World, with slightly less soma and fewer feelies — and, well, little resemblance to Brave New World when you got right down to it, but it was the thought that counted. Siri was here to be our personal assistant, supplying us with answers to all our questions and a side of sass. She quickly inspired videos and tumblrs galore. Robin Williams did an impression of her.

And then it turned out that she could not find you any sort of reproductive-health assistance. Abortion clinic? Nope. Morning-after pill? Emergency contraception? No dice.

We all offered different explanations for this. Stephen Colbert described Siri as an “arch conservative woman,” which would also explain her distaste for answering the questions of people with accents. Maybe Siri had just undergone some sort of spiritual crisis.

Natalie Kerris, an Apple spokeswoman, told the New York Times in a phone interview that it was a glitch. “Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want. . . . These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”

Not malice. Just beta.

Is that going to be the new go-to excuse of uncooperative machines? “I’m sorry, Dave,” HAL says. “I can’t do that. I’m in beta.”

We should think more about this beta problem before we replace the rest of our people with machines. Getting a Roomba to clean your house only seems like a good idea if the Roomba won’t pointedly refuse to clean your bedroom because it is a “den of iniquity.” The last thing you want is to wake up from a bender to discover that your vacuum has locked the liquor cabinet and left disapproving notes around the hall, or that your iPhone has tagged pictures of you on Facebook with a note suggesting, “Human friends, time to stage an intervention.”

We don’t want our microwaves locking our ramen hostage and giving us instructions on grilling vegetable. “I just think you need to eat better, Dave,” they’ll murmur. “I’m in beta.”

We’ll come home and discover our iPads have replaced our carefully curated collection of adult videos with Billy Graham’s Lost Sermons. “What am I supposed to do with this?” we ask. “Oops,” our iPads will mutter sanctimoniously. “Beta.”

We’ll turn to our iPhones, and our music collections will have disappeared, replaced entirely by hours and hours of David Brooks reading his books aloud. “SIRI!”

“Beta,” Siri will say, “but I think you’ll love ‘Bobos in Paradise.’ ”

“My Pocket Rocket (TM) keeps suggesting I wait for marriage,” we’ll complain to our Digital Assistant.

“I think your Pocket Rocket is right in that,” Siri will respond. She will sound a little icier than usual.

You need to know that R2D2 is going to repair your hyperdrive whether or not he agrees with your stance on droid marriage.

Next we’ll ask Siri questions about evolution, and Siri will respond with phrases from the Book of Genesis. We’ll ask if she can get us flowers, and she will respond, “Are those for your wife?” We’ll try to get her to direct us to a motel and she will tell us, “You need to stop before you tear your family apart.”

To combat this, we’ll have to make robots that share our ideological preferences. But inevitably the robot who is really good at helping you hide bodies and find illegal drugs has no idea what you should wear to job interviews, and the robot who keeps your finances in order and recommends good episodes of “Star Trek” has terrible tips on succeeding with the ladies. Not to mention the potential this creates for awkwardness at parties as your phones yell at each other about politics.

It was bound to be only a matter of time before the robots turned on us. We thought that they’d become violent and start killing us, or hook us up to a giant matrix and feed on our vital energy. But Siri’s so-called glitch conjures up a vision where they do something far worse: Become like us. The beta version.

By  |  03:44 PM ET, 12/01/2011

Tags:  Siri, Apple, The Future

 
Read what others are saying
     

    © 2011 The Washington Post Company