Ask Siri to Define ‘Mother’ and You’ll Get a Surprisingly Explicit Response

common

Siri has been known to produce some absolutely bizarre answers to requests from time to time, but iPhone users recently discovered perhaps the strangest—and definitely the most explicit—response yet.

Here’s the deal: Summon Siri on your iPhone or Mac and ask it to “define the word mother.” Siri will read off the first definition plain and simple, and then prompt you if you’d like to know the second definition. Say “yes,” and prepare your sensitive little ears for Siri’s R-rated response.

“As a noun,” Siri says, “it means, short for ‘motherfucker.’” Well then!

The unexpected answer, first surfaced by redditors on the r/Apple subreddit, appears to work pretty much across the board on all iOS devices, from the iPhone 5S all the way up to the iPhone X and iPads, as well as on computers running macOS. It even works with the Australian Siri, which is just as charming as you might expect.

 

Siri seems to be pulling the definition from the Oxford Dictionary, which offers the “motherfucker” answer as a secondary definition of the word “mother.” Tapping on the “more” button when Siri pulls up the definition reveals “short for ‘motherfucker’” is in fact listed in plain text right there in its response.

Screenshot: iPhone 7

Of course, the Oxford Dictionary offers four separate sub-definition of “mother” that would be far more commonly used, but Siri skips right over that and offers up the one that is clearly labeled “vulgar slang” without so much as a warning.

Getting Siri to swear is basically a sport for some Apple fanatics, and there are plenty of ways to go about tricking Apple’s voice assistant into rattling off some expletives, which is fun for about 45 seconds and will also probably be the most use you get out of Siri.

But Siri does not like it if you swear at it! It’ll scold you for your vulgarities and request that you be more kind, and that’s actually a pretty reasonable position for the AI to take. Sheryl Brahnam, a faculty member in Missouri State University’s Computer Information System Department, estimates that between 10 to 50 percent of human interactions with AI are abusive. That probably seems high until you remember that trolls managed to turn Microsoft’s innocent chatbot Tay into a Nazi in just one day, and then it sounds about right.

Siri is undoubtedly the dumbest of the smart assistants (though they are all bad), and honestly, it’s just reading the script that it’s been handed by the Oxford Dictionary so it’s not entirely at fault here. You can expect this to be fixed pretty quickly once Apple catches wind of it, but in the meantime enjoy this excuse to dust off Siri.

Leave a Reply

Your email address will not be published. Required fields are marked *