A Guardian report last week was framed as a Siri bombshell: Apple contractors are listening to your Siri recordings and “regularly hear confidential medical information, drug deals, and recordings of couples having sex.” The report echoed a similar expose published a few weeks back by Belgian broadcaster VRT NWS, in which Google Assistant recordings were exposed for all the world to hear. In that leak, VRT was able to track down some of the voices through “addresses and other sensitive information.”
Both companies claim that the data collected and analyzed is crucial to the development of their AI chatbot’s smarts. Like Google, Apple claims that “less than 1 percent of daily Siri activations, are used for grading, and those used are typically only a few seconds long.” As such, the whistleblower who provided the Guardian with the recordings said he was “tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.”

Siri may be listening more often than you think.
Apple also stressed that Siri data is random and “not linked to other data that Apple may have from your use of other Apple services,” but the whistleblower told the Guardian that the recordings “are accompanied by user data showing location, contact details, and app data.”
It’s not clear how this information is connected to the records, since Apple maintains a strict separation between a user’s Apple ID and Siri. The Guardian points out that “there is no specific name or identifier attached to a record and no individual recording can be easily linked to other recordings.” Apple has long promoted its privacy stance when it comes to Siri and claims that, “What happens on your iPhone stays on your iPhone.”
Additionally, Apple told the Guardian that “Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” So, unless some super sleuthing is being done, Apple appears to be going to great lengths to protecting your Siri conversations and making sure the data therein can’t be traced back to you.
But that doesn’t mean it won’t be. While we have no reason to believe that anything nefarious is being done with the data collected, the fact remains that Siri and any other voice-powered assistant can be triggered by phrases that sounds like their wake word. And therein lies the crux of the problem. If your phone or HomePod are accidentally triggered, they will start recording whatever it hears, which could be a sensitive conversation or romantic encounter that you probably don’t want on record. According to the whistleblower, Siri regularly activates without the explicit “Hey Siri” command, including something as innocuous as “the sound of a zip” on a jacket.

All you need to do is raise your wrist to get Siri to start listening on your Apple Watch.
If that snippet is then selected as one of the ones used for grading, a contractor could hear it. The possibility of an accidental trigger rises significantly with the Apple Watch, which only needs to be raised to activate Siri. And since it’s always on your person, the probability of Siri inadventently recording a sensitive conversation is higher than with a phone or a HomePod.
Let the user decide
Human analysis of Siri interactions is important. Apple can do all of the testing it wants, but it won’t know how well Siri is doing its job with collecting random samples without real-world sampling. That’s not in dispute. But random or not, we ought to have to right to opt out of being included in the pool that Apple draws from.
Apple asks about everything else—location, ad tracking, app crashes, microphone access, etc.—but there’s no screen in setup that allows you to opt out of random data collection. Nor is there a toggle in settings. But as it stands, there’s no way to do that. As the Guardian points out, Apple “does not explicitly state that that work is undertaken by humans who listen to the pseudonymised recordings” in its lengthy privacy policy.

Your HomePod’s ears might be hearing things it shouldn’t.
While I’m sure Apple will amend its privacy policy with new language that covers the human element, that’s not enough for a company that consistently promotes privacy as the main reason to buy an iPhone over and Android phone. Users should be able to easily opt out of any and all Siri data capture via an option during setup. That would go a long way toward keeping trust with users.
But we should also have a way to see and delete Siri queries. As it stands, the only way to see your conversation history is to ask Siri, and the only way to delete it is to clear your Safari history and disable Siri dictation. That’s hardly a substitute for the granular timeline of queries that Google and Amazon offer. Not only would it let you see and delete things that you don’t want falling onto the wrong ears, it would also give you an overview of what Siri is listening to when you haven’t summoned it, which is a big part of the problem here.
Yes, you can turn off Hey Siri on your phone or Raise to Speak on your Watch, but that’s limiting functionality and convenience. A simple toggle that lets you opt out of audio data collection would make Apple’s data collection that much easier to swallow. Had we known that Apple collects data to be analyzed as part of Siri’s development, the report wouldn’t have been nearly as salacious, even with the scandalous bits. It might have even propelled some users to dive into settings to turn off access.
We all want Siri to be better, and I understand that Apple needs to collect and listen to some of our recordings to get there. I just want to be able to decide for myself if I’m a part of it.