Nearly a month ago, a report in
The Guardian exposed the fact that third-party contractors have been listening in on a small percentage of Siri requests as part of a “Siri grading” program. Apple
promised to halt the Siri grading program while it conducts a “thorough review,” which left us
wondering how the company would move forward, as human grading of any machine-learning process is an essential part of training the algorithms to improve them.
Apple now appears to have finished its review and has
issued a statement apologizing for the way this program had been carried out so far. The company plans to reinstate the program this fall after making some important changes.
The apology begins with a familiar statement: “At Apple, we believe privacy is a fundamental human right.” It then describes how Apple designed Siri to protect your privacy—collecting as little data as possible, using random identifiers instead of personally identifiable information, never using data to build marketing profiles or sell to others.
The statement then goes on to make sure you understand that using your data helps make Siri better, that “training” on real data is necessary, and only 0.2 percent of Siri requests were graded by humans.
After all of this, Apple does get around to the actual apology that should have been in the first paragraph.
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. Apple will resume the Siri grading program this fall, but only after making the following changes:
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
This is the right move, and it once again puts Apple ahead of other tech giants in protecting your privacy and security. Apple is making the program opt-in rather than opt-out, an important distinction as the vast majority of users never stray from the default settings. It’s also going to make sure these audio samples stay in-house rather than going into the hands of third-party contractors.
Hopefully, this spotlight on Siri’s training, evaluation, and grading will have a positive effect not only for user privacy, but for helping Siri to improve more quickly.