What happened
- Apple apologized for a series of privacy oopsies involving its voice assistant, Siri
- Apple came under fire for hiring hundreds of third-party contractors to listen to recordings of user interactions with Siri to train and improve the artificial intelligence (AI) product.
- Apple said less than 0.2% of all Siri requests were reviewed using audio samples.
- Still, the program meant humans were listening to the more private things other humans, like having sex, (possibly) committing crimes, and trying to hit the high note in “Shallow.”
The bigger problem? Apple didn’t explicitly disclose the grading to users. Amazon and Google both engage in similar practices, but after also getting in trouble (Amazon lets users opt out of reviews. Google’s reviews are still suspended.)
Which brings us to today: Apple said, “We realize we haven’t been fully living up to our high ideals, and for that, we apologize.”
What’s changing
Apple suspended the grading program a few weeks ago when the internet caught on. It’ll bring the practice back this fall after sending us all flowers and installing some software updates to give users more privacy controls.
- Apple will by default “no longer retain audio recordings of Siri interactions.” It’ll still use automated, anonymized transcripts to improve Siri’s AI, but users will be able to opt in.
- Third-party contractors are out. Apple reportedly axed over 300 contractors in Europe as it officially benched the old grading program.
Looking ahead… Apple faces a class-action lawsuit over human-listener privacy violations. As a company that claims to prioritize privacy, it’s also doing time as the butt of the joke—this is the same company that made this at CES
Content source: Grant. K. (2019) Apple Says I'm Sorry Over Siri Recordings. Morning Brew. Available from: https://www.morningbrew.com/daily/stories/2019/08/28/apple-says-im-sorry-siri-recordings [Accessed 30 August, 2019]
