by Brandon Butler
Hey Siri, Did You Just Hear Me Commit a Felony? 10/10/2019

TechCrunch has a good update on the new Siri privacy settings coming to iOS 13.2, including Siri audio grading being completely opt-in:

Apple also is launching a new Delete Siri and Dictation History feature. Users can go to Settings > Siri and Search > Siri History to delete all data Apple has on their Siri requests. If Siri data is deleted within 24 hours of making a request, the audio and transcripts will not be made available to grading.

I’m personally hoping a lot of people opt-in — I will, and I encourage you to do the same. Nobody at Apple is spying on you or tracking you by listening to snippets of audio queries from Siri.

Yes, Siri can be triggered accidentally and pickup drug deals or sexy times, but these are win-the-lottery rare cases. If Siri was constantly getting invoked it would be in Apple’s best interest to fix that; that’s a lot of unnecessary noise and it would be a huge, noticeable drain on batteries. And Apple has made it easy to delete history, so if you believe Siri may have overheard you commit murder, it’s easy to purge the evidence.

Siri will only get better if more people are opted into the grading service.