Your IT Consultant

Information Technology Blog
by John W. Simek, Vice President of Sensei Enterprises, Inc.

Siri Sends Accident Recordings to Contractors

July 29, 2019

First it was Google and Amazon and now Apple is in the crosshairs. The Guardian reported that Apple contractors regularly hear confidential details on Siri recordings. The Guardian stated, "Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or "grading", the company's Siri voice assistant." A whistleblower told the Guardian, "There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data."

Apparently, the records are generated when Siri is activated, even by accident. Anything that remotely sounds like "Hey Siri" will trigger the recording. Last year, UK's Secretary of Defense Gavin Williamson found out that Siri was activated when an assistant piped up as he spoke to Parliament about Syria. The sound of a zipper may even be enough to activate Siri. The whistleblower said if an Apple Watch detects it has been raised and then hears speech, Siri is automatically activated. I guess the smart watch is pretty stupid or maybe that's the way Apple wants it to work.

Email: Phone: 703.359.0700
Digital Forensics/Cybersecurity/Information Technology
https://www.linkedin.com/in/johnsimek
https://amazon.com/author/johnsimek
https://senseient.com