Siri is the star voice assistant for apple brand products. But any star it is, Siri is not entitled to the same treatment as other functionalities already integrated iPhone or to Mac. On the contrary: to improve Siri, Apple does not hesitate to delegate its supervision to many subcontractors. The concern is that Siri sometimes activates by mistake and certain private data is heard by these same subcontractors, without the knowledge of the user.
This is The Guardian who is the source of this revelation. A journalist was able to chat with an employee who had access to the audio recordings. The employee then explains that Siri can activate himself because he thinks he hears “Say Siri”. Except that it is not the case: sometimes, the employees then hear people making love, saying identities, even even drug dealer!
It is theoretically impossible to trace a Siri “call”: the data is anonymous and nobody knows who the request came from. But sometimes, given the information gathered, it can be easy to make a reconciliation, noted the same employee. Apple responded to the article in Guardian : “A small part of Siri’s requests is analyzed to improve Siri and dictation. Requests are not associated with the user’s Apple ID. Siri’s responses are analyzed in secure facilities and all employees are expected to adhere to Apple’s strict confidentiality requirements. ” The group adds that less than 1% of daily requests are analyzed and these last a few seconds.