Apple is paying contractors to listen to recorded Siri conversations, according to a new report from The Guardian, with a former contractor revealing that workers have heard accidental recordings of users’ personal lives, including doctor’s appointments, addresses, and even possible drug deals.
According to that contractor, Siri interactions are sent to workers, who listen to the recording and are asked to grade it for a variety of factors, like whether the request was intentional or a false positive that accidentally triggered Siri, or if the response was helpful.
But Apple doesn’t really explicitly say that it has other humans listening to the recordings, and whatever admissions it does make to that end are likely buried deep in a privacy policy that few (if any) Siri users have ever read. Apple does note on its privacy page that “To help them recognize your pronunciation and provide better responses, certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols,” but nowhere does it mention that human workers will be listening to and analyzing that data.

In a statement to The Guardian, the company acknowledged that “A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” Apple also noted that less than 1 percent of daily activations are analyzed under this system.
The fact that humans are listening to voice assistant recordings in general isn’t exactly news — both Amazon (for Alexa) and Google (for Assistant) have been revealed to have similar systems where actual human workers listen to recorded conversations to better improve those systems. It makes sense: smart assistants obviously can’t tell the difference between false positives and actual queries (if they could, it wouldn’t be a false positive), and anyone who’s used a smart assistant can tell you that false positives are still very, very common at this stage of their evolution.
But for all three of these companies, it wasn’t clear up until recently the extent to which these companies were listening in on customers.
Apple’s system may also be more concerning for a few reasons, like the pervasiveness of Apple products. Where Alexa is largely limited to smart speakers, and Google Assistant to speakers and phones, Siri is also on Apple’s hugely popular Apple Watch, which is on millions of people’s wrists every waking moment. Plus, Siri on an Apple Watch activates any time a user raises their wrist, not just when it thinks it’s heard the “Hey, Siri” wake word phrase.

Post a Comment

Previous Post Next Post