Judge rules that Apple can be sued for recording snippets of Siri’s interactions with users for grading purposes
At the time, Apple said that less than 1% of Siri’s daily activations were being sent to the contractors whose job was to determine whether Siri was activated on purpose or by accident. The firm also graded whether Siri responded appropriately to a user’s request or query. A small number of snippets were used to try and improve Siri’s diction.
A federal judge says class action lawsuit against Apple can proceed
One Siri user said that he was having a private conversation with his doctor about a “brand name surgical treatment” and soon received targeted ads for the procedure. Two other Siri users complained that conversations they had about “Air Jordan sneakers, Pit Viper sunglasses and ‘Olive Garden'” resulted in both receiving online ads for these specific brands.
Judge White wrote in his decision that “Apple faults plaintiffs for not alleging the contents of their communications, but the private setting alone is enough to show a reasonable expectation of privacy.” The judge added that the plaintiffs can claim that Apple violated the federal Wiretap Act, California’s privacy laws, and committed breach of contract. The judge did deny a claim made by the plaintiffs accusing Apple of unfair competition.
Amazon has also been sued for transcribing conversations that users had with Alexa
. Back in July, a different federal judge in California ruled that users of Google Assistant, represented by the same law firm as those suing Apple, could take on Google in a class-action suit. Both Amazon and Google
, like Apple, used recordings or transcriptions to make sure that their digital assistants were responding appropriately to requests or queries made by users.
Besides having products mentioned to Siri end up being advertised on iPhone users’ phones, more serious privacy breaches occurred. Accidental activations of the digital assistant allowed those working for the third-party company grading Siri’s responses, to hear couples having sex. Conversations containing private medical information were also turned over to the third-party firm along with drug deals.
Siri tells users that “I respect your privacy” and only listen when being talked to
Apple said that there was no way that the third party firm could determine the identity of the voices on the recordings. Back in July 2019, the tech giant said, “User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
However, the current version of Apple’s Terms of Service states, “We may collect and store details of how you use our services, including search queries. This information may be used to improve the relevancy of results provided by our services. Except in limited instances to ensure quality of our services over the Internet, such information will not be associated with your IP address.
The plaintiffs said in the original court filing
that “Apple has sold millions of Siri Devices to consumers during the Class Period. Many of these consumers would not have bought their Siri Devices if they had known Apple was recording their conversations without consent.” Interestingly, if you ask Siri “are you always listening?,” the response is “I respect your privacy and only listen when you’re talking to me.”
Apple eventually allowed users to opt-out
of having their moments with Siri recorded and sent to a third-party for scoring Siri’s response. The lawsuit involving Siri is known in legal circles as Lopez et al v. Apple Inc., U.S. District Court, Northern District of California, No. 19-04577