On Wednesday, aggrieved Apple users filed a lawsuit in California against the Silicon Valley giant. The plaintiffs allege the corporation violated Golden State law by recording them without their consent. The suit follows a July article by The Guardian that revealed the firm’s human contractors listen to consumer-Siri interactions.
Furthermore, the individuals suing Apple filed for class-action status, meaning that more than 40 million American Siri users could join the lawsuit.
Apple’s Secret Siri Listening Program
As per The Guardian, Apple had maintained a program to improve the quality of its voice assistant, Siri. The corporation hired human contractors to listen to recordings made by the application to grade its effectiveness. Apple argued that the initiative helps make its digital assistant more responsive and accurate. The firm noted that it used less than one percent of all daily Siri interactions in its optimization project.
However, consumers and commentators have taken issue with two aspects of Apple’s human review program.
For one, Apple didn’t explicitly tell its customers that it cataloged their interactions with Siri. Secondly, an internal whistleblower told The Guardian that the program could be activated accidentally after registering something that sounds like its wake word. As a result, Apple recorded “countless” private discussions, business meetings, and crime-related conversations.
Also, as the sound of a zipper can allegedly activate Siri, the assistant captured audio from several sexual encounters.
The group behind the new lawsuit accuses Apple of violating California law by recording its customers without their consent. The plaintiffs also allege that the corporation lied to Congress when it testified that Siri couldn’t be activated without the “clear, unambiguous audio trigger ‘Hey Siri.’”
After The Guardian story went live, Apple suspended its Siri optimization program. Furthermore, the tech giant said that when it restarts the program, it will allow users to opt-out.
An Industry-Wide Issue
Though Apple is the most recent technology company to face blowback for the way it uses its voice assistant data, it’s hardly the only one.
Earlier this week, Amazon announced it released an update for its Alexa products that would prevent the company from using customer audio recordings without authorization. In April, a Bloomberg article revealed the corporation had human workers review customer interactions with the digital assistant.
Like Apple, Amazon defended itself by saying it only used a fraction of its overall Alexa requests. It also noted that it disclosed its app optimization program in Alexa’s frequently asked questions page and gave customers opt-out capabilities. Nevertheless, Bloomberg found that human reviewers frequently encountered recordings of users’ private activities.
Earlier this summer, Google had its own voice recordings scandal. On July 10, Belgian news site VRT reported that it obtained over 1,000 private recordings made by Google Assistant. The publication also revealed that the audio files contained users’ personally identifiable information. A day later, Google confirmed that the data VRT obtained was genuine and went on to discontinue its human review program.
In August, British, German, and Irish regulators announced they’d launched investigations to determine if Amazon, Apple, and Google had violated their respective consumer data protection laws.
In theory, hefty international fines and court judgments should persuade any industry to change its ways. However, given the tech sector’s past behavior concerning the exploitation of user privacy, reform seems unlikely.