Apple doubles down on privacy after Siri-snooping settlement

Apple doubles down on privacy after Siri-snooping settlement

Apple has vehemently denied that it ever abused recordings of Siri requests by using those records for marketing, ad sales, or any of the other creepy nonsense we’re being forced to tolerate with other connected devices.

The company’s denial follows a recent $95 million settlement concerning a widely reported sequence of events when it became known that the company had human contractors grading people’s spoken Siri requests. Many of us were extremely shocked at the nature what was being recorded and shared with those contractors, and to be fair, Apple swiftly took steps to remedy the situation, which it said was necessary to improve Siri’s accuracy.

The plaintiffs claimed that Apple’s systems had been used to trigger ads targeted at them, which Apple denied despite having settled the case. It’s thought the company chose to settle because it wanted to prevent further accusations against its commitments to privacy.

An unforced error with big consequences

The company has always denied that it abused the Siri request records in any way and has constantly pointed out that the recordings were not directly connected to any individual user, which is very unlike the experience you get with other connected devices. That denial wasn’t enough in this case. 

That’s because devices that lack Apple’s commitment to privacy are the ones responsible for ads you might encounter that spookily reflect private conversations you may have had. Apple says its systems don’t do that. 

Some companies deny they do this, but the fact others continue to do so leaves most of us deeply uncomfortable, and erodes trust.

In a statement following the resolution of the lawsuit, an Apple spokesperson said: “Apple has never used Siri data to build marketing profiles, never made it available for advertising, and never sold it to anyone for any purpose. Privacy is a foundational part of the design process, driven by principles that include data minimization, on-device intelligence, transparency and control, and strong security protections that work together to provide users with incredible experiences and peace of mind.”

Apple’s track record is a good one

Apple has committed vast resources to creating privacy protections across its systems. Everything from Lockdown mode to tools to prevent aggressive ad targeting and device fingerprinting represents the extent of its efforts, work that touches almost every part of the company’s ecosystem.

A future looming problem, of course, is that while Apple might be keeping to its pro-privacy promise, not every third-party developer likely shares the same commitment, despite the Privacy Labelling scheme the company has in place at the App Store.

This might become an even bigger problem as Apple is forced to open up to third-party stores. It seems plausible to expect some popular apps sold via those stores might choose to gather user data for profit.

With that monster visible on the horizon, Apple has also confirmed that it has teams working to build new technologies that will enhance Siri’s privacy. It also said, “Apple does not retain audio recordings of Siri interactions unless users explicitly opt in to help improve Siri, and even then, the recordings are used solely for that purpose.”

How Apple already protects Siri privacy

Apple pointed to several protections it already has in place for Siri requests:

  • Siri is designed to do as much processing as possible right on a user’s device — though some requests require external help, many, such as search suggestions, do not.
  • Siri searches and requests are not associated with your Apple Account. 
  • Apple does not retain audio recordings of Siri interactions unless users explicitly opt in to help improve Siri.

Apple has another protection it is putting into place: Private Cloud Compute. This will mean that Apple Intelligence requests made through Siri are directed to Apple’s cloud servers, which offer industry-leading security. “When Siri uses Private Cloud Compute, a user’s data is not stored or made accessible to Apple, and Private Cloud Compute only uses their data to fulfil the request,” the company said.

To some degree, the need to make these statements is a problem Apple foolishly created for itself in the way it initially handled Siri request grading. The manner in which that was done tarnished its reputation for privacy, which is unfortunate given the company knows very well that in the current environment digital privacy is something that must be fought for.

There is a silver lining to the clouded sky. That Apple is now making these statements means it can once again raise privacy as a consideration as we move through the next chapters of AI-driven digital transformation.

All the same, raising the conversation does not in any way guarantee that privacy will win the debate, despite how utterly essential it is to business and personal users in this digital, connected era.

You can follow me on social media! You’ll find me on BlueSky,  LinkedInMastodon, and MeWe

close chatgpt icon
ChatGPT

Enter your request.