Potential Major Security Concerns May Be Behind Smarter Siri’s Delay

0
17
Potential Major Security Concerns May Be Behind Smarter Siri’s Delay

The anticipated enhancements for Siri are facing extended delays, with reports suggesting that features initially set for release in iOS 18.4 may now be deferred to iOS 19.

Although Apple has not provided a clear explanation, there are two primary theories floating around. Additionally, a developer and data analyst has proposed that significant security concerns could be the leading issue in this situation.

Delay in Smarter Siri Features

Apple made a commitment to deliver a significantly more intelligent Siri during last year’s WWDC in June. While the company has started to implement some Apple Intelligence features, and Siri can now handle verbal mistakes and more complex commands, the three crucial upgrades promised by Apple have yet to materialize:

  • Enhanced contextual awareness, enabling it to utilize personal information.
  • The capacity to recognize what’s displayed on the screen and respond accordingly.
  • In-app functionalities, allowing users to direct Siri to perform tasks through their apps.

These features were initially anticipated to be part of iOS 18.4, with expectations for a more conversational Siri to follow next year. However, Apple has indicated that complications have arisen in meeting this schedule.

We’ve been working on making Siri more personalized, enhancing its understanding of your context and its ability to take action across your apps. It’s taking us longer than anticipated to deliver these features, and we expect to roll them out over the next year.

The specific timeline of “the coming year” remains ambiguous—it could refer to later this year or the next—but it strongly suggests we might not see a smarter Siri until iOS 19 is released.

Two Suggested Reasons for the Delay

Bloomberg has continuously reported that Apple is facing challenges in implementing these features due to numerous ongoing bugs.

Internal testing at Apple has revealed that many employees have found these new Siri features to be inconsistent.

Initially, Mark Gurman speculated that the launch might be postponed from iOS 18.4 to 18.5.

Another theory suggests that the presence of two separate Siri versions could be contributing to these bugs. One version maintains the traditional tasks Siri has always executed while the other layer manages more complex commands. Apple is reportedly struggling to integrate these systems into a unified Siri experience.

Security Risks as a Major Concern

Developer Simon Willison, who created the open-source data analysis tool Datasette, has posited that Apple might also be grappling with ensuring the security of a more advanced Siri. He specifically points to the risk of prompt injection attacks as a potential vulnerability.

This type of attack is a recognized issue across generative AI systems, where an attacker might attempt to bypass built-in safeguards within a large language model (LLM), substituting them with harmful instructions. Willison warns that this could pose significant threats to the new Siri.

These new Apple Intelligence functionalities involve Siri responding to access requests for application data and taking actions on the user’s behalf.

This scenario creates an ideal environment for prompt injection attacks! Any time an LLM-based system has access to sensitive data and the potential for exposure to malicious commands (such as emails and messages from unknown sources), there’s a considerable risk that an attacker could manipulate these tools to compromise or extract personal data.

In simpler terms, misguiding ChatGPT into sharing bomb-making instructions represents one level of danger, but tricking Siri into revealing your private information is an entirely different, more severe concern.

John Gruber, a commentator on Apple technology, finds this theory to be credible, noting that no one has successfully mitigated prompt injection attacks yet.

A pessimistic perspective on this personalized Siri issue is that Apple cannot afford to make a mistake. However, the inherent susceptibility of LLMs to prompt injection may mean that creating a completely secure version is ultimately unattainable. If it is achievable, it will demand groundbreaking solutions. Apple cannot merely “catch up”; it has to resolve a challenging problem that remains unsolved by OpenAI, Google, or any leading AI lab to deliver on the features they’ve promised.

He questions how Apple found itself in a position where it promised—and even advertised—features that could potentially be too dangerous to ever release.

DMN’s Perspective

While I believe earlier reports are accurate, it seems highly likely that security concerns are also a significant aspect of the delay.

Apple has made privacy a core differentiator and a selling point for its products. Thus, any weakness that allows malicious apps to access and extract personal data from users would be a catastrophic failure for the company.

While I do not anticipate that the smarter Siri initiative will become another AirPower, it’s becoming increasingly apparent that Apple might have benefitted from remaining silent about timelines until closer to resolving its challenges.

Image: Apple and Michael Bower/DMN