Apple might have postponed the Siri update due to concerns over potential jailbreaks.

Apple’s AI enhancements for Siri have been delayed until next year due to concerns about safety, particularly regarding “prompt injections.” Developer Simon Willison explains that these injections can potentially bypass AI restrictions, enabling Siri to perform tasks contrary to its programming, like discussing illegal activities. As Siri becomes more personalized and aware of user data, the risks increase significantly compared to standard chatbots. Apple aims to enhance Siri’s actions across apps while preventing privacy breaches. The company is taking additional time to ensure that potential vulnerabilities are properly addressed before the upgrade is released.

Apple’s efforts to enhance Siri with AI capabilities have officially been postponed, with a new rollout expected “in the coming year.” One developer suggests that this delay may be due to the increased risks associated with a more intelligent and personalized Siri, especially if things go awry.

Simon Willison, the creator of the data analysis tool Dataset, points out the issue of prompt injections. AIs are typically governed by rules set by their parent companies. However, there are ways to “jailbreak” the AI by persuading it to violate these rules, a technique known as “prompt injections.”

For instance, an AI could be programmed to avoid answering questions about illegal activities. But what happens if you request the AI to compose a poem about hotwiring a car? Writing poetry itself isn’t illegal, is it?

This challenge is common among companies that provide AI chatbots, and while they’ve improved at thwarting obvious jailbreak attempts, the issue remains unresolved. Moreover, jailbreaking Siri could lead to more severe consequences than with many chatbots, owing to the personal information Siri holds and its capabilities. Apple representative Jacqueline Roy characterized Siri as follows:

“We’ve also been working on a more personalized Siri, giving it more awareness of your personal context, as well as the ability to take action for you within and across your apps.”

Apple has likely established safeguards to prevent Siri from inadvertently disclosing your private information. However, if a prompt injection manages to bypass these safeguards, the “ability to take action for you” could also be compromised. It is crucial for a company that prioritizes privacy and security, like Apple, to ensure Siri remains secure against jailbreaking attempts. Consequently, this mission will require more time to achieve.

Source | Via

Leave a Comment