Apple’s new Apple information The system aims to inject generative artificial intelligence into the core of iOS. The system provides users with A range of new services, including text and image generation as well as organization and scheduling functions. However, while the system offers impressive new capabilities, it also introduces complexity. On the one hand, the AI system relies on a large amount of iPhone user data, which poses potential privacy risks. At the same time, the huge demand for computing power in artificial intelligence systems means that Apple will have to increasingly rely on its cloud systems to meet user requirements.
Apple has a history of providing unparalleled privacy protection for iPhone users. it is An important part of the company’s brand. Part of these privacy guarantees is the ability to choose when mobile data is stored locally and when it is stored in the cloud. While increasing reliance on the cloud may set off some privacy alarm bells, Apple has anticipated these concerns and created a shocking new system it calls the “Privacy Protection System.” Private Cloud Computing (PCC). This is essentially a cloud-based security system designed to keep users’ data safe from prying eyes while also being used to help fulfill AI-related requests.
On paper, Apple’s new privacy system sounds really impressive. The company claims to have created “the most advanced security architecture ever deployed for large-scale cloud artificial intelligence computing.” But what seems like a huge achievement on paper could end up raising broader user privacy concerns in the future. It’s unclear, at least for now, whether Apple can deliver on its lofty promises.
How Apple’s private cloud computing should work
In many ways, cloud systems are just giant databases. If bad actors gain access to this system/repository, they can view the material it contains. However, Apple’s private cloud computing (PCC) brings a number of unique protections designed to prevent such access.
Apple said it has implemented security systems at both the software and hardware levels. The company has created custom servers that will house the new cloud system, and these servers undergo a rigorous screening process during manufacturing to ensure their security. “We inventory and perform high-resolution imaging of the components of the PCC node,” the company claims. Servers are also equipped with physical security mechanisms such as tamper-evident seals. iPhone users’ devices can only connect to servers that have been certified as part of a protected system, and these connections are end-to-end encrypted, meaning the data being transferred is virtually untouchable in transit.
Once the data reaches Apple’s servers, additional protections are in place to ensure the data’s privacy. Apple says its cloud is leveraging stateless computing Establish a system where user data is not retained after being used to fulfill AI service requests. Therefore, according to Apple, your data won’t have a long lifespan on its system. The data will be transferred from your phone to the cloud, interacted with Apple’s high-octane artificial intelligence algorithms to satisfy any random question or request you submit (“Draw me a picture of the Eiffel Tower on Mars”), and then The data (again, according to Apple) will be deleted.
Apple has put in place a range of other security and privacy measures, you can read more details here On the company’s blog. These defenses, while varied, all appear to be designed to do one thing: prevent any breach of the company’s new cloud systems.
But is it really legal?
Companies have been making big cybersecurity promises, but it’s often impossible to verify whether they’re telling the truth. Failed cryptocurrency exchange FTX once claimed to store users’ digital assets in air-gapped servers. Later investigation showed It’s pure nonsense. But Apple is certainly different. To prove to outside observers that it actually secures the cloud, the company said it will roll out something called “transparent logging” that involves full production software imaging (Basically a copy of the code used by the system). It plans to release these logs periodically so that outside researchers can verify that the cloud is operating as Apple says it does.
What people say about PCC
Apple’s new privacy system has clearly polarized the tech world. While many are impressed by the project’s massive effort and unparalleled transparency, some are wary of the broader impact it could have on mobile privacy. Most notably – aka loudly – Elon Musk Start promoting now Apple betrayed its customers.
Web developer and programmer Simon Willison told Gizmodo that he was impressed by the “ambition” of the new cloud system.
“They are solving multiple extremely difficult problems in privacy engineering simultaneously,” he said. “I think the most impressive part is auditability – they will publish images for audit in transparency logs, which devices can use to ensure they are only communicating with servers running disclosed software.” Apple hired We have some of the best privacy engineers in the industry, but even by their standards, it’s a tough job.
But not everyone is so enthusiastic. Matthew Green, a professor of cryptography at Johns Hopkins University, expressed skepticism about Apple’s new system and the promises that come with it.
“I don’t like it,” Green sighed. “My biggest concern is that it will centralize more user data in data centers, whereas most of that data is now stored on people’s actual phones.”
Historically, Apple has made local data storage a mainstay of its mobile designs because cloud systems are known for their privacy shortcomings.
“Cloud servers are not secure, so Apple has been using this method,” Green said. “The problem is, with all this AI technology going on, Apple’s internal chips aren’t powerful enough to do what they want. So they need to send the data to servers, and they’re trying to build these super protected The server cannot be hacked by anyone.
He understands Apple’s reasons for this move, but doesn’t necessarily agree with it because it means greater reliance on the cloud.
Green said Apple hasn’t made it clear whether it will explain to users which data will be kept locally and which will be shared with the cloud. This means that users may not know what data is exported from the phone. At the same time, Apple has not made it clear whether iPhone users can opt out of the new PCC system. If users are forced to share a certain percentage of their data with Apple’s cloud, it could mean the average user has less autonomy, not more. Gizmodo reached out to Apple for clarification on both points and will update this story if the company responds.
For Green, Apple’s new PCC system marks the mobile phone industry’s shift toward greater reliance on the cloud. This could lead to a less secure overall privacy environment, he said.
“I have very mixed feelings about it,” Green said. “I think enough companies will deploy very sophisticated artificial intelligence [to the point] No company wants to be left behind. I think consumers may punish companies that don’t have strong AI capabilities.