Goto

Collaborating Authors

 private cloud compute


Apple Intelligence Is Gambling on Privacy as a Killer Feature

WIRED

As Apple's Worldwide Developers Conference keynote concluded on Monday, market watchers couldn't help but notice that the company's stock price was down, perhaps a reaction to Apple's relatively low-key approach to incorporating AI compared to most of its competitors. Still, Apple Intelligence-based features and upgrades were plentiful, and while some are powered using the company's privacy and security-focused cloud platform known as Private Cloud Compute, many run locally on Apple Intelligence-enabled devices. Apple's new Messages screening feature automatically moves texts from phone numbers and accounts you've never interacted with before to an "Unknown Sender" folder. The feature automatically detects time-sensitive messages like login codes or food delivery updates and will still deliver them to your main inbox, but it also scans for messages that seem to be scams and puts them in a separate spam folder. All of this sorting is done locally using Apple Intelligence.


Apple offers 1 million bounty to anyone who can hack its new AI system

Daily Mail - Science & tech

Apple is willing to bet big on the safety of Apple Intelligence, so much that the tech giant has offered up to a 1 million bounty to anyone who can hack it. The company announced Thursday that it's inviting'all security researchers - or anyone with interest and a technical curiosity' to perform'their own independent verification of our claims.' The public has been challenged to test the security of'Private Cloud Compute,' the servers that will receive and process user requests for Apple Intelligence when the AI task is too complex for on-device processing. The system, according to Apple, features end-to-end encryption and immediately deletes a user's request once the task is fulfilled. There are different payouts for certain discoveries, but the 1 million goes to anyone who can run code on the system without being detected and accessing sensitive parts.


Apple Intelligence Promises Better AI Privacy. Here's How It Actually Works

WIRED

The generative AI boom has, in many ways, been a privacy bust thus far, as services slurp up web data to train their machine learning models and users' personal information faces a new era of potential threats and exposures. With the release of Apple's iOS 18 and macOS Sequoia this month, the company is joining the fray, debuting Apple Intelligence, which the company says will ultimately be a foundational service in its ecosystem. But Apple has a reputation to uphold for prioritizing privacy and security, so the company took a big swing. It has developed extensive custom infrastructure and transparency features, known as Private Cloud Compute (PCC), for the cloud services Apple Intelligence uses when the system can't fulfill a query locally on a user's device. The beauty of on-device data processing, or "local" processing, is that it limits the paths an attacker can take to steal a user's data.


Apple's Biggest AI Challenge? Making It Behave

WIRED

Apple has a history of succeeding despite being late to market so many times before: the iPhone, the Apple Watch, AirPods, to name a few cases. Now the company hopes to show that the same approach will work with generative artificial intelligence, announcing today an Apple Intelligence initiative that bakes the technology into just about every device and application Apple offers. Apple unveiled its long-awaited AI strategy at the company's Worldwide Developer Conference (WWDC) today. "This is a moment we've been working towards for a long time," said Apple CEO Tim Cook at the event. "We're tremendously excited about the power of generating models."


How does Apple send your data to its cloud AI servers? Very carefully, it claims.

Engadget

For years, Apple has touted privacy as its major advantage over rivals like Google and Microsoft. Instead of relying on cloud processing to improve or organize your images, which requires sending your photos to Google's servers, Apple handles those tasks directly on your device. But with the advent of Apple Intelligence, the company's take on artificial intelligence, the company is stepping out of its comfort zone with "Private Cloud Compute." It says "private" right in the name, so it has to be secure, right? While Apple AI will run some models locally, it will occasionally have to send data to Apple's servers for complex requests.