Sunday, December 22, 2024

Why Apple Intelligence won’t run on older iPhones, or Vision Pro – 9to5Mac

Must read

When asked why Apple Intelligence won’t be available on older iPhones, the company has so far said that the chips simply weren’t sufficiently powerful to provide a good experience. Responses would take too long on older chips.

But many have said that if their phones weren’t up to the task, why not just use Apple Intelligence servers – aka Private Cloud Compute? After all, that’s what already happens with a lot of Siri requests today. We now have an answer to this …

John Gruber spoke with Apple about this.

One question I’ve been asked repeatedly is why devices that don’t qualify for Apple Intelligence can’t just do everything via Private Cloud Compute. Everyone understands that if a device isn’t fast or powerful enough for on-device processing, that’s that. But why can’t older iPhones (or in the case of the non-pro iPhones 15, new iPhones with two-year-old chips) simply use Private Cloud Compute for everything?

From what I gather, that just isn’t how Apple Intelligence is designed to work. The models that run on-device are entirely different models than the ones that run in the cloud, and one of those on-device models is the heuristic that determines which tasks can execute with on-device processing and which require Private Cloud Compute or ChatGPT.

Gruber acknowledges this likely isn’t the only reason; Apple is already having to provide a lot of server-side computing to handle requests that can’t be processed on-device, and the server demand would be massively higher if it had to handle all requests from older phones. But the reason given does sound plausible.

One other surprising revelation is that Apple Intelligence won’t be supported by Vision Pro, despite the M2 chip being powerful enough to do so. The reason here, says Gruber, is that the chip is already running at close to capacity, so doesn’t have enough spare capacity for Apple Intelligence.

According to well-informed little birdies, Vision Pro is already making significant use of the M2’s Neural Engine to supplement the R1 chip for real-time processing purposes — occlusion and object detection, things like that. With M-series-equipped Macs and iPads, the Neural Engine is basically sitting there, fully available for Apple Intelligence features. With the Vision Pro, it’s already being used.

Again, the explanation makes sense, though is a great shame, since the platform would seem almost tailor-made for AI – and is the precursor to an eventual Apple Glasses product which will for sure include Apple Intelligence.

Another snippet from Gruber’s round-up: Prepare to be consistently annoyed by the permission request for handoff to ChatGPT. At least as things stand in Apple’s internal versions, there’s no Always Allow option.

Some people are going to want an “Always allow” option for handing requests to ChatGPT, but according to Apple reps I’ve spoken with, such an option does not yet exist.

I suspect that will change, as it’s going to quickly become a consistent irritation, but Apple likely wants to play safe on the privacy front in the early days of the service.

Photo by AltumCode on Unsplash

FTC: We use income earning auto affiliate links. More.

Latest article