Saturday, November 2, 2024

Apple Just Made It Easier to Get Away With Sneaky Stuff

Must read

For years, Apple has tried to make privacy a core part of its brand. The keyword here is “tried,” since multiple investigations (including one from Gizmodo) have shown that the company doesn’t always live up to its lofty privacy promises. As part of its annual Worldwide Developers Conference, Apple rolled out several new features that aim to offer enhanced digital protections for MacOS, iOS, and iPadOS users.

One feature that is sure to be popular is a new widget that allows you to “lock” and/or hide mobile apps. Locking an app closes it off to outside inspection, and the only way to unlock it is via a person’s face ID. This feature also lets users hide apps, by concealing them within a hidden folder. Hidden apps can also be locked, making them basically impenetrable to the outside observer.

This feature seems potentially useful but also sorta funny since it seems so intentionally designed to let users get away with shady stuff. Want to cheat on your wife? Hide Tinder using App Lock! Want to sell drugs but also maintain an air of respectability on your mobile screen? Try App Lock! It’s also worth noting that Android phones have already offered identical feature for years.

The more technically impressive privacy feature that Apple rolled out Monday is related to its newly announced artificial intelligence system. Apple debuted Apple Intelligence (or AI) on Monday, a new generative AI suite that draws off the data on a user’s phone, tablet, or computer to offer automated assistance. However, the company has acknowledged the invasive potential of this technology, given that it relies on the totality of a user’s mobile digital activities to inform its automation.

“You should not have to hand over all the details of your life to be warehoused and analyzed in someone’s AI cloud,” said Craig Federighi, Apple’s senior vice president of software engineering, during his presentation on the new system. Federighi claims that Apple’s new AI models offer “powerful privacy” by processing data “on-device,” meaning that it isn’t shared with Apple but, instead, is kept circulating on the user’s phone or computer. “A cornerstone of Apple Intelligence is on-device processing, and many of the models that power it run entirely on device,” the company says.

For more complicated, energy-intensive models that require more processing power, Apple says it offers something called Private Cloud Compute, which is a system that interacts with Apple’s cloud but does not store user data and protects data via cryptographic defenses. Impressively, Apple has said that it wants “independent experts” to inspect this new feature and ensure that it is providing the protections that they say it does. Since there’s a lot that’s still unknown about Private Cloud Compute and the other privacy features that Apple just debuted, outside inspection seems like a great idea.

Latest article