Sunday, December 22, 2024

Could Apple’s ChatGPT partnership expose it to legal risks?

Must read

Apple’s newly announced partnership with the AI chatbot-maker OpenAI will put the latest ChatGPT model on the next iPhones, iPads, and Macs. As OpenAI continues to face legal actions from news outlets and authors, the new deal with Apple raises the question: could Apple be found liable if courts rule that OpenAI violated copyrights?

OpenAI has been sued by authors and news organizations — including The Intercept and The New York Times — over the last year in multiple lawsuits accusing the ChatGPT-maker of copyright violations. The concern is that OpenAI is using writers’ original content without compensation to train its AI models, with those writers saying the AI startup has stolen from them.

Neil Elan, senior counsel at Stubbs Alderton & Markiles LLP, says Apple probably wouldn’t be held liable in such cases. “I don’t think it would have a direct impact on Apple,” he said.

According to Elan, however, if courts rule in favor of plaintiffs, they might issue injunctions preventing OpenAI from continuing to collect data in an unauthorized way. On one hand, that could damage ChatGPT’s reputation and impact Apple negatively by virtue of association. On the other, such an outcome could be good for Apple users looking to protect their data when they use ChatGPT through Siri.

Data privacy is a bigger legal risk for Apple when it comes to its partnership with OpenAI than copyright infringement, Elan said. That’s because “there’s no guarantee that it will be successful and it will protect all the users in the way that Apple represents.”

Apple put privacy at the forefront of its AI announcements Monday during its Worldwide Developers Conference. Unlike other AI software, Apple said its AI tools will (in most cases) be powered by hardware that sits on your Apple device. In cases when more complex AI software requires more computational power, Apple will use a new model called “private cloud compute.” The model means Apple iPhone users’ data is sent more securely to servers run on Apple hardware.

But Elan said “the private cloud compute policy might not be strong enough in certain areas” to fully shield Apple from legal concerns’ over customers’ privacy. And Apple doesn’t have as much control over users’ privacy when Siri outsources queries to ChatGPT. Still, users will be able to opt-in or opt-out of using ChatGPT-4o, and OpenAI made big concessions: the startup won’t store Apple customers’ requests of IP addresses.

Apple is doing a lot more for user privacy than many other AI companies to date, too. “If Apple’s [new privacy standards are] accepted and hold up in practice, then it could establish a standard that other AI operators would then need to be held to,” Elan said. He said that would “cause a lot of other AI providers and operators to increase their security processes and protections and adopt similar safeguards.”

Latest article