Monday, November 18, 2024

How AirPods Pro will know when you’re trying to silently interact with Siri

Must read

In addition to revealing its initial plans for AI and annual updates to iOS, macOS and more at WWDC 2024, Apple also discussed new capabilities coming to the second-gen AirPods Pro. Siri Interactions will allow you to respond to the assistant by nodding your head yes or shaking your head no. Apple also plans to introduce improved Voice Isolation that further reduces background noise when you’re on a call. Both of these items are exclusive to the most recent AirPods Pro, because they rely on the company’s H2 chip like existing Adaptive Audio, Personalized Volume and Conversation Awareness features.

Like those advanced audio tools that are already available on AirPods Pro, Siri Interactions and Voice Isolation use the processing abilities of the H2 chip in tandem with the power of a source device — an iPhone or MacBook Pro, for example. Using the processing power on both sides, while being able to do so with very low latency, is what will continue to unlock these types of features on AirPods Pro. The pairing will also ensure that the system doesn’t respond when you don’t intend for it to, partially because it’s able to reliably predict what you’re doing.

For Siri Interactions, Apple employs several sensors in addition to the H2 chip to detect a nod yes or a shake no. The company hasn’t divulged any specifics on those, but the motion-detecting accelerometer inside AirPods Pro likely plays a role. Those sensors work alongside an advanced set of transformer models to accurately predict whether you are trying to confirm or dismiss Siri’s alert. They can also distinguish between normal head movements, with the goal of AirPods Pro not being tricked by a quick glance to the side or some other action. Overall, the intent is for Siri Interactions to work just as well when you’re stationary as when you’re moving or during a workout. Of course, Apple has an AI-infused update coming for Siri, so making exchanges with the assistant more natural and convenient means you might use it more.

Despite the unchanged design, Apple has packed an assortment of updates into the new AirPods Pro. All of the conveniences from the 2019 model are here as well, alongside additions like Adaptive Transparency, Personalized Spatial Audio and a new touch gesture in tow. There’s room to further refine the familiar formula, but Apple has given iPhone owners several reasons to upgrade.

Call quality was already a key aspect of AirPods Pro. But, like it has for Siri Interactions, Apple is using the combined power of H2 and a source device to improve voice performance. More advanced computational audio models are being used than what’s currently at work on AirPods Pro, with the goal of further reducing background distractions from everyday scenarios. Those include wind noise, the clamor of a busy city street, construction site racket and potential interruptions at home — like cooking, kids, pets or a vacuum. Additionally, Apple is improving the overall voice quality, not just the real-time noise reduction, and the company is doing so with very low latency. This means you should also sound better on calls in general, but not just because background noise is reduced.

Since these features rely so heavily on the processing power of the H2 chip, any future AirPods models would need to be equipped with the component in order to offer them. They would either need the H2 or something with even more computational horsepower. Of course, Apple doesn’t comment on future products, but the company is clear that H2 is foundational to unlocking these types of advanced audio tools. And if the rumors are true, we won’t have to wait long to see if the new “regular” AirPods will also allow you shake your head to dismiss a call.

Catch up here for all the news out of Apple’s WWDC 2024.

Latest article