One of the new features of iOS 16, and something that was again highlighted during Apple’s event on Wednesday, is personalized spatial audio. Once you’ve installed the latest iOS release on your iPhone beginning September 12th, you’ll be able to create a custom sound profile that should improve the sense of immersion and overall spatial audio experience you get from AirPods.
To produce this personalized tuning, Apple uses the iPhone’s front-facing TrueDepth camera to scan your ears. The process, which involves holding your iPhone about 10 to 20 centimeters from the side of your head, takes under a minute, and the resulting data is then used to optimize spatial audio for your unique ear shape. “The way we all perceive sound is unique, based on the size and shape of our head and ears,” Apple’s Mary-Ann Rau said during the keynote. “Personalized spatial audio will deliver the most immersive listening experience by precisely placing sounds in space that are tuned just for you.”
But Apple isn’t the first company to go down this path. Sony has offered “personalized 360 Reality Audio” since 2019 for supported music services like Amazon Music, Tidal, Deezer, and Nugs.net. Conceptually, it’s very similar: both Sony and Apple are trying to determine your ear structure and adjust spatial audio processing to account for the unique folds and contours of your ears. The goal is to maintain that 3D audio experience and eliminate any audio quirks that lessen the sensation.
[embedded content]
Here’s how Sony explained the benefits to me back in June, courtesy of Kaz Makiyama, vice president of video and sound at Sony Electronics:
Humans are able to recognize spatial sound sources by the subtle shifts in the intensity and time of sound entering the left and right ears from the sound source. Plus, the sound may depend on our head and ear shape. So, by analyzing and reproducing the characteristics of both ears by taking pictures of the ears, this technology enables reproduction of the sound field while using headphones.
Sony’s approach, however, is slightly more awkward than Apple’s. The AirPods technique is built right into iOS settings. But to build a personalized sound field with Sony’s products, you have to snap an actual photo of each ear with the Headphones Connect app and your phone’s camera.
These images are uploaded to Sony’s servers for analysis — and then Sony holds on to them for 30 additional days so they can be used for internal research and feature improvements. The company says the ear pics are not personally associated with you during this window.
That’s not to say that Apple has completely nailed the ear-scanning procedure, either. Throughout the iOS 16 beta period, some across social media and Reddit have mentioned that the process can feel tedious and sometimes fails to detect an ear. I think the truth of the matter is there’s no dead simple way to pull this off while also getting a good, accurate read of your ear shape.
The consensus seems to be that it’s worth the effort: these personalized profiles often make a noticeable difference and can improve our perception of spatial audio. And Apple isn’t taking actual photos: the TrueDepth camera captures a depth map of your head and ear, much in the same way that Face ID learns your facial features.
Apple’s website notes that once you’ve created a personalized spatial audio profile from an iPhone, it will be synced across your other Apple devices, including Macs and iPads, to maintain a consistent experience. That’ll be true starting in October at least: you’ll need upcoming updates to macOS and iPadOS for the syncing to work. Personalized spatial audio will be supported on the third-generation AirPods, both generations of AirPods Pro, and the AirPods Max.
Apple has never claimed to be pulling off any firsts with personalized spatial audio. The company’s executives have routinely stated that their goal is to come up with the best execution of meaningful features, even if others — in this case, Sony — were already pushing in that direction.