One of the new features of iOS 16, and highlighted again at Apple’s event on Wednesday, is personalized spatial audio. After you install the latest iOS release on your iPhone from September 12, you’ll be able to create a custom sound profile to enhance the immersion and overall spatial audio experience you get from AirPods.
To produce this personalized tuning, Apple uses the TrueDepth camera on the front of the iPhone to scan your ears. The process, which involves holding your iPhone 10 to 20 centimeters away from the side of your head, takes less than a minute, and the resulting data is used to optimize spatial audio for your unique ear shape. “The way we all perceive sound is unique, based on the size and shape of our heads and ears,” Apple’s Mary-Ann Rau said during the keynote. “Personalized Spatial Audio delivers the most immersive listening experience by precisely placing sounds tuned for you in space.”
But Apple isn’t the first company to go this route. Sony is offering “personalized 360 reality audio” from 2019 for supported music services such as Amazon Music, Tidal, Deezer and Nugs.net. Conceptually, it’s very similar: Both Sony and Apple are trying to recognize your ear structure and adjust spatial audio processing to take into account the unique folds and shapes of your ears. The goal is to maintain that 3D audio experience and remove any audio quirks that dampen the sensation.
Here’s how Sony explained the benefits to me in June, courtesy of spokesperson Chloe Canta:
Humans are able to identify spatial sound sources by subtle changes in the intensity and timing of the sound entering the left and right ears from the sound source. In addition, the sound may depend on the shape of our head and ear. So, by analyzing and reproducing the characteristics of both ears by taking pictures of the ears, this technology reproduces the sound field while using headphones.
Sony’s approach, however, is a little more awkward than Apple’s. AirPods technology is built right into iOS settings. But to create a personalized sound field with Sony products, you need to take an actual photo of each ear with the Headphones Connect app and your phone’s camera.
These images are uploaded to Sony’s servers for analysis – and then Sony keeps them for an additional 30 days so they can be used for internal research and feature improvements. The company says that ear images are not associated with you personally during this window.
That’s not to say that Apple has completely nailed the ear scanning approach. During the iOS 16 beta period, some on social media and Reddit claimed that the process was laborious and sometimes failed to detect the ear. I think the truth of the matter is there is no simple way to pull this off Also Getting a good, accurate read of your ear shape.
The consensus is that it’s worth the effort: these personalized profiles often make a noticeable difference and improve our understanding of spatial audio. And Apple isn’t just taking actual photos: the TrueDepth camera captures a depth map of your head and ear, the same way Face ID learns your facial features.
Apple’s website states that once you create a personalized spatial audio profile from iPhone, it will sync across your other Apple devices, including Macs and iPads, to maintain a consistent experience. At least this will be true in October: you’ll need upcoming updates to MacOS and iPadOS for sync to work. Personalized spatial audio is supported on the third-generation AirPods, both generations of AirPods Pro, and the AirPods Max.
Apple never said it was pulling any firsts with personalized spatial audio. While company executives routinely state that their goal is to optimize meaningful features, others — in this case, Sony — are already moving in that direction.