Using the TrueDepth camera on the iPhone, listeners may customise Spatial Audio to give a listening experience tailored to their own preferences.”
Users of AirPods will be able to scan their ears using Personalized Spatial Audio, which will be available in iOS 16 when it launches. Apple claims that it can provide a “more accurate and immersive listening experience” if it knows what your ears look like.
iOS 16 was introduced at Apple’s WWDC22 opening address on June 6 and the first beta release has been available to developers since then. However, you might not have heard about Apple’s new Personalized Spatial Audio feature, which aims to improve audio quality by using iPhone cameras with TrueDepth technology to map a user’s ears in real time. It’s no exaggeration to say that sounds like magic.
Dolby Atmos integration already gives music streaming via Apple Music a 3D sound experience with Spatial Audio. According to Apple, its iPhones can produce a 3D snapshot of your ears and then use that knowledge to fine-tune the 3D sound you hear using Personalized Spatial Audio.
With Personalised Spatial Audio, the listening experience is even more accurate and immersive. Listeners may construct a custom Spatial Audio profile using their iPhone’s TrueDepth camera to have a personalised listening experience.
The public beta of iOS 16 is likely to be published next month, although Apple has not announced the precise date. It will include these and other new features. People who do not wish to participate in the beta testing of Personalized Spatial Audio will have to wait until the autumn to get their hands on it. New iPhone 14 models are anticipated to be released at the same time as Apple’s next software upgrade in September.
One of the finest iPhone features for Apple Music users is Spatial Audio and we can’t wait to see how it performs when we get our hands on it.