The head of Samsung’s Audio Lab Allan Devantier suggests mixed reality headsets like the Meta Quest 3S and Apple Vision Pro will cause fidelity standards in earphones to skyrocket.

Why? Mixed reality continually highlights any unnatural tonal characteristics in sound.

Devantier says while most of us quickly get used to the sonic peculiarities of our speakers and headphones, the way mixed reality works will put those irregularities under the spotlight as soon as you put on the headset. And that needs to be fixed.

“What’s happens with mixed reality is you’re going to have earbuds in your ear and you’re going to have a live person in the room with you. [Their voice] is going to have to go through the microphone at the start of the earbud and get replayed by the earbud,” says Devantier.

“And that’s going to give you the sonic sorbet that’s going to tell you you’ve got all these timbral problems with your ear buds.”

He says this is will be an issue whenever you have “live sound mixed with synthesized synthetic sounds,” where you’ll instinctively have a sense of what those external audio should sound like.

What Is Samsung Audio Lab?

Samsung Audio Lab is based in LA, and is responsible for the audio tuning and design of some of Samsung’s most important audio-visual products, including headphones, TV speakers and soundbars.

Devantier says the issue of perfecting the response of an earphone is complicated by the fact all of our ear canals are different.

“When you put your bud inside your ear, you now basically have the ear bud in a tube and it’s just changed the standing wave modes in your ear. And of course, everybody’s designing an earbud to correct for that and it’s based on an average ear.

“We know from a scan of 300 inner ears, the current standard that’s out there is pretty good, but it’s not perfect” says Devantier.

One of Samsung Audio Lab’s current projects is in working out a way to mitigate these differences, and the team recently submitted a research paper outlining the findings of this scan of 300 people’s ears.

“We can correct for those small differences. So an earphone is not tuned based on an average ear but it’s actually tuned for your year and how you inserted the ear buds in your ear in the first place,” he says.

“There’s two two flavors of the algorithm. The first one deals with leakage and low frequencies, which are relatively easy to compensate for. And then the harder one is higher frequencies because that’s related to the distance of the ear buds transducer to your ear, and correcting for all those things. And that’s research that we’re working on,” he says.

Anatomy Of A Smart Earphone

The kind of tech needed to make this sort of earphone customization feasible is already here though. It uses the speaker and microphone present inside today’s more capable active noise cancelling earphones.

“[In ANC earphones] there’s usually a microphone near the transducer, and that microphone is actually in the ideal location to figure out how big their ear canal is how far their eardrum is away from the transducer,” says Devantier.

At this point followers of headphone tech may be reminded of Nura, which made self-customizing headphones that analyzed the wearer’s ear anatomy and hearing. The high-end pairs were fascinating, but Nura was acquired by Denon in April 2023, and announced the cessation of its own-brand lines shortly after.

This vein of consumer audio improvement might also be considered an evolution of the works headphone makers have put into the transparency modes of active noise cancelling earphones and headphones. And, sure enough, fidelity on that front has improved hugely over the last four years.

But will mixed reality actually have as much of a progress-pushing effect as the basic transparency mode? That’s up to headsets like the Meta Quest 3S, whose job it is to push mixed reality to a mainstream audience in the way the Quest 2 did somewhat successfully for basic virtual reality, with an estimated 20 million units sold.

Share.
Exit mobile version