RIP, stereo. You’ll be missed. But the audio world has moved on and now we’re ready for something more than two channels of audio.
But how far can this brave new world of audio tech take us? Well, if Sony’s recently announced 360 Reality Audio format is anything to go by, it’s around 24 object-based channels arranged in a 360-degree soundstage.
360 Reality Audio is an ambitious attempt to create a new immersive music format designed to be streamed over mobile music streaming services and played through compatible headphones. However, it also works through speakers, from complex rigs to one-box speakers that we’re pretty sure Sony is on the cusp of launching.
Sony specifically told us that 360 Reality Audio is aimed at streaming services, and it’s actually already got support for the launch from Tidal, Deezer, nugs.net and Qobuz. Expect to see 360 Reality Audio on those services’ premium plans soon or, at the very least, in some kind of opt-in beta-only trial.
However, creating object-based audio – so-called 360-degree sphere-mapping where each individual instrument, vocals and effect needs to be specifically placed in a sphere – is something Sony will need to persuade music producers to do before 360 Reality Audio can stand a chance of becoming an actual reality.
While Sony hosted a demo of 360 Reality Audio at its vast booth on the CES show floor, we were treated to a run-through of the tech away from the hubbub in the Mirage Hotel… on the one condition we wouldn’t take any photos.
So instead, we’ll have to describe the setup.
Upon first entering the room, the whole idea seems crazy. There are speakers everywhere, 13 in total, and we were seated essentially within a sound rig. A key part of the 360 Reality Audio demo was having the hearing characteristics our ears measured and, to do that, Sony had to place incredibly sensitive sensors in each ear to create a personalised head-related transfer function (HRTF) that’s crucial to making 360 Reality Audio work – but more on that later.
The initial phase of the demo was played through this speaker-heavy set-up. It sounded incredible. Powerful and bassy, yes, but throughout the track the vocals subtly changed position in the 360-degree soundstage, percussive details appeared in unexpected places, for sure, but it was the separation that impressed us most.
Elsewhere, in a separate public demo at the booth at CES a compatible circular 360-degree speaker – calibrated to the room’s acoustics – Morgan Saint’s Glass House was playing for the audience and the immersion was so impressive that it left folks wondering if all the music was coming from that one little speaker.
The answer, of course, was yes. Yes it was.
So how is Sony cooking up this audio magic? Sony told us that it’s all based on the MPEG-H 3D Audio format, and each track can have a maximum of 24 objects each at approximately 1.5 Mbps each. It’s working with Fraunhofer IIS on all this, but the format name is yet to be confirmed.
Sony also told us that it intends to disclose details to third-party manufacturers so they can make compatible products. The availability of compatible products will be key to making this format a success.
What’s both brilliant and bothersome about 360 Reality Audio is that for the 360-degree sphere to be replicated in a pair of headphones, it requires the streaming apps that decode the format to apply signal processing unique to each listener.
To get those calculations just right, it takes a special setup procedure that takes a few minutes and, as of right now, requires some special equipment.
With sensors placed in each ear, some loud test tones follow, then a graph showing our exact personalised ear image. It’s essentially the shape of our ears and everything else that affects how sound travels from each speaker to the ear. Head-shape, chin shape, size of nose, height of shoulders – all of that takes a difference because it gets in the way of sound and affects what the listener hears.
Surely not everyone that wants to stream 360 Reality Audio content on Tidal or Deezer is going to get a HRTF done? Sony says that it’s developing a phone app for users to take photos of their own ears that the app will then calibrate audio to. Convenient, perhaps, but HRTF is about a lot more than ear-shape so we’ll just have to wait and see if Sony’s more basic approach can work.
What we do know is that it’s worth getting a proper HTRF done because we thought that 360 Reality Audio actually sounded better through headphones than from the speaker rig all around us. Not as powerful, an certainly not as bassy, but just as detailed, and thoroughly convincing.
With so many sounds coming from above, below, behind and so forth, our first thought was that the sound wasn’t coming from the headphones at all. When tech is as invisible as this it’s a real wow moment. It’s properly immersive, and audibly so removed from the headphones, that it feels like you’re inside a piece of music. That’s a weird experience. However, it was also a high-end experience because we were using Sony’s high-end MDR-Z7M2 headphones – not exactly an everyday headphone – so we’ll have to see what Sony’s able to do with a more budget-friendly pair of cans.
Overall, Sony’s 360 Reality Audio is hugely impressive, but it’s hard to describe. Even Sony is struggling, stating that it provides ‘an immersive music experience that feels just like being at a live concert’.
It definitely doesn’t do that.
In our demo the Sony rep suggested that if, say, we were listening to the Rolling Stones, we could have Mick Jagger’s vocals coming from one side of the soundstage, and Keith Richards’ guitar from the other. However, there’s no point pretending that live music is like that; go to a live concert and the audio comes out of a bank of speakers on either side of the stage. So Sony’s concept is way more complicated than replicating live music, and in any case, it’s the 360-ness that’s most impressive.
If anything, it sounds like you’re on the stage, or in an orchestra, not spectating from in front of a band.
Sony’s CES booth in 2019 may have had fewer new products the expected, but what it did have was an intriguing concept for the what it calls ‘an entirely new world of music entertainment’. It’s welcome evidence that Sony, the geeky tech company, and Sony Music are finally in-sync.
That being said, a lot will depend on the soon engineers that produce 360 Reality Audio; even if Sony leverages its industry connections and in-house producers, there are going to be albums that sound incredible in this format, and others that are pointless novelty.
So when will we hear more about Sony’s new format? Expect to hear more in September from IFA 2019 in Berlin about Sony’s headphones, standalone speakers and speaker systems that support the new distribution format.
- Check out all of TechRadar’s CES 2019 coverage. We’re live in Las Vegas to bring you all the breaking tech news and launches, plus hands-on reviews of everything from 8K TVs and foldable displays to new phones, laptops and smart home gadgets.