Immersive accessibility: introducing a narrator
Remember when the beautiful blue species of Avatar leapt onto silver screens in all-consuming 3D and made the world gasp in awe at immersive virtual reality?
The year was 2009 and we have seen a few different formats emerge since then. Dynamic 3D followed soon after and, eight years on, we are seeing the convergence of formats to create an altogether new form of storytelling in 360 degree views. A headset transports you into a mesmerising, even edgily hypnotic world with 3-D sound. Gaming enthusiasts would probably be more used to this degree of immersiveness but, for the rest of us, the novelty of it is still exciting.
In the UK, the BBC was one of the first broadcasters to experiment with the 360 degree format, transporting viewers to Strictly’s dance floor or taking a walk with the dinosaurs guided by David Attenborough. While the idea sounds great, it is not surprising that people with any form of sensory disability find this new environment challenging – not just to interact with, but even to fully experience its magic.
One of the obstacles encountered has been the delivery of audio description on this content. Audio description is a fairly straightforward asset to produce and consume when one is talking about traditional two-dimensional content. The describer prioritises what will be described according to the time available between dialogues whether it is the on-screen action, time of the day or the characters on-screen. But in 360 degree views, although it is unlikely that the principles of description will change, it will be hard to predict what the viewer will do next – look left where one can see the snow-capped peaks of the Swiss Alps or right, where a meadow filled with daisies soothes your sensibilities. So what happens in situations where viewers have the complete control over what they see and when they see it. Perhaps such scenarios require further controls which allow people to activate descriptions when they change direction.
In a recent development, Google has introduced “heatmaps” showing where in videos people focus the most when watching 360 degree content on YouTube. According to Google’s data, people rarely bother to turn their heads to view the full 360-degree experience. What does this mean for audio description?
The Immersive Accessibility project (ImAc) financed by the European Union Research and Innovation programme Horizon 2020 aims to answer this and many other similar questions for access feature users.
Follow the developments on @ImAcProject.
Broadcast Relationships and Audio Description Manager
Royal National Institute of Blind People