Can broadcast live subtitling be improved?

Image shows part of a television screen with live subtitling reading "With the latest here's Andy Moore"

For deaf people, accurate live subtitling can allow full engagement with the visual content. That’s why, as a deaf person, and a Sense Digital Champion, I was particularly interested to learn more about the development of subtitling technologies at a recent Ofcom round-table event, feeding back on the quality of broadcast live subtitles.

Attending the event

Present (or by phone link) were around 25 people including representatives of Ofcom, two members from Ericsson, two from Deluxe – both access service providers – plus others representing professional and charitable bodies with a special interest in this subject.

The round-table event was set up to establish what guidance we might to give broadcasters, whilst looking at sampling activities looking at speed, delays between speech, live subtitling delivery and the number of errors broadcast.

We had excellent assistance via two sign language interpreters and a Palantype system to project simultaneous text on a big screen (yes, a bit like live subtitling), with an exceptionally competent stenographer. If only all subtitles could be this good!

By far the most interesting part of the event was the contribution from Alan McGuffog, Head of Service Delivery, Broadcast and Media Services, at Ericsson, and his colleague. They were able to explain the complications behind subtitling and illuminate the reasons why there are problems.

Discouragingly, I found the feeling around subtitling, in general, was somewhat negative. There was much criticism and little praise for what I feel are tremendous improvements over the last year or two. As someone who has experience of interpreting – a very similar skill – I can assure you that it is extremely difficult to translate rapid conversation into coherent text on screen. You can’t go back and correct mistakes, you just have to keep going and hope what you are writing makes sense.

It soon became apparent that there is one key problem faced by those who manage live subtitling for us: either you get an immediate transcription, which may not be very accurate, or you have a long delay before the transcription appears – by which time the news item, or whatever, has moved on.

This problem quite rightly became the focus of our deliberations. It was agreed that there would not be a ‘one size fits all’ solution, that there would always be a trade-off between the ideal and what we see. It was thought that the best we could achieve would be to aim for the maximum impact to benefit as many people as possible.

My thoughts

I must say that my own view is that a delay in delivery of subtitles is seriously problematic. Most people who use subtitles are deaf – if they were not, they would not need subtitles. Most of us deaf people – whether consciously or not – lip read to a certain extent, according to our degree of difficulty in hearing. It is extremely frustrating to ‘see’ someone on screen talking, while at the same time seeing the script of a previous person. What you want is to see the actual and the written word as possible.

Feeding back your thoughts

This is where all of you come in. The broadcasters need to know how we feel about this issue, and any others concerning their product. They can’t put things right if they don’t know what is wrong and an occasional comment on a particularly good piece of subtitling would do no harm. It would be hugely helpful if you – the customers, as it were – could provide feedback and examples of good and bad practice, which can be passed directly to the people who matter. Ofcom is looking into ways in which they can help with this. Any thoughts on this can be left in the comments section of this blog and they will be passed on to Ofcom.

One thing which came up – which I have to say had not occurred to me – were the many different platforms, or portals, on which people now watch TV. What may look fine on a broadcaster’s equipment, for example, yellow text on a red background, may be quite different on a different box in Lancashire or Devon – such as light grey on a white background, which could be illegible. The broadcasting companies need to know this. If they don’t know there is a problem, they can’t fix it.

It was a most interesting meeting, and I hope it will lead to better communications with the people at the sharp end.

Read about the use of Technology at Sense.

As more of our lives happen online, how do people with sensory impairments stay connected?

Author: Janet Caldwell

Janet is a Sense Digital Champion, as part of the Online Today project, helping people with sensory loss across the UK get online.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.