Over 38 million people in the US are deaf or hard-of-hearing, and Facebook wants them to be able to watch the 1 in 5 of its videos that are broadcast live. So today it’s taking the first step towards making Facebook Live more accessible by allowing publishers to create and stream closed captions for their Live videos either on their own or with the help of a technology vendor like Ai-Media or Telestream
However, Facebook isn’t offering automatic closed captioning through its tech the way it does for free for recorded and uploaded Page videos. That would significantly increase the coverage of captions on Live videos, but could be prohibitively difficult or expensive to offer in at scale in the low-latency real-time nature necessary for Live broadcasts.
Instead, only publishers using the Live API that generate CEA-608 standard closed captions or work with a vendor who does will be able to add subtitles.
“Making Facebook accessible to everyone is a key part of building global community” write Facebook product manager Supratik Lahiri, and director of accessibility Jeffrey Wieland. These captions will appear to users on both mobile and desktop if they enable the setting.
The 360 million people world-wide who are hard-of-hearing aren’t a small niche that Facebook can ignore. In fact, they present a sizeable growth opportunity. If Facebook can use closed captioning to get them watching videos, they’ll spend more time connecting with people on its apps, and see more ads that generate revenue.
For comparison, Twitter lacks any closed captioning ability beyond video publishers including them already overlaid in their uploads. Snapchat added a formal closed caption option to Discover last year. YouTube was a pioneer, adding closed captions in 2006 and automatic speech recognition-based captioning a few years later. It’s since expanded to Live videos and more languages.
But closed captions can be inaccurate, especially for Live broadcasts, as Facebook discovered this month when Harvard’s faulty system spawned plenty of jokes during Mark Zuckerberg’s commencement speech.
So perhaps it’s for the best that Facebook isn’t trying to assume the responsibility for accurate Live captioning just yet. But as its speech recognition improves and its tech scales, it might be able to make every Live video more accessible.
For more on accessibility in tech, check out coverage of our one-day TechCrunch Sessions: Justice conference today in San Francisco, where I’ll be chatting with Facebook’s first blind engineer, accessibility specialist Matt King.