Making Your Content Accessible To The Deaf & HOH Community

Image of buttons on a TV remote control

By: Veronica Figueroa Fernandez, PRSA Orlando Diversity & Inclusion Chair

Three decades ago, President George H. W. Bush signed the Americans with Disabilities Act (ADA) into law, transforming the lives of Americans, and ensuring equal opportunities and access to the 61 million adults living with disabilities when it comes to employment, government services, public accommodations, commercial facilities, and transportation.

And while increasing physical accessibility has been a central focus, it is important to continue addressing areas that need improvements, such as making your social media and video content accessible to the deaf and hard of hearing (HOH) community.

According to the National Association of the Deaf, in the United States alone, there are 48 million individuals who are deaf and HOH, including late-deafened, deafblind, and deaf-mobile, among others. 

Social media is a huge part of our culture and it brings people together. What was once seen as a tool for sharing personal updates is now a central and essential component to the business strategy. Brands and marketers need to make an effort to make their content more accessible, ensuring that everyone can enjoy it.

For Deaf History Month, we’ve made a list of ways marketers can make their content accessible and inclusive to the deaf and HOH community.

Provide Multiple Contact Options

Just like your hearing audience, your deaf and HOH customers want to get in touch with you. However, many small businesses prefer to connect with their customers through the phone because they do not have the time to dedicate to social media. To quote the movie Pretty Woman, “Big mistake. Big. Huge.” A 2018 report showed that individuals with hearing difficulties represent the disability category with the greatest amount of discretionary income of $9 billion. Provide additional contact options, such as Facebook Messenger or email where they can best reach you – and stay on top of your notifications!

Closed Captions

These are the most common types of captions used by major broadcasters and video streaming services like Netflix, Hulu, and YouTube. Closed captions are often displayed as [CC], letting the viewer know that closed captioning is available.

A closed caption file contains the text of what is said throughout the videos, with time codes for when each line of text should be displayed, as well as position and style information. The most common closed caption file types, according to the Alliance of Access to Computing Careers (AccessComputing), are:

  • Timed Text (or DFXP): The acronym stands for Distribution Format Exchange Profile, a World Wide Web Consortium (W3C) draft standard. This XML markup language is designed for marking up timed text or captions.
  • SMIL: Also known as a W3C standard and an XML markup language, Synchronized Multimedia Integration Language is designed to allow for the synchronized presentation of various media components such as video, text, images, and audio.
  • SAMI: Synchronized Accessible Media Interchange is Microsoft’s format for delivering closed captions, and files include the captions.
  • SubRip (.srt) and SubViewer (.sub): These text formats are officially supported by YouTube, along with the DFXP and SAMI formats. Facebook also uses the .srt format.

Pros: There are many benefits to using closed captions, such as giving the user the ability to turn them on or off, and allowing the video publisher to edit and re-upload the caption files should there be a mistake or inconsistency. With so many file format options for closed captions, video publishers are able to share their content on multiple platforms and media players.

In the past, televisions needed a separate decoder in order to display closed captions, since the passage of the Television Decoder Circuitry Act of 1990, most television manufacturers have been required to include closed captioning display capabilities.

Cons: The biggest drawback for closed captions is the number of file formats available, and knowing which format is compatible with each platform. Additionally, viewers are dependent on the video publisher to ensure that the closed captions are visible against all backgrounds. Finally, some viewers may not have the physical ability to turn captions on or off.

Fun Fact: Many use the words “closed captions” and “subtitles” interchangeably, but subtitles are typically used when the viewer doesn’t speak the language in the video. 💡

Open Captions

These captions are permanently visible on the video, without having to turn the captions on. Social media users are familiar with the open caption concept, as brands and media outlets are beginning to recognize that there are many reasons why someone may prefer to watch a video without sound. They may be in a public or quiet setting, HOH, or deaf.

A survey of U.S. consumers found that 92% view videos with the sound off on mobile and 83% watch with sound off. The report recommends that advertisers caption their advertisements because 80% of consumers are more likely to watch an entire video when captions are available. Armed with this knowledge, brands need to rethink their approach to video marketing.

Pros: For video publishers uploading to platforms without a closed captioning functionality, open captions are a dream because they do not require additional files, as the captions are permanently “burned” on the video. Because they are automatically displayed, viewers lacking the physical abilities to toggle them on do not have to worry about that. Open captions allow for more versatility in choosing the font color and size.

Cons: Because these captions are embedded in the video, users who do not want to see them cannot turn them off. Additionally, if there is a typo or mistake, the publisher will have to delete the video, edit it, and re-upload it. Viewers watching the video on a low-quality streaming platform may encounter difficulties in reading the captions.

Share Transcripts

Consider providing transcripts for your video or audio content. According to YouTube, transcripts work best with videos that are less than an hour-long, with good sound quality and clear speech.

Even podcasts have begun using transcripts to not exclude their HOH and Deaf audience. In 2011, This American Life (TAL), the then 16-year-old popular and beloved weekly public radio show hosted by Ira Glass, made the decision to transcribe their entire audio archive and make the transcripts available to website visitors to increase inbound traffic from organic search, increase the number of inbound links, engage users in alternate ways, make content accessible to HOH/Deaf, and make it easier to pull quotes.

In doing so, TAL helped improve its SEO and increased the number of unique visitors to its website by 4.18%. The number of unique visitors who discovered TAL through organic search results increased by 6.68%. The search results showed that 7.23% of website visitors engaged with the transcripts. The results also showed that transcripts provide an effective way to increase authority through additional inbound links. Today, transcripts are posted within 24 hours after a program airs and linked to the main episode page.

Provide an ASL Interpreter

Sign language interpreters are an important part of live video, especially those dealing with important topics such as the health and safety of the public. And while this sounds straightforward, we have seen many agencies butcher this tactic. Last year Baltimore Mayor Young apologized to the deaf community after cutting off an ASL interpreter during a coronavirus news conference. During the event, the mayor’s voice was drowned out by protesters and he paused his remarks. However, the interpreter continued to sign the protesters’ message, to explain to the viewers what was happening and the mayor asked the interpreter to stop. “You interpret for us,” he said. 

And countless times we have seen phony interpreters worm their way into interpreting during important news conferences, such as an incident in 2017, where officials in Manatee County, Florida came under fire after an interpreter warned viewers about pizza and monsters during an emergency briefing related to Hurricane Irma.

It is important for agencies, companies, and organizations looking to have an ASL interpreter to vet interpreters through reputable service providers.

Explore Live Captions

Companies like Ai-Live understand the difficulties Deaf or HOH people encounter while participating in workplace, educational, and live event settings. Ai-Live displays live captions on web-enabled devices, and can even display captions via a projector during in-person events. The great thing about this program is that it uses real people, not computers, to convert speech to text. The interpreters either use a stenotype machine with a phonetic keyboard or re-speak what they hear into voice recognition software that they have specifically trained to their voice.

In 2019, Google introduced Live Caption, an automatic captioning system for Android smartphones. With the exception of phone and video calls, Live Caption captions videos, and spoken audio on your device in real-time, even if you don’t have cell data or WiFi. The captions are private and never leave your phone, and can be positioned anywhere on the screen.

Tools, such as Clipomatic allow smartphone users to record themselves, turning everything they say into a caption. Users can then post their videos on social media, after checking and editing any caption errors. In the past, we’ve seen Congresswoman Alexandria Ocasio-Ortez and Queer Eye host Karamo Brown use the app on Instagram. Programs such as these ones break barriers and provide access to spoken dialogue.

The original version of this blog post appeared on Laughing Samurai.