Hearing-impaired people have a new ally in their exploration of the world: Subtitles aren’t limited to their TV screens and streaming services.
The COVID-19 pandemic has disrupted the daily lives of people around the world, but many people with hearing loss are particularly struggling to endure the resulting isolation. “I just can’t understand it at all when everyone’s wearing a mask,” said Pat Olken of the Massachusetts Salon, who runs out of hearing aids. (A new cochlear implant helped her a lot.)
So when her grandson’s bar mitzvah was playing on Zoom early in the pandemic, long before the service offered subtitles, Alken turned to Otter, an app for transcribing business meetings. Reading with the ceremony’s speakers makes the app “a huge resource,” she said.
People with hearing loss (an estimated 40 million U.S. adults) have long embraced technology to help them access the world of hearing, from Victorian ear horns to modern digital hearing aids and cochlear implants.
But hearing aids today can cost as much as $5,000 (approximately Rs 3,93,700) and are usually not covered by insurance and not suitable for everyone. These devices also don’t capture audible sound into focus, as glasses immediately correct vision. Instead, hearing aids and cochlear implants require the brain to interpret sounds in new ways.
“Existing solutions are clearly not a one-size-fits-all model, nor do they meet the needs of many people based on cost, access, and many different factors,” said Frank Lin, director of the Cochlear Center. Johns Hopkins School of Hearing and Public Health . It’s not just a communication issue. Researchers have found a correlation between untreated hearing loss and a higher risk of dementia.
Cheaper over-the-counter hearing aids are on the way. But for now, only about 20 percent of people who could benefit from hearing aids use hearing aids.
In contrast, subtitles are generally more accessible. They have long appeared on modern televisions and are increasingly appearing in video conferencing apps like Zoom, streaming services like Netflix, social media videos on TikTok and YouTube, movie theaters and live arts venues.
In recent years, smartphone apps like Otter; Google’s live transcription; Ava; InnoCaption, for phone calls; and GalaPro for live theater performances have emerged. Some target people with hearing loss and use human reviewers to ensure the captions are accurate.
Others, like Otter and Live Transcribe, rely on so-called automatic speech recognition, which uses artificial intelligence to learn and capture speech. ASR has issues with accuracy and lag in transcribing spoken language; built-in bias also reduces women, people of color, says Christian Volger, a professor at Gallaudet University who specializes in accessibility technology Transcription accuracy of native and deaf voices.
Jargon and slang can also be stumbling blocks. But users and experts say ASR has improved a lot.
While popular, none of these solutions are perfect. Even if she uses Otter to transcribe conversations, her book club can be exhausting, says Toni Iacolucci of New York. Subtitles aren’t always accurate and don’t identify individual speakers, which can make it difficult to keep up, she said.
“It kind of works,” said Ecolucci, who lost his hearing nearly two decades ago. When she got home, she would be too tired to lie down trying to keep up with the conversation. “It just takes so much energy.” She had a cochlear implant a year ago, which greatly improved her hearing, and now she can have one-on-one conversations without subtitles. They are still helping in group discussions, she said.
In a statement, Otter said it welcomes feedback from the deaf and hard of hearing community, noting that it now offers a paid software assistant that can join virtual meetings and transcribe them automatically.
Transcription lags can create other problems—among them fears that conversation partners might get impatient with the delay. “Sometimes you say, ‘Sorry, I just have to look at my subtitles to hear it,'” says New York-based musician and composer Richard Einhorn. “That doesn’t mean I don’t know that sometimes this can cause trouble for other people.”
Other problems followed. When Chelle Wyatt of Salt Lake City went to her doctor’s office, the Wi-Fi there wasn’t strong enough for the transcription app to work. “It’s hand gestures and writing things down and making sure I get a written report afterwards so I know what was said,” she said.
Cinemas provide equipment to amplify sound, as well as glasses and separate screens to display movie subtitles. But these are not always comfortable and are sometimes poorly maintained or not functional at all. Many people with hearing loss want more movies to be subtitled on the big screen, as if you were in the comfort of your own home.
A new law that went into effect in New York City on May 15 requires movie theaters to provide screen captions for up to four showtimes per movie per week, including on Friday nights and weekends, the most popular times to watch movies. Hawaii passed a state law in 2015 that requires each film to be shown twice a week with on-screen subtitles. AMC, the big movie chain, also said it shows some movies with subtitles in about one-third of U.S. theaters.
Captions are now also available for live performances. Several Broadway theaters have promoted a smartphone app that provides subtitles for live performances; there are also handheld personal devices that display subtitles. The theater also has some shows with “open captions” for everyone to see.
The shift to online meetings and schools during the pandemic has meant videoconferencing services have become a survival tool — but subtitles came after a big push. Zoom only added live transcription to its free service in October 2021, but meeting hosts must enable them. Google Meet made captions free to everyone faster in May 2020; workplace messaging app Microsoft Teams did so in June.
“We need captions everywhere, we need people to be more responsive,” Olken said. “The more I advocate, the more others benefit.”