AI-Mimi builds inclusive TV experiences for the deaf and hard of hearing in Japan

There is an increasing demand for subtitles all over the world. For example, in the UK, the BBC reports that subtitles are primarily intended to benefit viewers who are hard of hearing, but are used by a wide range of people:about 10% of broadcast viewers regularly use closed captions, rising to 35 percent for some online content† The majority of these viewers are not hard of hearing.”

Similar trends are recorded around the world for television, social media and other channels that deliver video content.

It is estimated that in Japan more than 360,000 people are deaf or hard of hearing – 70,000 of them use sign language as their primary form of communication, while the rest prefer written Japanese as the main way to access content. Moreover, with almost 30 percent of people in Japan age 65 or older, the Japan Hearing Aid Industry Association estimates that 14.2 million people have a hearing impairment.

Major Japanese broadcasters have subtitles for most of their programs, requiring a process involving dedicated staff and the use of specialized equipment worth tens of millions of Japanese yen. “More than 100 local TV stations in Japan face barriers to providing subtitles for live programs due to high equipment costs and staffing constraints,” said Muneya Ichise of SI-com. The local stations are of great importance to the communities they serve, and the local news programs provide important updates about the area and the people.

To meet this need for accessibility, from 2018 SI-com and its parent company, ISCEC Japan, piloted with local TV stations innovative and cost-effective ways to introduce closed captioning into live broadcasts. Their technical solution to provide subtitles for live broadcasts, AI Mimi, is an innovative combination between human input and the power of Microsoft Azure Cognitive Service, creating a more accurate and faster solution through the hybrid format. In addition, ISCEC can compensate for the shortage of people entering subtitles locally by using their own specialized staff. AI-Mimi has also been introduced at Okinawa University and the innovation was recognized and awarded a Microsoft AI for Accessibility Scholarship

Based on extensive testing and user feedback, themed around the need for larger fonts and better display of the subtitles on the screen, SI-com is able to create a model with more than 10 lines of subtitles to the right of the TV screen. away from the more commonly used version with only two lines in the display at the bottom. In December 2021, they demonstrated the technology for the first time, in a live broadcast, in collaboration with a local TV station in Nagasaki.

Two presenters on a live TV show with subtitles provided real-time on the right using a combination of AI and human input.
TV screenshot of demo with local TV channel in Nagasaki

Leave a Comment

Your email address will not be published.