Does closed captioning still serve deaf people? Gary Robson at TEDxBozeman
Translator: TED Translators admin Reviewer: Denise RQ Hello, Bozeman! What do you think of when you hear the word accessibility? Wheelchair ramps? Handicapped stalls in a public restroom? Braille on an ATM, perhaps? I think of something less obvious, but with a dramatic impact on the lives of 360 million people around the world. Imagine a deaf person, about 50 years ago, who had a television set turned on, and saw this: (Silence) That experience would have been a little bit different for a hearing person. [Woman: That's right] [Man: Ain't it a little early for that? Ain't Nana coming tomorrow? [Man2: Here is a bulletin from CBS news, President Kennedy shot today just as his motorcade left downtown Dallas.] When President Kennedy was assassinated, television was all but inaccessible. 15 years later, the advent of closed captioning promised to open the world of television to deaf audiences. 30 years, now, after that first captioned broadcast, we're asking ourselves: "Does closed captioning still serve the deaf and hard of hearing audience for whom it was created?" Captions are text on a video picture, that allow people who can't hear what's going on to read it instead. Closed captions are hidden until you press the CC button on your remote control.
So that people who don't need the captions, don't have to see them
The first phase in caption development is...development. Development can take a long time.
Television broadcasts in the United States began in 1928
It was over 40 years before we had closed captioning for deaf people. 60 years before we had descriptive video service for blind people.
But a lot of work had to be done
Caption decoders and encoders had to be invented, software tools developed, and training programs put together for captioners. After this first phase of development was complete, we had captions. For a small audience, on a few shows. The second phase of accessibility, broadening the base, depended largely upon the law.
The Americans With Disabilities Act, a landmark act in almost all respects, barely mentioned Closed Captioning
Later laws required the television sets contain decoding circuitry, and eventually mandated the presence, although not the quality of closed captioning on TV. Thanks to laws like that, today TV captioning is ubiquitous. However, as law and technology have pushed captioning forward, the availability of captions has often been offset by a decline in quality and a lack of focus on what's important to the deaf community. Just last week, or last month, the FCC unanimously approved new standards for captioning. These led us directly on the path to the third phase of accessibility: quality. Now, TED Talks have to be prepared well in advance. My talk was ready when this announcement was made, causing me to rip out quite the lecture on why the FCC should be mandating quality. (Laughter) But that's okay! I don't mind the last minute changes, it's for a good cause. I just wish they'd waited until after my talk so at least I could take credit for it! (Laughter) Defining quality can be a sticky issue. The dictionary calls it a "level of excellence." In captioning, a more useful definition would be understandability. Do the captions help someone who can't hear what's going on to see what's going on? A deaf friend of mine once said: "We're not asking for special treatment. All we want is what the rest of you take for granted." Legislating quality is even more difficult. Industry experts have been arguing for decades over how to put a numeric score on closed caption quality. Arguing is what we do! But the one thing we agree on is that it begins with accuracy. NCRA, the organization that certifies realtime captioners, --the ones who do captioning at 250 words a minute on live events-- measures caption quality by errors and omissions. The fewer errors you make, the higher your quality. This, though, supposes that all words have equal importance. Do they? Take this well known sentence from a Dr. Seuss classic. If we were to drop the second word, it wouldn't change the meaning of the sentence at all. But if we drop the third word the sentence means the exact opposite: "I do like green eggs and ham!" Clearly all words don't have equal importance. In realtime captioning, errors are inevitable, and often funny. The keyboard that realtime closed captioners use is chorded, meaning you press more than one key at a time, like a piano. And a simple misfingering doesn't lead to a letter being wrong, but a syllable, a word, or a phrase. This is what led the closed captioning on a network news broadcast to introduce a lawyer as a liar. (Laughter) A fun guy as fungi. And a golfer's nice putt as a nice butt. (Laughter) Postproduction captioners have plenty of time in a studio to carefully craft the text, timing, and placement. But the only way to assure high quality realtime closed captioning is to hire and train the best people for the job. The FCC's definition of accuracy begins with matching the captions to the spoken dialogue, but it continues to include background noises.
This is an example of focusing on the needs of the deaf audience
Imagine the captions going away on a television show. How is a deaf viewer to know whether there's a technical glitch or whether there's just nobody speaking at the moment? A simple bracketed caption like "(Applause), (Laughter), (Silence)" can answer that question. High quality captions must also be well synchronized. A significant delay between the video and the captions can make a program hard to understand, and I have recently measured delays of over 12 seconds on broadcast television. If you don't think that makes it hard for a deaf person to follow a program, try watching a TV show or a movie with the sound lagging 12 seconds behind the picture. These delays can also lead to a loss of caption data. If the captions are running 12 seconds behind, every time you go to a commercial you're going to lose entire sentences. And the fourth critical component to caption quality is placement. Nobody wants the captions covering the score on the ball game or the weatherman's map.
So what can we do to help?
First, we can care. We wouldn't tolerate grainy pictures, sloppy camera work, poor audio quality, bad lighting. Why should we tolerate bad captioning? The World Health Organization estimates 360 million people around the world have disabling hearing loss. 360 million people! That's equivalent to the entire population of the United States. Those people matter! The next thing we can do is talk about it. Broadcasters don't get a lot of feedback from their deaf audiences.
A lot of deaf people don't want to be seen as complainers, or they feel they should be grateful for whatever they're getting
But their captions are every bit as important as our sound. Quality captioning serves everybody. Captions can help children learn to read. Captions aid in fighting against adult illiteracy. Captions help us follow a TV program in a noisy airport, bar, or gym. But captions are a lot more than that to deaf people. Captions can save lives in an emergency broadcast, telling deaf people when they need to evacuate their homes, or what roads to avoid. And the same technology that's used for television closed captioning is used in educational and business settings, to give deaf people equal access. Captioning has been around now for over 30 years. It's time to retune, refocus, and remember who closed captioning was created for in the first place.
Tags: closed captioning services
Ssh! We read YouTube quietly © 2020