This is an unpublished draft preview that might include content that is not yet approved. The published website is at w3.org/WAI/.

Video Script for Perception

Back to Index Page

Video script for Perception from the page Tools and Techniques (in the 2020 Update version).

Submit an Issue

Script

Seq. Time Audio Visual
0 0:00 - 0:00 How people with disabilities use the web; perception - hearing, feeling and seeing. [Front plate.] box with the text “Perception - hearing, feeling and seeing”.
1 0:00 - 0:00 When content is accessible, people can perceive it through different senses depending on their needs and preferences. For example, some people who can’t see a screen or hear audio, rely on websites and apps that can present the information in different ways. We see a collage of five people going about their lives in their different settings [they are the protagonists that we will see in the coming scenes].
2 0:00 - 0:00 People who are blind rely on hearing or touch, rather than sight. For video, content creators need to ensure that visual information is provided in auditory form as well. Often these descriptions of visual information can be placed in existing pauses in the audio, or the script can be created from the start with sufficient description of the scenes, characters, and other important visual information. [brief pause to listen to a short slice of audio description while the video is being shown in the visuals] [New scene; zoom into the context/setting of one of the people (#24) shown in the collage in the previous scene.] We see the person (#16) starting an online video and switching on the “Audio Description” feature. [the person (#16) is wearing dark glasses but otherwise there is no visible indication that the person (#16) has any disability]. We see brief part of the video that the person (#16) starts while we hear the audio descriptions for that scene.
3 0:00 - 0:00 People who are blind may use assistive technology on computers and mobile phones called screen readers. These software tools read the information on the screen out loud, or they can present it in Braille. Screen readers process websites and apps with correctly coded headings, lists, links, button, and other structures much better. [brief pause to listen to a short slice of screen reader (through the headphones) at moderately high speed while the video is being shown in the visuals]. [New scene; zoom into the context/setting of one of the people (#3/Ilya) who is featured in a separate video (see Ilya’s script).] We see the person (#3/Ilya) using a screen reader and headphones. We scan over the content on the screen with images the narration mentions these aspects. [This scene is similar to scene in 2 in the Visual script and similar to scene 2 in the Ilya script.]
4 0:00 - 0:00 People who are deaf-blind also use screen readers to present the information using a device called a refreshable Braille display. Braille characters are a combination of raised or lowered dots, which people read by scanning over them with their fingertips. [New scene; switch to person (#7/Noor) who is featured in a separate video (see Noor’s script).] We see the person (#7/Noor) using a portable braille display to read what’s on the screen. [We see the screen turned off]. The person (#7/Noor) is scanning the braille display with their finger tips, and switching to the keyboard to type. [This scene is similar to scene in 7 in the Auditory script.]
5 0:00 - 0:00 Similarly to people who are blind and deaf-blind, people who are Deaf and hard of hearing may use haptic feedback too; for example, through vibration alerts on a mobile phone instead of auditory notifications. [New scene; switch to person (#11); there is no visible indication that the person (#11) has any disability]. We see the phone of the person (#11) vibrating when a notification appears on the screen. The attention of the person #11 is drawn to the phone because of the vibration.
6 0:00 - 0:00 People who are Deaf or hard of hearing often rely on seeing instead of hearing. For example, while watching a video they may use captions or sign language as an alternative to the audio. [New scene; switch to person (#6/Martine) who is featured in a separate video (see Martine’s script).] We see the person (#6/Martine) in an online meeting with sign language interpreters pinned and with captions turned on. This scene is similar to scene 2 in the Martine script.
7 0:00 - 0:00 Some people with language disabilities use tools to convert content into formats they can better understand. For example, converting text into symbols. The symbols could be pictures of the words, such as a picture of an apple instead of the word “apple”. They might also be recognizable icons for certain functionality, such as a home icon to symbolize the “back to home” functionality, or a cash icon to symbolize the “purchase” functionality. [New scene.] We see the person (#22) using adaptation tools to show symbols instead of text. The focus is more on the symbols on the screen appearing in sequence with the narration rather than on the person. Note: this is someone using augmentative and alternative communication (AAC).
8 0:00 - 0:00 [Individual protagonists:] These are ways to make technology work for me. [Narrator:] Accessibility: It’s about people. [New scene.] We see the five protagonists from the previous scenes. They are looking into the camera as they speak their lines to the viewers [either individually or all at the same time, to be decided]. We see more and more protagonists from the other videos appear on the screen [to illustrate many people] as the narrator speaks their line. [Note: some protagonists might be signing rather than speaking, or speaking with the help of software or hardware tools.]
9 0:00 - 0:00 For more information on how people with disabilities use the web, visit w3.org/WAI [End plate.] We see the URL from the narration.
Back to Top

This is an unpublished draft preview that might include content that is not yet approved. The published website is at w3.org/WAI/.