This is an unpublished draft preview that might include content that is not yet approved. The published website is at w3.org/WAI/.

Video Script for Dhruv (was Martine)

Back to Index Page

Video script for Martine from the page Stories of Web Users (in the 2020 Update version).

Submit an Issue

Summary

Script

Seq. Time Audio Visual
0 0:00 - 0:00 [Narrator:] How people with disabilities use the web; Martine, older adult student who is Deaf. [Front plate.] box with the text “Dhruv, older adult student who is deaf.”
1 0:00 - 0:15 [Dhruv] Hello! I’m Dhruv. I’m Deaf. I can hear some sounds but not enough to understand speech. Sign language is my native language – I think and dream in signs. We see Dhruv signing directly to us viewers [documentary style into the camera]. We briefly see a total of Dhruv and realize that he is communicating through a sign language interpreter who is speaking what he signs [the voice we hear].
2 0:15 - 0:35 I recently enrolled in an online degree. For my classes, I schedule interpreters who interpret the lectures and conversations, and voice for me. I need video conferencing apps with functionality to “pin” videos of the interpreters so that I can always see them. [New scene.] We see Dhruv in a video conference meeting with multiple people online. We see two sign language interpreters join the meeting. The windows for the interpreters are labeled accordingly, and we see Dhruv take particular attention to these two windows. We see Dhruv use a “pin to screen” function for the interpreters and for the current speaker. We see him signing back to the interpreter to speak up in the meeting. We don’t really see the exact signs or hear the interpreter while the scene transitions, we just see that he is actively participating in the meeting.
3 0:35 - 1:15 Some lectures have realtime captioning typed by human captioners. This is more accurate than automatic captions, which often don’t recognize specialized terms in the lectures. For our assigned videos, I rely on good captions. These are edited so that each sentence appearing on the screen is not too long and is synchronized with the audio. I also often find myself needing to adjust the text size and colors of the captions, to be able to read them better. Some apps also allow me to move the captions to the top or bottom of the video, so that they are not in the way. [New scene.] We see Dhruv watching an online lecture with captions. [We see that the captions indicate who is speaking but don’t see enough to actually read the entire text.] We see Dhruv switching from the live lecture to a pre-recorded video (e.g. browse through a collection of videos and select one of them). We see Dhruv switching on captions for that video and adjusting the text size. We also see Dhruv moving the captions above the video where he can see them better.
4 1:15 - 1:40 In some situations, I find myself lip reading. For example, when I’m on unplanned calls without interpreters or other people who can sign. I learned lip reading over the years because of such situations but it’s not always reliable for me. I need to see the person’s mouth, have them speak clearly, and seeing their gestures also helps. Yet it’s exhausting and not really something I can keep doing for too long. [New scene.] We see Dhruv in a video conference meeting with multiple people online, similar to that in scene #2. This time, there are no sign language interpreters present. She is focusing on the mouth of the person speaking. She is trying to understand what they are saying but the person is sometimes turning away from the camera (e.g. to point at a board or otherwise gesture). Dhruv is looking increasingly tired.
5 1:40 - 1:55 I also use the chat function. Some chat tools support realtime text rather than line-by-line text messages only. This is much more interactive, and I can engage with my friends in text conversations as others do by voice. [Continuation from previous scene.] We see Dhruv using the chat function to exchange with other people on the call. Dhruv is engaged in the discussion and smiling as he types.
6 1:55 - 2:20 Of course, the classes also come with a lot of reading. I’m fine with that, except when the writing is unnecessarily complex and without structure like lists and headings. People don’t realize that for me reading text involves extra effort, because sign language is my native language. So, like for everyone else, clear and simple writing makes text easier to understand. [New scene.] We see Dhruv having difficulty understanding a page with with long and justified paragraphs, little spacing, few headings that are hardly distinguishable, and without any structure.
7 0:00 - 0:00 [Dhruv:] You can help make technology accessible to me. [Narrator:] Accessibility: It’s about people. [New scene.] We see Dhruv signing his line directly to us, as in the first scene [documentary style into the camera, in the same style and continuing the first scene]. We see more and more protagonists from the other videos appear on the screen [to illustrate many people] as the narrator speaks their line.
8 0:00 - 0:00 For more information on how people with disabilities use the web, visit w3.org/WAI [End plate.] We see the URL from the narration.
Back to Top

This is an unpublished draft preview that might include content that is not yet approved. The published website is at w3.org/WAI/.