Skip navigation

The stenographer's friend. This 1910 advertisement by the Edison Manufacturing Company, promotes the phonograph as a text-based device.

Library of Congress, Motion Picture, Broadcasting, and Recorded Sound Division

Text description of The Stenographer's Friend




Explore further: Talking Book Timeline

Reading Sound

As braille was being developed and debated, the Talking Book arrived as an alternative medium for literacy and “from the start [for some], the Talking Book was destined to overtake braille, speedily and permanently, as the broadest channel of literature and information for blind people of all ages” (Koestler, 2004, p. 144). The American Foundation for the Blind’s Talking Book program begin in 1932 with support from the Carnegie Foundation; it included the production of both audio recordings and playback machines. Talking books were desired not only for convenience, but because some blind readers did not have the tactile sensitivity 6 necessary for braille finger-reading; demand for talking books increased when blinded veterans returned from war. Talking books were originally recorded by human voices on phonograph, then on audiocassette. More recently (beginning in 2009), the National Library Service began distributing digital books through its Digital Talking Books program. Beyond the government-funded Talking Book program, audio books for individuals with a range of print disabilities, are available through Bookshare and Learning Ally (formerly Recording for the Blind and Dyslexic), which is “the largest producer of recorded textbooks” (Kendrick, 2007).

Tactile sensitivity is considered a key part of braille readiness, and may include being able locate braille marks on a page and grading textures of sandpaper (McComiskey). Tactile sensitivity can diminish with age or with conditions such as diabetes. There is disagreement, however, about whether tactile sensitivity is a physiological state or a skill that can be developed (Cryer and Home, 2011).

Further, with improvements in the availability and portability of audio playback devices and the development of text-to-speech technologies, talking books have become more popular and mainstream; audio books (particularly mass market best-sellers) are an increasingly available reading option for sighted readers. However, digital talking book technology for individuals with print disabilities is far more functional than commercial digital book technology: DAISY (Digital Accessible Information System), a common standard for digital talking books meant for individuals with print disabilities, allows readers to search and scan a text, set bookmarks, add highlights and notes. DAISY allows audio reading to be dynamic and spatial rather than linear as with most audio books.

A Literacy of Sound

The inscription of speech and writing was an early intention for audio recording technology, even outside of blindness advocacy. As media historian Lisa Gitelman (1999) describes, “The first glimpse of a paperless world. . . was exhilarating. Paperlessness was a way to keep things live, to save the vibrance and authenticity of experiences” (p. 65). Edison originally viewed his phonograph as a business machine, not the read-only music playback device it became. Many of the primary purposes he envisioned for the device were sound-based literacy practices: letter writing and dictation, phonographic books for the blind.7 A literacy of sound.

Another blindness-related innovation that Edison explored was raised printing. The process was never realized, but included coating pages with a variety of chemicals and writing with arsenic ink.

The possibilities of recorded sound as a mode of literacy were imagined by Edward Bellamy’s 1889 piece of speculative fiction, “With the eyes shut.” In the story, train travelers entertain themselves with phonographed books, listening by holding two-pronged forks (called “indispensables”) up to the ear. Clocks announce the time, correspondence is conveyed through sound, print literacy is on the wane. Bellamy’s narrator speaks to the pleasures of an audio book:

A good story is highly entertaining even when we have to get at it by the roundabout means of spelling out the signs that stand for the words, and imagining them uttered, and then imagining what they would mean if uttered. What, then, shall be said of the delight of sitting at one's ease, with closed eyes, listening to the same story poured into one's ears in the strong, sweet, musical tones of a perfect mistress of the art of story-telling, and of the expression and excitation by means of the voice of every emotion? (Bellamy, 1898, n.p.; italics added)

Bellamy describes the loss that comes from speech being conveyed through signs, the printed words presenting a barrier between the story and the reader; the story assumes “that utterance is the root of all writing and that arbitrary signs are its bane: artificial, unnatural, irrational, and imperfect” (Gitelman, p. 65). 

Bellamy’s embrace of sound as a more direct and delightful mode of literacy reflects Plato’s concerns about the silences of writing, as described in the Phaedrus: “Writing is unfortunately like painting; for the creations of the painter have the attitude of life, and yet if you ask them a question they preserve a solemn silence.” The Phaedrus returns us to the past, when writing was “new media” and offers us an opportunity to explore assumptions about the value of writing. In the dialogue, Socrates argues that writing seems lifelike, as if the writer is present and speaking, yet if the reader poses a question of the writing it will not/ cannot respond. In Plato’s view, the very things that we currently value about writing are what make it inferior to the spoken word. Writing leaves words on their own where  “they cannot protect or defend themselves” (Phaedrus, n. p.). Or, as Peter Elbow (2012) colloquially summarizes Plato’s view: “This newfangled technology of writing ruined all our memories” (Vernacular Eloquence, p. 41).

The role of sound as a mode of literacy has been explored in the literature on blindness and rehabilitation beyond the work of the NFB, but with little resolution, perhaps because of what Phil Hatlen (1996), former superintendent of the Texas School for the Blind and Visually Impaired, describes as the “historical ambivalence of our profession with regard to reading through listening” (p. 173). For many professionals within blindness education and rehabilitation, listening is viewed as an option for literacy within the context of a broader definition of reading and writing:  “Reading could be defined better as the recognition, interpretation, and assimilation of the ideas represented by symbolic material, whether it is displayed visually, tactilely, or aurally” (Tuttle, p.173) or simply “a means by which to communicate with others” (Hatlen and Spungin, p.390). Regardless of pronouncements against the validity of sound as a mode of literacy, many blind individuals do use listening to access texts. “Do we simply dismiss this use of reading by listening as not literacy?” (Hatlen and Spungin, p. 391).

Multimodality

Within composition and literacy studies, a shift towards sound as literacy is signaled by the New London Group’s 1996 manifesto on multiliteracies, which argued for “the increasing multiplicity and integration of significant modes of meaning-making, where the textual is also related to the visual, the audio, the spatial, the behavioral, and so on” (n.p.).  While the New London Group’s expanded definition of literacy addressed a wide range of modes, the visual has received dominant focus both in subsequent research by members of the New London Group and more generally in composition and rhetoric research. However, the interest in sound studies seems to be burgeoning; as Sirc and Ceraso noted recently, “Though the history of composition is teeming with clusters of scholarship about aurality, disciplinary excitement about sound and music has become noticeably amplified in the past five years” (n.p.). The “sonic turn” in the field can be marked by a 2006 special issue of Computers and Composition: Sound in/as Composition Space; in the introduction to the issue guest editors Cheryl Ball and Byron Hawk argue that since the field has moved “from linguistic to visual meaning-making, a logical progression is to include other modes of meaning including audio” (p. 263).

In 2009, coincidentally the same year of the NFB’s report on the braille literacy crisis, Cynthia Selfe “encourage[d] teachers and scholars of composition, and other disciplines, to adopt an increasingly thoughtful understanding of aurality and the role it —and other modalities— can play in contemporary communication tasks” (p. 616). A compelling conflict is presented when a dominant voice within our own field of research makes a strong argument for listening as literacy, while others are stating adamantly that listening is not literacy. We need to take the conversations about the changing nature of literacy outside of our discipline, to view its impact where, as the NFB says, “the rubber meets the road.” In the 2006 special issue of Computers and Composition mentioned previously, the editors note that disability and accessibility are important topics missing from the collection; I argue that the experiences of individuals with disabilities should not only be considered in our explorations of multimodal literacies, but they are essential for understanding how literacy functions through a range of modes.

Notes

6Tactile sensitivity is considered a key part of braille readiness, and may include being able to locate braille marks on a page and grading textures of sandpaper (McComiskey). Tactile sensitivity can diminish with age or with conditions such as diabetes. There is disagreement, however, about whether tactile sensitivity is a physiological state or a skill that can be developed (Cryer and Home, 2011).

7Another blindness-related innovation that Edison explored was raised printing. The process was never realized, but included coating pages with a variety of chemicals and writing with arsenic ink.