STIRRED, NOT SHAKEN: AN ASSESSMENT REMIXOLOGY
Susan H. Delagrange, Ben McCorkle, and Catherine C. Braun
The three of us are in concert on several aspects of our approaches to assessment. Perhaps the most prevalent shared principle involves designing flexibility into the process of evaluation, which is also one of the more consistent bits of advice from other scholars on the subject (cited in this chapter and throughout this collection). We do not propose a unified, homogenized approach to assessment, which may achieve brevity and clarity, but fails to respond to the specific needs and concerns of individual programs. We offer instead some models for evaluation that take into account the responsiveness and complexity inherent in a contextualized rhetorical approach to the composing process.
In addition to flexibility, another shared principle is transparency. Letting students see the process—giving them a glimpse “behind the curtain”—demystifies the act of evaluation and leads to a greater sense of understanding and control. The result is more trust in the process, and an improved, less-hierarchical relationship between students and teachers.
As we see it, seeking buy-in from students is also essential to this process. Student engagement helps them take ownership of the assessment process, allowing them to experience the contextual nature of textuality and making the work they do in our classes more meaningful to them. This experience ideally applies to other classes and in other contexts in which they must create or assess texts. As Manion and Selfe (2012) remarked about wikis, digital remix assignments “can shift the social dynamics of the classroom” and consequently “represent an opportunity to examine approaches to assessment that help us think differently about how we teach and how we engage our students as they learn” (p. 26). This is true, they pointed out, because “assessment is tied to wider systems of activity that reflect particular local, field-specific ways of thinking as well as the immediate contingencies of an always evolving context” (p. 26). Our respective practices take into account this always-evolving context by incorporating an evolving rubric, utilizing distributed assessment practices, and blending field-specific ways of thinking about texts (i.e., rhetoric and composition) with other fields (design, etc.). Ultimately, we see these assessment practices functioning in ways that are critically activist, in that they introduce students to the politics underlying the educational process.
A final shared principle involves designing assessment instruments that encourage critical thinking. The sustained, collaborative focus on both formative and summative assessment throughout the composing process strengthens the habits of engagement, persistence, responsibility, and “thinking about thinking.” These qualities of mind extend beyond multimodal remix and enhance student experiences in school, at work, and in their communities.
As we look to the future, we realize that the question of effective assessment models is far from settled: technological, institutional, and cultural changes will continue to make demands on our ability to help students make sense of the work they do. Nevertheless, by adhering to these principles, digital media compositionists can ensure that they are building their models on solid, sustainable ground.