Monthly Archives: February 2014

Reducing the Clutter in the Mindmap

I realized that I am off on my Mind Map updates. I thought they were for the NEXT week’s work, not the previous one’s. I thought I had to do the new reading before updating. So last week’s more comprehensive updates were really both Foucaults. The Bazerman and Miller and Popham and the Digital Writing Assessment stuff are for AFTER this week’s class. Sometimes I’m a little slow on the uptake.

So this week I spent time cleaning up my Popplett and attempting to reorganize it. I made two color schemes, corresponding to structuralism and deconstruction, aligning Biesecker and Foucault together, and Bitzer and Vatz together (and the genre folks will end up here, I suspect). Red for deconstruction (it bleeds …. it hurts us) and Blue for structure (it calms us … it gives false sense of unity). I actually deleted several Popples that were no longer needed and some extraneous connections. I also came upon the limitations of the interface. I wanted to create a “super-Popple” which contained other Popples, to show nested categories rather than simple linear connections. My exercise became one of dealing with the limitations rather than freely making connections. I believe that this iteration may prepare me to absorb the next round of theorists, but I suspect my basis for categorization will shift again as the map seeks to encompass more.

” Toward a Rhetorically Sensitive Assessment Model for New Media Composition” – Crystal Van Kooten Annotated Bibliography Entry

VanKooten, Crystal. “Toward a Rhetorically Sensitive Assessment Model for New Media Composition.” Digital Writing Assessment and Evaluation. Eds. Heidi A McKee and Dànielle Nicole DeVoss. Computers and Composition Digital Press, 2013. Web. 3 Feb. 2014.

Situating herself inside three established assessment models — Paul Allison (2009), Eve Bearne (2009), and Michael Neal (2011)– Van Kooten creates a new model meant to assess new media composition. This adaptable model takes into account both process and product, functional and rhetorical literacies, and requires student self-assessment and reflection . Van Kooten details the theory and framework for the model, then demonstrates — using student voices — the implementation and assessment of it.  The chapter functions as a true multi-modal text, replete with 7 short videos (with accompanying transcripts) to demonstrate the kinds of “multifaceted logic” and “layers of media” that are employed using a variety of rhetorical and technical features to accomplish a specific purpose for a particular audience.

Van Kooten narrates the difficulty in creating a new media assessment model and her journey toward the model she unveils here. She notes that the first attempts she made, along with her students who collaboratively created the rubric for their own work, were outgrowths of the print media rubrics, and that they quickly revealed their shortcomings, due to the affordances of various media that could be incorporated. For example, arguments are presented differently using sound and visuals, and what constitutes evidence and organization varies by media.  Michael Neal (2011) proposed that the proliferation of multi-modal texts has created what he calls a “kairotic juncture” — an opportunity for a new model, which Van Kooten responds to, however cautiously, noting that “there is currently no agreed-upon language or vocabulary for discussing new media texts” nor any stability in new media genres.  She hopes the model she proposes opens the conversation about new media assessment and the opportunity for further evolutions that remain grounded in “a solid theory of writing assessment itself.”

Van Kooten offers three criteria for assessment:

  1. fulfillment of purpose and direction to audience;
  2. the use of a multifaceted logic through consideration of layers of media; and
  3. the use of rhetorical and technical features for effect.
Crystal Van Kooten's model of New Media assessment of multi-modal compositions.

Van Kooten’s model for assessment of New Media Composition includes both functional and rhetorical literacies.

She also offers two worksheets to help students set both functional and rhetorical goals for their work, involving them in the assessment process and requiring metacognition and reflection — ways of assessing both process and product.

It is my opinion that Van Kooten has created a plausible model, grounded in both (print) writing assessment theory and multi-modal composition theory, that will become an oft-cited text as this conversation continues. This article is a useful source for a go-to model that can be adapted for classroom use.

As mentioned, there are seven accompanying videos that demonstrate new media compositions and turn the chapter itself into a multi-modal piece. Here is a metacognitive piece where one of Van Kooten’s students overdubs his piece with his own narration of the process. This itself is a viable product, as we have director’s cuts with commentary on the special editions of movies and TV episodes, where the audience is privileged to have a window into the mind of the director or actor.

(I am unable to upload and embed the video as it exceeds the maximum allowable file size for WordPress).



Implications Beyond Those Noted — Digital Writing Assessment

I enjoyed reading the Foreword, Preface and Afterword of the Digital Writing Assessment and Evaluation book. I was forced to read on screen, which I am getting better at, although I prefer paper and being able to us an actual pencil to annotate. These were not too “heady” texts, so I didn’t feel stylus withdrawal. The compulsion to mark up and the feeling of loss wasn’t so strong as it is when I try to read something more difficult, that requires more processing and connections.

As a Community College English faculty member, I feel compelled to enter this conversation, perhaps in some of the gaps identified by McKee and DeVoss in their foreword. Interestingly, they did not identify the dearth of scholarship about writing at the community college, let alone digital writing and digital assessment, as one of these gaps. I checked, and not one of the chapters in this volume deals with community colleges as its object of study, nor is it written by or in collaboration with a community college faculty member. This is a HUGE gap, given that nearly half of all undergraduates in the United States are enrolled at two-year schools, and because of their open-access nature, they are one of the biggest users of Automated Essay Scoring (AES). In addition, they are less likely to have a coordinated writing program, although they are under the same pressures from accrediting bodies as the four-year schools to demonstrate student learning of course and program outcomes.

I was especially struck by the lack of regard for the implications when I read the footnote on the Foreword:“1. Interestingly, a form of credit is an option available for $190 in Coursera’s (Duke’s MOOC provider) “Signature Track.” Yet, as Steve Krause (2013) astutely noted, because Duke itself will not recognize that credit but offers it to other institutions, “it seems a little shady to me that places like Duke are perfectly happy to offer their MOOC courses for credit at other institutions but not at Duke.

Where’s the “Duh” Hammer? Hello????? Duke, and other universities may well find themselves with students who earned their credit from a MOOC, despite their posturing that they do not accept it. Accepting (and OFFERING) credit for the Comp MOOC creates a slippery slope for Duke because of articulation agreements with colleges –such as community colleges — who may allow credit for the MOOC. Two-year schools are under increasing pressure to award credit for prior learning, in a rush by politicians to decrease the amount of time to a degree (a barrier, they note, to completion of a credential). Thus, the number of ways to be awarded credit for courses (CREDIT, mind you, not placement) increases. Examples include: CLEP, DANTES, IB, Cambridge, AP).  So, if you earn an associate’s degree from a two-year school who awarded you credit for a MOOC, then you are eligible for guaranteed transfer to some pretty elite institutions, including Duke.  As stipulated in the agreement:

Transfer students will be considered to have satisfied the UNC Minimum Course Requirements (MCR) in effect at the time of their graduation from high school if they have:

  1. received the associate in arts, the associate in science, the associate in fine arts, the baccalaureate or any higher degree, or
  2. completed the 44-hour general education core, or
  3. completed at least six (6) semester hours in degree-credit in each of the following subjects: English, mathematics, the natural sciences, and social/behavioral sciences, and (for students who graduate from high school in 2003-04 and beyond) a second language.

These four-year institutions not only would not go back to see how the credits comprising the degree were derived at the associate-degree granting institution, but it also appears to be prohibited by the articulation agreement. While no community college currently awards credit for a MOOC, the door has been opened by a 2013 Florida law, which allows MOOC credit in certain cases and requires K-12 and colleges to create rules and procedures to accept credit for these courses. Between the pressure to award more credit for Prior Learning, to the inroads MOOCs have made with politicians who see them as a cost-savings measure (though not so much with students who want to use them for college credit on the cheap), more discussion about this topic is sure to follow, and community colleges cannot be left out of these conversations.

Three quotes from Edward White in the Afterword that I would like to address:

“This pencil [meaning the computer] has gotten out of hand and has entered our bloodstream.”

“But what is not discussed is what I consider the elephant in the room, which from my perspective is distinctly oppressive: assessment by computer and by various instructional platforms. While we talk pleasantly about the brave new world of writing that computers have ushered in, a darker side of technology has been making important inroads into the very center of writing itself.”


“Students will write to machines, a natural enough move for generations brought up challenging machines on computer games, rather than writing for their peers or their teachers. Students will write to machines just as surely as they now write their SAT or AP essays to the armies of dulled readers” (Afterword)

Again, leaving community colleges out of this equation is at the peril of compositionists. Virginia’s Community Colleges, for example, have partnered with McCann and Associates, a division of Vantage Learning, to use their IntelliMetrics to develop the Virginia Placement Tests, which all students entering the college must take in Mathematics and English Reading and Writing. The results of this test determine whether you require placement in Developmental Coursework, or are “ready for college-level work.”  The writing test can be gamed by playing to structure. Repeat key terms, use a clear thesis and conclusion, use markers and transition words (first, next, then), and you’ll score well, regardless of substance. The program counts number of words per sentence and per paragraph, expected clauses and distance from punctuation, and looks for common errors entered into its database.  You might suspect that it doesn’t do well with ESL students or students for whom Standard Academic Discourse is an L2.  Furthermore, students have the choice between two prompts, which point to two kinds of essays: one more expository, the other more analytical. Some of my colleagues at Northern Virginia Community College analyzed some data from the tests, which was very difficult to obtain, as the company keeps results close to the vest — the student and instructors only see the resulting placement, not an actual score, let alone an explanation for how the score was derived. They found that students who self-selected the harder prompt tended to score lower on the essay portion, but substantially higher on the multiple-choice portion. However, since the two scores are combined for a placement (using an unknown algorithm), students with very high scores on the reading and sentence correction portion, who also challenged themselves with the more rhetorically difficult essay, were being placed in developmental English, while other students who took the easier prompt (which could be answered SOL-style) and performed poorly on the closed-response questions testing reading comprehension and textual analysis, could be placed into the credit courses. After uncovering this inherent testing bias at the Developmental Education Peer Group conference in Fall 2013, the VCCS requested that the prompts be changed, and has promised that they are “more aligned.” I have some grave concerns about outsourcing the scoring of essays to a for-profit company who refuses to share metrics or results with faculty.

As White notes, our students are already well-versed in writing to the SAT and AP readers, and using stringent rubrics to grade writing for in-common assessments across course sections invites further standardization. Students inherently understand the rhetorical nature of audience — they spend a great deal of time figuring out what the teacher wants to hear. If they are writing for a computer program for a grade, they will quickly figure out the triggers to obtain a good one, despite actually saying anything factual, accurate, or coherent. As I tell my students, you can check off everything on a rubric as being complete, but that does not determine if you have actually communicated to your audience. Writing is more than the sum of its parts, which, at this point, cannot be effectively measured by an emotionless algorithm that cannot decode symbolic representations.

Works Cited

Bernstein, Kenneth. “Warnings from the Trenches.” Academe. American Association of University Professors. January-February 2013. Web. 3 Feb. 2014.
Fain, Paul. “College Credit Without College.” Inside Higher Ed.  7 May 2012. Web. 2 Feb. 2014.

Independent Comprehensive Articulation Agreement Between Signatory Institutions of the North Carolina Independent Colleges and Universities and the North Carolina Community College System.” Presidents of the Signatory Institutions of the North Carolina Independent Colleges and Universities and the

State Board of the North Carolina Community College System.  2007, revised 2010. Web. 2 Feb. 2014.
Kolowich, Steve. “A University’s Offer of Credit for a MOOC Gets No Takers.” Chronicle of Higher Education. 8 July 2013. Web. 2 February 2014.
Lunsford, Andrea. “Foreword” Digital Writing Assessment and Evaluation. Ed. Heidi A McKee and Dànielle Nicole DeVoss. Computers and Composition Digital Press, 2013. Web. 2 Feb. 2014.
“Transfer Agreements with Independent Colleges.” Sandhills Community College. n.d. Web. 2 Feb. 2014.
“SAT Reasoning Test – Essay Scoring Guide.” CollegeBoard. n. p., 2012. Web. 3 Feb. 2014.

“‘Watered Down’ MOOC Bill Becomes Law In Florida” Inside Higher Ed. 1 July 2013. Web. 2 Feb. 2014.

White, Edward M. “Afterword: Not Just a Better Pencil.” Digital Writing Assessment and Evaluation. Ed. Heidi A McKee and Dànielle Nicole DeVoss. Computers and Composition Digital Press, 2013. Web. 3 Feb. 2014.
Image from: