Implications Beyond Those Noted — Digital Writing Assessment

I enjoyed reading the Foreword, Preface and Afterword of the Digital Writing Assessment and Evaluation book. I was forced to read on screen, which I am getting better at, although I prefer paper and being able to us an actual pencil to annotate. These were not too “heady” texts, so I didn’t feel stylus withdrawal. The compulsion to mark up and the feeling of loss wasn’t so strong as it is when I try to read something more difficult, that requires more processing and connections.

As a Community College English faculty member, I feel compelled to enter this conversation, perhaps in some of the gaps identified by McKee and DeVoss in their foreword. Interestingly, they did not identify the dearth of scholarship about writing at the community college, let alone digital writing and digital assessment, as one of these gaps. I checked, and not one of the chapters in this volume deals with community colleges as its object of study, nor is it written by or in collaboration with a community college faculty member. This is a HUGE gap, given that nearly half of all undergraduates in the United States are enrolled at two-year schools, and because of their open-access nature, they are one of the biggest users of Automated Essay Scoring (AES). In addition, they are less likely to have a coordinated writing program, although they are under the same pressures from accrediting bodies as the four-year schools to demonstrate student learning of course and program outcomes.

I was especially struck by the lack of regard for the implications when I read the footnote on the Foreword:“1. Interestingly, a form of credit is an option available for $190 in Coursera’s (Duke’s MOOC provider) “Signature Track.” Yet, as Steve Krause (2013) astutely noted, because Duke itself will not recognize that credit but offers it to other institutions, “it seems a little shady to me that places like Duke are perfectly happy to offer their MOOC courses for credit at other institutions but not at Duke.

Where’s the “Duh” Hammer? Hello????? Duke, and other universities may well find themselves with students who earned their credit from a MOOC, despite their posturing that they do not accept it. Accepting (and OFFERING) credit for the Comp MOOC creates a slippery slope for Duke because of articulation agreements with colleges –such as community colleges — who may allow credit for the MOOC. Two-year schools are under increasing pressure to award credit for prior learning, in a rush by politicians to decrease the amount of time to a degree (a barrier, they note, to completion of a credential). Thus, the number of ways to be awarded credit for courses (CREDIT, mind you, not placement) increases. Examples include: CLEP, DANTES, IB, Cambridge, AP).  So, if you earn an associate’s degree from a two-year school who awarded you credit for a MOOC, then you are eligible for guaranteed transfer to some pretty elite institutions, including Duke.  As stipulated in the agreement:

Transfer students will be considered to have satisfied the UNC Minimum Course Requirements (MCR) in effect at the time of their graduation from high school if they have:

  1. received the associate in arts, the associate in science, the associate in fine arts, the baccalaureate or any higher degree, or
  2. completed the 44-hour general education core, or
  3. completed at least six (6) semester hours in degree-credit in each of the following subjects: English, mathematics, the natural sciences, and social/behavioral sciences, and (for students who graduate from high school in 2003-04 and beyond) a second language.

These four-year institutions not only would not go back to see how the credits comprising the degree were derived at the associate-degree granting institution, but it also appears to be prohibited by the articulation agreement. While no community college currently awards credit for a MOOC, the door has been opened by a 2013 Florida law, which allows MOOC credit in certain cases and requires K-12 and colleges to create rules and procedures to accept credit for these courses. Between the pressure to award more credit for Prior Learning, to the inroads MOOCs have made with politicians who see them as a cost-savings measure (though not so much with students who want to use them for college credit on the cheap), more discussion about this topic is sure to follow, and community colleges cannot be left out of these conversations.

Three quotes from Edward White in the Afterword that I would like to address:

“This pencil [meaning the computer] has gotten out of hand and has entered our bloodstream.”

“But what is not discussed is what I consider the elephant in the room, which from my perspective is distinctly oppressive: assessment by computer and by various instructional platforms. While we talk pleasantly about the brave new world of writing that computers have ushered in, a darker side of technology has been making important inroads into the very center of writing itself.”


“Students will write to machines, a natural enough move for generations brought up challenging machines on computer games, rather than writing for their peers or their teachers. Students will write to machines just as surely as they now write their SAT or AP essays to the armies of dulled readers” (Afterword)

Again, leaving community colleges out of this equation is at the peril of compositionists. Virginia’s Community Colleges, for example, have partnered with McCann and Associates, a division of Vantage Learning, to use their IntelliMetrics to develop the Virginia Placement Tests, which all students entering the college must take in Mathematics and English Reading and Writing. The results of this test determine whether you require placement in Developmental Coursework, or are “ready for college-level work.”  The writing test can be gamed by playing to structure. Repeat key terms, use a clear thesis and conclusion, use markers and transition words (first, next, then), and you’ll score well, regardless of substance. The program counts number of words per sentence and per paragraph, expected clauses and distance from punctuation, and looks for common errors entered into its database.  You might suspect that it doesn’t do well with ESL students or students for whom Standard Academic Discourse is an L2.  Furthermore, students have the choice between two prompts, which point to two kinds of essays: one more expository, the other more analytical. Some of my colleagues at Northern Virginia Community College analyzed some data from the tests, which was very difficult to obtain, as the company keeps results close to the vest — the student and instructors only see the resulting placement, not an actual score, let alone an explanation for how the score was derived. They found that students who self-selected the harder prompt tended to score lower on the essay portion, but substantially higher on the multiple-choice portion. However, since the two scores are combined for a placement (using an unknown algorithm), students with very high scores on the reading and sentence correction portion, who also challenged themselves with the more rhetorically difficult essay, were being placed in developmental English, while other students who took the easier prompt (which could be answered SOL-style) and performed poorly on the closed-response questions testing reading comprehension and textual analysis, could be placed into the credit courses. After uncovering this inherent testing bias at the Developmental Education Peer Group conference in Fall 2013, the VCCS requested that the prompts be changed, and has promised that they are “more aligned.” I have some grave concerns about outsourcing the scoring of essays to a for-profit company who refuses to share metrics or results with faculty.

As White notes, our students are already well-versed in writing to the SAT and AP readers, and using stringent rubrics to grade writing for in-common assessments across course sections invites further standardization. Students inherently understand the rhetorical nature of audience — they spend a great deal of time figuring out what the teacher wants to hear. If they are writing for a computer program for a grade, they will quickly figure out the triggers to obtain a good one, despite actually saying anything factual, accurate, or coherent. As I tell my students, you can check off everything on a rubric as being complete, but that does not determine if you have actually communicated to your audience. Writing is more than the sum of its parts, which, at this point, cannot be effectively measured by an emotionless algorithm that cannot decode symbolic representations.

Works Cited

Bernstein, Kenneth. “Warnings from the Trenches.” Academe. American Association of University Professors. January-February 2013. Web. 3 Feb. 2014.
Fain, Paul. “College Credit Without College.” Inside Higher Ed.  7 May 2012. Web. 2 Feb. 2014.

Independent Comprehensive Articulation Agreement Between Signatory Institutions of the North Carolina Independent Colleges and Universities and the North Carolina Community College System.” Presidents of the Signatory Institutions of the North Carolina Independent Colleges and Universities and the

State Board of the North Carolina Community College System.  2007, revised 2010. Web. 2 Feb. 2014.
Kolowich, Steve. “A University’s Offer of Credit for a MOOC Gets No Takers.” Chronicle of Higher Education. 8 July 2013. Web. 2 February 2014.
Lunsford, Andrea. “Foreword” Digital Writing Assessment and Evaluation. Ed. Heidi A McKee and Dànielle Nicole DeVoss. Computers and Composition Digital Press, 2013. Web. 2 Feb. 2014.
“Transfer Agreements with Independent Colleges.” Sandhills Community College. n.d. Web. 2 Feb. 2014.
“SAT Reasoning Test – Essay Scoring Guide.” CollegeBoard. n. p., 2012. Web. 3 Feb. 2014.

“‘Watered Down’ MOOC Bill Becomes Law In Florida” Inside Higher Ed. 1 July 2013. Web. 2 Feb. 2014.

White, Edward M. “Afterword: Not Just a Better Pencil.” Digital Writing Assessment and Evaluation. Ed. Heidi A McKee and Dànielle Nicole DeVoss. Computers and Composition Digital Press, 2013. Web. 3 Feb. 2014.
Image from:

2 Responses to Implications Beyond Those Noted — Digital Writing Assessment

  1. I think this is a timely and interesting discussion. Surprised not to see any Miller, Bazerman, or Popham make an appearance, though.

Leave a Reply

Your email address will not be published. Required fields are marked *