Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Improving Writing and Thinking through Assessment


reviewed by David Allen - November 16, 2012

coverTitle: Improving Writing and Thinking through Assessment
Author(s): Teresa L. Flateby (ed.)
Publisher: Information Age Publishing, Charlotte
ISBN: 1607524074, Pages: 288, Year: 2011
Search for book at Amazon.com


Assessments of incoming students’ writing skills are ubiquitous across the higher education landscape. Courses in composition or college writing are now a familiar part of the freshmen experience. At the university where I teach, a new assessment of incoming students’ writing was introduced in 2010. More recently, faced with a significant downsizing of the university-wide general education requirements imposed by central administration, faculty fought to make sure the required credits for composition remain unchanged. Like many institutions, my college maintains a writing center for students and a writing across the curriculum program for faculty; these are helpful but insufficient in addressing many students’ writing needs. While there is less explicit attention to students’ critical thinking skills, the level of analysis, inference, persuasive use of evidence, and other thinking skills of many students is a significant concern for faculty.


This book is unique in its attention to assessment and improvement of both students’ writing and thinking skills. It does so without taking the easy road of claiming that we automatically assess students’ thinking when we assess their writing, recognizing that “although related, writing and thinking are not synonymous, nor are they perfectly correlated” (p. 3). Nevertheless, while the book treats the two as separate domains, the only assessments of critical thinking it considers are conducted through students’ writing, inviting attention to other methods for assessing thinking, for example, through visual arts, mathematical problem solving, or oral presentation or discussion. Here, higher educators might learn from assessments developed by innovative secondary schools, such as the assessment of habits of mind at Central Park East Secondary School (Meier, 1996).


The book does not impose its own typology of either writing skills or thinking skills. It takes the view that each institution must define these skills for itself before developing an assessment system and program to support students in developing these skills. The first part of the book offers suggestions for how to develop an assessment program, as well as resources to help institutions do just that, for example, guidelines for writing assessment from the Executive Committee of the Conference on College Composition (2006), Linn et al.’s (1991) criteria for performance assessments, and Wiggins (1998) questions to consider in developing rubrics. While these will be familiar to educators involved in assessment development, it is useful to have them compiled here. It also provides examples of assessment tasks and rubrics, many from the University of South Florida, where the editor is director of a unit devoted to academic assessment. These provide useful examples of the principles addressed in this part of the book.


The second part comprises many—perhaps too many (14!)—accounts of assessment programs from the individual chapter authors’ institutions. The consistency is uneven: While some provide detailed descriptions of how the program was developed and instituted, others do not go much beyond articulating the general principles engaged by their institutions. Below, I discuss two that exemplify the former approach, one focused on writing skills, the second on thinking skills.


Laroche and Wood describe three attempts at Doña Ana Community College to create assessment tools for student writing. After successfully developing a rubric used across the college’s composition program, the authors’ experienced frustration in an attempt to develop a similar tool for writing across the curriculum, i.e., for use in disciplines outside of English. Because the development process “predominately reflected the values of the composition program” (p. 102), adoption by professors in other disciplines was minimal. Learning from their missteps, the composition faculty undertook a much more inclusionary process of creating a writing-across-the-curriculum assessment program. This time they involved instructors from other departments in “discussing, developing, revising and piloting the assessment tools,” including not only the rubric but also tools to help instructors design writing assignments and to assess their writing. (The tools, like many described in other chapters in this part, are included as an appendix.)


An example of an institution that focused on students’ thinking skills is provided by Kelly-Riley, who describes a university-wide effort at Washington State University to improve its students’ critical thinking outcomes. Not surprisingly, given the book’s approach, the effort built upon the university’s efforts in the assessment of writing.  The project developed a guide to rating critical thinking skills. The goal was for faculty across the curriculum to integrate critical thinking skills consistent with “their thinking styles, course levels, and disciplinary conventions” (p. 166). The instrument identifies seven areas of critical thinking, including “identification of a problem or issue,” “establishment of a clear perspective on the issue,” and “recognition of alternative perspectives.”


Once again, success in this story hinges on the broad inclusion in the program of faculty members across disciplines. In this case, the assessment instrument was presented to general education course faculty as something of a menu: “Faculty members chose the criteria they wanted to highlight in their classrooms” (p. 167). A large number of faculty members served as evaluators of student papers collected for the project. The authors describe how they took on the role of “facilitators,” rather than that of “experts,” to productively involve their colleagues in using the tool.


The lessons from these stories are consistent with those of other contributors to the book:


As an institution, define criteria for writing and/or critical thinking

Involve faculty from across the institution in developing the assessments

Provide faculty development in using the assessment(s) to productively support students’ writing and thinking.


The book does not make empirical claims about the effectiveness of the various assessment programs in improving student writing or thinking skills. Instead, it offers practical resources and real-institution examples for the faculties and administrators of other colleges and universities engaged in evaluating, developing, or improving their own assessment programs to support students’ writing and critical thinking.


References


Meier, D. (1996). The Power of Their Ideas: Lessons for America from a Small School in Harlem. Boston, MA: Beacon Press.






Cite This Article as: Teachers College Record, Date Published: November 16, 2012
https://www.tcrecord.org ID Number: 16937, Date Accessed: 10/21/2021 8:47:29 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • David Allen
    College of Staten Island, CUNY
    DAVID ALLEN is an assistant professor at the College of Staten Island, City University of New York. His most recent book is Coaching Whole School Change: Lessons in Practice from a Small High School. He is currently researching collaborative practices in the theatre arts and teacher learning.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS