Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

What Can Student Drawings Tell Us About High-Stakes Testing in Massachusetts?


by Anne Wheelock, Damian J. Bebell & Walt Haney - November 02, 2000

Many high-stakes testing policies rest on the belief that attaching consequences to test scores will persuade students of the importance of academics and will motivate them to exert greater effort to achieve at passing levels. This investigation explores this assumption through an examination of students' drawings of themselves taking the Massachusetts high-stakes test. Student drawings conveyed a range of opinions about test difficulty, length, and content. A small minority of drawings depicted students as diligent problem-solvers and thinkers. A larger percentage of drawings portrayed students as anxious, angry, bored, pessimistic, or withdrawn from testing. The overall patterns that emerge challenge the belief that the high stakes associated with MCAS will enhance the motivation and effort of students in a uniform way.

INTRODUCTION

All rumors of [the test's] esoteric and labyrinthine questions are true, and it personally took me over 17 hours to complete the [MCAS].
- Christian Drake, Northampton, Massachusetts, Class of 2000

The questions were really hard. I don't know how they expected us to answer them.
- Sarah Dauphinais, Westfield, Massachusetts, Class of 2003

We think it's unfair because they're testing us on things we haven't even learned yet. -Yaraliz Soto, Holyoke, Massachusetts, Class of 2002

After the first two days of tests, your fingers and your mind hurt. A lot of kids didn't try after that.
- Arthur Page, Wareham, Massachusetts, Class of 2003

Yes, the tests are challenging, but they are not unfair.
- James Peyser, Chairman, Massachusetts Board of Education

Over the past decade, in response to the requirements of Goals 2000 and Title 1 legislation, states have greatly expanded testing programs for public school students. Fashioned to fit state curriculum frameworks, state tests typically reinforce the "content standards" the frameworks represent. In the name of "accountability," many states have also adopted policies that make use of test scores to determine graduation from high school, grade promotion, distribution of rewards, and state intervention. As local and national media bombard the public with daily news of test results (see, for example, http://www.educationnews.org. This URL and all others listed below have been checked and verified as active as of September 15. 2000.), and as political leaders and the press push for higher scores, educators across the country are changing curriculum and instruction to boost students' chances of passing the tests (Groves & Richardson, 2000; McNeil & Valenzuela, 2000; McNeil, 2000; Shea, 2000).

In contrast to the enthusiasm of the media and policy officials, however, researchers increasingly warn that relying on standardized test scores to make educational decisions related to curriculum and instruction, allocation of resources, or students' futures is unwise (Dorn, 1998; Heubert & Hauser, 1999; Linn, 2000; Mehrens, 1998; Noble & Smith, 1994; Stake, 1998; Whitford & Jones, 2000). Widely-reviewed trade books with a focus on standardized testing further caution that such testing has a history of backfiring, resulting in narrowing rather than expanding students' opportunities to learn (Kohn, 1999; Lemann, 1999; Ohanian, 1999; Sacks, 1999). Yet despite extensive public discussion, few have attended to how those most affected, namely students, perceive and react to testing, or how students' responses might affect test results.

As a step toward addressing this gap, this paper focuses on student responses to high stakes testing in Massachusetts. Using students' drawings of themselves taking the statewide test, our study describes the range of responses of students to high stakes testing. It also provides electronic links to the drawings themselves, the coding scheme used to analyze the drawings, and brief suggestions on using drawings.


THE NATURE OF STATEWIDE TESTING IN MASSACHUSETTS

In the spring of 1998, the Massachusetts Department of Education implemented the first version of the Massachusetts Comprehensive Assessment System (MCAS), prepared under contract with Advanced Systems of New Hampshire. The MCAS is a paper-and-pencil test that includes both multiple-choice and open-response items. Scores are determined by measuring student performance against "standards" that can be set at high or low levels. Starting with the class of 2003, all Massachusetts high school students will be required to attain "passing" scores on tenth grade mathematics and English tests in order to receive a high school diploma.

MCAS is a lengthy test, taking more time than the Graduate Record Examinations or standardized tests required for medical, law, or business school. The 1998 testing schedule called for students to sit for seven sessions in English/language arts, three in mathematics, and three in science and technology, with most sessions running a minimum of one hour. The 1999 schedule required fourth graders to sit for five sessions in English, two in mathematics, two in science and technology, and one session of tryout questions in history and social studies. In 2000, the state reduced fourth grade testing time to five sessions in English, two in mathematics, and two in science and technology, but spread testing into the fifth grade for the first time, ostensibly to reduce what Board of Education Vice-Chairperson Roberta Schaefer called "testing burnout" (McFarland, 1999). Eighth and tenth grade students sat for 13 sessions, with tryout questions scheduled for sixth and seventh graders. Test times published by the Department of Education are approximate; in all grades, actual test times vary, and often exceed, those recommended.

MCAS is meant to be a difficult test. From the beginning, policy-makers intended MCAS results to be scored at a more demanding level than nationally normed standardized tests. According to John Silber, then-Chairman of the Massachusetts Board of Education, as reported in February 1997 minutes from the Board meeting that month (http://www.doe.mass.edu/boe/minutes/97/min21097.html):

If on one of the nationally normed tests that is given in grades 4, 8 and 10, students should turn out with a B, and on the Advanced Systems test they came out with a C, we might conclude that Advanced Systems has pegged it just right with a more demanding standard, a standard that would approach international standards. On the other hand, if the situation were reversed, where a nationally normed test shows that the students were performing at about a C level and Advanced Systems had a B level, then we would know that the standards in that exam were perhaps not rigorous enough.

Since setting the "cut-off" scores for MCAS is a political matter, "passing" scores can be set at a higher level relative to other tests. Indeed, at its January 2000 meeting, the Board of Education established the official requirement that students in the graduating class of 2003 meet or exceed the threshold scaled score of 220 on English and mathematics MCAS grade 10 tests in order to satisfy requirements of the "Competency Determination" for a high school diploma (http://www.doe.mass.edu/boe/bib/bib00/12800.html). This decision followed by two months the news that nearly half the state's tenth grade students did not meet this performance standard.

MCAS is also a controversial test. Despite a concerted campaign by business, media, and policy leaders to promote MCAS as a tool for accountability and reform (Guenther, 2000; Hayward, 2000b), those closer to student learning have become increasingly vocal about their disenchantment with high stakes testing in Massachusetts. Educators from all grade levels, including several former Massachusetts Teachers of the Year and McArthur Fellow Deborah Meier, have questioned the value of such testing for student learning (Associated Press, 2000; Greenebaum, 2000; Hayward, 2000a; Lindsay, 2000; Lord, 1999; Marcus, 1999; Meier, 2000; Penniman, 2000; Sukiennik, 2000; Tantraphol, 2000a). Parents have launched a petition drive and lobbied elected officials to reject the tests as a means to determine graduation (Daley D., 2000; Downs, 1999; Walsh, 2000a, 2000b; Wilson, 2000; also, http://www.massparents.org). Finally, the Student Coalition Against MCAS (http://www.scam-mcas.org) has distributed position papers on MCAS and testified at school committee meetings. In the spring of 2000, hundreds of students from across the state organized a test boycott, rallies, and the delivery of letters to the governor protesting MCAS (Crittenden, 2000; Gentile and Sukiennik, 2000; Glading-Dilorenzo, 2000; Hayward, 2000d; Steinberg, 2000; Scherer, 2000; Shartin, 2000; Sweeney, 2000; Tantraphol, 2000b; Thesing, 2000; Vaishnav & Vigue, 2000; Vigue and Yaekel, 2000).


PARTICIPATION IN AND RESULTS OF MCAS TESTING


In May 1998, Massachusetts administered its first round of testing in English/language arts, mathematics, and science/technology to 201,749 students. With participation required of virtually all students, including students with disabilities and those whose first language is not English, the testing pool included 96.6% of all students enrolled in the fourth, eighth, and tenth grades. Although the 1999 participation rate for students with disabilities dropped by several percentage points in each grade tested, overall participation for the second year of testing was similar to that of 1998 (Massachusetts Department of Education, October 1999a; Massachusetts Department of Education, November 1999).

MCAS results are reported in four categories: "Advanced," "Proficient," "Needs Improvement," and "Failing." Results of the first round of testing reported in December 1998 were disappointing. Only in the Grade 8 English/language arts test did more than half the students score in the "Advanced" and "Proficient" categories. In May 1999, another 215,045 students sat for the second round of testing, and results reported in December 1999 had changed little from the previous year. As in 1998, only in English/language arts in Grade 8 did more than half the students score above the "Needs Improvement" category, results Governor Celluci called "unacceptable" (Mashberg, 1999).

Results for African American and Latino students were more disheartening. According to an analysis of 1998 results prepared by the Mauricio Gaston Institute for Latino Community Development and Public Policy at the University of Massachusetts in Boston (http://www.gaston.umb.edu), 49% of the African American and 58% of the Latino tenth graders failed the English portion of MCAS compared with 19% of the state's white tenth graders. While 80% of African American and 83% of Latino tenth graders failed mathematics, 43% of the state's white tenth graders did so (Uriarte & Chavez, 2000). The Gaston Institute's analysis also revealed that high numbers of minority students had failed MCAS in specific districts. Cities like Boston, Springfield, Worcester, New Bedford, Lowell, Holyoke, Fitchburg, Chicopee, Chelsea, and Salem posted still higher failing rates for their African American and Latino tenth graders than for those students statewide. A Department of Education analysis based on 1999 scores revealed similar patterns (http://www.doe.mass.edu/mcas/race_report99/default.html). In response, researchers, parents, and community leaders warned that without a change in the policy linking MCAS scores to graduation, urban students would begin to drop out of school in increasing numbers (Allen, 1999; National Center for Fair and Open Testing, 2000; Rodriguez, 1999; Vigue, 2000).

MCAS "Failing" scores also put large numbers of students with disabilities and students whose first language is not English at high risk of leaving high school without a diploma. According to reports released by the Massachusetts Department of Education, 64% of the tenth graders with disabilities and 59% of students with limited English proficiency received a "Failing" score on MCAS in 1998. Failing rates were still higher in 1999 when 71% of the tenth graders with disabilities and 66% of students with limited English proficiency received "Failing" grades (Massachusetts Department of Education, October 1999a; Massachusetts Department of Education, November 1999).


CHALLENGES TO MCAS


Although MCAS results have been announced with great fanfare and assurances of their reliability, MCAS may, in fact, mistakenly classify competent students as ill-prepared for life after high school. Comparing the scores of students who had taken both the MCAS -- and one of four other standardized tests -- the Iowa Test of Basic Skills (ITBS), Stanford 9 Achievement Tests (SAT9), Educational Research Bureau tests (ERB), and the Preliminary Scholastic Aptitude Test (PSAT), researchers have found that many students with scores in the upper ranges on each of these four tests could readily fall into any one of the four MCAS score categories (Horn, Ramos, Blumer, & Madaus, 2000). Likewise, the state Board of Education's own technical report summary notes that student achievement as measured by national tests varies widely within MCAS categories. This October 1999 report noted that the group of fourth graders receiving "Failing" MCAS scores in 1998 included students who had received "average" scores, up to the 50th percentile, on the Grade 3 1997 Iowa Test of Basic Skills (ITBS). Further, fourth graders receiving "Needs Improvement" MCAS scores included students whose ITBS scores ranged from the 30th percentile to the 80th percentile (Massachusetts Department of Education, 1998 MCAS Technical Report Summary, Figure 2, 1999). These findings emphasize the possibility that MCAS scores may unreliably report what Massachusetts students know and can do and may result in misclassifying individual students into the "Failing" or "Needs Improvement" categories when, in fact, their achievement is above average.

Professional associations and university educators have also cautioned against the potential misuse of MCAS. In a statement to the Massachusetts legislature, Jacob Ludes III, Executive Director of the New England Association of Schools and Colleges has warned, " The notion that a single high-stakes test can be used to set policy, and reward or punish schools and the children in them, is indeed appalling" (Katz, 2000). Harvard professor Vito Perrone, summarizing an analysis of publicly-released 1998 MCAS items, has raised additional concerns about both the educational quality of MCAS questions and the effects of MCAS on student learning and engagement. Writing on behalf of the Coalition for Authentic Reform in Education (CARE), Perrone (CARE, 1998: 4) states:

In general, the conclusion [from a review of MCAS items] was that the tests were very much like all the other standardized tests we have reviewed over the years; that they might be difficult for many students, not because they are particularly rigorous or challenging, but because they are long, tedious, lacking in a genuine performance base, and filled with ambiguities; and that the tests are likely to dampen student achievement by undermining quality.

A year later, in a similar review of items made public after the second round of MCAS, Perrone (CARE, 1999: 3, 25) added:

In general, the 1999 tests are much like the previous year's tests.... MCAS tests were described as generating no genuine interest, viewed by students as tedious and ambiguous, representing a lack of trust in their abilities and commitments and constituting a waste of their time. It is not surprising that the tests didn't receive students' best efforts.


STUDENTS' PERCEPTIONS OF STANDARDIZED TESTING


The implication that the most important challenge to MCAS comes from the responses of students to the tests invites further inquiry into students' perceptions of MCAS. If students judge a test as unworthy of their participation, scores from even well designed assessments will not accurately reflect what they have learned in school or what they can do with their knowledge. If students do not put their best effort into testing, both test results and the value of the test as an educational tool are called into question. In addition, student anxiety, stress, fatigue, and motivation to learn compromise test results, a phenomenon known as "test score pollution" (Haladyna, Nolen, & Haas, 1991; Madaus, 1988).

USING STUDENTS' DRAWINGS TO EXPLORE THEIR PERCEPTIONS OF HIGH STAKES TESTING

With a view to understanding the phenomenon of test score pollution, a few researchers have surveyed and interviewed students. (Debard & Kubow, 2000; Haney & Scott, 1987; Paris, Lawton, Turner, & Roth, 1991; Paris, Herbst, & Turner, in press; Thorkildsen, 1999; Urdan & Davis, 1997). However, although surveys and interviews are valuable tools for gathering information from students, they are often difficult for teachers, school leaders, or community decision-makers to use in their classrooms, schools, and communities. In contrast, asking students to draw pictures of themselves as test-takers in order to examine their perceptions of testing is cost-effective, unintrusive, and compatible with many classroom routines. Using drawings to elicit impressions of schooling can capture the perspectives of students for whom reading or completing survey forms might be difficult, including students with disabilities and students whose first language is not English. Teachers seeking a way to document changes in classroom organization, teaching, and learning can easily collect drawings for reflection and discussion (Haney, Russell, & Jackson, 1997; Tovey, 1996). In addition, drawings are proving to be a valid and reliable way of illuminating how individual students and groups of students understand their own learning processes (Lifford, Byron, Eckblad, & Ziemian, 2000; Russell & Haney, 1999). See Appendix A: "Using drawings to spur reflection and change."

Despite these advantages, we acknowledge potential drawbacks to drawings as a form of research and inquiry. Based on past experience, we know that when asked to draw aspects of their learning and school experience, students sometimes fall back on visual stereotypes, or highly unusual but memorable events. Also, we have seen that older students, in their mid-teens, sometimes decline to draw because they view drawing as childish or have been taught to doubt their own artistic ability. Additionally, we have learned that it can be hazardous to interpret the meaning of individual drawings without being able to talk with the artists who created them (see Appendix A). Ideally, in order to inquire more fully into students’ reactions to high stakes testing, we would like to use not just drawings, but also interviews and observations. Nonetheless, we think that this exploratory study provides a unique window on the perspectives of a group whose views are far too often ignored in debates about high stakes testing, namely the students who are subjects to such testing.

Gathering drawings from Massachusetts classrooms

In May 1999, shortly after the second round of MCAS testing ended, we sent out an e-mail invitation to a small listserve of Massachusetts teachers asking for their help in an exploration of students' perceptions of MCAS based on students' drawings of themselves as MCAS test-takers. Fifteen educators from fifteen schools in eight different districts responded, and in June, all asked their students to follow the simple prompt: "Draw a picture of yourself taking the MCAS." Subsequently, these teachers sent us drawings by 411 students. Of these, 303 (73.7%) were from 4th graders, 58 (14.1%) were from 8th graders, and 50 (12.2%) were from 10th graders. Disaggregated by type of community, the 411 drawings included 109 (26.5%) from classrooms described as urban, 209 (50.9%) from classrooms described as suburban, and 93 (22.6%) from those described as rural.

We asked teachers to note the student's grade level and the general location of each school (urban, suburban, or rural) on the back of each drawing. We assured teachers that any presentation of the drawings or a summary of patterns would not identify contributing students, teachers, schools, or communities. We have received their permission to use the drawings anonymously to illustrate the general patterns and findings from our sample.

Developing a coding scheme

Over the fall of 1999, we used a random sample of drawings to identify surface features apparent in the drawings, then clustered those features into a coding scheme reflecting broad categories that emerged. Our intention was not to look at the drawings through a psychoanalytic lens, but rather to describe facets of the testing situation that students chose to include in their drawings. To a great extent, then, we listed explicit aspects of the drawings, including such unambiguous features as student postures, testing materials, and the presence of other students or teachers. In addition, the coding scheme included affective responses that were clearly discernible in the drawings. Finally, we allowed for space to note specific and individual features from students' self-portraits, including comments written in thought bubbles, speech bubbles, or captions. (For a copy of the coding scheme used, see Appendix B.)

Coding the drawings

Following the development of the rubric, we coded each drawing individually, noting the characteristics listed in the rubric as well as any additional characteristics, including written commentary by students. Categories were not mutually exclusive, and single drawings encompassing multiple characteristics were coded for all characteristics. All drawings were coded by the first author. In addition, a graduate assistant also coded a randomly selected sample of 40 drawings to assess the inter-rater reliability of the coding scheme, and the first author recoded another set of 60 randomly selected drawings to assess intra-rater reliability. The agreement in independent ratings was greater than 90%. To further study the reliability of coding of drawings, we had five graduate students independently code a random sample of drawings. In this study we used not just percent agreement, but also Cohen’s kappa (which adjusts observed agreement for the probability of chance agreement). Across all ten pairs of comparisons, the mean Cohen’s kappa was 0.67. In his methodological note on kappa in Psychological Reports, Kvalseth(1989) suggests that a kappa coefficient of 0.61 represents "reasonably good" overall agreement. Our results show clearly that reliability of coding drawings surpass the standard suggested by Kvalseth. Finally, in January 2000, we met with educators who had submitted the drawings. Reviewing the drawings together, we elicited their comments, discussed their interpretations based on their knowledge of testing conditions, and considered the implications.

Analyzing the findings

Given our dependence on busy teacher volunteers, the sample used in our analysis is an opportunity sample and does not represent all Massachusetts students in the grades tested, or all students in urban, suburban, or rural communities. Recognizing the limitations of our sample, then, we have analyzed and reported our findings so as to reduce misinterpretation as much as possible. To this end, we have combined drawings gathered from eighth and tenth graders into a larger sample of drawings from secondary students. In addition, the differences in proportions we report for various categories of responses may reflect the vagaries of our sampling approach, and some differences in proportions may not be real but may have resulted simply from the chance of sampling. For these reasons, we generally avoid discussing differences in the proportions of drawings showing particular features when those differences are less than 0.10 or 10%.


STUDENTS' DESCRIPTION OF MCAS DIFFICULTY, LENGHT AND CONTENT


The drawings provide a variegated picture of how students in elementary, middle, and high school grades view high stakes testing. Overall, our sample of student drawings depicted the task of sitting through the multiple MCAS sessions as a solitary experience. Almost three-quarters of the drawings (70.7%) showed students seated alone at their desk or table. Click here to see a sample of nine such drawings. By comparison, 9.0% of the drawings included other students, and 2.7% included an adult, presumably a teacher. A little over a third (36.8%) of the drawings showed students both seated and writing, while almost half (45.9%) showed them seated but not writing. The test booklet was visible in about three quarters (76.3%) of the drawings; 41.0% of all drawings depicted the test booklet with writing, and 12.7% showed "bubbles" to be filled in.

Of the 411 drawings, 152 (37.0%) contained no evidence of what students thought or felt about taking the MCAS. These self-portraits convey a picture of students' complying with MCAS requirements with no marked reaction to the task at hand. Contrasting with the relatively neutral drawings, the remaining 259 drawings provided some explicit information about students' perceptions of MCAS. Using thought bubbles, speech bubbles, written captions, and unambiguous postures or gestures, students critiqued the test, referring to test difficulty, content, and length.

PERCEPTIONS OF TEST DIFFICULTY

About one out of six (17.6%) drawings referred to test difficulty in captions or thought bubbles. Students were more than four times more likely to describe the test explicitly as "hard" (9.3%) than as "easy" (1.7%). Some described MCAS as a mixture of hard and easy items. Drawings from fourth graders and urban students were most likely to allude to test difficulty. Urban students (15.6%) were more likely than suburban students (5.7%) to describe MCAS as "hard." Click here for a sample of nine drawings depicting student perceptions of test difficulty.

PERCEPTIONS OF TEST CONTENT AND ITEMS

A second group of drawings conveyed students' reactions to MCAS content, including content they found "tricky" or confusing. About one out of twelve drawings (8.5%) included question marks, often in thought bubbles, sometimes without text, sometimes as part of specific questions. In some of these drawings, students pictured themselves asking for help from the teacher. In others, they raised questions about specific test items, as in one drawing that asked, "Who was Socrates? Who was Socrates? What kind of question is that." In still other drawings, students' questions conveyed simply "What? What? What?" or "What is this?" or "Huh?" A handful of drawings offered self-portraits of students who appeared to be "stuck" or "blanked out" in response to test content. Click here for a sample of nine drawings depicting student perceptions of test content and items.

PERCEPTIONS OF TEST LENGTH

Six percent of all of the drawings alluded to the length of the MCAS, in terms of both number of pages and/or the time required to take it. Urban students were most likely to draw themselves as taking a test they described as "too long." While only 3.3% of suburban students and none of the rural students commented on test length, 16.5% of the drawings from urban students described MCAS as "too long."

One eighth grader, steam coming from her ears, drew herself with a test booklet of 6,021,000 pages in front of her. A fourth grader labeled a booklet of 1,000,956,902 pages with the title "Stinkn' test" and portrayed herself saying "TO (sic) MUCH TESTING." Another frowning student drew herself thinking "5 pages more." Drawings also portrayed students saying, "Is it over yet?" and, perhaps alluding to the daily repetition of the MCAS ritual, exclaiming, "Not MCAS again!" A related set of drawings, while not referring directly to test length, portrayed students as feeling tired or rushed to complete the test. Click here for a sample of nine drawings depicting student perceptions of length.


STUDENTS' AFFECTIVE RESPONSES TO MCAS


A number of students' portraits of themselves as test-takers departed from critiquing the test itself and instead delineated the wide range of affective responses students have toward MCAS. In some drawings, students' reactions to MCAS were generally positive or negative, and we coded these as "nonspecific positive" (Click here for a sample of nonspecific positive drawings.) or "nonspecific negative" (Click here for a sample of nonspecific negative drawings.). While 2.9% of all students drew pictures that conveyed a "nonspecific positive" response to MCAS, over six times that number (19.3%) used the drawings to communicate a "nonspecific negative" response. We considered drawings that communicated "I like MCAS" as "nonspecific positive" responses to MCAS. We included drawings with captions such as "I hate MCAS," "This test is stupid," and "This feels like jial (sic)" as "nonspecific negative" responses. Other drawings conveyed more distinct personal attitudes and feelings toward MCAS, both positive and negative.

POSITIVE RESPONSES TO MCAS

Additional drawings displayed particular positive responses to MCAS, including diligence and persistence, thinking and problem solving, and confidence, and we coded them accordingly. These categories are not mutually exclusive, and some drawings were coded in several categories.

Diligence and persistence

Many who put their faith in high stakes testing believe that attaching consequences to test scores will push otherwise lackadaisical students to take testing seriously (Manzo, 1997). For example, Martha Wise, president of the Ohio State Board of Education asserts, "Unless we say these [tests] are tremendously important for our students and then tie high stakes to them, students and others will tend to find excuses for not taking the tests [or] for not achieving high scores" (Lawton, 1997).

In our sample, 18.0% of the drawings offered portraits of students as diligent and motivated test-takers. In this category, we included drawings in which students presented themselves as thinking, solving problems, confident, or working hard for an "A" or "100." Fourth graders were most likely to portray themselves as diligent and persistent (21.5%), compared with 8.3% of eighth and tenth graders. Drawings from urban and suburban students were more likely (at 21.1% and 20.1% respectively) to suggest diligence and persistence than those of rural students (9.7%). Click here for a sample of four drawings depicting diligence and persistence.

Thinking and solving problems

Supporters of high stakes testing in Massachusetts assert that MCAS is better than other standardized tests in that its questions stir students to use critical thinking skills. In 7.3% of the drawings, students depicted themselves thinking and solving problems. In some of these drawings, students were shown considering specific questions from the English and mathematics portions of the test. In others, they appeared engaged in a thinking process, sometimes involving the weighing of various answers to problems, sometimes using test-taking skills. Fourth graders (9.2%), suburban students (9.1%), and rural students (8.6%) were slightly more likely than the sample as a whole to draw themselves using thinking skills, solving problems, or using test-taking skills. Click here for a sample of nine drawings depicting thinking and solving problems.

Confidence

Students approach testing with different views of themselves as learners and different levels of confidence. Those who view themselves as confident learners and test-takers may be more inclined to persist, even when test items are tedious and ambiguous. Confident test-takers may be more likely to ponder each test item rather than guess or search for the single right answer. Confident test-takers may also be more likely to check their work and correct mistakes.

In our sample, 5.4% of all drawings depicted "confident" test-takers. These drawings highlighted students as self-regulators of their work. Some showed no signs of students' working for specific marks on MCAS; others portrayed students anticipating an "A" or "100." Fourth graders (6.3%) and suburban students (8.6%) were slightly more likely to depict themselves as confident test-takers, but given our sample size these differences are not statistically significant. Click here for a sample of seven drawings depicting confidence.

NEGATIVE RESPONSES TO MCAS

Drawings also conveyed distinct negative responses, including anxiety, anger, pessimism, boredom, and loss of motivation, to the MCAS.

Anxiety

Since the first administration of MCAS, reports of schools' efforts to respond to students' anxiety have circulated widely. In one suburban Boston school, where student absence, illness, and headaches have been attributed to anxiety about MCAS, teachers have arranged for a yoga instructor to instruct fourth graders in yogic breathing before MCAS testing (Hays, 1999). Other schools have sought to relieve student fears about testing (whether general worries about failing, lack of time to think, and "blanking out," or specific concerns about multiplying decimals, spelling, and historical dates) through after-school test preparation programs (Yaekel, 2000). Mixed messages abound as teachers prepare students in test-taking skills while reminding them "Don't stress; it's only a test." On one hand, experts counsel parents to downplay MCAS results (Meltz, 1999). On the other, Kaplan/Simon & Schuster has promoted a "No-Stress Guide to the Eighth Grade MCAS" for parents to use at home. As Kaplan publisher Maureen McMahon commented, "Anyone could see there was a need for a guide that would take some of the anxiety away from this process" (DiLorenzo, 2000).

We coded 13.4% of the MCAS drawings from all grades as showing anxiety. These included students' self-portraits showing them sweating or commenting on the test as "nerve-wracking." Other drawings coded for anxiety included thought bubbles with a prayer or wish for the arrival of help (as distinct from asking for help from the teacher). Still others alluded to fear of failing and having to go to summer school. Students at all grade levels and in all kinds of communities portrayed themselves as worried about MCAS. In our sample, the rate of drawings presenting anxiety was not dramatically different for fourth graders (14.2%) and secondary students (11.1%), or for urban (17.4%), rural (14.0%), and suburban students (11.0%). Click here for a sample of nine drawings depicting anxiety.

Anger and hostility

Ten percent (10%) of the drawings portrayed students as angry about MCAS testing. These drawings went beyond the "I hate MCAS" message of the "nonspecific negative" drawings. Rather these drawings portrayed students as "mad." Others included thought bubbles in which students were setting fire to MCAS or marching on City Hall. Some drawings detailed the reasons for some students' hostility: their encounter with "hard problems" or content they were not familiar with, the belief that time spent in testing was time stolen from learning, and the feeling that the test was designed to reveal "what you don't know."

Drawings that distinctly conveyed hostility varied considerably by grade level and kind of community. While 6.6% of the fourth graders pictures themselves as angry, 19.4% of the eighth and tenth graders portrayed themselves in this way. Urban students were four times as likely as rural students to depict themselves as angry: 17.4% and 4.3% respectively. Click here for a sample of nine drawings depicting anger and hostility.

Boredom

In our sample, 4.9% of the drawings highlighted boredom as a response to MCAS. Some students conveyed this reaction in commentary on test questions as "easy, but boring." Others drew themselves with thought bubbles, noting "I am sooooo bored. This is really annoying."

Secondary students were generally more likely to depict themselves as bored, with 10.2% of the eighth and tenth grade drawings communicating boredom as compared with 3.0% of those from fourth graders. Eleven percent (11.0%) of urban students portrayed themselves as bored with MCAS. Click here for a sample of five drawings depicting boredom.

Sadness, disappointment, and pessimism

Drawings also depicted students as sad or pessimistic about their experience with MCAS. We coded 2.7% of the drawings as "sad." Another 2.2% of the drawings contained explicit references to students anticipating failure, grade retention, or a poor score. A few students depicted themselves as disappointed, including one who drew a large heart cracked through the middle with the caption "Heart broken because of MCAS." Urban students were slightly more likely than others to portray themselves as sad, disappointed, or pessimistic. Click here for a sample of seven drawings depicting sadness, disappointment, and pessimism.

Loss of motivation and withdrawal from testing

After days of testing on material that may seem confusing, ambiguous, or unfamiliar, students who feel anxious, angry, test-weary, or pessimistic about their prospects for success are not likely to put sustained effort into testing. In fact, during the testing period in May 1999, absenteeism was up in some districts, while in others, participating students "just put any old answer down," stopped answering questions, and put their heads down on their desks (Berard & Pearlman, 1999; Curtis, 1999a; Johnson, 1999; O'Shea, 1999). As one student from the first class required to pass MCAS for graduation observed, "After the first two days of tests, your fingers and your mind hurt. A lot of kids didn't try after that" (Curtis, 1999b).

MCAS drawings showed student effort stalled in various postures. Several artists described how they felt fresh and eager in the early hours of MCAS but petered out and became careless as the hours and days of testing continued. In 5.3% of the drawings, students portrayed themselves sleeping during testing, or daydreaming about things unrelated to MCAS. Secondary students (7.4%) and urban students (6.4%) were slightly more likely to show themselves as sleeping through MCAS. Click here for a sample of seven drawings depicting loss of motivation and withdrawal from testing.

Relief

A final set of drawings, representing 3.9% of all drawings, depicted students as relieved that testing had ended. Some students portrayed themselves proclaiming the test "done" while others cheered, "Yeah, it's all over!" Drawings from eighth graders (5.2%) and urban students (6.4%) were most likely to convey relief that testing had finished. Click here for a sample of three drawings depicting relief.


CONCLUSION


Our study of students' self-portraits as MCAS test takers began with an invitation to Massachusetts teachers to gather drawings from their fourth, eighth, and tenth grade students following the second round of MCAS testing in 1999. The drawings generated by this invitation have allowed us to explore how students respond to MCAS testing. Although we have been working with a small opportunity sample of drawings, the patterns that emerge, we believe, raise questions about the assumptions that undergird high stakes testing policies. In particular, the drawings challenge the belief that the high stakes associated with MCAS will enhance the motivation and effort of students in a uniform way. To the contrary, the considerable range of responses to MCAS from one student to another, by grade level, and by the school location suggests that the connection of high stakes testing to students' motivation is not so simplistic as policy makers often assume. These variations also invite reflection on how students' perceptions of high stakes testing may interact with other aspects of their schooling. We discuss the patterns that emerge in the MCAS drawings in our related paper (Wheelock, Bebell, & Haney, 2000).


REFERENCES


Allen, M. (1999). Panel blasts MCAS: Minorities don't get fair shake, group says. New Bedford Standard-Times, 16 November: A01.

Associated Press. (1999). MCAS changes considered. Northampton Gazette, 3 July: http://www.gazettenet.com/schools/07031999/13970.htm.

Associated Press. (2000). Teacher opposed to MCAS tests. Northampton Gazette, 1 April: http://www.gazettenet.com/schools/04012000/23647.htm.

Barry, S. (2000). Pressure on schools to improve. Springfield Union-News, 13 April: http://www.masslive.com/news/pstories/ae413mca.html.

Berard D. & Pearlman, H. P. Latest MCAS scores: Good, bad, and ugly. Lowell Sun, 8 December: 01.

Coalition for Authentic Reform in Education (CARE). (1998). MCAS Review. Unpublished report, 23 November.

Coalition for Authentic Reform in Education (CARE). (1999). MCAS Review #2. Unpublished report, 10 December.

Crittenden, J. (2000). Boycotters score points as most students take MCAS, Boston Herald, 13 April: 03.

Curtis, M. J. (1999a). MCAS controls their future. New Bedford Standard-Times, 12 December: 01 (http://www.s-t.com/daily/12-99/12-12-99/a01lo005.htm).

Curtis, M. J. (1999b). Scores draw mixed reaction from parents, students. New Bedford Standard Times, 8 December: 01 (http://www.s-t.com/daily/12-99/12-08-99/a01lo013.htm).

Daley, B. & Vigue, D.I. (2000). Mass. seen as ground zero in test tussle, Boston Globe, 11 April: B01.

Daley, D. (2000). Parents lash back at MCAS, Newton Tab, 27 April.

Debard, R. & Kubow, P. K. (2000). Impact of Proficiency Testing: A Collaborative Evaluation. Report prepared for Perrysburg (OH) Schools. Bowling Green State University and Partnerships for Community Action.

Drake, C. (2000). Flaws of MCAS justify boycott. Northampton Gazette, 25 April: http://www.gazettenet.com/04252000/opinion/24535.htm.

Dorn, S. (1998). The political legacy of school accountability systems. Educational Policy Analysis Archives 6(1), 2 January: http://olam.ed.asu.edu/epaa/v6n1.html.

Downs, A. (1999). Cambridge parents pledge to end MCAS. Boston Sunday Globe City Weekly, 31 October: 01.

Gentile, D. (2000a). Monument students working to remove MCAS requirement. Berkshire Eagle, 14 April: 01.

Gentile, D. (2000b). Some students to shun MCAS. Berkshire Eagle, 11 April: 01.

Gentile, D. & Sukiennik, G. (2000). Students at Wahconah, Monument stage 'quiet protest' against MCAS. Berkshire Eagle, 13 April: 01.

Glading-DiLorenzo, J. (2000). A high-stakes revolution: Across the state, parents, students, and educators are saying 'no' to the controversial MCAS, Valley Advocate, 13 January: 16, 19, 21.

Greenebaum, M. (2000). School visits: MCAS alternative. Northampton Gazette, 14 April: http://www.gazettenet.com/04142000/opinion/24160.htm.

Groves, M. & Richardson, L. (2000). 'Test Prep' Moving Into Primary Grades, Los Angeles Times, 1 April: A1.

Guenther, W. (2000). MCAS tests skills that matter. Boston Globe, 25 April: E4.

Haladyna, T. M., Nolen, S. B., & Haas, N. S. (1991). Raising standardized achievement test scores and the origins of test score pollution. Educational Researcher 20(5), June-July: 2-7.

Haney, W., Russell, M., & Jackson, L. (1997). Using drawings to study and change education and schooling. Research proposal from the Spencer Foundation from the Center for the Study of Testing, Evaluation, and Educational Policy at Boston College, September 1997.

Haney, W. & Scott, L. (1987). Talking with children about tests: An exploratory study of test item ambiguity. In Freedle, R. O. and Duran, R. P. (Eds.). Cognitive and Linguistic Analyses of Test Performance. Norwood, NJ: Ablex.

Hayes, K. (1999). Stressed out children gaining relief in yoga. Boston Sunday Globe South Weekly, 6 June: 01.

Hayward, E. (2000a). Ex-teacher of the year now fights MCAS. Boston Herald, 27 March: 20.

Hayward, E. (2000b). Hub teachers, local leaders agree to back MCAS exam. Boston Herald, 15 April: http://www.bostonherald.com/news/local_regional/mcas04152000.htm.

Hayward, E. (2000c). Minority pupils lagging on MCAS exams. Boston Herald, 18 May: http://www.bostonherald.com/news/local_regional/mcas05182000.htm.

Hayward, E. (2000d). Schools gear up for MCAS: Boycotting kids face penalties as testing begins. Boston Herald, 12 April: http://www.bostonherald.com/news/local_regional/mcas04122000.htm.

Hellman, S. (2000). Parents unite as revolt widens against MCAS. Boston Sunday Globe West Weekly, 16 April: 01.

Heubert, J. P. & Hauser, R. M., (Eds.). (1999). High Stakes: Testing for Tracking, Promotion, and Graduation. Washington, DC: National Research Council, National Academy Press. (Available on line at http://books.nap.edu/html/highstakes/index_pdf.html.)

Horn, C., Ramos, D., Blumer, I., & Madaus, G. (2000). Cut scores: Results may vary. NBETPP Statements, Vol. 1, No. 4, National Board on Educational Testing and Public Policy, Boston College: http://www.nbetpp.bc.edu/reports.html.

Johnson, J. (1999). MCAS brings good news and bad. Norton Town Online, 9 December: http://www.townonline.com/neponset/norton/news/newNMMCAS121027.html.

Katz, M. (2000). Top educator criticizes reliance on MCAS tests. Providence Journal, 2 February.

Kohn. A. (1999). The Schools Our Children Deserve: Moving Beyond Traditional Classrooms and "Tougher Standards." Boston: Houghton Mifflin.

Kvalseth, T. O. (1989). Note on Cohen’s kappa. Psychological reports, 65, 223-26.

Lawton, M. (1997). States' boards leaders call for assessments bearing consequences. Education Week, 22 October: http://www.edweek.org/ew/1997/08nasbe.h17.

Lemann, N. (1999). The Big Test: The Secret History of the American Meritocracy. New York: Farrar Strauss & Giroux.

Lifford, J., Byron, B., Eckblad, J., & Ziemian, C. (2000). Reading, responding, reflecting. English Journal 84(4). March: 46-57.

Lindsay, D. (2000). CON-test. Education Week, 5 April, http://www.edweek.org/ew/ewstory.cfm?slug=30mass.h19.

Linn, R. (2000). Assessments and accountability. Educational Researcher, 29 (2), March:http://www.aera.net/pubs/er/arts/29-02/linn01.htm .

Lord, R. (1999). Harwich teacher refused to hand out MCAS test. Cape Cod Times, 3 June.

Madaus, G. F. (1988). The influence of testing on curriculum. In L.N. Tanner (Ed.), Critical issues in curriculum: Eighty-seventh yearbook of the National Society for the Study of Education (pp. 83-121). Chicago, IL: University of Chicago Press.

Manzo, K. K. (1997). High Stakes: Test Truths or Consequences. Education Week, 22 October: http://www.edweek.org/ew/1997/08nc.h17.

Marcus, J. (1999). The shocking truth about our public schools: They're better than you think. Boston Magazine, October: 70-81; 138-141.

Mashberg, T. (1999). Mass. kids join legions with poor test scores. Boston Herald, 14 November: 01.

Massachusetts Department of Education. (October 1999a). Massachusetts Comprehensive Assessment System: 1998 Technical Manual.

Massachusetts Department of Education. (October 1999b). Massachusetts Comprehensive Assessment System: 1998 MCAS Technical Report Summary.

Massachusetts Department of Education. (November 1999). Massachusetts Comprehensive Assessment System: Report of 1999 State Results. November

McFarland, C. (1999). Education board increases number of MCAS tests. Worcester Telegram and Sun, 26 May: 01.

McNeil, L. (2000). Contradictions of Reform: Educational Costs of Standardized Testing. New York: Routledge.

McNeil, L. & Valenzuela, A. (2000). The harmful impact of the TAAS system of testing in Texas: Beneath the accountability rhetoric. Harvard University, The Civil Rights Project: http://www.law.harvard.edu/civilrights/conferences/testing98/drafts/mcneil_valenzuela.html.

Mehrens, W. A. (1998). Consequences of assessment: What is the evidence? Educational Policy Analysis Archives 6(13), 14 July: http://olam.ed.asu.edu/epaa/v6n13.html.

Meier, D. (2000). Will Standards Save Public Education? Boston: Beacon Press.

Meltz, B. F. (1999). How to handle MCAS scores. Boston Sunday Globe City Weekly, 11 November: 01.

National Center for Fair and Open Testing (FairTest), (2000). MCAS: Making the Dropout Crisis Worse, MCAS Alert, September: http://www.fairtest.org/care/MCAS%20Alert%20Sept.html.

Noble, A. J. & Smith, M.L. (1994). Old and new beliefs about measurement-driven reform: 'Build it and they will come.' Educational Policy 8(2), 111-136.

Ohanian, S. (1999). One Size Fits Few : The Folly of Educational Standards. Portsmouth, NH: Heinemann.

O'Shea, M. E. (1999). Absenteeism seen as a factor in state test results. Springfield Union-News, 10 December: http://www.masslive.com/news/stories/ae129atv.html.

Paris, S.G., Lawton, T.A., Turner, J.C., & Roth, J.L. (1991). A developmental perspective on standardized achievement testing. Educational Researcher 20(5), June-July: 12-20.

Paris, S. G., Herbst, J.R., & Turner, J.C. (In press.) Developing disillusionment: Students' perceptions of academic achievement tests. Issues in Education.

Penniman, B. (2000). After great expectations, hard times: Why assessment in Massachusetts undermines the early promise of education reform. The State Education Standard, 1 (Journal of the National School Boards Association). Spring: 22-26.

Rodriguez, C. (1999). MCAS stirs fears that minorities will drop out. Boston Globe, 12 November: A01.

Russell, M. & Haney, W. (1999). Validity and reliability gleaned from information in student drawings. Paper presented at the annual meeting of the American Educational Research Association, Montreal.

Sacks, P. (1999). Standardized Minds: The High Price of America's Testing Culture and What We Can Do To Change It. Cambridge, MA: Perseus Books.

Scherer, M. (2000). Political statement about MCAS. Northampton Gazette, 12 April: http://www.gazettenet.com/schools/04122000/24076.htm.

Shartin, E. (2000). High stakes for making the grade. Metrowest Daily News, 16 May.

Shea, C. (2000). It's come to this. Teacher Magazine, May: http://www.teachermagazine.org/tm/tm_printstory.cfm?slug=08profit.h11Content-Type :.

Stake, R. (1998). Some comments on assessment in U. S. Education. Educational Policy Analysis Archives, 6 (14): http://olam.ed.asu.edu/epaaa/v6n14.html

Steinberg, J. (2000). Bluebooks closed, students protest state tests. New York Times, 13 April: 01.

Sukiennik, G. (2000). MCAS tests fatally flawed, teachers say. Berkshire Eagle, 26 January: 1.

Sweeney, E. (2000). Students stage MCAS walk-out. Brookline TAB, 13 April: 01.

Tantraphol, R. (2000a). Educator: Delay test-diploma link. Springfield Union-News, 15 March: http://www.masslive.com/news/pstories/ae315tan.html.

Tantraphol, R. (2000b). Students boycott 'unfair' test. Springfield Union-News, 13 April: http://www.masslive.com/news/stories/ae413not.html.

Thesing, E. (2000). Teens tell governor he is failing them. Boston Globe, 6 June: B02: http://www.boston.com/dailyglobe2/158/metro/Teens_tell_governor_he_is_failing_them-.shtml.

Thorkildsen, T. A. (1999). The way tests teach: Children's theories of how much testing is fair in school. In M. Leicester, C. Modgil & S. Modgil (Eds.) Education, Culture, and Values, Vol. III. Classroom Issues: Practice, Pedagogy, and Curriculum. London: Falmer.

Tovey, R., (1996) Getting kids into the picture: Student drawings help teachers see themselves more clearly, Harvard Education Letter XII, (6), 5-6.

Urdan, T. & Davis, H. (1997, June). Teachers' and students' perceptions of standardized tests. Paper presented at the Interdisciplinary Workshop on Skills, Test Scores, and Inequality. The Roy Wilkins Center for Human Relations and Social Justice, University of Minnesota. Minneapolis, Minnesota.

Uriarte, M. & Chavez, L. (2000). Latino Students and the Massachusetts Public Schools. Boston: Mauricio Gaston Institute for Latino Community Development and Public Policy, University of Massachusetts; http://www.gaston.umb.edu.

Vaishnav, A. & Vigue, D. I. (2000). Hundreds of parents, teachers, students rally against MCAS. Boston Globe, 16 May: B3.

Vigue, D. I. & Yaekel, T. (2000). Hundreds of students boycott MCAS tests. Boston Globe, 13 April: A01.

Vigue, D. I. (2000). Race gap endures on MCAS results: Minority leaders, state plan talks. Boston Globe, 19 May: B01.

Walsh, J. (2000a). Board to be asked to oppose test. Springfield Union-News, 13 April: http://masslive.com/news/pstories/hf413mca.html.

Walsh, J. (2000b). Parents pass word of test opposition. Springfield Union-News, 11 April: http://www.masslive.com/news/pstories/hf411mca.html.

Walsh, J. (2000c). Test content reaches school board. Springfield Union-News, 16 April. http://www.masslive.com/news/pstories/hf415mca.html.

Wheelock, A., D.J. Bebell, & W. Haney. (2000). Student self-portraits as test-takers: Variations, contextual differences, and assumptions about motivation. Teachers College Record, http://www.tcrecord.org ID Number: 10635.

Whitford, B.L & Jones, K. (2000). Accountability, Assessment, and Teacher Commitment: Lessons from Kentucky's Reform Efforts. Albany: State University of New York Press.

Wilson, C. B. (2000). Parents seek anti-MCAS stand. Northampton Gazette, 12 April: http://www.gazettenet.com/04122000/schools/24078.htm.

Yaekel, T. (2000). Students scramble, sacrifice, sweat in countdown to MCAS. Boston Globe, 10 April: B01.


ACKNOWLEDGEMENTS

The authors most gratefully acknowledge the contribution of the Massachusetts teachers who generously collaborated with us on this project, gathered drawings, and provided commentary on our early analysis of findings. We also extend special thanks to Mimi Coughlin, Lauren McGrath, Christine Mills, and Genia Young for their invaluable assistance in our data input, reliability analysis, and technical production tasks, and to Christina Capodilupo, Scott Paris, Kathleen Rhoades and Alan Stoskopf for comments on earlier drafts of this paper. Thanks also to two anonymous TCR reviewers who provided helpful suggestions on a previous version of this paper. We alone take responsibility for our conclusions. Finally, we are grateful to the Spencer Foundation for support in the completion of this project.


APPENDIX A: USING DRAWINGS TO SPUR REFLECTION AND CHANGE

For more than five years we have used drawings not just in the sort of research reported here, but also in less formal efforts to document and change the educational ecology of schools and classrooms (see Haney, Russell, Gulek& Fierros; 1998 for a summary of how this work evolved). We have also seen a number of teachers use drawings to help students reflect on their own thinking and learning (see, for example, Black, 1991; Lifford, Byron, Eckblad, & Ziemian; 2000).

Here are a variety of prompts we or our colleagues have used to elicit drawings of different sorts:

  • Think about the teachers and the kinds of things you do in your classrooms. Draw a picture of one of your teachers working in his or her classroom.
  • Think about all of the different things your teachers do with you in the classroom. Draw a picture of what a camera would see when one of your teachers is working in the classroom.
  • Draw a picture of yourself doing math.
  • Think about all of the different things you do when you read. Draw a cartoon or pictogram of everything you do when you read a book.
  • Draw a picture of yourself using a computer.
  • Think about the steps you take when writing a paper for school. In the space below, draw a picture or series of pictures that reflect your writing process.
  • Think about the teachers and the kinds of things you have done in your class today. Draw a picture of your teacher teaching and yourself learning.

We have most experience in using the first prompt. After collecting drawings from all or a representative sample of students in school, we assemble one or more random sets of 50 drawings (50 is a large enough number to afford representative results, but small enough for review in a reasonable period). We present the drawings to teachers in small discussion groups of three or four people. The teachers are asked to flip through the drawings and look for patterns, speculate about their causes, and think about what they might do differently, based on the drawings. Finally small groups share their findings. The school staff are encouraged to reach consensus on priorities for change and to set concrete goals for changes they hope to see in future survey results.


Some practical tips on using drawings:

1. Use plain white paper, not lined paper.
2. When setting your classroom up for the drawing activity, consider the pros and cons of asking students for drawings in color, pen, or pencil. Drawings in color may allow for greater expression, but the costs of reproducing multiple copies for discussion may be higher.
3. Before sharing drawings outside particular classrooms or schools, be sure to get appropriate consent of students and educators involved. And remove words, such as personal names that might allow identification of individuals.
4. Be cautious in interpreting the meaning of individual drawings. Below is an example of why.

Caution in Interpretation!!!

In one school, two students' drawings depicted a most unusual scene, an image of their science teacher standing in front of the blackboard with flames coming out of his pocket. This unusual scene might readily provoke speculation as to its meaning , but what became apparent in discussion was that both students were depicting an incident that had actually happened some months before. The teacher in question had a nervous habit of fiddling with change in his pocket as he taught and what had once happened was that while fiddling with contents of his pants pockets, two books of matches had rubbed together and caught fire.


APPENDIX B: CODING MATRIX

CODE

 

Student depiction

 

Student seated at desk, writing

 

Student seated at desk, not writing

 

Student standing

 

Student outside of school

 

Student, face only

 

Other students in picture

 

Teacher

 

No people in drawing

 

Facing Away

 

Student facial expression

 

Positive

 

Negative

 

Neutral

 

No facial features

 

Testing "equipment" /trappings

 

Test booklet

 

Pencils

 

--Student holding pencil

 

--Pencil laid on desk or test booklet

 

--Extra pencil(s) available

 

--No pencil shown

 

Eraser

 

Food and/or drink

 

Other

 

Classroom features: Functional

 

Clock

 

Telephone

 

Computer

 

Door

 

Window/outside view

 

Table

 

Teacher's desk

 

Blackboard

 

Student desk alone

 

Student desks in rows

 

Student desks clustered

 

Other

 

Classroom features:

 

Instructional posters

 

Motivational posters

 

MCAS-related posters

 

Other

 

Test difficulty

 

Hard

 

Not too hard, not too easy

 

Easy

 

Tricky

 

Can't discern

 

Other

 

Test content/format

 

Math/Science

 

Writing

 

Reading/literature

 

History/social studies

 

Bubbles to fill in (multiple choice)

 

Blank space on test booket

 

Unfamilar content

 

Response to testing/test

 

Thinking/problem-solving

 

Using test-taking skills

 

Feeling confidence

 

Help needed/asking question

 

Anxiety/stress/nervousness

 

Boredom

 

Anger

 

Tiredness/sleeping (while testing)

 

Sadness

 

Being overwhelmed/too long

 

Disappointment

 

Diligent, Motivated

 

Not serious

 

Anticipating a grade

 

Anticipating a good grade

 

Anticipating a bad grade

 

Question marks

 

Thinking about other things/post-test

 

-- Sleeping

 

-- Relaxing

 

-- Friends/fun

 

Feeling relief

 

Other postivie

 

Other negative

 

 



Cite This Article as: Teachers College Record, Date Published: November 02, 2000
https://www.tcrecord.org ID Number: 10634, Date Accessed: 5/21/2022 6:04:38 AM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Anne Wheelock
    Independent Education Policy Writer & Researcher
    E-mail Author
    Anne Wheelock, an independent education policy analyst and writer, is author of Crossing the Tracks: How 'Untracking' Can Save America's Schools (1992) and Safe To Be Smart: Building a Culture for Standards-Based Reform in the Middle Grades (1998). She is also co-author (with Christina Capodilupo) of a research analysis of Massachusetts dropout rates in the era of high stakes testing, posted on line at http://www.fairtest.org/care/MCAS%20Alert%20Sept.html.
  • Damian Bebell
    Boston College
    E-mail Author
    Damian J. Bebell is a doctoral student at Boston College where he is employed at the Center for the Study of Testing, Evaluation and Educational Research (CSTEEP). His research interests include educational philosophy, alternative forms of assessment, and addressing student perspectives in education. His related work with the Massachusetts Teacher Test is on line at http://epaa.asu.edu/epaa/v7n4/.
  • Walt Haney
    Boston College
    E-mail Author
    Walt Haney, Ed.D., Professor of Education at Boston College and Senior Research Associate in the Center for the Study of Testing Evaluation and Educational Policy (CSTEEP), specializes in educational evaluation and assessment and educational technology. He has published widely on testing and assessment issues in scholarly journals such as the Harvard Educational Review, Review of Educational Research, and Review of Research in Education and in wide-audience periodicals such as Educational Leadership, Phi Delta Kappan, the Chronicle of Higher Education and the Washington Post. His recent work, "The Myth of the Texas Miracle in Education" can be found on line at http://epaa.asu.edu/epaa/v8n41/. Also, a recent discussion of the gap between testing and technology in schools is available at http://epaa.asu.edu/epaa/v8n19.html
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS