Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

Understanding the Interaction Between High-Stakes Graduation Tests and English Learners


by Julian Vasquez Heilig - 2011

Background/Context: The prevailing theory of action underlying No Child Left Behind’s high-stakes testing and accountability ratings is that schools and students held accountable to these measures will automatically increase educational output as educators try harder, schools will adopt more effective methods, and students will learn more. In Texas, the centerpiece of high school accountability is the pressure to improve exit test scores, a battery of minimum competency exams that students have to pass to graduate from high school. Despite the theory underlying accountability, it is unknown whether policies that reward and sanction schools and students based on high-stakes tests improve English learner (EL) student outcomes over the long term.

Purpose/Objective/Research Question/Focus of Study: The purpose of the research is to better understand the interaction between high-stakes testing, accountability, and ELs. This study asks the following questions: Have student outcomes for ELs improved since the inception of accountability in Texas? To what extent does social capital theory inform our understanding of the impact of high-stakes exit testing on EL exit test performance in Texas high schools? What are the perceptions of teachers, principals, and students regarding the effects of high-stakes testing and accountability on ELs?

Research Design: This article reviews longitudinal student outcomes (test scores, dropout, grade retention, and graduation rates) for Texas ELs from the inception of accountability in 1993. To understand the interaction between ELs and high-stakes exams, the researcher undertook qualitative field work in high schools in four Texas districts with large numbers of ELs to understand how the life contexts of ELs interact with Texas-style high-stakes testing and accountability policies. Via administrator, teacher, and student perceptions of exit testing, the article attempts to shed light on the academic challenges faced by ELs in the current accountability context.

Conclusions/Recommendations: This article underscores the legitimacy of the concern that ELs experience unintended consequences associated with high-stakes exit testing and accountability policy and suggests that social justice and equity are ratiocinative critiques of high-stakes testing and accountability policies. The next round of federal and state educational policy must be a mandate that provides support for ELs to meet performance standards by providing evidence-based solutions: appropriate curriculum, pedagogy, and well-trained teachers. Furthermore, policy makers, practitioners, and researchers should be cognizant of the less intrusive approach that many ELs and their families have toward schools by reconsidering whether “one size fits all” high-stakes exit testing policies are plausible for increasingly heterogeneous student populations. The use of multiple measures of EL student success in content areas, such as portfolios, is an accountability mechanism that makes sense, not just for ELs, but for all students.

English learners (ELs) constitute a large sector of students vulnerable to poor school performance because many of these youth arrive having received uneven or irregular instruction in their home countries (Olsen, 1997). ELs exhibit low academic achievement, poor performance on standardized exams, low graduation rates, and high dropout rates (Gándara, Rumberger, Maxwell-Jolly, & Callahan, 2003). ELs often do not receive specialized instruction in the classroom (August & Hakuta, 1997), leading to a “failure” track that can be exacerbated by fast-tracked standardized testing in English .


National achievement trends are also reflected in aggregate Texas data. U.S. Census Bureau (2008) data show that Texas ranks second nationally in its percentage of immigrant Latinos aged 25 years and older without a high school diploma. Valenzuela, Fuller, and Vasquez Heilig (2006) found that the disappearance of ELs from Texas schools is twice the rate of other students. Valencia and Villarreal (2004) showed that Texas ELs have dramatically lagged behind English-proficient students on the state-mandated Texas Assessment of Knowledge and Skills (TAKS). Vasquez Heilig and Darling-Hammond (2008) found that less than a quarter of ELs were graduating from a large urban school district in Texas. Hence, the twin problems of lagging achievement and low completion rates are problematic for ELs in Texas.


A PANACEA? HIGH-STAKES TESTING AND ACCOUNTABILITY


To address long-standing gaps between minority and majority student achievement, the Texas Legislature enacted Texas Senate Bill 7 (1993), the incipient statute for the creation of the Texas public school accountability system, to rate school districts and evaluate campuses.1 The prevailing theory of action underlying Texas-style high-stakes testing and accountability ratings is that schools and students held accountable to these measures automatically will increase educational output as educators try harder, that schools will adopt more effective methods, and that students will learn more (Vasquez Heilig & Darling-Hammond, 2008). Pressure to improve test scores will produce genuine gains in student achievement (McNeil, 2005).


McNeil (2005) related that Texas-style high-stakes testing and accountability policy, by force of federal law, has become the driving education policy for the entire nation with the 2002 reauthorization of the Elementary and Secondary Education Act as the No Child Left Behind Act (NCLB). President George W. Bush and former secretary of education Rod Paige, two primary arbiters of NCLB, lassoed their ideas for federal education policy from Texas. NCLB replicated the Texas model of accountability by injecting public rewards and sanctions into national education policy and ushered in an era in which states and localities are required to build state accountability systems on high-stakes assessments. The centerpiece of NCLB requires that schools and districts meet the federally established goal of adequate yearly progress (AYP) associated with minimum levels of improvement on high-stakes testing assessments for demographic subgroups, including ELs, or face federal sanctions and penalties.


Texas-style accountability policies have the potential to dramatically impact the schooling of ELs nationwide via the batteries of high-stakes testing that buttress education policy systems. NCLB-encouraged accountability policies are increasingly considering tests as the basis of decisions that determine the progression of children through school, access to education, student achievement progress, and the amount of resources a school receives to educate students (Darling-Hammond, 2003). Understanding if high-stakes testing and accountability policies impact ELs differently than their U.S.-born counterparts is an important investigation because these policies are not overtly racial or EL specific and are applied regardless of student status.


As test-based accountability commenced in Texas, publicly reported achievement gains across grade levels, conjoined with increases in high school graduation rates and decreases in dropout rates, brought nationwide acclaim to the Texas accountability “miracle” (Haney, 2000). Yet, although accountability’s theory of action intuitively seemed plausible, at the point of NCLB’s national implementation, the “Texas Miracle” was the primary source of evidence fueling the notion that accountability positively impacted the long-term success of low-performing students and the schools that served them (Nichols, Glass, & Berliner, 2006). The successes of the Lone Star State’s accountability policy in the midst of the Texas Miracle has been debated vociferously in the literature (Carnoy, Loeb, & Smith, 2001; Haney; Klein, Hamilton, McCaffrey, & Stecher, 2000; Linton & Kester, 2003; McNeil, Coppola, Radigan, & Vasquez Heilig, 2008; Toenjes & Dworkin, 2002; Vasquez Heilig & Darling-Hammond, 2008). Yet the question remains: Do policies that reward and sanction schools and students based on high-stakes tests improve student outcomes over the long term?


Understanding EL outcomes in the accountability environment is an insufficiently explored area in the literature. Accordingly, this article begins by reviewing longitudinal student outcomes (test scores, dropout rates, grade retention, and graduation rates) for Texas ELs from the inception of accountability in 1993. To understand the interaction between ELs and high-stakes exams, the researcher undertook qualitative field work in high schools in four Texas districts with large numbers of ELs to understand how the life contexts of ELs interact with Texas-style high-stakes testing and accountability policies. Via administrator, teacher, and student perceptions of exit testing, the article attempts to shed light on the academic challenges faced by ELs in the current accountability context.


The purpose of the research is to better understand the interaction between high-stakes testing, accountability, and ELs. This study asks the following questions: Have student outcomes for ELs improved since the inception of accountability in Texas? To what extent does social capital theory inform our understanding of the impact of high-stakes exit testing on EL exit test performance in Texas high schools? What are the perceptions of teachers, principals, and students regarding the effects of high-stakes testing and accountability on ELs?


PREVIOUS RESEARCH


A continuing question in the literature is whether high-stakes testing that rewards or sanctions schools based on averaged student scores has unintended consequences for students. Prior work has focused on the language dependency of high-stakes tests for ELs (Abedi, 2002; Durán, 1989). Emerging research suggests that high-stakes testing policies could have a negative effect on ELs’ educational outcomes because of low-quality pedagogy from teaching to tests (Hamilton et al., 2007; McNeil, 2000). Schools also have sought to manipulate the accountability system by excluding special populations from testing through exemptions and other means in order to show overall increased educational achievement (Cullen & Reback, 2006; Jacob, 2005; Jennings & Beveridge, 2009). Such manipulation leads to high rates of grade retention, increased dropout rates, and depressed high school completion (Vasquez Heilig & Darling-Hammond, 2008).


The debate surrounding the relationship between one form of high-stakes assessment, exit exams, and student outcomes emerged as early as the 1980s. Catteral (as cited in Firestone, Rosenblum, Bader, & Massell, 1991) found that students who failed graduation tests were more likely to drop out. More recent evidence suggests that high-stakes graduation tests in Texas are not associated with negative outcomes such as increased high school dropout rates. Warren and Jenkins (2005) considered Texas and Florida graduating classes from Current Population Survey data (1971–2000) and found no evidence that state high school exit examinations were independently associated with higher dropout rates or greater inequalities in dropout rates. However, their study had some important limitations. The first issue is the severe oversampling of Whites in their study. They reported that 83% of their Texas Current Population Survey sample in 2000 was White; however, the Texas Education Agency (TEA, 2000) reported that only 43% of the students in the state were White. The second issue lies in the lack of specificity in defining and examining the relationship between exit testing and dropout for minority populations. The authors included Asians with Latino and African Americans in a White/non-White dichotomous variable, even though the Asian graduation rate was 20% higher than that of Latinos and African Americans.2 Thus, the impact of exit testing on EL dropout is uncertain. Warren and Jenkins addressed this limitation by stating, “It is not clear to us how serious this issue may be” (p. 135). They assumed that inclusion of ELs was as likely in their sample as other demographic groups; however, the oversampling of Whites in their population suggests otherwise. Furthermore, even with perfect data, it would be difficult for their quantitative research to develop a theoretical relationship between EL dropout and exit testing. The current research does descriptively consider student outcomes but also seeks to understand how high-stakes exit testing may impact EL outcomes such as dropout rates.  


To date, empirical research detailing the positive relationships between Texas-style high-stakes exit testing and minority student success is scarce, and none has examined EL responses to the incentives and pressure of exit testing. There is little in the research literature on how high-stakes exit testing impacts the long-term achievement and academic progress of ELs. As a result, this research conducts an inchoate query into the unobserved impact of Texas exit testing in the lives of ELs.


EL EDUCATIONAL ATTAINMENT, SOCIAL CAPITAL, AND CONFIANZA


La Celle-Peterson and Rivera (1994) argued that ELs and their communities interact with schools differently than do monolingual students. In the current accountability environment, it is unclear how ELs and their families intervene in the process of schooling to ensure educational attainment. To conceptualize the relationship between ELs and high schools in a context of accountability policies, this theoretical framework first considers traditional conceptions of social capital. Next, the article considers con respeto, a form of respect that Mexican families afford teachers and schools. Finally, the article examines whether a weakening of confianza, a conception of social trust in schools, can hinder EL achievement and progress in an environment of high-stakes exit tests.


Coleman (1988) provided insights into the workings of social capital by contextualizing class differences in both parental involvement and views that parents hold about their children’s schooling. Coleman emphasized supportive relationships, bonds of trust, and shared norms and values, similar to the exchange relationships found among many Mexican adults (Keefe & Padilla, 1987) and youth (Valenzuela, 1999), and in relationships between school personnel and youth (Stanton-Salazar, 2001). Horvat, Weininger, and Lareau (2003) demonstrated the broader role of social capital in the securing of middle-class parents’ educational advantage for their children. Lareau (1987) argued that home advantage refers to a level of understanding and types of constructive involvement that link students to their own class position. In other words, families can marshal their social resources to gain positive schooling outcomes for their children or experience bounds associated with limited social capital.


The existing literature on the educational attainment of immigrants, although recent, has shown that varying levels of social capital influence achievement (Kao & Rutherford, 2007; White & Kaufman, 1997). Social capital theory suggests that the added complexity of high-stakes testing and accountability policies to the process of schooling may introduce additional hurdles for ELs to leap, compounding the invisible forces of advantage that are already at play even before a student reaches the established and ever-increasing bars that signify proficiency across subject areas. As schools are attempting to take steps to be more broadly inclusive of all students and “leave no child behind,” children from families with more social capital will be better positioned to have better student learning opportunities and positive achievement outcomes.


Putnam (1993) defined social capital as the “trust, norms, and networks that facilitate social coordination for mutual benefit” (p. 167). Trust can be operationalized as a collection of expectations shared by individuals involved in the exchange and cooperation of human interaction (Burt, 1997). Valdés (1996) described patterns of trust among Mexican families to honor schools, teachers, and policies con respeto and pointed out that Latino parents do not necessarily see it as their place to initiate communication and contact, and intervene in the schooling of their children. Valdés related, “Hispanic parents tend to see the school as the main force responsible for their children’s education and academic development” (p. 44). Chrispeels and Rivero (2001) argued that the circumscribed sense of Latino engagement with the school indicates differing cultural perceptions about the construction of the parents’ roles in the process of schooling. This contrasts with parents from U.S.-born working- and middle-class White families, who instead may question many characteristics of the schooling process, such as teacher qualifications, standardized testing, and outdated textbooks (Baker & Stevenson, 1986).


An unexplored area in the literature is how high-stakes exit testing impacts ELs and their families’ trust in the process of school and whether the exams incentivize achievement motivation, as the theory of action underlying accountability suggests, or perhaps create a disincentive to remain in school.3 To consider this proposition, the article now turns to an accompanying notion of confianza to conceptualize the interaction among ELs, schools, and high-stakes exit testing. Confianza refers to trust between people based on a confidence in the other person to have one’s best interests in mind (McLaughlin & Bryan, 2003). Confianza, social trust or widespread trust, is an emerging concept in social capital research and is defined by costs, benefits, and beliefs that influence the decision to trust (Vásquez, 2004). Stanton-Salazar (2001) found that working-class Mexican American students’ falta de confianza (the absence of trust) initiated students’ avoidance strategies as self-reliance to develop modes of self-preservation and individuation to persist in school. However, the reality that the students persisted in school exemplifies that they had drawn on social capital while signaling respeto for the broader system of school. Immigrant families attribute high instrumental value to formal schooling (Goldenberg, Gallimore, Reese, & Garnier, 2001) and trust that their children’s enrollment is in their best interest. If continuing failure on a high-stakes assessment causes EL students or their families to question their confianza in schools, then self-preservation could diminish the belief in the benefit of continuing enrollment.


Con respeto implies that Latino parents tend to see the school as the main force responsible for their children’s education. Research has yet to explore the relationship between ELs’ social capital and their conception of the confianza in schools when faced with failure associated with high-stakes graduation testing. Accountability endeavors to create quality schools through overt signals to school staff and the public, such as test scores and ratings banners that adorn buildings. According to the conventional understanding of the role of high-stakes testing, the derived incentives work concurrently as motivational and assessment tools (Hamilton, Stecher, Russell, Marsh, & Miles, 2008). However, considering Valdés’s (1996) conception of respeto, I sought to understand whether high-stakes exit assessments instead may diminish ELs’ confianza in the process of schooling, leading to deleterious outcomes such as dropout.


METHODS


I used a mixed-methods approach to understand ELs’ school experiences and progression through school in the midst of accountability. This approach combined descriptive analyses of longitudinal state-level data available from the TEA with interviews with students and staff to learn about their direct experience with exit exams.


OVERVIEW OF QUANTITATIVE DATA


The Texas Comptroller of Public Accounts (2001) indicated that the Public Education Information Management System (PEIMS) was created in 1983 for TEA to provide a uniform accounting system to collect all information about public education, including student demographics, academic performance, personnel, and school finances. The PEIMS lies at the heart of the Texas student accountability system, and the wealth of information gathered from school districts offers the opportunity to gauge the success of Texas-style accountability for ELs over time. At the time of writing, there were no publicly available TEA reports or published research considering cross-sectional EL achievement and progress through school in Texas. As a result, this article gathers data from multiple state reports to descriptively consider EL TAAS and TAKS Exit testing achievement, grade retention, dropout rates, and graduation rates for more than 15 years of Texas-style accountability.4


OVERVIEW OF QUALITATIVE DATA


Site description. To understand the interactions of ELs with high-stakes exit testing and accountability policy, the research presented here also involved in-depth interviews with ELs and their teachers and administrators from four public high schools located along the U.S.- Mexico border region of South Texas. A review of the contexts across the four high schools provided an understanding of the similarities and differences of each of the schools in the sample. The first school, El Camino High School, is a large school serving over 2,000 students. El Camino is located in a suburban area of several small contiguous cities. Del Oro, the highest performing school in the study, is a large urban school serving nearly 2,500 students. Palma High School is located in a small city that has experienced rapid growth over the past decade. At the time the study was conducted, the high school had about 2,500 students. Finally, Tierra High School has an enrollment of slightly under 2,000 students and is located in a rural farming community. As a result, the sample contains high schools located in rural, small city, suburban, and urban areas.


The four high schools are fairly representative, given that averaged TAKS scores and completion rates have trended in the same direction as the state of Texas.5 Initially the sample high schools’ test scores were somewhat lower than the state, but they have experienced math and English language arts TAKS gains. In 2008, the four sample high schools exceeded the state average, yet they have exhibited a decline since 2003 (see Table 1).


Table 1. Percentage Outcomes for All Students in Texas and Rural, Small City, Suburban,
and Urban High Schools in Sample (2003–2009)


Test

2003

2004

2005

2006

2007

2008

2009

TAKS English language arts

       

Average of sample districts

65

69

73

81

82

80

83

Texas

79

80

83

87

89

86

90

TAKS Math

       

Average of sample districts

48

50

56

58

62

60

69

Texas

69

67

72

75

77

80

82

Completion rate

      

Average of sample districts

95

96

91

85

87

89

N/A

Texas

96

96

92

89

87

83

N/A

TAKS = Texas Assessment of Knowledge and Skills.


Sample selection. The selection of the sample high schools began by randomly choosing six candidate high schools based on locality (rural, small city, and suburban) and whether 60% or more of their student population was classified by the state of Texas as ELs with limited English proficiency. Once schools were identified, written permission was sought from six high schools to conduct the research. Four high schools agreed to participate in the research. One school declined, and the other did not respond to repeated e-mails and phone calls. To ensure privacy and confidentiality, schools are referred to using pseudonyms.


This study used qualitative interview research to gather perceptions of principals, teachers, and ELs about high-stakes exit testing. The project began with preliminary visits to each high school. Researchers sought out counselors at each of the schools to provide an overview of the research and to outline the scope of the fieldwork. Once relationships were developed with counselors, we sought introductions to principals in each of the high schools. Next, the researchers met with principals to discuss the overall structure of the research and describe the overall project. The principals were then asked to recommend a key contact to coordinate interviews. The key contact at each high school assisted the researchers with the respondent sampling and was asked to populate teacher and administrator focus groups by randomly selecting individuals who had significant amounts of interaction with ELs.


Data collection. Interviews were conducted over a span of several months, with several full days in each of the four high schools. Researchers began by interviewing school staff and students about their perspectives on the impact of high-stakes exit testing and accountability on ELs. All the interviews were semistructured and lasted between 1 and 2 hours. Interviews with students, teachers. and school staff took place in secluded school settings. For example, in two schools, the interviews took place in school textbook supply rooms where the door could be closed to limit noise and intrusion from the school setting.


Most teacher focus groups had between 2 and 5 individuals and occurred during teacher planning periods. In one school, the angst of the EL department regarding the underperformance of ELs on the TAKS resulted in the entire department participating in a focus group. Although the teachers were mainly interviewed in focus groups, most administrators and staff were typically interviewed individually. The key contact person at each school was also asked to randomly select ELs from data provided by the district. Students were then approached by researchers and school staff and told that they had the opportunity to be a part of a research study with the University of Texas at Austin. Student focus groups typically contained 3–5 individuals 18 years or older. The informants in this study were 50 students, 7 administrators, 33 teachers, and 11 staff members (see Table 2).


Table 2. Description of Study Participants


High school

Students
Grades 9–12

Teachersa

Principals,
asst. principals

Staffb

Total

Camino

16

  3

1

  6

  26

Del Oro

  8

  8

2

  2

  20

Palma

18

12

2

  2

  34

Tierra

  8

10

2

  1

  21

Total

50

33

7

11

101

a Special education, English as a second language (ESL), English, math, social studies.

b Public Education Information Management System (PEIMS) coordinators, ESL
program coordinators, counselors.


Data analysis. The Glaserian comparative analysis method is a concurrent data gathering and analyzing process that emphasizes induction and emergence during the research process (Strauss & Corbin, 1990).6 The field-based qualitative research involved school visitations to contextualize the experiences of ELs in relation to high-stakes exit testing and the perceptions and meanings attached to these experiences. Comparative analysis is grounded in flexibility as the research evolves (Glaser, 1992). As a result, the interviews and focus groups initially used a rubric of open-ended questions based on high-stakes exit testing and accountability themes identified in the literature to understand EL perspectives, attitudes, involvement, and knowledge about high-stakes exit testing and accountability policy (see the appendix). However, as the research commenced, questions on the protocol were eliminated and amended in favor of others that arose during the research process. In essence, to gather richer data, the research was grounded in emerging themes revealed by the participants.


Charmaz (2005) suggested that constant comparative analysis stimulates the inductivity necessary to illuminate social justice issues that otherwise might be neglected in the research process. Social justice issues, such as the relationship between high-stakes exams and the views of ELs about school, may be unobservable and inaccessible when using survey research or analyzing large-scale data sets. The flexibility of qualitative comparative analysis empowers researchers to move beyond surface observations to delve deeper into phenomena of interest with informants as they arise during the research process.7


A graduate student transcribed all the interviews. The transcripts were then analyzed using the constant comparative method (Patton, 1990). I coded phrases that had meaning in relation to the main topics and purposes of the study.8 Next, I sought to define axial relationships to identify consistent emerging themes within the phrase coding (Borgatti, 2005). For synthesis, informant counts by category were conducted to understand the representativeness of the dominant codes generated in the field interviews. After coding the interviews, I wrote thematic summaries to create the descriptions of participants’ motives and circumstances that are presented in the Qualitative Findings section. To check the authenticity of the work and moderate the validity threats of description and researcher bias, several funded graduate students conducted member checks by conducting field interviews, examining the data, helping to develop the emerging themes, and participating in group sessions to review the completed manuscript (see Table 3).


Table 3. Topics, Codes, and Emerging Themes in the Qualitative Data


Topics

Codes

Emerging Themes

Examples of Supporting Data

Testing and accountability

General impressions

Student outcomes

Stress

Commitment of school staff

Alternative assessments

Pressure of accountability

Unintended consequences

Test causing dropout

Students passing classes aren’t able to pass TAKS Exit test

Better to drop out and work then fail TAKS

Negative emotional consequences such as stress and anxiety

Curriculum and pedagogy

Curriculum

Pedagogy

School supports

Professional development

Teaching to test

Narrowing of the curriculum

Teaching to test

Worksheets for 100% of chemistry class

Schools playing the “game” to manipulate TAKS scores

No homework

TAKS test prep books instead of textbooks

English learners and their families

School contact

TAKS and communication

Challenges

Knowledge of school

Expectations of school

Involvement with school

Parental confianza

More likely to depend on the school to educate their children

Do not typically initiate contact with school

Positive view of school

Trust mislaid

TAKS = Texas Assessment of Knowledge and Skills.


Ethical considerations. The interviews were digitally recorded, and all participants in the interviews were made aware of standard anonymity and confidentiality research practices. For students with limited English proficiency, the researchers communicated in Spanish. About half of the ELs were more comfortable communicating in Spanish. These transcripts were later translated into English.


High-stakes testing and accountability is a hot-button political issue in Texas, so I sought to ensure that respondents would not be harmed as a result of the study. I used pseudonyms for each high school throughout the study. Geographic locations and school data are described but disguised. Additionally, I limited the amount of information describing participants. For example, I attributed quotations of principals and assistant principals to administrators. Finally, these techniques and strategies were submitted to the University of Texas at Austin Institutional Review Board, which approved their use.


QUANTITATIVE FINDINGS


To address the question of whether student outcomes for ELs have improved since the inception of accountability in Texas, the descriptive statistical analyses begin by focusing on EL high-stakes exit test score trends between from 1994 to 2008. The first generation of Texas accountability (1994–2002) used the 10th-grade TAAS Exit. From 2004 onward, the second generation of NCLB-inspired educational policy has employed the 11th-grade TAKS Exit as the underpinning of the current accountability system.


TAAS EXIT EXAM


Figure 1 shows that ELs dramatically increased their achievement on the TAAS Exit Math, from only 25% meeting minimum standards in 1994 to 71% by 2002. The state also showed large gains in the number of students meeting minimum standards. The achievement gap between all students and ELs also narrowed from 30% in 1994 to about 20% in 2002.


Figure 2 shows large gains in the percent of students meeting minimum standards on the TAAS Exit Reading. By 2002, TEA reported that 66% of ELs and 94% of all students in the state had met minimum standards on the TAAS Exit Reading. ELs showed an increase of 37% more students meeting minimum standards, whereas the state showed an overall increase of 19%. The achievement gap closed from 46% to 28% but was still larger than the TAAS Exit Math gap noted.


Figure 1. Texas Assessment of Academic Skills (TAAS) Exit Math: Percent meeting minimum standards (1994–2002)


[39_16201.htm_g/00002.jpg]

EL = English learner.

Source: Statewide TAAS Results, by the Texas Education Agency (2003b). Retrieved from http://ritter.tea.state.tx.us/student.assessment/reporting/results/swresults/august/
g10all_au.pdf.


Figure 2. Texas Assessment of Academic Skills (TAAS) Exit Reading: Percent meeting minimum standards (1994–2002)


[39_16201.htm_g/00004.jpg]

EL = English learner.

Source: Statewide TAAS Results, by the Texas Education Agency (2003b). Retrieved from http://ritter.tea.state.tx.us/student.assessment/reporting/results/swresults/august/
g10all_au.pdf


TAKS EXIT EXAM


In 2003, the TAKS replaced the TAAS as the exit exam in Texas. As shown in Figure 3, between 2003 and 2009, the percentage of ELs passing the TAKS Exit Math increased from 15% to 47%, a gain of 32%. Overall, the state experienced a gain of 37% more students meeting minimum standards on the TAKS Exit Math. In contrast to the closing of the achievement gap on the TAAS Exit Math trends, the TAKS Exit Math gap between ELs and all students expanded from 29% in 2003 to 34% by 2009 (see Figure 3).


Figure 3. Texas Assessment of Knowledge and Skills (TAKS) Exit Math: Percent meeting minimum standards (2003–2009)


[39_16201.htm_g/00006.jpg]

EL = English learner.

Source: Statewide TAKS Performance Results, by the Texas Education Agency (2009). Retrieved from http://www.tea.state.tx.us/index3.aspx?id=3220&menu_id3=793.


During the past 7 years of TAKS Exit testing, the percentage of ELs passing the TAKS Exit English Language Arts (ELA) increased 29%, while the proportion of all students meeting minimum standards increased 31% (see Figure 4). In contrast to the closing of the achievement gap noted on the TAAS Exit Reading, the gap between ELs and all students increased slightly, from 41% to 43%. By 2009, 92% of all students passed the TAKS Exit ELA, whereas only 49% of ELs met minimum standards.


Figure 4. Texas Assessment of Knowledge and Skills (TAKS) Exit English Language Arts: Percent meeting minimum standards (2003–2009)


[39_16201.htm_g/00008.jpg]

EL = English learner.

Source: Statewide TAKS Performance Results, by the Texas Education Agency, 2009. Retrieved from http://www.tea.state.tx.us/index3.aspx?id=3220&menu_id3=793


GRADE RETENTION


Haney (2000) reported that one of the primary mechanisms by which students were exempted from testing in Texas-style accountability reforms of the 1990s was the use of the ninth grade as a holding bin prior to the 10th-grade high-stakes TAAS Exits. Yet, Carnoy et al. (2001) disputed that grade retention rose in the midst of Texas-style accountability. Whether grade retention increased in the midst of high-stakes testing and accountability is a salient issue; policies that increase retention rates are problematic because of the deleterious academic outcomes long associated with these actions (Darling-Hammond & Falk, 1997). Figure 5 shows retention for Grades 7–12 for Texas since 1994. The longitudinal trend line shows that ELs have been retained at rates higher than all other students. ELs experienced a rise in grade retention of about 2%, to a statewide total of 14%, while all other students have remained at about 6% since the inception of accountability in Texas.


Figure 5. Grade retention rates for Grades 7–12 (1994–2006)


[39_16201.htm_g/00010.jpg]


Sources: Grade-Level Retention in Texas Public Schools, 1996-97, by Texas Education Agency, 1998, Austin: Author; Grade-Level Retention in Texas Public Schools, 2006-07, by Texas Education Agency, 2008a, Austin: Author.


DROPOUT


In the 1998–1999 school year, TEA introduced tracking of individual students in cohorts between Grades 9 and 12 (TEA, 2001). As a result, the longitudinal cohort dropout analysis begins in 1999 instead of 1994. To understand student leavers, a cohort method is more desirable because it considers what happens to a group of students over time and is based on repeated measures of each cohort to reveal how students progress in school. The cohort method is more accurate than the yearly snapshot dropout rate that TEA historically has reported. Figure 6 shows that TEA-reported EL cohort dropout rates halved between 1999 and 2005. However, after 2005, when the state began to use the National Center for Education Statistics (NCES) dropout definition for leaver reporting, a 7.1% increase in the number of publicly reported dropouts for the state occurred.


Figure 6. Cohort dropout rates (1999–2007)


[39_16201.htm_g/00012.jpg]


Source: Secondary school completion and dropout data from the Texas Education Agency.

In 1999, TEA first reported longitudinal EL dropout rates for students who left the system between Grades 9 and 12 for reasons other than obtaining a GED certificate or graduation (TEA, 2002). Data from TEA, 2001, 2003a, 2004b, 2005, 2006, 2007, 2008b.


Notably, the cohort dropout increase is almost 300% higher for ELs because the adoption of the NCES standard increased the cohort dropout rate by about 19%. These numbers align with empirical research critical of TEA’s publicly reported dropout numbers (Losen, Orfield, & Balfanz, 2006; Vasquez Heilig & Darling-Hammond, 2008), suggesting that student leavers were underreported for quite some time for the state, even more so for ELs.


TEA (2006) detailed the Texas dropout definition that was revised in 2005 as a student who is enrolled in public school in Grades 7–12; does not return to public school the following fall; is not expelled; and does not graduate, receive a General Education Development (GED) certificate, continue school outside the public school system, begin college, or die. In prior years, TEA auditors found that PEIMS codes were used to obscure student leavers on the local level (Vasquez Heilig & Darling-Hammond, 2008). For example, in some districts and schools, a missing student was taken off the books by a school if the student was presumed either to be in school elsewhere or to have graduated, when in fact that student might well have dropped out. On the state level, students without an official PEIMS code for the TEA cohort graduation calculation were dropped from the denominator (TEA, 2004a). As a result, it is apparent in Figure 6 that Texas has vastly undercounted dropouts in its accountability system.


The Intercultural Development Research Association (IDRA) argued that adopting the NCES national dropout definition for Texas has provided a more accurate, yet still understated, representation of the magnitude of the overall dropout problem in Texas (Johnson, 2008). More than two decades of IDRA’s yearly high school attrition studies of PEIMS data have suggested that TEA has consistently and severely undercounted student leaving in publicly reported dropout and graduation rates. IDRA found that the overall student attrition rate of 33% was the same in 2007–2008 as it was more than two decades ago (Johnson). In contrast, TEA had reported annual dropout rates that declined from 5% to 1%, and longitudinal cohort dropout rates that declined from about 35% to around 5% over the same time frame. IDRA also posited that the high school attrition rates for Latino and African American students accounted for more than two thirds of the estimated 2.8 million students lost from Texas public high school enrollment since the 1980s (Johnson). IDRA does not report the longitudinal attrition rates of ELs in Texas.


GRADUATION RATES


If ELs were being retained and dropping out of school, then cohort graduation rates should be correspondingly low. Figure 7 shows that graduation rates hovered just above 50% throughout the accountability era and then dipped when NCES standards were instituted in 2005.


Figure 7. Graduation rates (1994–2007)


[39_16201.htm_g/00014.jpg]


Source: High school completion data from the Texas Education Agency.  

Data from TEA, 1999, 2001, 2003a, 2004b, 2005, 2006, 2007, 2008b.


In a study of Texas dropout data, Losen et al. (2006) argued that Texas graduation rates historically have been overstated. They examined PEIMS data for individual students and proffered that between 1994 and 2003, the state’s graduation rate increased from 56% to 67%. In contrast, TEA’s publicly released statistics locate the graduation rates at 72% and 84% for the same period—a difference of 17% by 2003, about 46,000 students. Losen et al. noted that the overstatement of graduation rates in Texas occurred partly because PEIMS included many ways that students could be excluded from the enrollment data used to calculate graduation rates. Instead of using PEIMS to define away the dropout and graduation numbers in Texas, the NCES definition has created more transparency in the state while calling into question whether gains have actually occurred in Texas since the inception of accountability in 1994.


In summary, the TAAS Exit exams showed that ELs apparently made dramatic achievement gains and narrowed the achievement gaps during the first generation of Texas-style accountability. The TAKS Exits also suggested achievement gains, but the achievement gap between ELs and all students increased. Notably, the statewide EL success rates on both TAKS Exit exams were lower than those on the TAAS. Also, the cross-sectional student progress analysis showed that grade retention, dropout rates, and graduation rates for ELs in Texas do not appear to have improved after about 15 years of accountability policy; in fact, the situation may have worsened. EL grade retention has risen moderately but steadily since the inception of accountability. Cohort dropout rates appeared to have improved. However, when more accurate NCES accounting standards were adopted by TEA in 2005, it became apparent that EL dropout rates were higher than had ever been reported. Most important, in 2007, about 60% of ELs were not graduating.


QUALITATIVE FINDINGS: ENGLISH LEARNERS AND EXIT TESTING


In the midst of accountability, EL exit test scores soared, while school completion plummeted. To consider how EL test scores can show dramatic gains concomitant with depressed graduations rates, the article now turns to qualitative findings to address the research question concerning the complexity of the interaction between ELs and high-stakes exit testing by considering confianza as an extension of social capital theory.


EL PARENTAL CONFIANZA


Valdés (1996) suggested that in Latin American countries, many communities look to schools as beacons of learning. The maestro (teacher) is often one of the more educated persons in the community and considered con respeto (Valdés). The current research study aligns with Valdés; teachers (3 of 4 schools) and administrators (4 of 4 schools) reported that low-income EL parents were more likely to trust the school with the success of their children. A Del Oro administrator stated, “I know that the parents of our ELs are more trusting. . . . They really put a lot of trust into the system.” Another administrator said, “They [EL parents] feel comfortable, and it’s just in their nature to trust us. . . . They come with a very positive view of ‘you all are making sure that my child has a good education.’ They don’t have doubts that we’re not going to do that.”


A teacher from Camino High School said, “I think that the [EL] parents have 100% trust in the schools. They depend mostly on the school, that the school knows what they’re doing, and so they back off. They think that the school should do the job.”


An administrator from Del Oro stated,


I know that the parents of our ELs are more trusting. In meeting with them versus just having general parent meetings, they really put a lot of trust into the system to the point of aquí está, “he’s yours; do what the teacher says, do what the administrator says” is what they tell their kids, versus the other parents who might be more affluent, they’re like “no, I’m going to take the bull by the horns, I’m going to be meeting with the teachers, I want this and I want that.” So yes, there’s a pretty big contrast with the parents we have.


School staff also noted that that trust could be mislaid. Recent surveys by the Pew Hispanic Center indicated that Latino parents are more likely to have positive views of their neighborhood school than other groups and believe that standardized testing has benefited their children (Kaiser Family Foundation, 2004). However, faculty at Tierra and Palma suggested that their confianza might be misplaced. A teacher from Tierra said,


I don’t want to say it’s passive, I don’t want to say meek, but I guess they’re more respectful. . . . They trust the administration. You place my son here, well, good teacher
. . . they’re not. In that certain aspect your non-English language learner parent, they’re more aggressive, they’re more assertive as far as their son or daughter’s education. But I don’t know, I’ve come across some very good parents of the EL students whereas the non [EL] parents are more abrasive, more brash, more down on the system.


A teacher from Palma said,


I think a lot of them [parents] may think that we’re taking an interest in their child, and their caring. . . . They’re not going to question, they’re not going to question whether you’re right or wrong. And you hope that the teacher’s doing the best for the student but sometimes that’s not the case, that doesn’t happen all the time.


EL families place confianza in schools, which is tied to their belief that the best interest of their children will be served by an environment pervaded by accountability undergirded by high-stakes exit testing. As mentioned, the phenomenon that high schools respond to the pressure of high-stakes testing and accountability in ways that may increase test scores for a high school but have deleterious unintended consequences for ELs is a continuing question in the literature (Vasquez Heilig & Darling-Hammond, 2008). As a result, the analysis of the qualitative data now turns to developing an understanding of the perceptions of students and high school faculty on how the incentives of exit testing are impacting the schooling process of ELs.


EXIT TAKS AND THE CURRICULUM


As Texas-style accountability commenced, an ongoing debate in the literature considered how high-stakes TAAS testing was impacting classroom pedagogy and curriculum (McNeil & Valenzuela, 2001; Valencia & Bernal, 2000). In the current TAKS testing era, these concerns still remain, but there is a dearth of literature on how the current round of exit tests is impacting ELs in Texas schools. Because exit tests are directly tied to graduation rates in Texas, understanding how schools are preparing English learners for high-stakes exit testing in Texas is of primary concern.


How high schools narrowed curriculum and pedagogy in response to Texas high-stakes testing in the TAAS era was extensively explored by McNeil and Valenzuela (2001). In the current study, teachers (11 of 33) and principals (6 of 7) from each of the four high schools detailed aspects of “teaching to the test” and the impact of exit testing on the narrowing of the curriculum of EL students. An administrator at Del Oro High School acknowledged that schools are paying attention to constraints created by the current educational policy system:


There’s no way around it, I mean you’d be a fool if you did not play that game, I guess you can call it. . . . You can easily end up being labeled unacceptable if you did not prepare the students to take the test. . . . Two weeks before the TAKS date we pull out the kids. . . . We let the teachers know you’re not going to see these kids for 4 days. For 4 days we do what we call the TAKS blitz.


A staff member at Del Oro High School responsible for planning programming for ELs and monitoring ELs’ progress toward graduation was concerned that the attention being paid to the TAKS testing was distracting from the core mission of educating students. The respondent had been on staff at Del Oro for over a decade, which provided a long view of how the policy was changing the school environment:


I think personally that we’re so caught up in this game of making the numbers look good you know, so that your AYP and all this other stuff, the report card, that we’ve forgotten why we’re here as educators. . . . I think it’s very transparent, to them [students] too, that the emphasis is on teaching the tests, and manipulating, in a sense, the figures, rather than on focusing on really teaching.


The students also seemed concerned with how exit testing was impacting instruction and the quality of the curriculum in the school. As the staff member mentioned, the tensions associated with the TAKS testing were also on the minds of ELs at Del Oro High School. When asked whether the TAKS appeared in the daily curriculum, EL students related that they had noticed that many of their courses had a heavy TAKS preparation focus. One student stated, “In my pre-AP [Advanced Placement] we were working in class on one problem by day or sometimes, some days with practice like, many problems during the class. But in regular classes they just give you the [TAKS] book and the class was about that, was according with that book.”


The ELs who faced the heaviest test-prep-focused courses were those who had not passed TAKS during prior testing opportunities. Students related that they were tracked into courses in which the amount of TAKS curriculum was increased. A Del Oro student who had failed previous administrations of the TAKS Exit related,


Every class I have, it’s based on the TAKS, you know? Like we do exercises that are in the TAKS. . . . Right now I’m taking the TAKS classes for the tests I need and basically he lectures the whole period [on TAKS]. . . . We don’t have homework.


Teachers were also asked about the TAKS preparation components in each high school. The breadth of the TAKS preparatory curriculum appeared to be differentiated by student and school contexts. However, the common thread was that each high school enacted intensive TAKS training for ELs to prepare them for exit testing. Several teachers characterized their test prep approaches for exit testing because of the failure of their school to meet AYP: “Whatever I’m covering in class, the objectives I’m covering, I’m trying to get them ready for February [TAKS].” A teacher from Tierra High School said,


I don’t review [for TAKS] every day, however, I tell them look this is what we have to do
. . . but I don’t give them those TAKS review sheets, no I don’t drill the test. I teach around [the test], the center, the focus. It has to be because of the situation the school has been in.


Student informants at Tierra related that the curriculum was saturated with teaching to the exit tests. They reported that their chemistry class entailed 100% TAKS test preparation—no textbooks, labs, experiments, or other traditional means of science curriculum. The entire chemistry course was solely designed to drill students for science exit testing by using multiple-choice worksheets. The idea seemed somewhat implausible until the Tierra chemistry teacher was randomly chosen to participate in a focus group. She characterized the worksheets in the course as being entirely geared for the TAKS: “Mine is not going through a 15-minute bell-ringer, then going on to teaching chemistry. No, no me it’s everything, so mine are actual [TAKS] lessons. . . . I don’t just teach my course, now I teach towards the TAKS.”


A logical question that arises from the interview data is, How do you readily assess whether a teacher in a high school has changed the traditional modes of classroom pedagogy and curriculum to respond to the TAKS? Is the chemistry teacher who uses 100% of the daily class activity around TAKS preparation only an outlier? At El Camino High School, without prodding or a request from the researcher, the EL program coordinator led a tour down the hallways after school. The EL coordinator opened classroom doors to describe her view on readily estimating the differences between a TAKS-driven classroom and non-TAKS-driven classroom. From her perspective, classrooms at El Camino that lacked innovative materials, manipulatives, and student work were mainly relying on worksheets and test prep materials as pedagogical tools to prepare students for the exit exams.9


McNeil and Valenzuela (2001) identified Houston schools as “teaching to the TAAS” almost a decade ago—the current research suggests that schools outside Houston are also using worksheets as the curriculum for the TAKS Exit tests. Informants from all four high schools in the current research have responded to high-stakes exit tests in a similar manner by narrowing instruction to the multiple-choice structure of the TAKS Exit tests. However, it is also important to understand that some teachers may continue traditional teaching methods, whereas others may focus on test prep multiple-choice exercises. Research considering the contrast between novice and veteran teachers could provide one possible explanation for this difference and is an important area for future research.


IMPACT OF TAKS EXIT ON ELS


To address the question of the effects of high-stakes testing and accountability on ELs, it was essential to learn about the perceptions of teachers, principals, and EL students. Ultimately, exit tests were designed to improve the success and learning of students. The accountability and testing policies adopted in each state to meet the mandates of NCLB were intended to improve educational output. However, informants at all four high schools identified an unintended effect of accountability linked to high-stakes exit exams: negative emotional consequences for ELs. A teacher from Palma stated, “Just dealing with the stress of ‘if I don’t pass this test I don’t graduate,’ that’s got to be incredibly difficult.” A teacher from El Camino said, “I can see it in their [ELs’] eyes, the anxiety, they’re overwhelmed” when preparing for the TAKS. An administrator from Del Oro High School noted, “It’s one test, but for some [EL] kids it just becomes a block.”


ELs weighed in on the incentives associated with exit testing pressure. Conversations with 7 EL students highlighted that they felt overwhelmed when faced with the exit exams, whereas others focused on the impact of not graduating even if they had completed all other high school requirements. A student from Del Oro explained, “They’re always talking about TAKS. People talking in town . . . in the magazines, and newspapers. Everything is talking about TAKS. . . . I’m like, oh my God.” A Tierra student said, “’Cause, like, if you’re doing well in all your classes, you’re passing all your classes, and the TAKS comes and you fail that, you’re like, damn!”


Well I have one of my friends, she passed all of her tests except for the science and she’s doing good in everything, but she has to still retake it and retake the test because she hasn’t been able to pass it. She wants to graduate this year. She’s a senior and she hasn’t been able to pass the TAKS. — Student, Tierra High School


Proponents of high-stakes testing posit that without exit examinations, the U.S. educational system would graduate large numbers of illiterate and unprepared students. Others have argued that high-stakes testing has placed undue harm on low-performing students who have not had access to the same academic resources (Valenzuela, 2004). However, in the quest to create more stringent graduation requirements, several students and school staff suggested that qualified and otherwise academically successful ELs were unnecessarily caught in the crossfire. The impact of otherwise qualified ELs not receiving a high school degree because of failure on exit exams is a thesis that requires more attention in the research literature.


The Texas Miracle, the underpinning of NCLB, was based on apparent decreases in the dropout rate, a closing achievement gap, and increasing graduation rates. However, accountability and testing may not actually lead to increased educational output for ELs and instead may have escalated deleterious outcomes. Understanding why ELs are continuing to experience depressed graduation rates and increased dropout rates in the current round of NCLB-inspired accountability is an important investigation. Conversations with students and high school staff at all four high schools revealed that the traditional problems that ELs have experienced in school are compounded by the current system because additional hurdles actually have escalated dropout for some ELs.


An El Camino teacher said, “I hear it a lot from my [EL] students, ‘well I’m just going to drop out of school. . . .’  It’s [TAKS] discouraging them, it really is. Their frustration level, their anxiety.”


In general, like, school is easy, but the TAKS test . . . make you feel like if you don’t pass it you’re like, “Why am I going to school?” . . . I have friends who doesn’t pass the TAKS and they don’t even want to come because they’re not going to graduate, ‘cause the tests. So they feel like, “oh, I spent my whole, my three, my four years here for nothing. . . .” They want to drop out. — Student, Del Oro High School


Another Del Oro student said, “Many of the guys that drop out, they pass their classes. They have good grades, but they drop just cause they don’t have the tests.” Another confirmed, “Some of my classmates are like, ‘oh if I don’t pass that TAKS next time, I’ll drop out.’”


Afecta muchos de los Hispanos que venimos y muchos salen de la escuela porque dicen ‘¿de qué perder tiempo si no voy a poder [finish] por el TAKS? Mejor me salgo y me pongo trabajar.’ [It affects many of the Hispanics who come, and many leave school because they say, “Why waste my time if I’m not going to be able to [graduate] because of the TAKS? It’s better for me to get out and work.”] — Student, Palma High School


A Palma teacher confirmed the students’ views: “[TAKS] is also impacting our dropout rate, because the kids get very discouraged and they say, ‘You know what, what for, para qué, why am I going to come to school, if I’m not going to pass?’” A Palma administrator simply stated, “I have many students who cannot graduate due to the TAKS.”


The voices of the respondents, in conjunction with the statewide statistical data, suggest that it is possible to concurrently show large gains on the exit tests concomitant with high dropout rates and decreasing graduation rates. In his examination of the TAAS era, Haney (2000) suggested the possibility that schools and districts could increase high-stakes test passing proportions (or average scores) not just by increasing the numerator, but also by decreasing or manipulating the denominator. Vasquez Heilig and Darling-Hammond (2008) demonstrated quantitatively that Texas high schools that increased their student leavers also increased their exit test scores and accountability ratings. Interviews with students and staff at the rural, small city, suburban, and urban high schools revealed unintended consequences associated with the high-stakes testing pressure for ELs who were otherwise academically able—contemporaneous with soaring scores. Informants suggested that exit tests narrowed the curriculum, needlessly pushed ELs out of school, and fallaciously obstructed graduation.


DISCUSSION


This research is important to the realm of educational policy because it furthers the understanding of EL learning and progress in high schools located in communities of varying types in the midst of high-stakes exit testing. This article reviewed cross-sectional EL outcomes statewide and the percolation of accountability-based incentives into high schools after 15 years of accountability in Texas, and provided some insight on how pedagogy and curricula are being impacted by exit testing. I have described both stakeholder perspectives of accountability incentives and the mechanisms by which test scores can improve while other EL outcomes deteriorate. Ultimately, considering the interaction of social capital, respeto, and confianza with high-stakes exit testing has the potential to inform policy makers and researchers on how the construction of various accountability-based reforms impact micro-level realities and can result in deleterious outcomes for historically underserved ELs.


Furthermore, this research is significant because it informs a reframing of the accountability debate. Recently, Peter Cunningham, the Obama administration assistant secretary of communications for the U.S. Department of Education, weighed in on NCLB-inspired testing: “Education has been corrupted. In addition to narrowing the curriculum by abandoning other topics, what this kind of system does is create incentives to game the system. We’re actually harming the education of students in this country” (Olney, 2010).


The pending reauthorization of NCLB offers a unique opportunity to reconsider evidence and information about ELs and their experiences in the current high-stakes testing and accountability policy context. Despite soaring test scores, the lack of improvement in EL grade retention, dropout rates, and graduation rates after 15 years of Texas-style accountability suggests that the focal point should turn to policy makers as needing to be held accountable and to respond accordingly with less insular and more effective educational policy.


Considering the dramatic demographic changes in Texas and elsewhere, the next round of federal and state educational policy must be a mandate that provides support for ELs to meet performance standards by providing evidence-based solutions: appropriate curriculum, pedagogy, and well-trained teachers (Darling-Hammond, 2007). Furthermore, considering the informants’ exposition of respeto and confianza, policy makers, practitioners, and researchers should be cognizant of the less intrusive approach that many ELs and their families have toward schools and reconsider whether one-size-fits-all high-stakes exit testing policies are plausible for increasingly heterogeneous student populations. Neill (2005) proposed the use of multiple measures of EL student achievement in content areas as one solution. As Darling-Hammond, Rustique-Forrester, and Pecheone (2005) suggested, using multiple measures to consider student achievement, such as portfolios, is an accountability mechanism that makes sense, not just for ELs, but for all students.


With looming changes in immigration laws and a shift away from thinking of racial or ethnic grouping for making claims on the state, the current national environment has encouraged a decline in the validity of race and EL status as concepts for making social equity claims. This article underscores the legitimacy of the concern that ELs experience unintended consequences associated with high-stakes exit testing and accountability policy and suggests that social justice and equity are ratiocinative critiques.


Acknowledgments


Special thanks to Patricia D. Lopez, Elsa S. Billings, Angelica Aguilar, Angela Valenzuela, and Jo Bennett, who contributed to the research with their time and expertise.


Notes


1. For more information on the history of the Texas accountability system, see http://www.tea.state.tx.us/perfreport/account/2009/manual/.

2. Warren and Jenkins (2005) stated that they reestimated the models for Latino and African Americans separately in a more refined classification scheme from 1991 onward (the results were not included) and found no association between exit tests and dropout rates. However, the lack of comparison with the earlier years without exit exams, in conjunction with the oversampling of Whites, could bias their results.

3. At a fundamental level, parents, regardless of background, must trust a schooling system to send their children each morning.

4. The most recent publicly available data were used in the research.

5. To protect the anonymity of the high schools, TAKS scores and graduation rates were averaged together.

6. Alternatively, Straussian comparative analysis focuses structured validation criteria and systematic hypothesis testing.

7. Although not the main focus of the methodology, to provide the researchers with background and contextual information, qualitative data were compared with field notes, archival materials provided by schools, and local press reports.

8. Respondent passages could receive multiple codes to be considered by research team members. Disagreement was rare due the overlap of the multiple coding strategy. Axial relationships were considered for the development of emerging themes from passages that received singular and multiple codes.

9. There is a paucity of research in the literature that examines the commonality of Texas classrooms that are TAKS driven and non-TAKS driven—an important area for future study that is beyond the scope of this article.


References


Abedi, J. (2002). Standardized achievement tests and English language learners: Psychometrics issues. Educational Assessment, 8, 231–257.


August, D., & Hakuta, K. (1997). Improving schooling for language-minority children. Washington, DC: National Academy Press.


Baker, D., & Stevenson, D. (1986). Mothers’ strategies for children’s school achievement: Managing the transition to high school. Sociology of Education, 59, 156–166.


Borgatti, S. (2005). Introduction to grounded theory. Retrieved May 15, 2010, from the Analytic Technologies Web site: http://www.analytictech.com/mb870/introtoGT.htm


Burt, R. (1997). Contingent value of social capital. Administrative Science Quarterly, 42, 339–365.


Carnoy, M., Loeb, S., & Smith, T. (2001). Do higher state test scores in Texas make for better high school outcomes? (Research Report No. RR-047). Philadelphia: Consortium for Policy Research in Education.


Charmaz, K. (2005). Grounded theory in the 21st century: A qualitative method for advancing social justice research. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (3rd ed., pp. 507–535). Thousand Oaks, CA: Sage.


Chrispeels, J. H., & Rivero, E. (2001). Engaging Latino families for student success: How parent education can reshape parents’ sense of place in the education of their children. Peabody Journal of Education, 76, 119–169.


Coleman, J. S. (1988). Social capital in the creation of human capital. American Journal of Sociology, 94, 95–120.


Cullen, J., & Reback, R. (2006). Tinkering toward accolades: School gaming under a performance accountability system (NBER Working Paper No. 12286). Cambridge, MA: National Bureau of Economic Research.


Darling-Hammond, L. (2003, February). Standards and assessments: Where we are and what we need. Teachers College Record. Retrieved June 7, 2010, from http://www.tcrecord.org/
Content.asp?ContentID=11109


Darling-Hammond, L. (2007). Race, inequality and educational accountability: The irony of “No Child Left Behind.” Race Ethnicity and Education, 10, 245–260.


Darling-Hammond, L., & Falk, B. (1997). Using standards and assessments to support student learning: Alternatives to grade retention. New York: National Center for Restructuring Education, Schools, and Teaching.


Darling-Hammond, L., Rustique-Forrester, E., & Pecheone, R. L. (2005). Multiple measures approaches to high school graduation. Stanford, CA: Stanford University School Redesign Network. Retrieved February 4, 2010, from http://www.srnleads.org/data/pdfs/multiple
_measures.pdf


Durán, R. P. (1989). Testing of linguistic minorities. In R. L. Linn (Ed.), Educational measurement (3rd ed., pp. 573–587). New York: Macmillan.


Firestone, W. A., Rosenblum, S., Bader, B. D., & Massell, D. (1991). Education reform from 1983–1990: State action and district response. Rutgers, NJ: Consortium for Policy Research in Education.


Gándara, P., Rumberger, R., Maxwell-Jolly, J., & Callahan, R. (2003). English learners in California schools: Unequal resources, unequal outcomes. Education Policy Analysis Archives, 11(36). Retrieved June 5, 2009, from http://epaa.asu.edu/epaa/v11n36/


Glaser, B. (1992). Basics of grounded theory analysis. Mill Valley, CA: Sociology Press.


Goldenberg, C., Gallimore, R., Reese, L., & Garnier, H. (2001). Cause or effect? Longitudinal study of immigrant Latino parents’ aspirations and expectations, and their children’s school performance. American Educational Research Journal, 38, 547–582.


Hamilton, L. S., Stecher, B. M., Marsh, J., McCombs, J. S., Robyn, A., Russell, J., et al. (2007). Implementing standards-based accountability under No Child Left Behind: Experiences of teachers and administrators in three states. Santa Monica, CA: RAND.


Hamilton, L. S., Stecher, B. M., Russell, J. L., Marsh, J. A., & Miles, J. (2008). Accountability and teaching practices: School-level actions and teacher responses. Research in Sociology of Education, 16, 31–66.


Haney, W. (2000). The myth of the Texas Miracle in education. Education Policy Analysis Archives, 8(41). Retrieved from http://epaa.asu.edu/epaa/v8n41


Horvat, E., Weininger, E., & Lareau, A. (2003). From social ties to social capital: Class differences in the relations between schools and parent networks. American Educational Research Journal, 40, 319–351.


Jacob, B. (2005). Accountability, incentives and behavior: The impact of high-stakes testing in the Chicago Public Schools. Journal of Public Economics, 89, 761–796.


Jennings, J., & Beveridge, A. (2009). How does test exemption affect schools’ and students’ academic performance? Educational Evaluation and Policy Analysis, 31, 153–175.


Johnson, R. (2008, October). Texas public school attrition study, 2007-08: At current pace, schools will lose many more generations. IDRA Newsletter. Retrieved from http://www
.idra.org/newsletterplus/October_2008/


Kaiser Family Foundation. (2004). Latinos are optimistic about schools and education.  Retrieved June 18, 2009, from http://www.kff.org/kaiserpolls/pomr012604nr.cfm


Kao, G., & Rutherford, L. (2007). Does social capital still matter? Immigrant minority disadvantage in school-specific social capital and its effects on academic achievement. Sociological Perspectives, 50, 27–52.


Keefe, S., & Padilla, A. (1987). Chicano ethnicity. Albuquerque: University of New Mexico Press.


Klein, S. P., Hamilton, L. S., McCaffrey, D. F., & Stecher, M. B. (2000). What do test scores in Texas tell us? (Issue paper). Santa Monica, CA: RAND.


Koyama, J. (2004). Appropriating policy: Constructing positions for English language learners. Bilingual Research Journal, 28, 401–423.


La Celle-Peterson, M., & Rivera, C. (1994). Is it real for all kids? A framework for equitable assessment policies for English language learners. Harvard Educational Review, 64, 55–75.


Lareau, A. (1987). Social class differences in family-school relationships: The importance of cultural capital. Sociology of Education, 60, 73–85.


Linton, T. H., & Kester, D. (2003, March 14).  Exploring the achievement gap between White and minority students in Texas: A comparison of the 1996 and 2000 NAEP and TAAS eighth grade mathematics test results. Education Policy Analysis Archives, 11(10). Retrieved from http://epaa.asu.edu/epaa/v11n10/


Losen, D., Orfield, G., & Balfanz, R. (2006). Confronting the graduation rate crisis in Texas. Cambridge, MA: Harvard University, Civil Rights Project.


McLaughlin, H. J., & Bryan, L. A. (2003). Learning from rural Mexican schools about commitment and work. Theory Into Practice, 42, 289–295.


McNeil, L. (2000). Contradictions of school reform: Educational costs of standardized testing. New York: Routledge.


McNeil, L. (2005). Faking equity: High-stakes testing and the education of Latino youth. In A. Valenzuela (Ed.), Leaving children behind: How “Texas-style” accountability fails Latino youth (pp. 57–111). Albany: State University of New York Press.


McNeil, L. M., Coppola, E., Radigan, J., & Vasquez Heilig, J. (2008). Avoidable losses: High-stakes accountability and the dropout crisis. Education Policy Analysis Archives, 16(3). Retrieved from http://epaa.asu.edu/epaa/v16n3/


McNeil, L., & Valenzuela, A. (2001). The harmful impact of the TAAS system of testing in Texas: Beneath the accountability rhetoric. In M. Kornhaber & G. Orfield (Eds.), Raising standards or raising barriers? Inequality and high-stakes testing in public education (pp. 127–150). New York: Century Foundation Press.


Neill, M. (2005). Assessment of ELL students under NCLB: Problems and solutions. Retrieved June 6, 2010, from the FairTest Web site: http://www.fairtest.org/files/NCLB_assessing
_bilingual_students_0.pdf


Nichols, S. L., Glass, G. V., & Berliner, D. C. (2006). High-stakes testing and student achievement: Does accountability pressure increase student learning? Education Policy Analysis Archives, 14(1). Available from http://epaa.asu.edu/epaa/v14n1/


Olney, W. (Moderator). (2010, January 12). To the point [Radio broadcast]. Santa Monica, CA: KCRW.


Olsen, L. (1997). Made in America: Immigrant students in our public schools. New York: New Press.


Patton, M. Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbury Park, CA: Sage.


Putnam, R. (1993). Making democracy work: Civic traditions in modern Italy. Princeton, NJ: Princeton University Press.


Stanton-Salazar, R. (2001). Manufacturing hope and despair: The school and kin support networks of U.

S.-Mexican youth. New York: Teachers College Press.


Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. San Francisco: Sage.


Texas Comptroller of Public Accounts. (2001). Special report: Undocumented immigrants in Texas, a financial analysis of the impact to the state budget and economy (Publication No. 96-1224). Retrieved June 1, 2009, from http://www.window.state.tx.us/specialrpt/
undocumented/undocumented.pdf


Texas Education Agency. (1998). Grade-level retention in Texas public schools, 1996–97 (Document No. GE08 601 07). Austin: Author.


Texas Education Agency. (1999). 1996–97 report on high school completion rates. Retrieved June 16, 2009, from http://ritter.tea.state.tx.us/research/pdfs/9697comp.pdf


Texas Education Agency. (2000). Academic Excellence Indicator System reports, SY 2000–01. Retrieved June 1, 2009, from http://ritter.tea.state.tx.us/perfreport/aeis/2000/state.html


Texas Education Agency. (2001). Secondary school completion and dropouts in Texas public schools 1999–00. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/
pdfs/9900drpt.pdf


Texas Education Agency. (2002). Three-year follow-up of a Texas public high school cohort. Retrieved June 11, 2009, from http://ritter.tea.state.tx.us/research/pdfs/wp06.pdf


Texas Education Agency. (2003a). Secondary school completion and dropouts in Texas public schools 2000–01. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/pdfs/
0001drpt_reprint.pdf


Texas Education Agency. (2003b). Statewide TAAS results—Percent passing tables Spring 1994–Spring 2002, Grade 10, Reading, Mathematics, Writing. Retrieved June 11, 2009, from http://ritter.tea.state.tx.us/student.assessment/reporting/ results/swresults/august/
g10all_au.pdf


Texas Education Agency. (2004a). Academic Excellence Indicator System: 2003–2004. Austin: Author.


Texas Education Agency. (2004b). Secondary school completion and dropouts in Texas public schools 2001–02. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/
pdfs/0102drpt.pdf


Texas Education Agency. (2005). Secondary school completion and dropouts in Texas public schools 2002–03. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/
pdfs/dropcomp_2002-03.pdf


Texas Education Agency. (2006). Secondary school completion and dropouts in Texas public schools 2003–04. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/
pdfs/dropcomp_2003-04.pdf


Texas Education Agency. (2007). Secondary school completion and dropouts in Texas public schools 2005–06. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/
pdfs/dropcomp_2005-06.pdf


Texas Education Agency. (2008a). Grade-level retention in Texas public schools, 2006–07 (Document No. GE09 601 01). Austin: Author.


Texas Education Agency. (2008b). Secondary school completion and dropouts in Texas public schools 2006–07. Retrieved June 15, 2009, from http://ritter.tea.state.tx.us/research/
pdfs/dropcomp_2006-07.pdf


Texas Education Agency. (2009). Statewide TAKS performance results. Retrieved January 27, 2010, from http://www.tea.state.tx.us/index3.aspx?id=3220&menu_id3=793


Texas Senate Bill 7, 73rd Texas Legislature, Education Code • 16.007 (1993).


Toenjes, L. A., & Dworkin, A. G. (2002). Are increasing test scores in Texas really a myth, or is Haney’s myth a myth? Education Policy Analysis Archives, 10(17). Retrieved from http://epaa.asu.edu/epaa/v10n17/


U.S. Census Bureau. (2008). 2004 Annual social and economic supplement. Current Population Survey. Washington, DC: Author.


Valdés, G. (1996). Con respeto: Bridging the differences between culturally diverse families and schools: An ethnographic portrait. New York: Teachers College Press.


Valencia, R. R., & Bernal, E. M. (2000). An overview of conflicting opinions in the TAAS case. Hispanic Journal of Behavioral Sciences, 22, 423–443.


Valencia, R. R., & Villarreal, B. J. (2004). Texas’ second wave of high-stages testing: Anti-social promotion legislation, grade retention, and adverse impact on minorities. In A. Valenzuela (Ed.), Leaving children behind: How “Texas-style” accountability fails Latino youth (pp. 113–152). Albany: State University of New York Press.


Valenzuela, A. (1999). Subtractive schooling: U.S.–Mexican youth and the politics of caring. Albany: State University of New York Press.


Valenzuela, A. (Ed.). (2004). Leaving children behind: How “Texas-style” accountability fails Latino youth. Albany: State University of New York Press.  


Valenzuela, A., Fuller, E., & Vasquez Heilig, J. (2006). The disappearance of high school English language learners from Texas high schools. Williams Review, 1, 166–200.


Vásquez, F. (2004). Porque qué confiar? Formas de creación de confianza social [Why trust? Ways of creating social trust]. Revista Mexicana de Sociologia, 66, 605–626.


Vasquez Heilig, J., & Darling-Hammond, L. (2008). Accountability Texas-style: The progress and learning of urban minority students in a high-stakes testing context. Educational Evaluation and Policy Analysis, 30(2), 75-110.


Warren, J. R., & Jenkins, K. N. (2005). High school exit examinations and high school dropout in Texas and Florida, 1971–2000. Sociology of Education, 78, 122–143.


White, M., & Kaufman, G. 1997. Language usage, social capital, and school completion among immigrants and native-born ethnic groups. Social Science Quarterly, 78, 385–398.


APPENDIX


Interview Rubrics


Rubric introduction: The underlying theory of NCLB, Texas accountability system formulas, and the TAKS is that they raise the state’s educational standards and allow students, including ELs, to have an education that better prepares them for adult life. During this conversation, we hope to get your thoughts about how the current national and state policies are impacting EL students.


Teacher Interview Rubric


Teacher – Testing and Accountability

What are your general impressions about the current Texas and NCLB policy on EL student achievement (TAKS) and progress (Dropout, Retention, Graduation)? Positive effects? Unintended negative impacts?

Does the TAKS provide an accurate evaluation of an EL student’s academic level? Are the expectations of the TAKS realistic? How about AYP? TEA accountability rating system?

Judging from the EL students in your school (or classroom), are the expectations of the TAKS realistic? How about AYP?

Does the TAKS impact EL students differently than non-EL students? If so, how?

Has the TAKS and Texas accountability system impacted your commitment to your career as a teacher?

If you could have your wish, what kind of test or other assessment tools would you like for your students? What mixture would be most effective for learning?

What kind of support do you receive in order to meet the demands of current TAKS tests? What kind of support do you need?

How will the End-of-Course tests impact your school and EL students?


Teacher – Curriculum and Pedagogy

To what extent does the current curriculum tailor to the current NCLB environment? Does it allow you to engage EL students in their learning?

Does the TAKS and Texas accountability system inspire you to improve your teaching for EL students?

Do you think you have received adequate professional development to implement the required curriculum effectively and prepare them for testing?

Are EL students mixed in with existing classes, or are they educated separately?

How long are students called “EL”?  Do you have any kind of designation for students who have “exited” English instruction or “sheltered” instruction?

There have been reports from some large urban district of classroom teachers “teaching to the test.” Do you think that occurs in the Rio Grande Valley?


Teacher – Parent Involvement

Approximately what percentage of EL parents would you say you’ve had communication with outside of scheduled teacher-parent conferences?

Have you noted any differences in your interactions with parents of EL students and non-EL students? The school’s or district’s interactions?

Is your communication with EL parents impacted by the TAKS and Texas accountability system?

What forms of services are you aware of that help make parent involvement accessible for EL parents?

If a Spanish-speaking parent needed to speak with you, could you communicate with him or her?

Do you take part in any after-school programs or have any affiliation with community-run services outside of the classroom that may impact EL students?

What methods of parental involvement do you see being used for EL parents?

In your opinion, can EL parents be more involved in their children’s schooling? If so, how?


Principal Interview Rubric


Principal – Testing and Accountability

What are your general impressions about the current Texas and NCLB policy on EL student achievement (TAKS) and progress (Dropout, Retention, Graduation)? Positive effects? Unintended negative impacts?

Does the TAKS provide an accurate evaluation of an EL student’s academic level? Are the expectations of the TAKS realistic? How about AYP? TEA accountability rating system?

Does the TAKS impact EL students differently than non-EL students? If so, how?

Has the TAKS and Texas accountability system impacted your commitment to your career as a principal?

If you could have your wish, what kind of test or other assessment tools would you like for your students? What mixture would be most effective for learning?

What kind of support does your school receive in order to meet the demands of current TAKS tests? How are these supports paid for? (From what funds do they come?) Are the funds different for each school, that is, according to need, or are the funds evenly distributed? What kind of support do you need?

Should the Texas accountability system consider an immigrant status variable as additional accountability subgroups?

How will the End-of-Course tests impact your school and EL students?


Principal – Curriculum and Pedagogy

What is the process for selecting the curriculum that you use for EL students? Does the district make recommendations? Do you offer recommendations? Do the teachers?

To what extent is the current curriculum tailored to the current NCLB environment?

Does the TAKS and Texas accountability system inspire teachers to improve teaching and learning? There have been reports in some large urban districts in Houston and elsewhere of classroom teachers “teaching to the test.” Do you think that occurs in the Rio Grande Valley?

What kind of professional development do you offer your staff to meet the demands of the TAKS test?

Are EL students mixed in with existing classes, or are they educated separately?

How long are students called “EL”?  Do you have any kind of designation for students who have “exited” English instruction or “sheltered” instruction?

In a typical school year, approximately what percentage of your EL parents would you say you communicate with?


Principal – Parent

Have you noted any differences in your interactions with parents of EL students and non-EL students? The district’s interactions?

Is your communication with EL parents impacted by the TAKS and Texas accountability system?

What forms of services are you aware of that help make parent involvement accessible for EL parents?

If a Spanish-speaking parent needed to speak with you, could you communicate with him or her?

Do you take part in any after-school programs or have any affiliation with community-run services outside of the classroom that may impact EL students?

What methods of parental involvement do you see being used for EL parents?

In your opinion, can EL parents be more involved in their children’s schooling?  If so, how?


Student Interview Rubric


Student – Test

What can you tell me about the TAKS test?

Have you taken it?

What do other students say about it?

Are you worried about the test?

Do you know other students who dropped out because of the TAKS?

Have you heard of the new End-of-Course tests?


Student – Pedagogy

What materials does your teacher use to prepare you for the test?

Are the activities for the test different from your other activities?


Student – Teacher

What does your teacher do to help you study for the test?

What does your teacher tell you about the test?

How much time does your teacher spend preparing you for the test?




Cite This Article as: Teachers College Record Volume 113 Number 12, 2011, p. 2633-2669
https://www.tcrecord.org ID Number: 16201, Date Accessed: 10/24/2021 11:13:52 AM

Purchase Reprint Rights for this article or review
 
Article Tools

Related Media


Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
  • Julian Vasquez Heilig
    University of Texas at Austin
    E-mail Author
    JULIAN VASQUEZ HEILIG is an award-winning researcher and teacher. His current research includes examining how high-stakes testing and accountability-based reforms and incentive systems impact minority students. Additionally, his work considers the sociological mechanisms by which student achievement and progress occur in relation to specific NCLB-inspired accountability policies in districts and schools for students of different kinds. He has recently published articles in Educational Evaluation and Policy Analysis, Journal of Latinos in Education, and Arts Education Policy Review.
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS