Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

Instruction Matters: Lessons From a Mixed-Method Evaluation of Out-Of-School Time Tutoring Under No Child Left Behind

by Annalee G. Good, Patricia Ellen Burch, Mary S. Stewart, Rudy Acosta & Carolyn Heinrich - 2014

Background/Context: Under supplemental educational services (“supplemental services”), a parental choice provision of the No Child Left Behind Act (NCLB), schools that have not made adequate yearly progress in increasing student achievement are required to offer low-income families free, afterschool tutoring. Existing research shows low attendance rates among eligible students and little to no aggregate effects on achievement for students who do attend.

Focus of study: We employ a framework grounded in examining the instructional setting, or “instructional core,” and we draw on the unique contributions of qualitative research to help explain the limited effects of supplemental services on student achievement. Specifically, we address the following research question: How can in-depth examination of the instructional core explain the impact of supplemental services on student learning?

Research Design: Our findings draw on data from an ongoing mixed-method and multisite study of the implementation and impact of supplemental educational services in five urban school districts located in four states. Although this paper includes quantitative data from this study, analysis focuses on qualitative data, including observations of tutoring sessions using a standardized observation instrument; semistructured interviews with district staff, provider administrators, and tutors; focus groups with parents of eligible students; and document analysis.

Findings: We identify two primary reasons for a lack of effects. First, there is a “treatment exposure” problem where most students receive far less than 40 hours of tutoring over the course of a school year, a critical threshold for seeing significant effects on achievement. In addition, there are discrepancies between an invoiced hour of tutoring and actual instructional time. Second, supplemental services has an instructional quality problem. Instruction lacks innovation; the curriculum typically does not align to that of the day school; programs do not meet all students' instructional needs, especially students with disabilities and English language learners; and there can be considerable variation in quality within the same provider.

Conclusions: Our findings lay the foundations for being able to not only establish best practices for supplemental services, but to suggest policy changes to facilitate these best practices and offer insights to a host of other parental choice, out-of-school time (OST), and accountability-based reforms.


Under supplemental educational services ("supplemental services"), a parental choice provision of the No Child Left Behind Act (NCLB), schools that have not made adequate yearly progress in increasing student achievement are required to offer low-income families the opportunity to receive free afterschool tutoring.1 There are a growing number of studies examining the impact of this intervention, and the trend is one of little to no aggregate effects of supplemental services on student achievement (Barnhart, 2011; Burch, 2009; Deke, Dragoset, Bogen, & Gill, 2012; Heinrich, Meyer, & Whitten, 2010). These same studies also show consistently low demand for supplemental services among eligible students (on average 12–17%), with even lower rates in rural schools (Barley & Wegner, 2010).

In this paper, we argue that consistently low attendance rates among eligible students and limited effects on student achievement warrant a nuanced look at the core of the intervention: academic instruction in supplemental services tutoring sessions. We make this argument based on insights emerging from our ongoing mixed-method study of supplemental services in five urban school districts. In this paper, we employ a framework grounded in examining the instructional setting and draw on the unique contributions of qualitative research to help explain the limited effects of supplemental services on student achievement. Specifically, we address the following research question: How can in-depth examination of the instructional core explain the impact of supplemental services on student learning?

We suggest new directions for research and policy relevant to all out-of-school time (OST) programs, as well as parental choice interventions in general. These insights are particularly timely as Congress enters the reauthorization process for NCLB and states have been offered the opportunity to apply for waivers of certain requirements under NCLB where they must weigh the benefits and costs of such programs.

In-depth and rigorous analysis of supplemental services is not only important in improving access to and quality of this major instructional intervention, but is applicable to a broad spectrum of reforms in that supplemental services is part of an accountability movement firmly embedded in the current educational policy context. In addition, we must continue to develop tools for looking inside all types of instructional settings—including new ones, such as publicly funded OST programs where private providers provide a large portion of the instructional services—and connect this analysis to existing instructional research.


The No Child Left Behind Act of 2001, the Bush Administration's reauthorization of the Elementary and Secondary Education Act, incorporates market-based, parental choice provisions that are similar in design to voucher plans proposed since the 1960s (Friedman, 1962). Under NCLB, districts not making test score targets for three or more consecutive years must offer supplemental services to the parents of students attending "failing" schools. Supplemental services is free afterschool tutoring from a choice of providers, where tuition is paid for by the district out of Title I funds. Providers can be almost any type of organization (e.g., public, private, for-profit, not-for-profit, secular, faith-based, local, or national) and offer services in a number of formats (e.g., online, software-based, in person, at school, at home, or in community settings). Although providers must complete an application process with the state and sign a contract with the district, these organizations provide tutoring services with very little regulation on program quality.

Thus, supplemental services has been described as "mini-vouchers" where the axis of parental choice is on the provider level (Burch, Steinberg, & Donovan, 2007). Provider-level information such as format, curriculum, and tutor–student ratios is used as a proxy for knowledge of instructional practice and, theoretically, the ability of parents to choose the appropriate provider for their child turns on this information (U.S. Department of Education [USDOE], 2005, 2008). Recently, many of the states that have applied for and/or received flexibility from the federal government in implementing NCLB have asked to make significant changes to supplemental services or eliminate the requirement completely, while retaining or developing other models of OST instructional programs (USDOE, 2012).


In our qualitative design and analysis, we draw on two bodies of literature: (a) the theoretical frame of Cohen and Ball (1999) on the importance of the instructional core in understanding an intervention's impacts (or lack thereof), and (b) existing research on supplemental services and OST to provide a framework of important indicators of afterschool program quality, which will assist in assessing the instructional core of supplemental services.


First, we draw on existing research of instructional practices in K–12 school settings (Coburn, 2004; Cohen & Ball, 1999; Elmore, 2004; Spillane, 1996). These studies, which vary in both foci and methods, generally explore the dynamics of instructional change internal to classrooms and argue for greater precision and attention to the importance of the interactive processes within classroom settings. Specifically, Cohen and Ball (1999) argued that in understanding why and how interventions contribute to improved student outcomes, researchers must attend carefully to the interactions within these dynamics—how students interact with teachers, how teachers view and approach tasks, and how instructional resources are employed in the setting.

Other educational researchers looking at the relationship between policy and classroom practice conducted studies along similar lines in arguing for greater attention to the instructional core (Elmore, 2004) and to the practices of teaching and instructional leadership (Coburn, 2004; Spillane, 1996). Their findings illustrate how educators' work within the instructional core is largely shaped by their interactions with and interpretations of the surrounding setting, policy frameworks, and other stakeholders (e.g., parents, private entities, and researchers).

While there is an abundant body of qualitative research that adopts the above framework for understanding day school classroom practices, it unfortunately has yet to be leveraged in research on supplemental educational services in particular, and market-oriented education reforms in general. Therefore, we use this well-established, nuanced framework for examining the instructional core, but apply it to the context of a market-based OST intervention, throughh the example of supplemental services. Specifically, the framework guides the collection and analysis of rich, triangulated data that includes detailed observations of tutoring sessions, in-depth interviews, focus groups, and archived documents, as well as statistical analysis from our quantitative team. This allows us to capture the dynamic interactive elements of the instructional core within supplemental services and it is invaluable in explaining the limited effects of such interventions.


Existing Research on Supplemental Services

Although there has been little systematic research on the nuances of the implementation process and actual instructional landscape of supplemental services, existing literature does suggest some patterns in attendance and effects. A number of studies document the problem of low attendance rates, which have leveled off over time and typically range between 12% and 17% of those eligible to participate (Burch, Heinrich, Good, & Stewart, 2011; Heinrich et al., 2010; USDOE, 2009; Zimmer, Hamilton, & Christina, 2010;). Another common finding is that elementary school students are more likely to attend supplemental services than middle school or high school students (see Burch et al., 2011; Springer, Pepper, & Ghosh-Dastidar, 2009; Zimmer, Gill, Razquin, Booker, & Lockwood, 2007).

The second consistent finding in existing supplemental services literature relates to the impact of the intervention on student outcomes. There have been a number of studies on the part of districts or outside researchers that employ statistical research in order to examine the effects of supplemental services on student achievement. The overall trend in findings is one of little to no statistically significant effects on student achievement (Barnhart, 2011; Burch, 2009; Heinrich et al., 2010; Heistad, 2007; Springer et al., 2009; Zimmer et al., 2007; Zimmer, Hamilton, & Christina, 2010). When supplemental services do show an impact on academic gains, they are approximately a third of the effect size of similar interventions (Hill, Bloom, Black, & Lipsey, 2008). An important insight from the broader literature on out-of-school tutoring programs, which is consistent with that of supplemental services evaluations to date, is that reaching some minimum threshold of tutoring hours appears to be critical to producing measurable effects on students' achievement (as measured primarily by test scores) (e.g., Lauer et al., 2006).

Quantitative data on the impact of supplemental services on student achievement offer valuable insight that helps us see the relationship between the inputs of supplemental services (money, time for additional supplemental instruction, etc.) and outcomes (i.e., student achievement). However, statistical analyses offer little purchase for understanding what happens between these inputs and outcomes. The data give us some broad measures of processes inside of classrooms, communities, and schools that might contribute to effects, but there is much more to understand about the particularities of how supplemental services is actually delivered in specific settings.

Unfortunately, qualitative research on supplemental services has been limited and, when present, largely descriptive, exploratory, and focused on the challenges of implementation in a context of limited capacity and/or will on the part of district and state providers in informing parents about their options and in monitoring and reporting on the quality of tutoring provided (see Burch, 2007b; Burch et al., 2007; Center on Education Policy, 2007; Fusarelli, 2007; Gill et al., 2008; Government Accountability Office, 2006; Potter et al., 2007; Sunderman, 2006; Sunderman & Kim, 2004; Zimmer et al., 2007). An important limitation of existing qualitative research is the absence of context-rich data on the nature and quality of tutoring. The majority of studies were conducted far from the instructional setting and relied on the reports (through interviews) of providers, school administrators, and district and state officials. We view a sole reliance on interviews with policymakers, school administrators, and district and state officials on the nature of tutoring as a shortcoming.

Best Practices in OST

Although relatively little research has been done on best practices specific to supplemental services, prior research on OST programs tells us that quality programs are characterized by a number of particular characteristics. First, a quality OST curriculum is content-rich, differentiated to student needs, and connected to students' school day (Beckett et al., 2009; Vandell, Reisner, & Pierce, 2007). Second, instruction is organized into small grouping patterns (no larger than 10:1 and ideally 3:1 or less) and instructional time is consistent and sustained (Beckett et al., 2009; Farkas & Durham, 2006; Lauer et al., 2006; Little, Wimer, & Weiss, 2008). Furthermore, instructional strategies are varied (both structured and unstructured, independent and collective, etc.), active (not desk time, worksheets, etc.), focused (program components devoted to developing skills), sequenced (using a sequenced set of activities designed to achieve skill development objectives), and explicit (targeting specific skills) (Beckett et al., 2009; Vandell et al., 2007).

In addition to elements specific to curriculum and instruction, quality OST programs not only hire and retain tutors with both content and pedagogical knowledge, but also provide instructional staff with continuous support and authentic evaluation (Little et al., 2008; Vandell et al., 2007). Lastly, research suggests the importance of OST programs actively supporting positive relationships on the classroom level among tutors and students (Durlak & Weissberg, 2007; Vandell et al., 2007), as well as between the program and the surrounding community (Little et al., 2008). In the absence of a rigorous research base specific to instruction in supplemental services, we draw upon research on OST in subsequent sections when examining instructional quality in supplemental services programs. This is an appropriate choice as, in many ways, the afterschool context is distinct from that of the day school because it occurs after school. Therefore, we draw on best practices literature specific to OST, as opposed to general research on instructional quality or on tutoring within the day school hours.


The findings from this paper primarily draw on the qualitative data from an ongoing mixed-method and multisite study of the implementation and impact of supplemental educational services. The central purpose of this study is to understand whether and how providing students with academically focused OST tutoring in reading and mathematics contributes to improvements in their academic performance, specifically in reading and mathematics. We are conducting this research in five urban school districts located in four states: Milwaukee, Wisconsin; Minneapolis, Minnesota; Chicago, Illinois; Austin, Texas; and Dallas, Texas. These five districts offer large sample sizes of eligible and enrolled students from a variety of demographic backgrounds. In addition, the Chicago context includes a district-run provider with considerable market share.


Qualitative case study research "involves exploration of a 'bounded system' or a case (or multiple cases) over time through detailed, in-depth data collection involving multiple sources of information rich in context" (Creswell, 1998, p. 61). This tradition is well suited and widely used in the fields of education, urban studies, and political science (Creswell, 1998; Yin, 1991). Case sampling—conducting multiple observations of tutoring practices for providers—was intended to help us see tutoring practices in greater depth. As noted above, the focus of research in a case study is the case, which becomes the unit of analysis. Our case was an analysis of tutoring practices across five districts, and 25 tutoring providers of varying formats, such as in-person, digital, at home, in-school or community settings, profit, and not-for-profit (see Table 1).

Table 1. Qualitative Sample of Supplemental Services Providers (2009–2011)


Profit status

Location of tutoring sessions


Special services










Students with disabilities











Note: Providers may fit multiple criteria.

*Provider included digital instruction as a consistent and purposeful element of its program (i.e., online or software-based).

The provider characteristics used in our initial sample selection included high market share, high attendance levels relative to other providers in the same district, two or more years providing supplemental services in the district, and equal sampling among digital, in-home, in-school, and community-based tutoring, as well as among for-profit, not-for-profit, and district-provided. When possible, we also attempted to include providers that advertise that they target English language learner (ELL) populations and students with disabilities.2


Qualitative Data Collection Activities

The first two years of the study involved the following data sources:

Observations of full tutoring sessions (n=94) use a classroom observation instrument (described below) designed to capture key features of instructional settings.

Semistructured interviews with provider administrators (n=52) discuss the structure of instructional program, choice of curricula and assessments, challenges in implementation, and choices in staffing.

Semistructured interviews with tutoring staff (n=73) discuss instructional formats, curriculum, adaptations for special student needs, and staff professional background and training.

Semistructured interviews with district and state administrators (n=20) involved program implementation.

Focus groups with parents (n=168) of students who were eligible to receive supplemental services, most with children currently receiving supplemental services, discussed supplemental services. Two focus groups of approximately 1.5 hours each were conducted in each site and translation was offered in Spanish, Hmong, and Somali (other than English, the primary languages spoken by parents in these communities).

Document analysis focused on formal curriculum materials from providers; provider staff training manuals; diagnostic, formative, or final assessments used; policy documents on federal, state, or district policies concerning the implementation of supplemental services.

Please note, these sample sizes (n) are cross-site and for the 2009–2010 and 2010–2011 research years, upon which the analysis in this paper is based.

A central piece of our qualitative work is a standardized observation instrument we developed to more accurately capture the nature of the supplemental services intervention.3 Systematic analysis of structured observation protocols offers critical insight in the evaluation of accountability-based programs (Pianta & Hamre, 2009). Drawing on Cohen and Ball (1999), our observation instrument is intended to measure instructional capacity—the capacity to produce worthwhile and substantial learning as a function of interactions between the numerous stakeholders in the implementation process, such as students and teachers, and instructional materials and technologies. The instrument has the capability of not only providing descriptive information on instructional materials and teaching methods in use but also detecting the impact of different kinds of format, resources (curriculum materials, staffing, etc.), and instructional methods on students' observed levels of engagement.

The observation instrument includes indicator ratings at two 10–15 minute observation points, as well as a rich description in the form of a vignette, and follow-up information provided by the tutor(s).4 Once observation data is collected, it is categorized into clusters of indicators organized by areas of OST best practice: varied, active, focused, targeted, relationships, tutor knowledge, differentiation, and student engagement (see Table 2). This clustering of qualitative indicators allows us to see which best practices are predominant in observations and which are rare or missing. Although the observation instrument ratings use a numeric rating system, the process is fully qualitative in terms of clustering the indicators under each best practice area. OST cluster numbers are calculated by adding the total ratings for each indicator in each cluster and dividing that sum by the total possible ratings.5

Table 2. OST Best Practices Indicator Clusters

OST Best Practice

Indicators in Cluster

Average Rating

Curriculum that is differentiated to student needs

Curriculum differentiated to ELL (3.2.11)

Curriculum differentiated for students with disabilities (3.2.12)

Show evidence of accommodations (3.3.10)

Check that ELL students understand (3.3.9)

Deal effectively with language barriers (3.3.14)

Inclusive practices (3.3.15)


Instruction that is varied (both structured and unstructured, independent and collective, etc.)

Encourage students to solve their own problems (3.3.7)

Actively facilitate discussion (3.3.12)

Choose what they do or how they do something (3.3.19)

Provide direct instruction (3.3.3)

Demonstrate/model concept (3.3.4)


Instruction that is active (not desk time, worksheets, etc.)

Actively facilitate discussion (3.3.12)

Community- family-linked (3.2.7)

Artistic/physical recreation activities (3.2.6)


Instruction that is focused (program components devoted to developing skills)

Cognitive/enrichment activities (3.2.5)

Clearly focused on instruction (3.3.11)

Students demonstrate understanding of a concept/skill (3.3.23)

Constructively critique/offer feedback (3.3.6)

Students focused/actively participating (3.4.10)


Instruction that is explicit (targeting of specific skills)

Provide clear instructions (3.2.9)

Indicate skill focus (3.2.2)

Communicate goals and purpose (3.3.2)

Demonstrate/model a concept or skill (3.3.4)


Positive relationships between tutor and student, as well as between peers

Listen actively and attentively (3.4.2)

Work cooperatively (3.3.18)

Engage in peer–peer tutoring (3.3.22)

Praise/encourage students (3.4.3)

Engage positively with students (3.4.4)

Student interactions positive (3.4.8)

Students respond to staff (3.4.9)


Tutors who have both content and pedagogical knowledge

Provide accurate answers (3.3.8)

Actively facilitate discussion (3.3.12)

Demonstrate/model concept or skill (3.3.4)

Clearly focused on instruction (3.3.11)


Student Engagement Cluster

Materials used by student in goal of instruction (3.2.19)

Discuss/ask questions about materials (3.2.20)

Listen actively to presentation (3.3.17)

Work cooperatively (3.3.18)

Choose what they do or how they do something (3.3.19)

Participate in structured discussions (3.3.20)

Ask why, how, or what-if questions (3.3.21)

Engage in peer–peer tutoring (3.3.22)

Demonstrate understanding of a concept or skill (3.3.23)

Push themselves intellectually, creatively, etc. (3.3.25)

Student interactions positive with one another (3.4.8)

Respond to staff directions (3.4.9)

Focused and/or actively participating (3.4.10)


Quantitative Data Collection Activities

The sample frame for our quantitative data includes eligible students, registered students, and those attending supplemental services in the five districts in our study. We draw upon elementary, middle, and high school data from the administration of standardized tests and administrative databases for managing supplemental services programs, as well as student transcript and demographic data from the districts. These data are used in constructing measures of receipt of supplemental services, student-level controls to account for selection into supplemental services, and the outcome measures (i.e., changes in test scores). In estimating the impacts of supplemental services on student achievement, our quantitative team used an interrupted time series design with internal comparison groups and multiple nonexperimental approaches (value-added, student fixed effects, school and student fixed effects, and propensity score-matching models) to control for school and student time invariant characteristics.

Integrated Data Analysis

Case study is a research strategy employing triangulation with data, investigators, theories, and even methodologies (Miles & Huberman, 1994). Indeed, our analytic process is part of our fully integrated mixed-method research design where the quantitative and qualitative teams coordinate and collaborate at all stages of the study (design, collection, analysis, and dissemination). Specifically, the qualitative and quantitative teams meet monthly to review analytical findings from both study components, direct additional data collection, refine analysis plans, and prepare dissemination of findings to stakeholders. Within the qualitative team, we developed and use common protocols by role group and in our observation of tutoring sessions. We use a constant comparative method (both within and across method) to develop and refine our understanding of patterns and dissimilarities in tutoring practices across providers.

Further, the same data are analyzed and discussed simultaneously by different researchers in an effort to consider and develop multiple interpretations of events observed. Throughout the process, we seek to examine potential trends in the instructional setting that may help in understanding the shortcomings and challenges faced by the policy "in action." Analytic codes are developed from these patterns and in response to the research questions, and then reapplied to interview, observation, and archival data in order to establish findings. As with any qualitative study, data analysis occurs both concurrent to and after the data collection process.


Qualitative research is uniquely suited to help us understand why we are not seeing significant effects on achievement from supplemental services. From our mixed-method study, we identify two primary reasons for the lack of effects. There is a "treatment exposure" problem (low attendance and actual amount of instructional time) and a quality problem (instruction is not systematically enriching or innovative, the curriculum does not consistently align to the day school curriculum, instruction does not equally meet all students' instructional needs, and there is variation in program quality within providers).


If instructional capacity depends on the quality of interactions between teachers, students, and instructional materials, there first must be ample opportunity (i.e., time) for these interactions to occur (Cohen & Ball, 1999). In other words, limited instructional time leads to limited instructional capacity. Therefore, this section will discuss issues related to treatment exposure (amount of actual instructional time students receive) in supplemental services: attendance rates, advertised time versus the actual instructional time of sessions, and "attendance flux." As noted earlier, existing quantitative research on supplemental services consistently shows low take-up rates among eligible students of 12–17% (Heinrich et al., 2010; USDOE, 2009; Zimmer et al., 2010). The quantitative portion of this study builds on this research by separately analyzing the multiple stages of selection—registration, attendance, and the number of hours attended. For example, we found across multiple sites and years that Whites, Hispanics, and Asian Americans are significantly less likely to register for or attend supplemental services, but if they attend, they are significantly more likely than African Americans to reach higher attendance rates. ELLs, alternatively, are more likely to register and to attend more hours than non-ELL students.

Forty contact hours (hours invoiced by provider for an individual student) appears to be a critical threshold for having a statistically significant effect on elementary school students' math and reading gains in test scores and middle school students' math test score gains.6 If the threshold for realizing any impact on academic achievement is at least 40 hours of attendance, our quantitative data offer important insight into why we may not be seeing effects. Looking at hours of tutoring across the five districts in our study, we find that Chicago is the only district where substantial numbers of students (56% of elementary students) receive at least 40 hours of supplemental services. This is compared to 11% in Milwaukee and 14% in Minneapolis.

The maximum number of hours students can attend a supplemental services program (after registering) is influenced by two primary factors: the amount of per-pupil allocation for supplemental services and the hourly rates charged by providers. For example, in 2008–2009, Austin allocated $1334 per student for supplemental services, while over 70% of the participating students received supplemental services from a provider charging $75 or more per hour. At this hourly rate and with a budget of $1334, the maximum amount of tutoring a supplemental services provider could offer a student was approximately 18 hours. In contrast, about 15% of students in Chicago were served by the district provider that charged $13.25 per hour in 2008–2009, and about 50% of students received supplemental services from a provider charging $50 or less per hour. Our quantitative analysis illustrates how lower hourly rates in Chicago led to students receiving more treatment.

Clearly, quantitative work goes a long way in explaining the lack of significant effects on student achievement, especially considering the findings that suggest a threshold of attendance for effects. But, in conjunction with the quantitative work, our qualitative research offers critical insight into a more complex picture of "treatment exposure" that goes further in explaining low levels of effects on student outcomes.

Advertised Versus Instructional Time

Throughout our 94 observations of full tutoring sessions across districts, we consistently observed a difference between the advertised time of a tutoring session and the actual instructional time. Providers are required to advertise the average length of their sessions. Districts are invoiced at an hourly rate, based on the time students spend in tutoring. In our sample, advertised sessions ranged from 60 to 150 minutes. Irrespective of the format, students received less instructional time than what was advertised by providers, although the magnitude of these differences varied by format. As displayed in Table 3, digital organizations most closely matched instructional time with advertised time. In school and community settings, average instructional time was often considerably less than average advertised time: approximately 19 minutes in the case of school-based tutoring and approximately 38 minutes in the case of community-based tutoring.

Table 3. Average Advertised Time Versus Average Instructional Time by Format, in Minutes


Advertised Time

Instructional Time



Year 1

Year 2


Year 1

Year 2


Year 1

Year 2



67.5 (N=6)

55 (N=6)

61.3 (N=12)

57.8 (N=6)

54 (N=6)

55.9 (N=12)

-9.7 (N=6)






90.6 (N=50)

87.2 (N=32)

89.3 (N=82)

70.8 (N=50)

70.2 (N=32)

70.6 (N=82)

-19.8 (N=50)

-19.0 (N=32)

18.7 (N=82)




60 (N=4)

63.3 (N=18)

60.6 (N=14)

55.3 (N=4)

59.4 (N=18)

-3.6 (N=14)

-4.8 (N=4)

-3.9 (N=18)


95.6 (N=27)

91.2 (N=25)



76.3 (N=27)

71.8 (N=25)

74.2 (N=52)

-19.2 (N=27)

-19.4 (N=25)

-19.3 (N=52)


116.7 (N=9)

90 (N=3)

110 (N=12)

70.0 (N=9)

76.7 (N=3)

71.7 (N=12)

-46.7 (N=9)

-13.3 (N=3)

-38.3 (N=12)

Our fieldwork also offers insight into possible reasons for these discrepancies. In school-based tutoring, the format necessitates administrative tasks (e.g., rosters, snacks, and transportation). In addition, tutoring sessions at school sites have to compete with other activities (such as athletics), there tends to be larger numbers of students, and time is needed for students to transition from school dismissal to the supplemental services session. A school-based supplemental services tutor gave an example of these challenges. "By the time you go pick up the students and bring 'em to your room, they lost about five minutes. You know? Then you pass out the materials. I probably have 'em for about 55 minutes."

Some community-based sessions also faced challenges with the logistics of transportation (e.g., handing out bus passes, making sure that students get outside to meet the bus on time, or checking in with families as the provider picked up and dropped off students), which could limit the actual instructional time available. In addition, school and community settings often include food, which is not the case in digital or in-home sessions. Regardless of the reason, participating students may not be getting the full instructional treatment in sessions where there are demands on tutors to conduct activities other than instruction.

Attendance Flux

Observation data indicated a large number of tutoring sessions had considerable student mobility or "attendance flux," as measured by comparing the number of students observed in Observation Point A (during the first half of the session) with the number of students observed in Observation Point B (during the second half of the same session). When these numbers differed, we counted this observation as having attendance flux. Of the 63 observations with two or more students, 26 (41.3%) had mobility. Six out of the 26 sessions with mobility took place in community-based settings (6 out of 12 total community-based observations), and 19 out of the 26 sessions with mobility took place in school-based settings (19 out of 52 total school-based observations). One out of two digital sessions had mobility, and zero out of one home-based session had mobility. As noted above, the higher proportion of school-based attendance flux may reflect competition with other school-based activities. Through observations, as well as interviews with both tutors and provider administrators, we know that school-based supplemental services programs often compete with other afterschool programs (e.g., athletics and clubs) for students' time. For example, in one school-based tutoring observation, we noted a quarter of the session's students leaving early to attend a school-sponsored club that meets weekly to improve students' self-esteem.


Although treatment exposure may help explain the lack of effects of supplemental services on student outcomes, we argue that the central challenge of supplemental services is rooted in the details of instructional capacity and practice—the instructional core. None of the other elements of the supplemental services program (attendance rates, costs, and organizational structures) can make up for low-quality instruction. Again, we believe in order to understand why and how interventions might contribute to improved student outcomes, we must closely examine the instructional core (Cohen & Ball, 1999). Through this frame, we illustrate how supplemental services instruction lacks innovation, the curriculum typically does not tie into that of the day school, programs do not meet all students' particular instructional needs, and there can be considerable variation in quality within the same provider.

Instruction Mirrors Traditional Formats

The theory of action behind parental choice programs such as supplemental services is that less regulation gives programs the freedom to innovate. Despite this and the research on OST tutoring emphasizing the importance of varied and activity-based programming, the general model of supplemental services tutoring we observed tended to mirror traditional academic learning environments. In other words, rather than providing something innovative and active, supplemental services tutoring was based on teacher-directed instruction with reliance on worksheets and drilling. Our observation data illustrates these patterns, in that the "Active" cluster consistently received very low ratings (average = .14) and although higher, with an average of .56, the "varied" cluster had significant room for improvement (see Table 2).

There also were encouraging patterns in our observations of tutoring sessions. In most instances, we did find a learning environment focused on academic tasks and student learning (average = .77), and positive interactions between tutors and students. Yet, although it may be possible for traditional instructional approaches to lead to quality OST programs, especially if other factors such as well-trained staff and small ratios are present, research on OST argues for differentiated programming that responds to students' different learning styles or needs. While there was some evidence of tutors responding to students' different learning styles (the "differentiated" cluster average was 0.57), these examples were mostly included as part of written structures such as workbooks and worksheets.

In addition, while students appeared relatively engaged in tutoring practice, the sessions themselves were largely teacher-directed. Students followed directions, but rarely had occasion to determine the nature of learning activities or engage with their tutors in thinking through the bigger questions behind the lessons (e.g., "What if the main character had not responded in the manner that she did to the central conflict?"). Lastly, while students often sat together at large tables or in pairs, there were very few opportunities for small group work or cooperative activities. We rarely saw students working cooperatively on a lesson or project.

Only two providers across the five districts incorporated project-based, student-directed activities as a regular part of their tutoring curriculum, although the degree to which tutors received training and/or followed through on multimodal instructional approaches varied. One of these two providers encouraged their tutors to devise lessons and activities that situated learning in the context of students' cultures, which may include particular content areas or artistic forms. However, in observed cases such as the example described in the following field notes, hands-on and multimodal activities could be undercut by a lack of classroom management skills.

The computer lab is packed with students and tutors. Two groups of girls and one group of boys work on their digital comic strips. Today's task is to edit digital pictures of students that were taken in poses for the comic strip that had been written by another student. Each group has a set of pictures to edit. Tutors are actively helping the two groups of girls, who are working relatively quietly, carefully, and intently, but no one is sitting with the group of three boys. Two of the boys are messing around on other computers, which the lead tutor of the lesson finally puts a stop to about halfway through [supplemental services] time. Then there is a bit of banter between the tutors and the boys about sharing the editing work. By this point, most of the other groups are nearing completion of the task.

Tutor 2, who has been leading the students throughout the activity, encourages the writer of the storyboard to get more involved in the editing. Apparently there had been some misunderstanding about his level of interest in the editing part of the process. Tutor 1 tries another approach, using logic: “How are you ever gonna learn if you don't do it?”

Tutor 2 is growing impatient with the boys' antics. He attempts to threaten them with the loss of free time: “Free time starts at when? 4:00. I will cancel free time.” However, free time is not cancelled; it actually starts around 3:55. A couple of the boys keep working on the project for a few more minutes when the other students begin playing computer games. The two boys in the group who have been off to the side most of the session never really get a chance to do any of the editing.

The second of these two providers developed a highly structured curriculum around the pedagogical approach of multiple intelligences. While these curricular materials are not rooted in a particular culture, they depend on tutors leading students to skill development via various methods, such as visual, kinesthetic, artistic, and aural, as well as traditional verbal instruction and discussion. The following excerpt from an observation illustrates this approach:

The tutor led the students in a vocabulary hunt. The students gathered together on the carpet in the back of the room while the tutor explained the activity. Each set of partners was to walk around the room and find three cards taped throughout. While looking through the room, they were to hold hands or lock arms at all times. Once a group had their three cards, they returned to the carpet. Some of the cards had pictures of a side of a coin or a symbol like $ or the cent symbol. The other cards had words on them such as “dollar,” “cent,” and “nickel.” The students all worked together to match the words with the appropriate symbol or picture. When they were done matching, the teacher asked a series of questions such as, “How much is a nickel worth?”

Instances of enriching, activity-based instruction, such as the one described above, were exceedingly rare and, as noted, poor classroom management or pedagogic skills on the part of tutors could mute attempts at innovative instruction.

Qualitative interviews with stakeholders (i.e., providers, district administrators, and parents) offer important perspectives on supplemental services, creating a further nuanced picture of instruction. In general, providers had a more positive view of supplemental services instruction. Some reported that the actual nature of their instruction was more innovative and enriching, for example, integrating arts and culture into instruction or drawing on their own experiences in areas such as creative writing or theater.

Many providers also cited the low ratio of tutors to students as enriching in and of itself. By tutoring a few students, the tutors feel they get to know the students, the academic areas in which they are struggling, and how to tailor their instruction to meet the unique needs of individual students. In fact, we found grouping patterns in most tutoring sessions to be in line with OST best practices research (<1:10).7 Grouping patterns of 1:3 or less were available in the majority of sessions (67/94 or 71%), and no tutoring session had a grouping pattern larger than 1:11. We found certain tutoring formats were more likely to have smaller grouping patterns. Namely, home-based tutoring almost always involved a 1:1 grouping. Slightly over half of all other observations (i.e., excluding home-based tutoring) involved some 1:1 grouping. Parents across districts expressed satisfaction with the grouping patterns in their programs. "In general, the tutoring programs have smaller teacher-to-student ratios. It's a positive thing!" Although we can say with confidence that students in the supplemental services tutoring context are getting opportunities to work in small groups, and small groups offer instructional advantages, we do not feel a low tutor-to-student ratio by itself constitutes innovative instruction.

In addition, providers talked about the connection a tutor can develop with a student's family and teachers. Providers report regularly calling students and their families to check in on student attendance and instructional needs. Many providers also mentioned building relationships with students' day school teachers to better understand areas in which students are struggling and how best to help them. One tutor described the important effects of positive relationships:

I cannot teach them everything they need to know in the amount of time that I have to work with them, but if I can make them realize that they're capable learners, that they're smart, that the world is interesting, that they're gonna raise their hand more in class, they're gonna pay more attention during the school day so that the school day teacher is more effective with that student.

When asked to describe a unique contribution of supplemental services to students' lives, many tutors first mentioned the opportunity it provides to have another adult who cares about a student's success.8 Our observation data corroborates much of this perspective in that tutors were consistently observed "engaging positively" with students (average = .89) across all districts and formats.9 In addition, tutoring sessions had high ratings on a variety of related indicators such as "provide constructive criticism," "encourage participation from disengaged students," and "listen actively and attentively to students" (see Table 2).

Lastly, some providers cite their experience and ability to connect with students and their families as enriching to instruction. For example, one tutor described the following instructional interaction:

So I'm like, “Read to a tune.” “Ah, Miss, I can't do that.” “Okay, well, rap to the tune. Whatever you have to do to read this.” And then when he gets stuck on a word, “Okay, sound it out. How would you say it if you slowed it up and you rapped it? How would you use it?” And usually that clicks in his head and he gets it. So he loves doing it that way. So we would do little assignments where, “Okay, next week I want you to go over what we just read, and I want you to give me a beat to it.” So when you come back, it's pretty much he has it memorized because it's like he made a beat to it, and now in his head he's like, “Ah, I can read. Like I can really read this and understand it.” Because most people listen to music, but you don't understand it. And that's the way he is. He could read stuff, but he couldn't really comprehend it. But when it comes to music, like he comprehends music, he understands it.

In this example, the tutor felt her ability to connect academic tasks to rap offered a more enriching instructional approach, and one that was better connected to the student's interests. Yet, as subsequent sections will discuss, this type of innovation is seldom systematized.

In contrast to the provider perspective, many district administrative staff felt supplemental services instruction tended to be traditional with a reliance on rote learning and drills. For example, one administrator commented, "Well it's just that what they're mostly doing is all generic. It's all, 'Oh, you're in seventh grade? Okay, here's your seventh grade packet. Here's seventh grade [state test preparation workbook]. Start with worksheet 1 and go to worksheet 12.' You know, and just go down the line." Some districts also noted that they do not have adequate staffing or authority to sufficiently evaluate the quality of the supplemental services providers' instruction. Similarly, parents felt the nature of supplemental services instruction was not innovative and, in fact, in many cases lacked the pedagogical structure that may take place in a traditional classroom setting. Parents witnessed cases where the supplemental services tutors or instructors were disengaged, which led them to question the efficacy and rigor of the instruction. In some cases, parents were unaware of the instructional practices and requested more information from the providers beyond the individualized student reports. One parent described his child's experience with a digital provider:

That's why I'm sayin' that with that [provider], she stayed after school and there was the promise of the laptop at the end. I wouldn't do it again. Because the instructor didn't seem to be engaged. She just seemed to kinda be waiting for it to be over. And the kids kinda seemed to be playin'. They were supposed to be doin' work on the computer. I couldn't see whether she had actually completed anything that was substantial.

When we did observe quality, enriching instruction in the supplemental services context, it was atomized—the result of individual tutors, not provider-level efforts. For example, one tutor described her process of integrating more enriching curriculum materials into her tutoring sessions:

I am not a big fan of [the provider's] materials. I like to pick and pull—I'm not a drill-and-kill person. Just from my experience in the classroom. So a lot of the stuff that I pull is from Investigations. You know, from hands-on type curricular material that was a curriculum that we had in the district several years ago. So that's the main thing that I use for fractions. So we worked with pattern blocks and they got to use the manipulatives, and then I moved them to the more concrete.

Similarly, another tutor described how he deviated from the provider's curriculum to further engage one of his students:

It's just whatever gets the job done, you know? A number of weeks this one child had been trying to bring his Transformers to the table where we were working. And his mom was always taking 'em away from him. And we would begin every session rather upset because he didn't have his Transformers. So one day I said, “Hey. Let's work with the Transformers. You think you can teach Megatron® here how to count to 20?” And we did.

In yet another example, a classroom was observed where students rotated between three tutors, each focused on a different skill set. One tutor worked on prereading strategies using a real telescope, video clip on constellations, and finally, a short reading passage. A second tutor did a math activity on fractions using M&Ms to chart color distributions. A third tutor used discussion around a collection of pictures to focus on science, specifically species classification. The tutors explained that although they sometimes use the provider's formal curriculum as supplemental materials, they came up with this system and the activities on their own in response to the needs of their students. Their approach was enriching, but atomized. It was the only classroom at this school site, from this provider, that used a system of rotating stations.

When present, the high-quality and innovative instruction hoped for in parental choice programs such as supplemental services was atomized and was primarily a result of the motivation and abilities of individual tutors, as opposed to the systematic efforts of an entire provider. Previous studies in the area of market-based educational reforms also have shown competition to result in conformity, as opposed to diversity and innovation (Adnett & Davies, 2000).

Curriculum Is Seldom Aligned to the Day School

In addition to suggesting that instruction (the "how" of learning) is innovative and enriching, research on OST academic programs suggests that tutoring curricula (the "what" of learning) should align with that of students' day school (Beckett et al., 2009; Vandell et al., 2007). We also know from research on the instructional core that the nature of instructional materials and teachers' interactions with the curriculum are critical to instructional capacity (Cohen & Ball, 1999). Although alignment to common curriculum standards is important, we argue this does not go far enough. For example, if a district uses Six Traits Writing as the structure of its language arts instruction, it is best practice for tutoring instruction to do the same. Yet, in our fieldwork, coherence between tutoring and day school curricula was typically a default connection resulting from tutors also being staff at the students' school. It was up to individual tutors to seek quality connections between students' schoolwork and the content of tutoring sessions.

In the best of these instances, a tutor took advantage of these connections by frequently touching base with students' teachers and, in some instances, accessing students' special education Individualized Education Programs (IEPs) if necessary. These connections were less likely to happen with digital and in-home formats, where tutors had less interaction with the school context. In these instances, curricular connections to the day school curriculum came through the student.

It should be noted that supplemental services sessions mirrored the day school in terms of content focus, with the majority of time spent on reading, language arts, and mathematics. In 49 of 94 observations (52%), reading and language arts was the focus of the intervention. Slightly more tutoring sessions included mathematics than reading sessions. In 55 out of 94 (59%) of the observations, math enrichment was the focus of the tutoring session. However, this focus on reading and mathematics reflects the stated focus of the supplemental services legislation. In theory, supplemental services is an intervention designed to assist schools in making adequate yearly progress on state assessments, which focus on reading, language arts, and math. Across districts, providers followed the letter of the law in terms of the instructional focus of programming.

Interestingly, in 15 out of 56 sessions observed (27%) in Year 1, and in 6 out of 38 sessions observed (16%) in Year 2, the explicit focus of the tutoring was test preparation activities. In Texas—a state that has explicitly adapted and expanded its day school materials to be closely aligned with content standards and the state assessment (Texas Assessment of Knowledge and Skills or "TAKS")—students completed worksheets photocopied from the test preparation booklet for their grade level. Although alignment to curricular content (math, reading, and language arts) is important, it is not as nuanced as a provider aligning a curricular approach to that of the day school.

Stakeholder Perspectives on Curriculum

Again, our qualitative interviews with stakeholders offer a nuanced picture of the nature of supplemental services curriculum. Providers can develop proprietary curriculum, purchase packaged curriculum from educational publishers, or draw upon curriculum in use by a district. Some providers were critical of the district curriculum and felt providers should offer a better product. Others felt the policy's prohibition of homework help limits what nonschool staff tutors can do to align with day school curriculum. Yet, many saw the benefit of alignment. For example, one tutor explained, "You wanna hopefully give them activities that will review previous concepts that the teachers have already gone over, or, I don't know, present them in different modes, or in different ways."

Positive relationships between providers and principals led to greater coordination of tutoring programs. For example, one provider talked about the benefits of working in a school where the principal is supportive of supplemental services and is concerned with the success of their students. They felt a principal's mindset influences the teachers they hire and the way they run the school.10 Providers and district administration in Milwaukee reported an improved effort to align curricula, partly as a result of a district policy where principals can identify "preferred providers" for their schools, which increased communication between providers and school-level staff. In addition, providers in Austin and Dallas reported that alignment was enhanced because both the supplemental services curriculum and day school curriculum are highly focused on developing test skills for the state standardized assessment in Texas. They reported using student-level data the district provides (i.e., state test scores) and conversations with classroom teachers to determine student needs. However, this does not mean day school and tutoring curricula progress through particular skills in the same order or speed.

District staff indicated the supplemental services curriculum should be aligned with the day school curriculum, but lamented a lack of communication between providers and day schools. An administrator in one district described the importance of alignment, as well as what it would look like in an actual tutoring session:

If I'm a tutor, I would come in and say, “What'd you do in class today?” “Worked on polynomials,” whatever, or, “we worked on reading comprehension of a narrative story,” or something like that, and I would mine for some data. I wouldn't, you know, say, “Pull out your homework assignment and let's just work from that.” No, mine for data from the student, find out what's going on in the classroom, and then see how you can connect the instruction that's taking place during the tutoring session with the prior knowledge and experience that the student is getting on a regular class. I mean, that's just basic 101 teaching, but, with no highly qualified teacher provision in [supplemental services], maybe some of the tutors didn't get basic 101 teaching.

This same administrator pointed out that such alignment requires structure as well as commitment from both the district and providers:

I got a couple calls from [supplemental services] site coordinators this year who said, “I've got teachers that want to know what the students are doing in their tutoring sessions because they're improving in the classroom.” And we didn't have a systematic way to allow teachers to see what the individual learning plans were for students or to look at the progress notes that students were, that providers were writing up on students. So one of the things that we'd really like to do for next year is encourage providers to do a more rigorous job of providing feedback on the progress reports that they're required to do four times a year.

Indeed, administrators from three of our districts specifically described putting structures in place to better facilitate communication and alignment between schools and providers. For example, one administrator explained:

We've had conversations with providers about alignment in the sense of letting them know very clearly what's happening in the regular day. What the curricular side of our house has articulated as the goals, and, you know, needs of our students at large and encouraging them to try to align their practices with those things. And so those things have been very formal and very intentional. And other things have just been kind of informal conversations with providers on how you can, you know, navigate the school community and understand how to build those relationships. But it has all been toward the end of serving kids best.

Similarly, administrators in one district felt many providers were moving closer to alignment, partly as a result of the district’s action, such as requiring students' tutors’ learning plans match district learning targets and opening up district professional development around curriculum to tutoring staff.

Despite attempts on the part of providers and districts, the overarching theme among parents was that supplemental services provider curriculum is not fully synchronized to the day school curriculum. For example, one parent lamented, "Well, the services did a lot of good to my son. But at the same time I was finding it was not synching with the homework that he was getting from the school." In many cases, students would work on one of the two subjects (language arts or math) and it would not coincide with what students were learning in their day school classrooms. "[My son], he's tellin' me that they're not covering any—it's like totally different levels of what they're studying [at school]. Here at the tutoring it's math, but it has no connection." Many parents requested more cooperation between the providers and the school in order to bridge the content gap that occurs during the supplemental services.

Quality Instruction Is not Equally Accessible for all Students

Through observations, interviews, and focus groups we found that supplemental services instruction typically does not equally meet all students' instructional needs, especially for students with disabilities (both physical and developmental) and ELLs. Such data shed important light on the nature of instruction, as well as possible factors in the limited effects on student achievement. This is a particularly critical insight when considering how important these two student populations are to one of the mandates of NCLB, an increased spotlight on accountability for subgroups' performance.

According to providers' advertised services, 21 out of 25 providers in our sample advertised that they could serve ELL students, at least in a limited way or for limited languages. Eighteen out of 25 (though not necessarily the same) providers advertised that they could serve students with disabilities, at least in a limited way or for limited categories of disabilities (see Table 1). A major obstacle for providers was identifying students with documented ELL or special education needs. Specifically, the majority of tutors we observed and interviewed did not have access to IEPs or district data on ELL identification. Most providers only knew of students with disabilities because their tutors were also teachers in the day school or parents notified them.

In addition to difficulties in identifying special needs, tutor capacity was an important limitation to the ability of providers to offer quality services to ELLs and students with disabilities. For example, our interview and observation data, as well as our review of provider staff training materials, suggest that, with few exceptions, tutors did not have specific training or certification in working with students with disabilities. We did observe many sessions with certified teachers as tutors, and most teachers have had training related to students with disabilities as part of their certification process. In addition, many classroom teachers have considerable experience working with students with disabilities in their day classrooms, but we did not find training and experience with students with disabilities as a structured part of tutor training.

Similarly, we found examples of tutors who had training, certification, and experience working with ELLs, but only as a result of their teaching position in the day school. We also observed a number of bilingual tutors who might use multiple languages to clarify terms, but bilingualism does not equate to having been trained in teaching ELL students. There were instances of monolingual, English-speaking tutors experiencing language barriers with ELLs, particularly in communicating with students who speak less common foreign languages (e.g., Somali or Vietnamese).

In addition to limited tutor capacity, we found the curriculum was rarely formatted to accommodate the particular needs of ELLs or students with disabilities. For example, one tutor described using the same curriculum with students with disabilities as with the other students, and the only instructional adaptation was she gave them more "one-on-one attention." In addition, existing curriculum was sometimes "slowed down" for students with disabilities or a lower grade level was used.

Stakeholder Perspectives on Students with Special Needs

In general, the providers felt districts do not have a systematic process to provide information about the status of ELLs or students with disabilities. Yet, providers felt they still are able to provide services tailored to students' individual needs. Many providers help ELL students by having bilingual staff. "I have not had very many students so far who struggle with language to the degree that it's really been a barrier, and I take all comers. Um, there've been a few issues with Spanish-speaking families, I speak a smattering of Spanish, I've been able to work through family members, so again, it's just a matter of that pragmatic approach, I just do what needs to be done." But, only in very few instances did providers discuss using a specific curriculum or instructional strategies for ELLs or students with disabilities. Instead, they discussed accommodations in broad terms. For example, when asked what is different in how he approaches a tutoring session with students with disabilities, one tutor responded, "I guess everything—I mean in terms of what I'm planning on achieving for that lesson, how I behave, the speed with which we conduct the session." Another tutor described her approach to working with a student with a learning disability. "With [student name] we started out doing the basic I do with all my kids, but it was just a little harder for him. So we changed it up. I simplified all of the questions that I was asking him and I broke down the concepts." Although the tutor slowed down the instruction, it was not specifically differentiated to the student's special needs.

In contrast, district staff did not think supplemental services providers were responding to all students' needs, especially ELLs and students with disabilities. Some district staff spoke about providing IEPs, state test scores, and information regarding student ELL and disability status to the providers, but the providers did not use this information to differentiate their instruction. One district administrator said it was the providers' job to find and train qualified staff who could meet students' and parents' language needs:

You want to provide services in [this district]? Get a multilingual staff, period. The people who are monolingual have no business in the business. This is who our district is, we are over one-fourth English language learners, we are almost a third [one-third] families who speak a language other than English in the home. You want to do business here, figure it out. I have no patience for that. That's where these folks are choosing to do business; they have to figure it out, just like we have to figure it out.

While district administrators acknowledged the challenges in disseminating information about student language or disability status, many felt it was primarily the providers' responsibility to figure out how to get the right services for students with particular instructional needs.

Understanding the quality of instruction means looking at classroom practice, but also listening to how students and parents filter these experiences. Somewhat different from our observational data, many parents in our focus groups felt that their child's needs were met related to language and disability status. For example, one parent reported, "They told me things that they would do. Like give her time to talk when she needs to talk. If she needed to have a little more time for testing or something, that they would do that. They were pretty compensating about that." Many parents actually approached the provider to inform them of their child's disability. "I informed. But, the first day I went with [my daughter] and I just informed them of that." They felt with smaller student-to-tutor ratios, supplemental services instructors were able to provide instruction that would service students with disabilities.

For the most part, parents also felt that supplemental services providers did an adequate job of addressing the instructional needs of their ELL students, but felt that information about the nature of the programs could be more accessible. While it is promising that many parents felt their students' special needs were addressed by providers, our other data sources suggest considerable room for improvement in the quality of instruction for ELLs and students with disabilities.

Variation Within Provider

Lastly, we observed considerable variation in instructional quality within providers. For example, sessions of very different instructional styles and quality were observed from one provider who offers services in both schools and homes. In a school-based session, the tutor worked with three students together for one hour on a variety of math activities, all focused on the same concepts around long division. This tutor was also the math specialist for the school and incorporated a number of activities and strategies from her day school resources to engage students in active learning. On the other hand, a tutor from the same provider worked with one student at home for two hours. She was not a certified teacher, although had coursework and experience in tutoring. She relied exclusively on the printed worksheets from the provider and jumped from concept to concept, even from math to reading, depending on the worksheet. The student was not actively engaged.

As this example illustrates, there is intra-provider variation in both instruction and curriculum materials, as they came from a variety of formal (website or materials directly from provider administrators) and informal sources (tutors' own resources or students' work from day school). The "in-use" curriculum often included formal materials, but was supplemented by materials from the tutor, which at times may be inconsistent with the formal curriculum.

The theory of action behind supplemental services is that variation between providers creates a competitive marketplace from which parents can choose the most appropriate program for their students' needs. Variation within providers confounds the assumption that the axis of parental choice lies on the provider level, and also may complicate determining effects of the program on the provider level.


States and districts are spending billions of Title I dollars to implement supplemental services and a growing body of research is showing supplemental services to have little to no effect on student achievement. Why is this the case and how, if at all, does the lack of significant effects relate to the instructional characteristics of the intervention? We adopted Cohen and Ball's (1999) frame of instructional capacity developed for use in day school classroom settings and adapted it for use in OST classroom settings, specifically that of supplemental services. This frame helped us examine the components of this particular afterschool instructional setting that may influence broader patterns of limited effects and identify two primary sets of characteristics: First, there is a treatment exposure problem (low attendance and instructional time not being maximized). Second, there is a quality problem (instruction is not innovative, the curriculum does not align to the day school curriculum, instruction does not equally meet all students' instructional needs and there is considerable variation in program quality within providers). Our findings are not only important to improving the access to and quality of this major instructional intervention, but are applicable to a host of other parental choice and accountability-based reforms.


This paper identifies and highlights the interaction of multiple components within the instructional core of supplemental services: tutors, students, and instructional resources (Cohen & Ball, 1999), allowing us to evaluate supplemental services against the best practices associated with quality OST programs (e.g., differentiated, active, varied, focused, and connected to the day school curriculum). We are developing more sensitive measures of instructional practice that go beyond test scores and laying the foundations for being able to not only establish best practices for supplemental services, but offer ways to structure organizations and policy to facilitate these best practices. Based on this work, we offer a number of policy recommendations to improve both the quality and the amount of tutoring students receive:

Establish minimum criteria (beyond simply referring to state standards) for aligning the tutoring curriculum to that of the day school (e.g., both providers and day school use Six Traits Writing for developing writing skills).

Assess instructional quality using standardized observation tools and curriculum analysis that encourage enrichment and differentiation.

Require providers have tutors on staff with demonstrated knowledge of diagnosing and addressing the educational needs of students with disabilities and ELLs.

Structure hourly rates, per pupil allotments, and invoicing systems to ensure students receive at least 40 hours of instructional time.

Access to and quality of OST programs improve when there are coordinated systems in place, facilitated by policy structures (Bodilly et al., 2010).

Classroom setting matters to student learning (Nye, Konstantopoulos, & Hedges, 2004; Pianta & Hamre, 2009). Therefore, in the current accountability-based policy context, where reforms emphasize evidence-based practice, it is critical we collect rigorous and nuanced fieldwork on the instructional setting. This qualitative focus on the instructional core does not replace quantitative analysis of outcomes, but instead enhances it by providing an in-depth examination of why we see (or do not see) certain effects across settings. Integration of rich qualitative data with quantitative analysis of outcomes must be part of the evidence considered in studies of large-scale interventions. When integrated, these two methodological lenses combine to offer a far more rigorous assessment of an intervention that not only addresses questions of whether an intervention impacts student learning, but how and why. Ultimately, we urge both policymakers and researchers to attend to instructional setting—the core—so that we can truly understand dynamics behind interventions such as supplemental services, and provide the tools to structure, implement, and assess instructional quality of large-scale reforms targeting historically underserved students.


We thank the funder of this research, the Institute of Education Sciences, PR/Award number R305A090301, Education Policy, Finance, and Systems Research Program, Goal 3.


1. The use of "supplemental services" in this paper is in specific reference to the portion of NCLB entitled "supplemental educational services," which also can be referred to as "SES."

2. However, obtaining a sample perfectly representative of these provider characteristics proved challenging. Limitations include reluctance on the part of providers, low numbers of providers with more than one year of service in smaller urban districts (i.e., districts that only recently had to start offering supplemental services), and a limited number of providers in some sites that target ELL students and students with disabilities.

3. A copy of the observation instrument is available at www.sesiq2.wceruw.org.

4. We conduct regular reliability training with the qualitative research team to ensure consistency in ratings. In each session, the research team rated the same video segment of an instructional session and went through each indicator to compare ratings. Validity of the instrument is ensured by the development process, whereas its structure and content is based on well-tested, existing observation instruments for OST, existing literature on the best practices for OST, and the theory of action in the supplemental services policy. We continue to test and refine the data collection process as the study progresses.

5. Unobservable or not applicable indicators are omitted.

6. Springer et al. (2009) and Zimmer et al. (2007) likewise found more consistent, positive effects of supplemental services on students' math (vs. reading) gains in their studies of supplemental services in large, urban school districts.

7. By "grouping pattern," we mean the allocation of staff to students for the purpose of instruction. Based on our observations, grouping pattern is a more accurate indicator of staffing resources than tutor/pupil ratio. To illustrate, two tutors may be assigned to four students (ratio = 2:4), but only one of the tutors might be involved in instruction (grouping pattern = 1:4).

8. It should be noted that homework was not done in the vast number of tutoring sessions observed. In approximately 13% of all sessions observed (12 of 94 sessions), students attending supplemental services tutoring sessions worked on homework assigned by day classroom teachers.

9. "Engaging positively" means "staff have generally positive interactions with students. These interactions are constructive and supportive. Staff use affirming words and tone of voice, speaking in a manner that indicates respect, appreciation, and belief in the value and potential of students. Staff initiate informal conversations with students and respond to students' efforts to talk to them by showing interest and extending the conversation. Staff make an effort to build relationships with the students through a variety of means. Staff also move around to student work spaces, instead of staying in one place (i.e., their desk) the entire session."

10. Similarly, Koyama (2011) found principals to be powerful actors in the interpretation and implementation of supplemental services at the school level.


Adnett, N., & Davies, P. (2000). Competition and curriculum diversity in local schooling markets: Theory and evidence. Journal of Educational Policy, 15(2), 157–167.

Barley, Z., & Wegner, S. (2010). Examining the provision of supplemental educational services in nine rural schools. Journal of Research in Rural Education, 25(5), 1–13.

Barnhart, M. (2011). The impact of participation in supplemental education services (SES) on student achievement: 2009–10 (Publication No. 379). Los Angeles, CA: Los Angeles Unified School District Research Unit.

Beckett, M., Borman, G., Capizzano, J., Parsley, D., Ross, S., Schirm, A., & Taylor, J. (2009). Structuring out-of-school time to improve academic achievement: A practice guide (NCEE #2009-012). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides

Bodilly, S., McCombs, J., Orr, N., Scherer, E., Constant, L., & Gershwin, D. (2010). Hours of opportunity: How cities can build systems to improve out-of-school-time programs. Santa Monica, CA: RAND Corporation.

Burch, P. (2007b). Supplemental educational services under NCLB: Emerging evidence and policy issues. East Lansing, MI: The Great Lakes Center for Education Research & Practice. Retrieved from http://epsl.asu.edu/epru/documents/EPSL-0705-232-EPRU.pdf

Burch, P. (2009). Hidden markets: The new education privatization. New York, NY: Routledge.

Burch, P., Heinrich, C. J., Good, A., & Stewart, M. (2011). Equal access to quality in federally mandated tutoring: Preliminary findings of a multisite study of supplemental educational services. Working paper presented at the 2011 Sociology of Education Conference, Monterey, CA.

Burch, P., Steinberg, M., & Donovan, J. (2007). Supplemental educational services and NCLB: Policy assumptions, market practices, emerging issues. Educational Evaluation and Policy Analysis, 29(2), 115–133.

Center on Education Policy. (2007). Behind the numbers: Interviews in 22 states about achievement data and No Child Left Behind Act policies. Washington, DC: Author.

Coburn, C. (2004). Beyond decoupling: Rethinking the relationship between the institutional environment and the classroom. Sociology of Education, 77, 211–226.

Cohen, D., & Ball, D. (1999). Instruction, capacity and improvement: CPRE Research Report Series RR-43. Philadelphia, PA: Consortium for Policy Research in Education.

Creswell, J. W. (1998). Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks, CA: SAGE.

Deke, J., Dragoset, L., Bogen, K., & Gill, B. (2012). Impacts of Title I Supplemental Educational Services on student achievement (NCEE 2012-4053). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Durlak, R., & Weissberg, R. (2007). The impact of after-school programs that promote personal and social skills. Chicago, IL: CASEL.

Elmore, R. (2004). School reform from the inside out: Policy, practice, and performance. Cambridge, MA: Harvard University Press.

Farkas, G., & Durham, R. (2006, February). The role of tutoring in standards-based reform. Presented at the Standards-Based Reform Conference at University of Wisconsin-Madison.

Friedman, M. (1962). Capitalism and freedom. Chicago, IL: University of Chicago.

Fusarelli, L. (2007). Restricted choices, limited options: Implementing choice and supplemental educational services in No Child Left Behind. Educational Policy, 21(1).

Gill, B., McCombs, J. S., Naftel, S., Ross, K., Song, M., Harmon, J., & Vernez, G. (2008). State and local implementation of the No Child Left Behind Act, Volume IV-Title I school choice and supplemental educational services: Interim report. Washington, DC: Department of Education.

Government Accountability Office. (2006). No Child Left Behind Act: Education action needed to improve local implementation and state evaluation of supplemental educational services. Washington, DC: Author.

Heinrich, C., Meyer, R., & Whitten, G. (2010). Supplemental educational services under No Child Left Behind: Who signs up, and what do they gain? Educational Evaluation and Policy Analysis, 32, 273–298.

Heistad, D. (2007). Evaluation of supplemental education services in Minneapolis Public Schools: An application of matched sample statistical design. Minneapolis, MN: Office of Research, Evaluation and Assessment, Minneapolis Public Schools.

Hill, C. J., Bloom, H. S., Black, A. R., & Lipsey, M. W. (2008). Empirical benchmarks for interpreting effect sizes in research. Child Development Perspectives, 2(3), 172–177.

Koyama, J. (2011). Principals, power, and policy: Enacting 'Supplemental Educational Services' (SES). Anthropology and Education Quarterly, 42(1), 20–36.

Lauer, P., Akiba, M., Wilkerson, S., Apthorp, H., Snow, D., & Martin-Glenn, M. (2006). Out-of-School-Time programs: A meta-analysis of effects for at-risk students. Review of Educational Research, 76(2), 275–313.

Little, P., Wimer, C., & Weiss, H. (2008). After school programs in the 21st century: Their potential and what it takes to achieve it. Cambridge, MA: Harvard Family Research Project.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis. Thousand Oaks, CA: SAGE.

Nye, B., Konstantopoulos, S., & Hedges, L. V. (2004). How large are teacher effects? Educational Evaluation and Policy Analysis, 26, 237–257.

Pianta, R., & Hamre, B. (2009). Conceptualization, measurement, and improvement of classroom processes: Standardized observation can leverage capacity. Educational Researcher, 38(2), 109–119.

Potter, A., Ross, S., Paek, J., McKay, D., Ashton, J., & Sanders, A. (2007). Supplemental educational services in the state of Tennessee: 2005–2006. Memphis, TN: Center for Research in Educational Policy.

Spillane, J. (1996). School districts matter: Local educational authorities and state instructional policy. Educational Policy and Analysis, 10(1), 63–87.

Springer, M. G., Pepper, M. J., & Ghosh-Dastidar, B. (2009). Supplemental educational services and student test score gains: Evidence from a large urban school district (Working paper). Nashville, TN: Vanderbilt University.

Sunderman, G. L. (2006). Do supplemental educational services increase opportunities for minority students? Phi Delta Kappan, 88(2), 117–122.

Sunderman, G. L., & Kim, J. (2004). Increasing bureaucracy or increasing opportunities? School district experience with supplemental educational services. Cambridge, MA: The Civil Rights Project at Harvard.

U.S. Department of Education. (2005, August 22). No Child Left Behind: Supplemental educational services non-regulatory guidance. (Final Guidance). Washington, DC: Author.

U.S. Department of Education. (2008). No Child Left Behind—2008: Summary of final Title I regulations. Washington, DC: Author.

U.S. Department of Education. (2009). State and local implementation of the No Child Left Behind Act volume VII—Title I school choice and supplemental educational services: Final report. Retrieved from http://www.ed.gov

U.S. Department of Education. (2012). 26 more states and D.C. seek flexibility from NCLB to drive education reforms in second round of requests. Retrieved from http://www.ed.gov

Vandell, D., Reisner, E., & Pierce, K. (2007). Outcomes linked to high-quality afterschool programs: Longitudinal findings from the study of promising practices. Irvine, CA: University of California and Washington, DC: Policy Studies Associates.

Yin, R. K. (1991). Applications of case study research. Washington, DC: Cosmos Corp.

Zimmer, R., Gill, B., Razquin, P., Booker, K., & Lockwood, J. R. (2007). State and local implementation of the No Child Left Behind Act: Volume I—Title I school choice, supplemental educational services, and student achievement. Washington, DC: RAND.

Zimmer, R., Hamilton, L., & Christina, R. (2010). After-school tutoring in the context of No Child Left Behind: Effectiveness of two programs in the Pittsburgh Public Schools (PPS). Economics of Education Review, 29(1), 18–28.

Cite This Article as: Teachers College Record Volume 116 Number 3, 2014, p. -
https://www.tcrecord.org ID Number: 17351, Date Accessed: 5/21/2022 4:57:45 AM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Annalee Good
    University of Wisconsin, Madison
    E-mail Author
    ANNALEE G. GOOD is a Research Associate at the University of Wisconsin-Madison. She has published and presented numerous papers on the nature of the instructional landscape in federally funded tutoring programs, as well as the role of K–12 classroom teachers in the development of educational policy.
  • Patricia Burch
    University of Southern California
    E-mail Author
    PATRICIA BURCH is an Associate Professor at the Rossier School of Education at the University of Southern California, where her research examines the role of private firms as influences in the design and implementation of K–12 education policy and how public education is being transformed by new forms of privatization. Her book, Hidden Markets: The New Education Privatization, was published by Routledge in 2009.
  • Mary Stewart
    University of Wisconsin, Madison
    E-mail Author
    MARY S. STEWART is a doctoral candidate at the University of Wisconsin-Madison in the Educational Policy Studies Department. Her research focuses on educational policy implementation, legal issues in education, and program evaluation.
  • Rudy Acosta
    University of Southern California
    E-mail Author
    RUDY ACOSTA is a doctoral candidate at the University of Southern California in the Rossier School of Education. His research focuses on educational policy of market-based initiatives in K–12 public schooling, access of information of educational policies to under-resourced communities, and parent and community organizing around school reform.
  • Carolyn Heinrich
    University of Texas, Austin
    E-mail Author
    CAROLYN HEINRICH is the Sid Richardson professor of public affairs and affiliated professor of economics and the director of the Center for Health and Social Policy at the Lyndon B. Johnson School of Public Affairs, University of Texas at Austin. Her research focuses on education and social welfare policy, labor force development, public management, and econometric methods for program evaluation, working directly with governments at all levels. Recent publications include articles in journals such as the Journal of Policy Analysis and Management and Educational Evaluation and Policy Analysis and a book (2011), The Performance of Performance Standards.
Member Center
In Print
This Month's Issue