Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13

Beyond Academic Math: The Role of Applied STEM Course Taking in High School

by Michael A. Gottfried, Robert Bozick & Sinduja V. Srinivasan - 2014

Background/Context: Educational policymakers and researchers are concerned about the declining quantity and quality of U.S. students in line to pursue careers in science, technology, engineering, and mathematics (STEM) fields. As one policy response, a number of federal initiatives have been enacted to enhance STEM curriculum in schools. Part of this push has been to offer applied STEM courses in the K–12 curriculum to reinforce academic STEM material as well as motivate students to remain in these fields. Prior to this current study, no national-level research has evaluated the effectiveness of these courses.

Purpose: (a) What applied STEM courses are most commonly taken by high school students? (b) To what extent are high school students taking both academic math courses and applied STEM courses? (c) Do applied STEM courses in high school improve achievement in math?

Participants: To address the three research questions listed above, this study relies on a comprehensive longitudinal dataset: the Education Longitudinal Survey (ELS:2002). The present study is based on a sample of approximately 11,112 students who participated in the base-year (10th grade, 2002) and first follow-up (12th grade 2004) interviews, who completed math assessments in both years, and for whom valid transcript information was collected.

Research Design: This study begins with a descriptive analysis to evaluate which students have taken applied STEM courses and at which ability level. From this, a common set of applied STEM courses is determined across this nationally representative dataset. Next, this study relies on a linear regression model of math achievement where the dependent variable is a standardized math score. Independent covariates include measures as to whether or not a student had taken applied STEM courses, academic math courses taken by the student, and a range of controls.

Findings: Students who take an applied STEM course had higher math scores than their peers who did not take an applied STEM course, all else equal. These courses may be particularly beneficial for those students who are less oriented toward advanced math.

Conclusions/Recommendations: Applied STEM courses can be used to support learning in math instructed elsewhere in the curriculum, particularly for those students at the lower end of the math pipeline. In providing hands-on learning, often with technology and with direct application to concrete occupationally specific problems, applied STEM courses may serve as a critical means to support an understanding of concepts taught in lower level math pipeline courses.

In response to an economy that increasingly requires critical thinking, quantitative reasoning, and problem-solving skills to adapt to innovation, schools are experimenting with new ways to engender students’ interest and proficiency in science, technology, engineering, and mathematics (STEM) in order to boost achievement outcomes. One strategy is to supplement courses that teach math and science concepts with an applied or occupational framework, thereby connecting the varying components of STEM using job-specific applications as a context for instruction. These applied courses are intended to complement the skills taught in the traditional academic curriculum while demonstrating the utility of math and science in a real-world setting. The present study focuses on two types of applied STEM courses that are most directly relevant to STEM careers: scientific research and engineering (SRE) courses and information technology (IT) courses (Bradby & Hudson, 2007). Among high school graduates in 2005, 12% earned credit in an SRE course and 19% earned credit in an IT course (Hudson & Laird, 2009).

Despite their prevalence in American high schools, little is known about the efficacy of these courses. The present study contributes to the broader research base on curricular pathways in STEM by examining applied course taking in high school using transcripts and standardized test scores from a nationally representative cohort of high school students. Specifically, it details which applied courses are most commonly taken by high school students; the extent to which high school students take applied courses alongside their traditional math courses; and whether applied courses are an effective means to boost student proficiency in math.  


The importance of a well-trained scientific workforce to a nation’s economic vitality cannot be overstated. Strong economies rely on scientists and engineers to ensure national security, to solve a nation’s most critical problems, and to increase the general standard of living (Hira, 2010). At present, American policy makers, educators, and business leaders are deeply concerned about the quantity and quality of youth in line to pursue these positions. In terms of quantity, the percentage of undergraduates in the United States earning degrees in engineering, physical science, or math has remained stagnant, and the percentage pursuing computer science has declined slightly (National Science Board, 2010). In terms of quality, youth in the United States are less prepared relative to their international counterparts in science and mathematics. For example, 15 year olds in the United States rank 16 out of 26 countries in science literacy and 19 out of 26 countries in mathematical literacy (National Science Board, 2010). These statistics do not bode well for a young U.S. workforce entering a highly competitive and global economy.

Part of the problem can be attributed to a lack of instructional resources and support in the elementary and secondary years. For example, the National Research Council (2011) notes that insufficient teacher training, ineffective instructional practices, and a lack of materials (e.g., lab equipment, textbooks, etc.) are critical factors that inhibit student learning in STEM. Beyond any single resource deficit, however, is that the overall curriculum in support of STEM learning tends to be fractured and disconnected (Stone & Lewis, 2012). For example, basic algebraic concepts (taught in traditional academic math courses) are essential in understanding the underpinnings of energy production, a concept taught in traditional academic science courses. When students learn the interconnectedness of math and science concepts, they are in a better position to develop higher order reasoning and logic skills required in STEM occupations (Stone & Lewis, 2012). However, most high school math and science courses are taught in silos by different instructors and hence are not organized to bridge concepts in a complementary integrated manner. Though specific interventions aimed at integrating subject areas to make a more coherent curriculum have proven to improve student learning (see for example, Stone, Alfeld, Pearson, Lewis, & Jensen, 2008), these remain the exception and not the norm.

As a policy response, a number of federal initiatives have been enacted with specific provisions for the enhancement of STEM curriculum in schools. For example, the Carl D. Perkins Career and Technical Education Improvement Act of 2006 provided funds for the development of integrated curricula in support of occupationally focused STEM training, and the America COMPETES Act of 2007 supported programs that increased the number of teachers qualified to teach STEM courses in high school. More recently, the push for an integrated curriculum has also been accelerated by the Next Generation of Science Standards (NGSS) and the Common Core State Standards (CCSS)—the latter, for example, being a set of academically focused benchmarks in math and English developed by the National Governors Association and the Council of Chief State School Officers and currently adopted by 45 states. The CCSS in math emphasize student proficiency in empirical analysis, decision making, and real-world applications as a means to ensure that students are both “college and career ready.”   

These reform efforts have the following in common: the development, staffing, and delivery of courses that directly support STEM learning. By and large, there are two types of courses within the STEM curriculum: academic STEM courses and applied STEM courses. Academic STEM includes math and science courses that comprise the traditional academic curriculum, such as algebra, geometry, calculus, biology, chemistry, and physics. Academic courses are typically taught from a theoretical approach that stresses procedures, observation, identification, documentation, and computation. To augment and contextualize the material taught in the traditional academic curriculum, some schools offer applied STEM courses, which emphasize the application of academic concepts to real-world job experiences while incorporating quantitative reasoning, logic, and problem-solving skills. By design, applied courses impart skills and knowledge that have direct relevance to the daily challenges and problems students will face should they pursue a STEM career. These courses assume that skills are procedures learned within the tasks rather than being explicitly taught.

Though the contemporary policy climate creates implementation challenges at the nexus of school reform and labor market demands at the national level, approaches to providing quality STEM education to students are mostly made at the district and/or school level. Those in charge of designing the local math and science curricula typically have two options for promoting the acquisition of STEM skills: implement packaged programs that embed STEM instruction within existing academic courses or add applied STEM courses to complement the existing academic curriculum. The former approach tends to be more time and resource intensive and involves extensive planning, staff development, and oftentimes partnerships with local agencies, colleges, and employers (Brophy, Klein, Portsmore, & Rogers, 2008). Examples of the former include VIBES (Vanderbilt Instruction in Biomedical Engineering for Secondary Science), Project Lead the Way, the University of Nebraska-Lincoln Summer Engineering Institute, and RAISE (Revitalizing Achievement by Using Instrumentation in Science Education)—all of which rely substantially on teacher retraining and professional development to improve the content and instruction of existing math and science courses (Brophy et al., 2008; Iskander & Kapila, 2012; Nugent, Kunz, Rilett, & Jones, 2010).   

More common, however, is the addition of applied STEM courses to the preexisting career and technical education curriculum. Per the taxonomy developed by the U.S. Department of Education, there are two predominant categories of applied STEM courses in the high school curriculum in the United States: SRE courses IT courses. For ease of expression, the phrase “applied STEM courses” refers collectively to both SRE and IT courses. SRE courses integrate basic concepts in math and science to instruct students on the steps of the engineering process (i.e., identify the problem-design-build-test-evaluate). These courses teach students how to solve problems within the context of planning, managing, and providing scientific research and professional and technical services, including laboratory and testing services and research and development services. Examples of SRE courses include surveying, electrical engineering, structural engineering, and computer-assisted design/drafting. IT courses, on the other hand, teach basic programming and systems functionality with a focus on practical problem solving. They involve the design, development, support, and management of hardware, software, multimedia, and systems integration services. Examples of IT courses include introduction to computer science, C++ programming, visual basic programming, and data processing.

Learning in core academic subjects such as math is most directly influenced by the concepts and skills taught in traditional academic courses; hence, the mechanisms through which applied STEM courses (SRE and IT) influence overall achievement are likely more indirect and heterogeneous. But by relying on the framework put forth in the research on academic STEM course taking (i.e., Tyson, Lee, Borman, & Hanson, 2007), it does seem plausible for there to be three conceptual mechanisms by which applied STEM courses improve achievement outcomes. The first mechanism is reinforcement: concepts taught in academic math and science courses can reapply their knowledge with additional classroom instruction in other related fields, such as engineering. The second mechanism is relevance and engagement. Applied STEM courses may translate theoretical and abstract concepts from academic subjects into relevant applications. In doing so, students might grasp traditional math and science concepts more effectively, given the exposure to the material from a more relevant and engaging perspective (Stone & Lewis, 2012). Finally, applied courses might promote new skill formation. Because applied courses stress the development of reasoning, logic, and problem solving, these skills may in turn influence a student’s achievement levels.

Given these mechanisms, it is the hope of most educators that applied STEM courses will improve academic learning by making theoretical math and science concepts more concrete and relatable, particularly to students who may be less academically inclined. However, it is possible that STEM courses could negatively influence learning by taking the place of academic courses in students’ schedules, depriving them of opportunities for direct academic instruction. Thus, the types of academic and applied courses that students take are key to understanding the potential efficacy of an integrated STEM curriculum.

The research to date has yet to clarify the direction and magnitude of such relationships using nationally representative or large-scale data. However, given the national-level importance of a declining quantity and quality of U.S. students in STEM, national-level data might serve the field most effectively in addressing these issues. Moreover, while a large number of studies demonstrate that academic STEM courses support math achievement (see for example, Newton, 2010), it is unknown whether the same holds true for applied STEM courses. Some research on occupational course taking more broadly—of which applied STEM is only one component—finds that these courses in the aggregate have little bearing on learning (Agodini, 2001; Rasinski & Pedlow, 1998). However, there is no research that explicitly examines SRE and IT courses apart from the heterogeneous range of occupational courses. In other words, applied courses may be beneficial, but their unique effects are obscured as they are examined in combination with an array of occupational courses that are less relevant for academic learning, such as agriculture, culinary arts, and consumer services. The present study focuses specifically on SRE and IT courses and therefore provides new knowledge on the distribution and efficacy of applied STEM course taking.  


This study addresses three research questions:


What applied STEM courses are most commonly taken by high school students, where applied STEM courses are defined as SRE and/or IT courses?


To what extent are high school students taking both academic math courses and applied STEM courses?


Do applied STEM courses in high school improve achievement in math?

Answering the first question will help to define the contours of the applied STEM course-taking landscape in American high schools. Though this information is central to any educational policy discussion of STEM preparation, there is surprisingly no research at the national level documenting which applied courses are most commonly taken.1 Unlike in the traditional math and science curriculum where courses are sequential in their organization (e.g. Algebra I is required to progress to Algebra II), there is a wider range of applied courses that can be taught alongside or independently of related courses. Whether there is a common set of “core courses” that comprise the applied STEM curriculum or whether there is variability in the types of applied STEM courses that are taken is unknown. Hence, answering the first question will provide new information regarding STEM course taking as well as serve as a foundation for the rest of the analysis.

The second question further explores the landscape of applied STEM course taking by examining the extent to which these courses overlap with traditional academic courses in the math curriculum at all points in the math ability spectrum. Applied courses are intended to build upon and to contextualize the material delivered in academic courses. However, it is unclear if students reaching more advanced levels of math and who also take applied courses are enrolling in similar or different applied courses as their peers who only reach intermediate or remedial levels of math who also take applied courses. For example, there may be distinct pairings of academic and applied courses that are commonly taken. Addressing this second question is particularly timely, as states and districts continue to grapple with how best to structure their curricula to ensure students meet the provisions and requirements of the Perkins legislation, NGSS, and CCSS.

Finally, the third question examines the effect of taking applied courses on math achievement while taking into account location in the math pipeline. The focus on math achievement is key, as it highly correlates with the persistence and completion of college (Adelman, 1999) and is an important foundation on which to further build and to develop STEM skills and knowledge. Moreover, math constitutes the primary component of most standardized tests, and so the efficacy of applied STEM courses will be evaluated by the extent to which they support/detract from student proficiency in academic math skills. This study will be the very first to examine the roles that academic and applied STEM course taking jointly play in supporting math achievement in high school using nationally representative data.



To address the three research questions listed above, this study relies on a comprehensive, longitudinal dataset: the Education Longitudinal Survey (ELS:2002). Collected by the NCES, ELS:2002 monitors the academic and developmental experiences of a cohort of 10th graders as they proceed through high school and into young adulthood. ELS:2002 used a two-stage sampling procedure. In the first stage, a sample of 752 high schools, both public and private, were selected with probabilities proportional to their size. In the second stage, approximately 26 students were randomly sampled from each school on the condition that they were in the 10th grade in the spring term of 2002. The original sample included 15,362 students in 752 public and private schools in all 50 states and the District of Columbia. The spring of 2002 served as the base year of the study, during which questionnaires were administered to students, parents, teachers, and school administrators in order to construct a comprehensive record of a student’s developmental and educational environment. Students were re-interviewed in the spring of 2004 when most sample members were seniors (the first follow-up) and again in 2006 when most sample members were two years out of high school (the second follow-up). Parent and teacher questionnaires were administered in the base year (2002) only, while the student and school administrator questionnaires were administrated in the base year and first follow-up (2004).

The cornerstone of the present analysis is the student transcript course-level file, which contains complete course-taking histories—including course names, grades earned, and credits earned—of 91% of the original ELS:2002 base-year sample (Bozick et al., 2006). Official high school transcripts were collected and appended to the survey data in 2005 after students in the sample had finished high school and degree verification processes were complete. All course record files were calibrated to indicate Carnegie units as a standardized measure of credits earned. A Carnegie unit is equal to a course taken every day, one period per day, for a full school year. To prepare the data file for this analysis, a number of editing and consistency checks were performed. First, any discrepancies between credits earned and the course grade were resolved to ensure that course credit was awarded only when the student received a passing grade. Second, any duplicate course records resulting from school transfers were removed.2 Lastly, the number of credits assigned was inspected to ensure compatibility with the school’s calendar system (e.g., semester, trimester, etc.). After these data cleaning steps, the course-level file was converted into a student-level file that indicated the number of credits earned for each course in each year of high school for each student in the sample.

Preceding the conversion of the course-level file into a student-level file, information about the student’s sociodemographic background and their investments in schooling, taken from survey questions asked of students and their parents, were merged. The resulting data file includes a comprehensive picture about each student in the ELS:2002 sample, with survey data collected from the 10th and 12th grade interviews, and course-taking information spanning all four years of high school. The present study is based on a sample of approximately 11,112 students who participated in the base-year (10th grade, 2002) and first follow-up (12th grade 2004) interviews, completed math assessments in both years, and for whom valid transcript information was collected.


The dependent variable in this analysis is a test score reflecting performance on a standardized math assessment administered by NCES to sample members in the 10th and 12th grade along with the surveys. This score is a continuous measure indicating the total number of questions answered correctly on the assessment. The test was designed explicitly to measure the skills and concepts taught in the high school math curriculum (including intermediate and advanced material) and scoring was based on item response theory (IRT) to maximize measurement accuracy within in a limited amount of testing time while minimizing floor and ceiling effects (Ingels, Pratt. Rogers, Siegel, & Stutts, 2005). During the 10th grade survey and testing administration, sample members were first given a short routing test. Based on their performance of the routing test, they were then directed to complete a low-, middle-, or high-difficulty test. The difficulty level of the 12th grade test was determined by both the original difficulty level assigned during the 10th grade test as well as the 10th grade test score. NCES administered the 12th grade test only to those students who participated in the base-year component of the study. Therefore, students who were not enrolled in the same school in both the 2002 and 2004 survey administrations due to school transfer, drop out, or home schooling were not given the 12th grade test and thus are not included in the present study. By only including students who remained in their 10th grade schools during both test administrations, the results from this analysis will not directly generalize to students who transfer in and out of school(s). However, examining students who were continuously exposed to only one curriculum and school environment provides a more precise estimation of the relationship between coursework and achievement.


Table 1 presents the means and standard deviations for the dependent variable and key predictor variables (described below) used in this analysis. On average, students in the sample answered 39.3 questions correctly on the mathematics assessment at the end of their sophomore year and 49.8 questions correctly at the end of their senior year.


The key predictor variables in this analysis are measures of applied STEM course taking and measures of academic math course taking. This study focuses on academic math and not academic science for two reasons. First, most of the major changes in policy that directly bear upon STEM (including the Perkins legislation and the movement to adopt CCSS) emphasize improvements to the math curriculum. Second, and more empirically limiting, ELS:2002 contains only a math assessment and not a science assessment. Thus, it is not possible to directly ascertain the skills that are central to learning in the traditional science curriculum.

Courses were classified as applied STEM or academic math, per the Secondary School Taxonomy published by NCES (Bradby & Hudson, 2007). This taxonomy organizes all high school courses recorded on students’ transcripts into four distinct curricula: academic, career and technical education (CTE), enrichment/other, and special education. Math is a traditional subject that—along with English, science, social studies, fine arts, and non-English language—comprises the academic curriculum. Applied courses, on the other hand, have a distinct occupational focus and are part of the CTE curriculum. This taxonomy is mutually exclusive such that courses classified as academic cannot also be classified as CTE and vice versa.3

The CTE curriculum contains 16 distinct career clusters; this study focuses on the two clusters that contain applied STEM coursework: engineering technologies (SRE) and information technology (IT).4 Within these broader clusters, course titles are identified and assigned unique course classification codes to fit within the Secondary School Taxonomy. A student was categorized as participating in applied STEM coursework if the student received credit for the course and the course classification code fell within the SRE or IT designation.

To examine how applied STEM courses complement academic STEM courses, the analysis also includes a pipeline measure of math course taking. This pipeline measure was originally developed for the U.S. Department of Education by Burkam and Lee (2003) to distinguish between lower level less challenging courses and upper level more challenging courses across the broader universe of courses within the Secondary School Taxonomy. Building off Burkam and Lee’s analytic work, this study employs the mathematics course taking pipeline measures designed and validated specifically for the ELS:2002 transcript data file (Bozick et al., 2006). The mathematics pipeline is an ordinal measure indicating the highest level of mathematics for which the student received (nonzero) credit and includes the following four categories: (1) below average math, which includes basic math up through prealgebra; (2) average math, which includes algebra and geometry; (3) above average math, which includes trigonometry, statistics, and precalculus; and (4) advanced math, which includes calculus. The pipeline measure is exhaustive of all courses classified as academic math. In the dataset, nearly all students take academic math classes in high school, with only 1% finishing 12th grade with no math credits earned. The majority of students finish the 12th grade having taken either an above average math course (36%) or an advanced math course (16%). Approximately 2.8% of students the sample have no recorded math course on their transcript, which is most likely due to enrollment in special education courses as substitutes, earning credit in math outside of school (such as a local college), or administrative recording error. To maintain the correct standard errors and national representation of the full survey sample, these students are included as a residual “no math” category in all estimations. However, as there is no relevant substantive interpretation of this category, its associated statistics are suppressed in all tables and figures.


In order to ensure that the estimated effects of applied STEM courses on math achievement are not confounded by other factors, a set of variables controlling for differences in sociodemographic background and investments in schooling are included in the multivariate analyses. Because this is the first large-scale study of the influence of applied STEM course taking, the control variables selected for this analyses are grounded in prior empirical research on the effect of academic STEM course taking on STEM outcomes (e.g., Adelman, 1999; Brody & Benbow, 1990; McClure, 1998; Riegle-Crumb, 2006; Tyson et al., 2007; Wimberly & Noeth, 2005). Based on this large body of literature, this study aggregated control variables into two categories: sociodemographic characteristics and measures pertaining to investments in schooling.

Sociodemographic background variables are taken from the 10th grade student and parent surveys and include gender, race/ethnicity, native language, household composition, mother’s and father’s education, mother’s and father’s occupation, and family income. Variables measuring investments in schooling are taken from the 10th student surveys, the 12th grade student surveys, and the transcripts. The variables include the importance that students place on education, their expectations for college, their self-efficacy in math, their parents’ involvement in schooling, their participation extracurricular activities, their labor force participation, and the total number of credits they earned in all subjects. Because they are not central to the research questions posed in this analysis, and because of the volume of literature that examines their relationship to achievement, these variables are used simply as controls. The coding of these variables is described in Appendix A.

Table 2 presents partial correlation coefficients and their significance levels between the indicator that a student completed applied STEM coursework and other independent variables in this analysis. Partial correlations were purposefully selected to describe the data, as they test the association between two variables while holding constant the influence of additional variables. Importantly, the values presented throughout the table suggest that in the sample, there are weak to null correlation coefficient values between student sociodemographic characteristics and those students who have taken applied STEM coursework. Thus, students who have taken applied STEM coursework do not appear to be systematically related to other observable characteristics that may bias the estimated effects of course taking on math achievement. While the correlation coefficients of some sociodemographic characteristics may be larger than others (i.e., gender), the practical significance is minimal given their magnitudes.


Table 2 also presents very small correlation values between characteristics pertaining to investments in schooling and students having taken applied STEM coursework. This suggests that students’ academic characteristics are unrelated or weakly related to the key independent variable—either aggregately (applied STEM) or within course type (SRE or IT). Again, this suggests that there is nothing systematic in the relationships between students having taken applied STEM coursework and the set of independent variables (i.e., motivation) that might bias the estimated effects of the key predictor variables.


Descriptive statistics will be used to address Research Questions 1 and 2. For Research Question 1, the distribution of students earning credit in specific courses within the SRE and IT career clusters, as well as for the SRE and IT career clusters overall, will be tabulated for each of the four years of high school. For Research Question 2, the joint distribution of students reaching each level of the math pipeline and earning credits in the SRE and IT career clusters will be tabulated.

A series of progressively more stringent ordinary least squares (OLS) regression models will be estimated to address Research Question 3, including a baseline model, a school-level fixed-effects model, and a school-by-year fixed-effects model. The specifications of these three approaches for Research Question 3 are discussed below as a baseline model and adjustments based on unobserved heterogeneity.

Research Question 3 Baseline Model

In order to gauge the effect of applied STEM courses on high school math achievement, this study first estimates a regression model for the full sample of observations. The baseline model specification is:

[39_17496.htm_g/00006.jpg] (1)

In the model, y represents the number of questions answered correctly on the math assessment by student [39_17496.htm_g/00008.jpg] in school [39_17496.htm_g/00010.jpg] in year [39_17496.htm_g/00012.jpg],[39_17496.htm_g/00014.jpg]; [39_17496.htm_g/00016.jpg] indicates if a student had taken applied STEM coursework, disaggregated in subsequent analyses by [39_17496.htm_g/00018.jpg] and [39_17496.htm_g/00020.jpg] separately; [39_17496.htm_g/00022.jpg] designates a student’s highest pipeline math level at time [39_17496.htm_g/00024.jpg]; [39_17496.htm_g/00026.jpg] allows for the interaction between applied STEM status and a student’s highest math pipeline level; and [39_17496.htm_g/00028.jpg] is a vector of control variables including student’s sociodemographic background and their investments in schooling. The coefficients of interest, [39_17496.htm_g/00030.jpg] and [39_17496.htm_g/00032.jpg], represent the effect of applied STEM course taking as well as the effect of applied STEM course taking as it varies across the academic math pipeline.

The error term ε includes all unobserved determinants of math achievement. Empirically, this component is estimated with Huber/White/sandwich robust standard errors, adjusted for school clustering. It is in this error term that the multilevel structure of the data is taken into account. Because students are nested within schools and hence are likely to share common but unobservable characteristics and experiences, clustering student data at the school level provides for a corrected estimate of the variance of the error term given this nonindependence of individual-level observations. As a result, standard error estimates are robust, as they have been corrected for this multilevel nature of the data (Primo, Jacobsmeier, & Milyo, 2007).

Research Question 3 School-Level Fixed Effects

The empirical approach thus far can provide evidence as to whether the observed variables are related to math achievement, notably prediction of applied STEM coursework. However, estimates of the variables under β1 and β3, the coefficients that contain[39_17496.htm_g/00034.jpg], may be biased in Equation 1, even with the inclusion of a large number of control variables. The bias arises because there are potential unobserved school factors that may be influencing course taking as well as achievement in math. For instance, attending a school with an environment that emphasizes accountability and achievement test performance might simultaneously be correlated with the selection of academic STEM courses over applied STEM courses as well as with performance on the math assessment. Thus, the omission of school environment as a control in Equation 1 may cause an underestimation of the effect of applied STEM coursework. It is hence necessary to turn to an estimation approach that is more immune to this omitted variable bias. In this study, in conjunction with the previously defined baseline empirical model, a fixed-effects strategy can act as a more stringent attempt to remove potential unobserved school-level biases in the estimation of β1 and β3:

[39_17496.htm_g/00036.jpg]. (2)

This model is nearly equivalent to Equation 1; however, [39_17496.htm_g/00038.jpg] now represents school fixed effects for each school in the sample s. Empirically, this is accomplished by including k − 1 binary variables that indicate if a student had attended a particular school, where k = 752 high schools. School fixed effects [39_17496.htm_g/00040.jpg] control for the unobserved influences of schools by capturing systematic differences across each unique institution. By holding constant those omitted school-specific factors, such as school-specific practices, the effect of applied STEM coursework apart from variation across schools can be better identified.

Empirically, the school fixed-effect method averages the terms in Equation 2 over all of the observations for a given school and subtracts the average within a school. This eliminates any bias caused by correlation between the regressors and the unobserved school influences. Because the school fixed-effects parameter in Equation 2 controls for any correlation between time-invariant school characteristics and the applied STEM coursework variables, school fixed effects are essentially holding constant any unobserved differences between schools. Hence, these models provide information about the extent and likely direction of the bias in the baseline models that do not control for these unobserved differences in schools. Consequently, the use of school fixed effects is compelling not only in this present analysis but also in quasi-experimental educational research (Schneider, Carnoy, Kilpatrick, Schmidt, & Shavelson, 2007).

Research Question 3 School-by-Year Fixed Effects

Even with the inclusion of school fixed effects, it is possible that time-varying unobserved school-level factors may be influencing applied STEM course taking as well as math achievement. For example, if there was an increased emphasis on testing and accountability in a given school between 2002 and 2004 (i.e., with the gain in inertia of No Child Left Behind policies), resulting from the implementation of a school reform effort and/or a change in school leadership over this time period, the estimated effect of applied STEM coursework in Equations 1 and 2 may be biased. To account for these potential time-varying confounds a final model will include school-by-year fixed effects:

[39_17496.htm_g/00042.jpg], (3)

where [39_17496.htm_g/00044.jpg] represents school-by-year fixed effects. Equation 3 augments Equation 2 by controlling for any school-year effects that may affect applied STEM course taking. Empirically, this is accomplished by including k − 1 binary variables that indicate if a student had attended a particular school in a particular year, where k = 752 high schools and t = two time periods (10th grade interview in 2002 and 12th grade interview in 2004). By implementing this more stringent fixed-effects strategy, this model further attenuates the threat of unobserved school-level confounds.



Table 3 shows the distribution of the most commonly taken SRE courses overall and for each level of math. Table 4 follows the same layout but shows the distribution of IT courses. The first column in each of these tables shows the percentage of students who have ever taken SRE or IT courses as well the percentage of students who have ever taken specific SRE or IT courses. To easily highlight those courses that are more prevalent, only those taken by more than 1% of the sample are displayed. In terms of applied STEM course taking overall, IT courses are more prevalent than SRE courses: 25.5% of sample members had taken an IT course (Table 3) compared with 14.4% of sample members that had taken a SRE course (Table 4). This is not surprising, as nine states count computer science as a core graduation requirement (Wilson, Sudol, Stephenson, & Stehlik, 2010), while zero states count SRE as a core graduation requirement.



With respect to SRE course taking, there are 48 courses classified as SRE in the SST. Thirty-two of these 48 SRE courses were offered by schools in the ELS:2002 sample, and only two of these courses were taken by more than 1% of sample members. Drafting Fundamentals was taken by 3.7% of the sample and Computer Assisted Design/Drafting (CAD) was taken by 3.5% of the sample. This indicates that there is not a single SRE course or a single set of SRE courses that dominate the applied STEM curriculum, but rather there is considerable variability in the types of SRE courses to which students are exposed.

With respect to IT course taking, there are 38 courses classified as SRE in the SST. Twenty-eight of these 38 SRE courses were offered by schools in the ELS:2002 sample, and six of these courses were taken by more than 1% of sample members. Unlike the heterogeneity in SRE course taking, there appears to be at least one IT course that is common to a sizeable swath of students: Computer Science 1. This introductory course was taken by approximately 11%of students, with the next most common IT course being Website Design taken by 4.1% of students. This suggests that courses that introduce computer application skills and concepts are more common in schools (compared with courses that introduced engineering skills and concepts) and in turn are more accessible to students.


Columns 2 through 8 in Tables 3 and 4 help to elucidate the degree to which students are taking applied STEM courses in conjunction with traditional academic math courses. Table 3 shows the percentage of students who reached a specific level of math who have also have taken an SRE course. The distributions reveal that SRE course taking is most concentrated among those not progressing to the more advanced courses. For example, SRE course taking is most common among those whose highest math course is below average on the pipeline (25%) and least common among those whose highest math course is advanced math (13.1%). Thus, within the broader STEM curriculum, it appears that SRE courses may be acting in some cases as supplements to more advanced courses rather than as complements.  

The picture is different when looking at the distribution of IT courses in Table 4. In contrast to SRE, IT course taking appears to be more common regardless of progress through the math curriculum. Across the different levels the percentage of taking IT courses are between 25% and 28%. Therefore, IT courses appear to serve as complements to traditional math courses rather than a supplement to any particular course level.


Table 5 provides the results from estimating the baseline model (Equation 1), school fixed-effects model (Equation 2), and school-by-year fixed effects model (Equation 3). The table provides coefficient estimates and Huber/White/sandwich robust standard errors adjusted for school clustering. All models control for students’ sociodemographic background characteristics and investments in schooling as well as for grade level.

Each section presents estimates from two baseline model (Equation 1), in which only observable variables are included. The first model, (a), includes only main effect terms and tests if there is a direct effect of having taken applied STEM coursework. The second model, (b), includes interaction terms between having taken applied STEM coursework and each pipeline level. This way, it is possible to test for the extent to which the effect of applied STEM course taking varies across each additional level in the mathematics pipeline.

While the models are analogously constructed by observable variables across each section, they become more stringent in Section 2 and Section 3. In these models, school and school-by-year fixed effects are incorporated, respectively (as specified in Equation 2 and Equation 3). This process, by which more complex models build upon previous models, demonstrates that the inclusion of school and then school-by-year fixed effects improves the ability of the model to explain the variance in math achievement. As a result, accounting for unobserved school-level factors proves to be significant in explaining the variance in math achievement. Although the explained portion of the variance increases only slightly, the Likelihood Ratio test nonetheless supports the use of models in Section 3 over those in Section 2 (and those in Section 1). Hence, the school-by-year fixed effects model is the most favored—and most rigorous—specification. As such, all discussion going forward will be based on the school-by-year fixed effects model, as it takes into account the potential confounding effects of observed sociodemographic background characteristics and observed student investments in schooling as well the unobserved school-year environment across both waves of the survey data—thus, providing the most unbiased estimate of the effect of course taking.


Beginning with the main effects models—i.e., column (a)—the indicator of having taken an applied STEM course is statistically significant once school fixed effects are incorporated, beginning in Section 2. The size and direction of the main applied STEM term implies that, on average, students who take applied STEM coursework tend to perform somewhat better than students who do not. Specifically, in column (a) of Section 3, the coefficient for applied stem (β = 0.73) indicates that when controlling for sociodemographic background, investments in schooling, and changes in the school environment across the study waves, students who take an applied STEM course answer more questions correctly than their peers who did not take an applied STEM course. The fact that the main effect coefficient for applied STEM course taking becomes statistically significant with the inclusion of more robust estimation strategies provides evidence that the relationship is underestimated in the baseline specification. The significance of the applied STEM course-taking parameter after the inclusion of more rigorous controls via the fixed-effects specifications in Section 2 and in Section 3 supports the hypothesis that taking applied STEM courses is associated with greater achievement in math.

While overall the effect of applied STEM courses appears to be small, the results in column (b) of Section 3 reveal a more nuanced interpretation. In these models, the indicator for having taken applied STEM coursework is interacted with three pipeline levels: below average, above average, and advanced. The average pipeline level is excluded and serves as the reference category for all other pipeline levels. The parameter estimates in Section 3 column (b) show that the effects of applied STEM coursework vary depending on how far a student has progressed in math. Specifically, applied STEM course taking appears to give a greater boost to performance for those who do not progress far in the traditional academic math curriculum. For example, students who fall at the below average math pipeline level can achieve a positive net gain when having taken applied STEM courses compared to students at the same pipeline who do not (this was calculated by the main term of applied STEM coursework as well as by the interaction term between applied STEM coursework and below average pipeline level). This is equivalent to a 13% standard deviation increase in math test performance. However, at higher than average math pipeline levels, taking applied STEM courses does not provide any substantive boost in math achievement. Indeed, under the same analysis, there are much smaller gains for students at higher pipeline levels, exhibited empirically by the fact that only the main term—and not the interaction—is statistically significant.

To more clearly understand the patterning indicated by the interaction terms, the coefficients from the school-by-year fixed-effects models are used to calculate predicted values of the math achievement at all pipeline levels. As consistent with the previous paragraph, each predicted value gain in testing performance is determined by calculating the sum of the coefficients on the main term for applied STEM coursework and on the interaction between applied STEM and pipeline level. This was done with all other covariates set at their mean values. Hence, the comparison at each pipeline level is for all students at a particular pipeline, between those who have taken applied STEM coursework and those who have not. The results are shown in Figure 1.

Figure 1. Predicted gains in math test scores from applied STEM courses by highest level of academic math taken by student


The trends in this figure indicate that the largest gain in math achievement among those taking applied STEM courses occurs for students taking below average math courses. This is depicted by the highest values in the figure along the y axis. The gains in math achievement resulting from taking applied STEM coursework taper off along the math pipeline as students progress to more advanced courses in the academic math curriculum, indicating that applied STEM coursework is less beneficial for more academically focused students.

When applied STEM courses are subsequently separated into more specific CTE categories—SRE and IT—the results in Table 6 reveal a similar pattern. Again, the school-by-year fixed effects models are supported as the most stringent of all models presented in this study, as determined by the Likelhood Ratio test. Hence, the discussion of the results focused on the third column of the table. Each column is analogous to column (b) in Table 5.


The effects of SRE and IT separately yield a similar effect as the aggregated applied STEM measure for students at various math pipeline levels: Students at the lower end of the academic math pipeline receive a greater boost in math achievement from SRE and IT courses. Conversely, the effect of SRE or IT coursework is less beneficial for students reaching the advanced levels of the math curriculum. Predicted gains for math achievement by SRE and IR are calculated analogously to those in Figure 1 and are here shown in Figure 2:

Figure 2. Predicted gains in math test scores from specific applied STEM courses—SRE and IT—by highest level of academic math taken by student


The figure portrays achievement gains separately for SRE and for IT courses. SRE courses seem to be particularly beneficial at the lower pipeline levels: The math achievement gains are higher for students taking below average math and also taking SRE courses compared to students taking below average math and also taking IT courses. Though the reasons for these differences cannot be ascertained directly with the ELS:2002 data, it is likely that the SRE courses contain more material that complements math learning than do IT courses. Recall that the most prevalent IT courses taken by student were Computer Science 1 and Web Design (Table 4). As introduction courses geared toward basic computer usability concepts and web page construction, their emphasis on quantitative reasoning, problem solving, and logics skills are likely limited in contrast to the material taught in Drafting Fundamentals and Computer-Assisted Design/Drafting (CAD)—the most popular SRE courses.  


In order to ensure that the results were not driven by a particular type of school (e.g., vocational schools), the analysis controls for two school classifications in the baseline OLS models: school sector and school focus. With respect to school sector, 76% of students are in public schools, 14.3% of students are in Catholic schools, and 9.2% are in other schools (such as Native American reservation schools or schools that do not have a sector associated with their official records). With respect to school focus, 6.4% of students attended a school with a vocational focus and 93.6% of students attended a school with an academic focus. When the baseline models are re-run with school sector and school focus as controls, the results do not differ compared to what is presented in this study.

It is also possible to conduct a test of validity to ensure that it is participation in STEM applied courses, rather than participation in any applied course, that is driving the observed results. In this context, the treatment is participation in non-STEM applied courses. As anticipated, the test score gains at lower pipeline levels from participation in non-STEM applied courses, when significant, are much smaller compared to participation in applied STEM courses. At higher math levels, students seem to be at a disadvantage by taking non-STEM applied courses, as reflected in lower test scores. It appears that content of non-STEM applied courses does not reinforce math concepts as applied STEM courses do, resulting in no or even a negative effect on math achievement. These results are available upon request.


Research on the effectiveness of the high school STEM curriculum has predominantly focused on the effects of academic STEM courses on achievement, and prior to the present study, little work has quantified the potential effects of applied STEM courses on achievement. This study has hence contributed new insight by examining SRE and IT courses—two sets of applied STEM courses that are designed to complement the traditional math and science curricula and to support the development of skills that are central to STEM careers. The relationship between applied STEM coursework and mathematics achievement was evaluated using a nationally representative and comprehensive dataset of high school students in the United States. As such, the results presented in the study can generalize to students across a range of schools and student populations in all 50 states. This research has important policy implications, not only for how the STEM curriculum is implemented in high schools but also for how these courses may serve as a foundation to support postsecondary STEM opportunities and future employment in STEM fields.

Before testing the effects of applied STEM course taking on math achievement, the distributions of SRE and IT (i.e., applied STEM) courses were tabulated to identify which courses are most commonly taken by students and the extent to which these courses are taken alongside traditional academic math courses. On average, there was substantial variation in the types of SRE courses taken, with only two courses—Drafting Fundamentals and Computer Assisted Design/Drafting—taken by more than 1% of the sample. There was less heterogeneity among IT courses, with a larger proportion of students taking Computer Science 1 and Website Design. SRE courses were most common among students taking less rigorous math courses, suggesting that they may be acting as supplements to more advanced level courses, while IT courses were taken by students across all levels of math.

As schools and districts work to provide classes that impart practical occupational skills while simultaneously attempting to streamline their curricula to align with NGSS and CCSS, the extent to which applied STEM courses can support academic achievement will likely remain at the forefront of curricular programming decisions. To help inform such decisions, this study estimated the effect of applied STEM courses on math achievement using three modeling approaches. The first approach relied on a baseline assessment, employing a linear regression specification where mathematics achievement was modeled strictly based on observable characteristics in the data. The second approach extended the baseline model by incorporating school fixed effects to account for unobservable school-level factors that may be influencing enrollment in applied STEM coursework as well as mathematics achievement. The third and most rigorous approach incorporated school-by-year fixed effects to account for any unobserved variance over time. Empirical models with fixed effects are supported in quantitative educational research on large-scale datasets as appropriate tools (e.g. Schneider et al., 2007).

On average, the findings demonstrate a positive effect of taking applied STEM courses: Students who take an applied STEM course answer more questions correctly on a standardized math assessment than their peers who did not take an applied STEM course, all else equal. This provides new evidence that applied STEM courses can be used to support learning in math instructed elsewhere in the curriculum, particularly for those students at the lower end of the math pipeline. That said, the results do suggest that traditional academic math courses remain much more effective at supporting math testing performance in high school. For example, the models show that students who progress to advanced-level math answer about 18 more questions correctly than their peers who progress to average-level math and students who progress to above average-level math answer about eight more questions correctly than their peers who progress to average-level math. This should not be too surprising, as traditional math courses are almost exclusively geared toward academic skills and concepts whereas applied courses incorporate a range of materials and instructional approaches in order to emphasize occupational applications. It may also be that traditional math tests do not tap into testing the applications of math found in applied STEM courses.

Hence, the analyses indicate that these courses may be particularly beneficial for those students who are less oriented toward advanced math, and thus, less likely to continue on to college. For example, students reaching below average math courses (e.g. basic math and prealgebra) can gain approximately 13% of a standard deviation in math achievement as a result of taking applied STEM courses. For these students, applied STEM courses may be more accessible and practical in that they stress the application of academic concepts to real-world experiences while incorporating quantitative skills, logic, and problem solving. Many students become disengaged with traditional academic courses as they see little direct relevance to their occupational goals—either in the present or in the future (Stone & Lewis, 2012). Hence, in providing hands-on learning, often with technology and with direct application to concrete occupationally specific problems, applied STEM courses may serve as a critical means to support students’ understanding of concepts taught in their lower level math pipeline courses.

Despite the many strengths of the ELS:2002 data and the robustness of the findings presented here, there are three limitations to keep in mind. First, as a nationally representative study of schooling processes and educational attainment more generally, ELS:2002 lacks specific information on the substantive content of and the instructional practices used in the applied STEM courses. The findings from the analyses generalize to SRE and IT courses in the aggregate but should not be used to evaluate the efficacy of any particular course. Hence, future research could pair the findings from this study with results from smaller scale empirical studies or qualitative studies that can delve into the content of the coursework, how it relates to the content of the test, and the type of instruction best suited for these courses to have greatest impact.

Second, as a study of high school students, ELS:2002 lacks data on math ability prior to high school entry—a key predictor of course selection/course placement and achievement trajectories in high school. Finally, the study is limited to using only a single, and potentially limiting, measure of achievement: scores on a standardized math test. While this is a particularly potent indicator of academic learning and readiness for college, assessments such as the one used in ELS:2002 are not designed to measure the breadth of skills and concepts that are typically taught in an occupationally focused STEM course. As a result, the findings from the present study may understate the cognitive gains that accrue to students who take SRE or IT courses.

These limitations notwithstanding, the findings have important implications for policy. These implications, however, depend largely on how schools and districts view the role of SRE and IT courses in their broader STEM curriculum. If the goal in providing these courses is to help bolster the learning of students who will likely forgo the most advanced math courses, then the provision of SRE and IT courses may be an effective means at helping to strengthen their basic competencies in math.

In closing, as the economy becomes increasingly reliant on strong quantitative and analytical skills, there is an increasing need for policy makers to identify the most efficient ways to prepare all youth for careers in STEM—not simply those at the advanced level. This study shows that applied STEM courses may have the ability to boost achievement gains in math for lower ability students. It is true that students make the largest gains in mathematics when they take more advanced academic math courses. However, SRE and IT courses may serve an important role in the broader STEM curriculum and may ultimately contribute to STEM success, particularly for those students at the lower ends of the math pipeline. Future research is needed to evaluate whether differences in the content and delivery of these courses are associated with learning gains and to evaluate whether these courses have longer term effects with respect to postsecondary STEM participation and labor force success.


1. A number of government reports tabulate the percentage of students who enroll in SRE and IT courses (see for example Hudson & Laird, 2009). However these analyses do not report the frequency of the specific courses taken by students within these broader categories.

2. In collecting transcripts, NCES went back to the last two schools attended by students in the sample. This was done to enhance coverage, particularly for students who transferred schools during high school. There were cases where courses taken at the student’s first school were conveyed to the transfer school and thus appeared twice on the student’s record.

3. It should be noted that the academic and CTE distinction is one that the Department of Education makes based on content and focus of the course material and is not tied to college-bound/noncollege-bound skills and concepts. Historically, CTE was synonymous with vocational education, which was considered a course-taking track that led to jobs for noncollege-bound youth. This is no longer the case as many CTE courses include a rigorous content that complements the skills and concepts taught in more traditional academic courses.

4. The other 14 clusters include Agriculture, Food, and Natural Resources; Architecture and Construction; Arts, Audio–Visual Technology, and Communication; Business, Management, and Administration; Education and Training; Finance; Government and Public Administration; Health Science; Hospitality and Tourism; Human Services; Law, Public Safety, Corrections, and Security; Manufacturing; Marketing, Sales, and Service; and Transportation, Distribution, and Logistics.


Adelman, C. (1999). Answers in the tool box: Academic intensity, attendance patterns, and bachelor’s degree attainment, Washington, DC: U.S. Department of Education.

Agodini, R. (2001). Achievement effects of vocational and integrated studies, Princeton, NJ: Mathematica Policy Research.

Bozick, R., Lytle, T.,  Siegel, P.H., Ingels, S.J., Rogers, J.E., Lauff, E., & Planty, M. (2006). Education Longitudinal Study of 2002: First follow-up transcript component data file documentation. NCES 2006–338. U.S. Department of Education. Washington, DC: National Center for Education Statistics.

Bradby, D., & Hudson, L. (2007). The 2007 revision of the career/technical education portion of the secondary school taxonomy (NCES 2008-030). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Brody, L. E., & Benbow, C. P. (1990). Effects of high school coursework and time on SAT scores. Journal of Educational Psychology, 82, 866–875.

Brophy, S., Klein, S., Portsmore, M., & Rogers, C. (2008). Advancing engineering education in P-12 classrooms. Journal of Engineering Education, 97, 369–387.

Burkam, D. T., & Lee, V. E. (2003). Mathematics, foreign language, and science coursetaking and the NELS:88 transcript data (Working Paper, NCES 2003–01, 2003). Washington, DC: National Center for Education Statistics, U.S. Department of Education.

Hira, R. (2010). U.S. policy and the STEM workforce system. American Behavioral Scientist, 53, 949–961.

Hudson, L., & Laird, J. (2009). New indicators of high school career/technical education coursetaking: Class of 2005 (NCES 2009–038). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Ingels, S. J., Pratt, D. J., Rogers, J. E., Siegel, P. H., & Stutts, E. S. (2005). Education longitudinal study of 2002: Base-year to first follow-up data file documentation (NCES 2006–344). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Iskander, M., & Kapila, V. (2012). Revitalizing achievement by using instrumentation in science education (RAISE), a GK-12 fellows project. Journal of Professional Issues in Engineering Education and Practice, 138, 62–72.

McClure, G. T. (1998). High school mathematics course taking and achievement among college-bound students: 1987-1996. NASSP Bulletin, 82, 110–118.

National Research Council. (2011). Successful K-12 STEM education: Identifying effective approaches in science, technology, engineering, and mathematics. Washington, DC: Author.

National Science Board. (2010). Science and engineering indicators 2010. Arlington, VA: Author.

Newton, X. (2010). End of high school mathematics attainment: How did students get there? Teachers College Record, 112, 1064–1095.

Nugent, G., Kunz, G., Rilett, L., & Jones, E. (2010). Extending engineering education to K-12. The Technology Teacher, 69, 14–19.

Primo, D. M., Jacobsmeier, M. L., & Milyo, J. (2007). Estimating the impact of state policies and institutions with mixed-level data. State Politics and Policy Quarterly, 7, 446–459.

Rasinski, K. A., & Pedlow, S. (1998). The effect of high school vocational education on academic achievement gain and high school persistence: Evidence from NELS:88. In A. Gamoran (Ed.), The quality of vocational education. Washington DC: National Institute on Postsecondary Education, Libraries, and Lifelong Learning.

Riegle-Crumb, C. (2006). The path through math: Course sequences and academic performance at the intersection of race-ethnicity and gender. American Journal of Education, 113, 101–122.

Schneider, B., Carnoy, M., Kilpatrick, J., Schmidt, W. H., & Shavelson, R. J. (2007). Estimating causal effects using experimental and observational designs. Washington, DC: American Education Research Association

Stone, J. R., III, Alfeld, C., Pearson, D., Lewis, M. V., & Jensen, S. (2008). Rigor and relevance: Enhancing high school students’ math skills through career and technical education. American Educational Research Journal, 45, 767–795.

Stone, J. R., III, & Lewis, M. V. (2012). College and career ready in the 21st century: Making high school matter. New York: Teachers College Press.

Tyson, W., Lee, R., Borman, K. M., & Hanson, M. A. (2007). Science, technology, engineering and mathematics (STEM) pathways: High school science and math coursework and postsecondary degree attainment. Journal of Education for Students Placed at Risk, 12, 243–270.

Wilson, C., Sudol, L. A., Stephenson, C., & Stehlik, M. (2010). Running on empty: The failure to teach K–12 computer science in the digital age. New York: Association for Computing Machinery.

Wimberly, G. L., & Noeth, R. J. (2005). College readiness begins in middle school. Iowa City: IA: American College Testing.

Appendix A: Construction of the Student Sociodemographic and Investment in Schooling Variables




Cite This Article as: Teachers College Record Volume 116 Number 7, 2014, p. 1-35
https://www.tcrecord.org ID Number: 17496, Date Accessed: 5/26/2022 6:10:36 AM

Purchase Reprint Rights for this article or review
Article Tools
Related Articles

Related Discussion
Post a Comment | Read All

About the Author
  • Michael Gottfried
    University of California
    E-mail Author
    MICHAEL A. GOTTFRIED, PhD, is an assistant professor in the Gevirtz Graduate School of Education at the University of California Santa Barbara. His research interests pertain to issues, including: school quality and effectiveness, classroom peer effects, and attendance and truancy. Recent articles include: Retained Students and Classmates’ Absences in Urban Schools (American Educational Research Journal), and Classmates with Disabilities and Students’ Non-Cognitive Outcomes (Educational Evaluation and Policy Analysis).
  • Robert Bozick
    RAND Corporation
    E-mail Author
    ROBERT BOZICK is a sociologist at the RAND Corporation and a faculty affiliate of the Pardee RAND Graduate School of Public Policy. His research focuses on the linkages between school and work over the life course, youth employment and youth labor markets, inequality in higher education, and the transition to adulthood for disadvantaged populations. Currently at RAND he is coleading a national evaluation of correctional education for the Department of Justice’s Bureau of Justice Assistance.
  • Sinduja Srinivasan
    RAND Corporation
    E-mail Author
    SINDUJA V. SRINIVASAN is an assistant policy analyst at the RAND Corporation. Her overarching research interest is in economic development and growth. She focuses on poverty alleviation, entrepreneurship, labor market reforms, and education reforms. Recent publications include: “The Impact of Labour Market Regulation on Employment in Low-Income Countries: A Systematic Review” (prepared for the Department of International Development, 2012) and “Indonesia Urban Poverty Analysis and Program Review” (prepared for the World Bank, 2012).
Member Center
In Print
This Month's Issue