Abstract

One explanation for negative or null findings in prior research on postsecondary remediation is that college may be too late to address issues of academic underpreparedness. This study evaluates the impact on student outcomes when college math remediation is offered in the senior year of high school. The Seamless Alignment and Integrated Learning Support (SAILS) program in Tennessee targets students with low eleventh-grade ACT math scores. Students who pass SAILS in twelfth grade can enroll directly in college-level math courses at any Tennessee community college. Using a triple-difference design, we exploit variation in students’ treatment status based on ACT math scores (remediation-eligible versus remediation-ineligible), high school adoption of SAILS (first cohort versus later cohort), and senior year (before versus during first SAILS year). We find that SAILS-eligible students in the first cohort were significantly less likely to enroll in remedial math courses in college, and more likely to enroll in and pass college-level math overall. These students also earn 2.8 additional credits by their second year. We detect no significant differences in high school graduation rates, college enrollment, or postsecondary credential attainment within two years. The program advanced progress toward several, but not all, of the potential goals examined.

1.  Introduction

Recent estimates suggest that over two thirds (68 percent) of students starting at a community college eventually enroll in at least one remedial course (Barry and Dannenberg 2016; Chen 2016). Remedial courses increase students’ time to degree, which is associated with a lower likelihood of completion (Attewell, Heil, and Reisel 2012). Among community college students nationwide, 15 percent of those who enroll in a developmental education course complete an associate's degree within six years (BPS 2009). Therefore, a nearly 70 percent placement rate into remedial courses is concerning, especially given that community colleges educate nine million undergraduates each year (Ginder and Kelly-Reid 2018). Furthermore, the number of students needing remediation has grown over the past two decades, leading to an increase in university budgets and student costs in the form of tuition paid for remedial courses (Jenkins and Boswell 2002; Tierney and Garcia 2011; Martinez and Bain 2014; Belfield, Jenkins, and Lahr 2016). A recent study in Tennessee found that traditional remediation costs the public colleges $830 per student above the typical cost of instruction (Belfield, Jenkins, and Lahr 2016). Furthermore, large demographic and socioeconomic gaps exist in who is placed into remediation, with black and Hispanic students, low-income students, and female students most likely to enroll in remedial math courses (Long, Iatarola, and Conger 2009; Chen 2016).

Many of the larger-scale studies on the effects of remedial and developmental courses on student persistence and degree completion outcomes find negative or null effects for college students at the margin of passing out of remediation (see Jaggars and Stacey 2014). A likely explanation for these discouraging findings is that college is too late to address issues of underpreparedness. As such, American high schools and colleges have been experimenting with a variety of interventions designed to identify students’ remedial needs earlier in high school. Early warning systems, information campaigns, and high school/college collaborations are all becoming increasingly popular strategies designed to help students avoid remedial courses in college (Rutschow and Mayer 2018). Recent state-level policy discussions across the United States have focused on the need to better align K–12 systems with higher education to ensure a seamless transition from high school to college (Kurlaender 2014).

Our paper examines the impacts of one of the country's first statewide high school/college partnerships focused on academic readiness. In 2013–14, Tennessee launched a statewide effort to shift math remediation from college campuses back to high schools, both to save costs and to reduce the amount of time students spend remediating their math skills once in college. The program, known as Seamless Alignment and Integrated Learning Support (SAILS), is a blended learning math course with students working at their own pace on an online curriculum, facilitated by licensed math teachers in a computer-lab setting. For students whose eleventh-grade ACT math sub-score suggests that they may be placed into remedial math once in college, the SAILS program allows them to instead take that remedial course during their senior year of high school. Students passing the SAILS course in twelfth grade are then exempt from math remediation, and can begin college in college-level math courses if they attend one of the state's thirteen community colleges. The program served 121 of the state's roughly 400 high schools in 2013–14, and scaled up over time to include 268 schools as of 2017–18.

The SAILS program has been recognized nationally for its efforts to address issues of academic underpreparedness earlier in the college pipeline. In 2018, the Education Commission of the States awarded the program with its Frank Newman Award for State Innovation, recognizing states for “excellence in shaping education policy” (Tennessee Office of the Governor 2018). Descriptive data suggest the number of first-time freshmen requiring remediation in math at Tennessee's community colleges has decreased 12.8 percentage points since the adoption of SAILS (Tennessee Office of the Governor 2018). This number, however, cannot be attributed to the SAILS program alone, and could instead be the result of other statewide postsecondary reforms adopted over the past several years. For example, beginning in 2014–15, graduating high school seniors were eligible for the Tennessee Promise, a last-dollar scholarship covering tuition and fees for recent high school graduates attending a Tennessee community college. These same students were enrolled only in corequisite remediation once in college, which allowed students to enroll in college-level math while simultaneously enrolling in remedial math. As such, it is not clear whether the decline in the number of students taking remedial courses is a result of SAILS or another statewide policy.

Our study takes into consideration the changing statewide policy landscape in Tennessee over time as we isolate the impact of the high school SAILS course on students’ postsecondary outcomes. By focusing on the first year of large-scale adoption of the program (2013–14), we are able to track student progress for the 2013–14 cohort of high school students into their third year of college, as well as isolate the within-cohort effects of SAILS in a prerequisite policy context, similar to that of most other U.S. states. While the free community college and corequisite remediation policies that Tennessee later adopted are increasingly popular college access and success strategies, studying the effects of SAILS in an environment without these additional policies is particularly beneficial for those states that have yet to adopt such policies. As of 24 April 2019, states offered some version of a statewide college Promise Program, although with considerable variation across states in the eligibility requirements and benefits for students (College Promise Campaign 2019). A 2018 report by the Education Commission of States cited sixteen states that recommended or mandated corequisite courses in their public colleges (Whinnery and Pompelia 2018; Rutschow et al. 2019). Twenty states have not yet adopted either type of policy, making the policy context in Tennessee in 2013–14 relevant to the context in these states today (as well as other systems and individual colleges in similar contexts).

Specifically, we address the following questions:

  1. Did access to SAILS increase the rate of college attendance for potentially remediation-eligible high school students in Tennessee?

  2. Among students who attended community college in Tennessee, did access to SAILS increase the number of credits earned, rates of persistence, and on-time associate's degree completion rates?

  3. How did access to SAILS affect enrollment in remedial and college-level math courses at community colleges and subsequent grades in these courses?

  4. Do these results differ by gender, race, high school urbanicity,1 and incoming level of preparation, as measured by the ACT math score?

For our primary analyses, we answer these questions using a difference-in-difference-in-difference (DDD) design, which capitalizes on three sources of variation: a student's junior-year ACT math sub-score (less than 19 or at least 19), SAILS availability at their high school (available in 2013–14 or not available until later), and their senior year (before or during the 2013–14 academic year). The first difference is based on the fact that SAILS implementation was specifically targeted for students with ACT scores below the remediation threshold of 19. The second difference distinguishes whether the student attended a high school that offered SAILS in 2013–14, the first full-scale year, or in a later year (either 2014–15 or 2015–16). Restricting our sample to the subset of high schools that offered SAILS by 2015–16 enables us to focus on schools with a similar interest in adopting the program, as well as the administrative capacity required to meet its technology requirements. The third difference identifies whether the student was a senior in 2013–14, the first full-scale year of SAILS, or an earlier year, when they could not have benefited from SAILS. Through this approach, we identify the effect of SAILS access on outcomes for students with remediation-eligible ACT math scores who attended high schools that offered SAILS during their senior year in 2013–14. We also use an alternative specification that checks for differing trends, by year, for low-scoring students and for particular high schools, and find consistent results. We discuss the full analytic strategy in the Methods section.

The purpose of this study is to understand how a high school course that exempts students from remediation in college impacts their subsequent college academic outcomes. We isolate our sample to the first cohort of SAILS so as to track students for three years into college, and to examine the effects in a policy context that does not include “free college,” which few states have yet to adopt at scale. While several states have implemented initiatives to improve students’ pathways from K–12 into and through higher education, evaluating the evidence from existing efforts is critical for other states and districts considering moving to such a model.

Results from this study indicate that the SAILS program in Tennessee helped advance progress toward several, though not all, of the potential goals examined. We find SAILS-eligible students were significantly less likely to enroll in remedial math courses once in college, a pattern that holds across sex, race, and high school urbanicity. SAILS-eligible students were also more likely to enroll in college-level math, and more likely to pass college-level math overall—findings that were observed for all sex, race, and high school urbanicity subgroups (with the exception of black students, whose point estimates were also in a positive direction). Additionally, we find evidence that SAILS-eligible students earned significantly more credits within two years of high school, with magnitudes comparable to their elevated rates of taking and passing math overall. We do not detect significant differences in the high school completion and college enrollment rates for SAILS-eligible students. We also do not observe significant changes in the passage rates for college-level math courses or the proportion of students completing an on-time associate's degree. Taken together, these findings suggest that the SAILS program generally led to improvements on dimensions more immediately tied to the program design (e.g., reductions in math remediation, increases in taking and passing college-level math overall, increases in credits), with no detectable effects evident for measures less directly linked to math course-taking patterns (e.g., high school graduation, college enrollment, and associate's degree completion).

2.  SAILS Program Design

SAILS is a math course developed for public high school students in Tennessee who score below a 19 on the ACT math subtest. It is offered during a student's senior year in a high school classroom equipped with desktop or laptop computers, with a math teacher in the classroom at all times. Developed in partnership with the Tennessee Higher Education Commission, the Tennessee Board of Regents, and Chattanooga State Community College, the program was initially funded by the Governor's Online Innovation fund for eLearning initiatives.

SAILS schools are selected in a two-part process. First, school principals express interest in offering the course by completing a short application. SAILS program staff then select schools from this pool based on available funds and the space and teaching capacity for offering the course. Each school is allotted a certain number of seats in SAILS that is determined by the number of computers in a classroom and the cost per student for using the software. Because of these technology constraints, not all students with ACT math scores below 19 enroll in a SAILS course, even at schools that offer SAILS. In speaking with program staff, when there are capacity constraints in a school, teachers invite students to enroll in SAILS whom they feel are the most comfortable working independently in a computer-driven instructional environment. Students may also be encouraged to enroll in SAILS if they express interest in attending college, or if they report ACT math scores closer to 19. The reasons students are either invited to enroll or choose to enroll in SAILS are not fully observable, making selection into SAILS classes an analytic issue we discuss further below. On average, in 2013–14, approximately 44 percent of students attending a SAILS school with an ACT math score below 19 enrolled in SAILS.

Figure 1 illustrates the likelihood of SAILS participation at high schools offering SAILS in 2013–14. The figure presents the proportion of students at each ACT math score who enrolled in SAILS at the 121 participating high schools, with circle size used to illustrate the number of students with each ACT math score. SAILS students are primarily those with ACT math scores below 19, although there are a few exceptions to the right of the vertical cutoff.
Figure 1.
Likelihood of Seamless Alignment and Integrated Learning Support (SAILS) Participation at High Schools Offering SAILS In 2013—14, by Junior-Year ACT Math Score

Notes: The size of each circle represents the number of students at each ACT math score. The dashed line denotes the ACT math cutoff for assignment to remedial math (i.e., ACT math < 19).

Figure 1.
Likelihood of Seamless Alignment and Integrated Learning Support (SAILS) Participation at High Schools Offering SAILS In 2013—14, by Junior-Year ACT Math Score

Notes: The size of each circle represents the number of students at each ACT math score. The dashed line denotes the ACT math cutoff for assignment to remedial math (i.e., ACT math < 19).

The curriculum for SAILS is identical to the math remediation course students would take in community college. Unlike traditional twelfth grade math curricula, it relies on a Web-based interface and software program created by Pearson Higher Education. The curriculum comprises five modules and is self-paced and interactive. The five modules align with the remediation standards set forth by the Tennessee Board of Regents, and include: real number sense and operations; operations with algebraic expressions; analysis of graphs; solving equations; and modeling and critical thinking. Each module contains a narrative about the concept(s) presented, sample problems, and multiple choice and free response questions. Upon completing each module, students take a supervised post-test. Students who fail the test are allowed to complete the module again. When a student passes all of the five post-tests, the student has completed SAILS and has passed out of remedial coursework in college. For the 2013–14 cohort of SAILS students, 72 percent completed all five modules. Those students can then enroll in college-level math courses at any Tennessee public college.

Students with ACT math scores below 19 who do not enroll in SAILS take a course called Bridge Math, which is also designed for students likely to need remediation prior to college-level math. Bridge Math covers many of the same math concepts as SAILS math, but is taught in a more traditional, instructor-led manner that emphasizes real-world applications. Significantly, unlike SAILS, students who complete Bridge Math are not exempted from remedial coursework at the state's community colleges.

3.  Theory of Action Behind High School Transition Courses

There are several reasons why a high school remediation intervention may influence students’ postsecondary outcomes. Most obviously, allowing students to complete their remedial requirements while still in high school saves them time and money once enrolled in college, both strong predictors of college completion (Edgecombe 2011). Research overwhelmingly supports the concept of academic momentum, with first-year credit accumulation strongly influencing degree completion (DesJardins, Ahlburg, and McCall 2006; Attewell, Heil, and Reisel 2012; Davidson 2014; Monaghan and Attewell 2015; Belfield, Jenkins, and Lahr 2016). This concept of academic momentum relates to remedial courses as well. Accelerated remedial education (compared to traditional semester-long courses) is associated with increased enrollment and completion of gatekeeper math and English courses (Jaggars, Edgecombe, and Stacey 2014). Conversely, enrollment in extended math courses (e.g., an extra semester of remedial math) is associated with reduced completion of remedial course sequences and graduation (Ngo and Kosiewicz 2017).

Secondly, remedial courses can be effective at resolving students’ math deficiencies, which allows for academic success in college-level courses (Bahr 2008; Bettinger and Long 2009). A modularized, computer-based math curriculum can help identify students’ problem areas, allowing them to skip portions of the curriculum they already know (Bickerstaff, Fay, and Trimble 2016). Hybrid courses that combine online and face-to-face interaction result in positive learning outcomes compared to fully online and face-to-face courses (Means et al. 2010, 2013). The self-paced but supervised structure of a high school remediation course helps students develop academic self-regulation skills that are needed in college coursework (Karp and Bork 2012).

This exposure to an independent style of college course may, in turn, allow students who are performing well to feel a greater sense of self-efficacy toward math. Students who report higher levels of self-efficacy are more likely to graduate from high school (Alivernini and Lucidi 2011), report a higher undergraduate college grade point average (Lotkowski, Robbins, and Noeth 2004), and report higher rates of student retention from the first to second semester and in subsequent years (Chemers, Hu, and Garcia 2001; Lotkowski, Robbins, and Noeth 2004; Zajacova, Lynch, and Espenshade 2005).

Finally, the SAILS program may work through its explicit alignment with college coursework. Given that the SAILS curriculum was originally developed within the community colleges themselves, SAILS is better aligned to a college-level math course than a typical high school might otherwise be. Curricular alignment between K–12 and higher education has long been a topic of research (Jackson and Kurlaender 2016). Misaligned systems often result in students feeling discouraged by college placement exams and the additional remedial coursework necessary to catch up to college-level courses (Deil-Amen and Rosenbaum 2003). High school-to-college linking activities have a positive impact on college enrollment at both two- and four-year institutions (Engberg and Wolniak 2010). Exposure to college-type coursework in high school, such as dual enrollment or Advanced Placement courses, increases college enrollment, preparedness, persistence, and completion (Karp et al. 2007; Speroni 2011; Karp, Hughes, and Cormier 2012; Klopfenstein and Lively 2012; Struhl and Vargas 2012; An 2015; Grubb, Scott, and Good 2017; Smith, Hurwitz, and Avery 2017). Although a SAILS course does not count for college credit in the same way that dual enrollment courses do, it does help students to complete prerequisites for college material, while educating them on academic expectations for college.

4.  Research on High School-to-College Transition Courses

SAILS is a hybrid of early intervention programs that serve underprepared students, similar to summer bridge programs and dual enrollment courses that provide students with the opportunity to take college classes (or in this case, remedial classes) in high school. Similar programs exist across the country, and more are adopted every year, although the research on these specific types of programs is still limited.

Early assessment programs in high school have been found to reduce the need for college remediation in North Carolina and California (Howell, Kurlaender, and Grodsky 2010; Hilgoe et al. 2016; Jackson and Kurlaender 2016), but assessment information about college-readiness alone may not be sufficient to remove the need for remedial education (Tierny and Garcia 2011; Venzia and Voloch 2012). Precollege math transition courses are a more targeted effort to address these needs. The City University of New York's At Home in College program assigns students to a math transition course in the semester prior to college. A recent study found that the program had no impact on college readiness, but a small positive impact on passing the college math gatekeeper course (Trimble et al. 2017). A similar program in California, Early Start, requires students in need of remedial math and English courses to complete these requirements in the summer before their freshman year. Research found no significant improvements in performance or persistence under Early Start for students in need of remediation (Kurlaender, Lusher, and Case 2017). Taken together, these studies suggest mixed findings on the impact of post--high school, precollege transition courses on performance in subsequent math classes.

Few studies examine the effects of college remediation programs taking place during high school. Research from seven college and career readiness partnerships in Illinois found that 70 percent of the participating college and career readiness students completed their remedial math or English course prior to enrolling in college (Bragg and Taylor 2014). In Florida, completion of a remedial Mathematics for College Readiness course among high school seniors significantly reduced students’ probability of taking remedial education courses in college, particularly for African American and female students (Alexander 2013).

The study most similar to ours takes place in West Virginia and examines the effect of participation in a mathematics transition course on college-level math outcomes for the 2011–12 and 2012–13 high school senior cohorts. The authors found that the intervention did not improve academic outcomes for underprepared students who were near the assessment cutoff for being placed into transition courses (Pheatt, Trimble, and Barnett 2016). Given the relative recency of these types of high school/college partnerships, there is limited evidence in the field on the effectiveness of these programs. Our study is among the first to examine the effects of enrolling in a remedial course in high school that, if completed successfully, exempts a student from needing to enroll in a remedial course in college.

5.  Empirical Strategy

Data

The school- and student-level data used for this analysis originate from multiple sources: SAILS program data, the Tennessee Longitudinal Data System, the Tennessee Department of Education (TDOE) data system, and the Tennessee Higher Education Commission student information system. Data from the TDOE includes student demographic characteristics, student ACT test scores, public high school characteristics, and high school enrollment patterns for each senior who attended a public high school in Tennessee from the 2009–10 through 2013–14 school years. As a supplement to these data, we also merged selected attributes about public high schools in Tennessee from the Common Core of Data, a program of the U.S. Department of Education's National Center for Education Statistics.

Data from the SAILS program include a student-level indicator of participation in SAILS, along with measures of whether a student completed each of the five SAILS modules. Because some students attend multiple high schools during their senior year, we define a student's primary high school as the school attended for the greatest number of days during senior year. We identify SAILS schools as those high schools that were the primary high school for five or more SAILS students. This approach identifies 121 SAILS high schools in 2013–14, the first academic year that the SAILS program was fully implemented. We exclude 19 high schools that participated in a small-scale pilot in 2012–13,2 resulting in an analytic sample with 102 high schools that offered SAILS in 2013–14. Our comparison group includes 122 additional high schools that offered SAILS in 2014–15 or 2015–16 (but not 2012–13). At the student level, we restrict the sample to students for whom a junior-year ACT math score is available, given that the ACT math is used for assignment into SAILS classes. Over the time period examined, we observe a junior-year ACT math score for 82 percent of high school seniors.3

At the postsecondary level, we use National Student Clearinghouse data for the term-by-term college enrollment of each student. From the Tennessee Higher Education Commission's student information system, we received information about awards of postsecondary degrees and certificates. We obtain math course enrollment information and grades for students who attended a community college in Tennessee. Finally, we add several postsecondary institution-level characteristics using the Integrated Postsecondary Education Data System from the National Center for Education Statistics.

Sample

SAILS classes enrolled approximately 7,000 high school seniors in 2013–14. In this study, we examine the 102 high schools that initially offered SAILS as part of the first large-scale cohort of 2013–14 (but did not participate in the pilot year) in order to track multiyear student outcomes. Focusing our study on this year also allows us to isolate the effect of SAILS separate from the Tennessee Promise and statewide corequisite remediation, which occurred in subsequent years. This allows us to more readily apply our findings to the policy contexts in most other states, few of which have adopted tuition-free community college while simultaneously eliminating stand-alone remedial courses.

Table 1 provides descriptive statistics about the demographic characteristics of Tennessee high school seniors from 2009–10 through 2013–14, for all students (the top panel) and for students with an ACT math score below 19 (the bottom panel). This table focuses on students who attended high schools in the analytic sample (i.e., 102 high schools that first offered SAILS in 2013–14 and 122 high schools that did not offer SAILS until the 2014–15 or 2015–16 academic years). Columns 1–4 provide an overview for all high school seniors at schools in the analysis sample separately by each pre-SAILS year. Columns 5 and 6 show the descriptive statistics for SAILS participants and SAILS nonparticipants in the first full year of the program. The years before SAILS implementation are included to capture trends in the state prior to the adoption of SAILS. In these pretreatment years (columns 1–4), we see little variation in the characteristics of high school students over time.

Table 1.

High School Student Characteristics, by High School Senior Year and Seamless Alignment and Integrated Learning Support (SAILS) Participation Status

2013—14
2009—102010—112011—122012—13SAILS StudentsNon-SAILS Students
(1)(2)(3)(4)(5)(6)
Panel A: All Students 
White 0.764 0.775 0.772 0.779 0.778 0.778 
 (0.425) (0.417) (0.420) (0.415) (0.416) (0.416) 
Black 0.194 0.203 0.205 0.198 0.202 0.195 
 (0.395) (0.402) (0.404) (0.398) (0.402) (0.397) 
Hispanic 0.029 0.034 0.039 0.042 0.042 0.049 
 (0.167) (0.181) (0.194) (0.201) (0.200) (0.217) 
Female 0.513 0.508 0.502 0.508 0.533 0.500 
 (0.500) (0.500) (0.500) (0.500) (0.499) (0.500) 
ACT math 17.910 17.935 17.974 18.015 16.172 18.770 
 (3.991) (3.995) (4.069) (4.122) (1.770) (4.242) 
ACT math < 19 0.687 0.670 0.681 0.663 0.934 0.576 
 (0.464) (0.470) (0.466) (0.473) (0.248) (0.494) 
High schools 209 213 217 222 102 223 
Students 32,723 35,147 34,760 35,184 4,670 28,077 
Panel B: Students with ACT Math Scores Below 19 
White 0.708 0.719 0.720 0.723 0.771 0.700 
 (0.455) (0.450) (0.449) (0.447) (0.420) (0.458) 
Black 0.249 0.263 0.261 0.258 0.210 0.279 
 (0.432) (0.440) (0.439) (0.437) (0.408) (0.448) 
Hispanic 0.033 0.039 0.045 0.048 0.043 0.058 
 (0.178) (0.194) (0.208) (0.213) (0.202) (0.235) 
Female 0.533 0.522 0.511 0.518 0.542 0.516 
 (0.499) (0.500) (0.500) (0.500) (0.498) (0.500) 
ACT math 15.661 15.592 15.647 15.548 15.856 15.740 
 (1.477) (1.436) (1.445) (1.482) (1.266) (1.423) 
High schools 209 213 217 222 102 223 
Students 22,317 23,465 23,574 23,254 4,354 16,263 
2013—14
2009—102010—112011—122012—13SAILS StudentsNon-SAILS Students
(1)(2)(3)(4)(5)(6)
Panel A: All Students 
White 0.764 0.775 0.772 0.779 0.778 0.778 
 (0.425) (0.417) (0.420) (0.415) (0.416) (0.416) 
Black 0.194 0.203 0.205 0.198 0.202 0.195 
 (0.395) (0.402) (0.404) (0.398) (0.402) (0.397) 
Hispanic 0.029 0.034 0.039 0.042 0.042 0.049 
 (0.167) (0.181) (0.194) (0.201) (0.200) (0.217) 
Female 0.513 0.508 0.502 0.508 0.533 0.500 
 (0.500) (0.500) (0.500) (0.500) (0.499) (0.500) 
ACT math 17.910 17.935 17.974 18.015 16.172 18.770 
 (3.991) (3.995) (4.069) (4.122) (1.770) (4.242) 
ACT math < 19 0.687 0.670 0.681 0.663 0.934 0.576 
 (0.464) (0.470) (0.466) (0.473) (0.248) (0.494) 
High schools 209 213 217 222 102 223 
Students 32,723 35,147 34,760 35,184 4,670 28,077 
Panel B: Students with ACT Math Scores Below 19 
White 0.708 0.719 0.720 0.723 0.771 0.700 
 (0.455) (0.450) (0.449) (0.447) (0.420) (0.458) 
Black 0.249 0.263 0.261 0.258 0.210 0.279 
 (0.432) (0.440) (0.439) (0.437) (0.408) (0.448) 
Hispanic 0.033 0.039 0.045 0.048 0.043 0.058 
 (0.178) (0.194) (0.208) (0.213) (0.202) (0.235) 
Female 0.533 0.522 0.511 0.518 0.542 0.516 
 (0.499) (0.500) (0.500) (0.500) (0.498) (0.500) 
ACT math 15.661 15.592 15.647 15.548 15.856 15.740 
 (1.477) (1.436) (1.445) (1.482) (1.266) (1.423) 
High schools 209 213 217 222 102 223 
Students 22,317 23,465 23,574 23,254 4,354 16,263 

Notes: Table displays means, with standard deviations in parentheses. This table categorizes students based on their (most recent) senior year of high school, based on administrative data from Tennessee. The analytic sample includes 102 high schools that first offered SAILS in 2013—14 and 122 additional high schools that first offered SAILS in 2014—15 or 2015—16. The number of high schools represents the number of high schools in the analytic sample at which students in each column were enrolled (e.g., in 2013—14, there were 223 high schools with seniors not enrolled in SAILS).

Overall, as shown in panel A of table 1, there are also no notable differences in the racial and ethnic backgrounds of SAILS and non-SAILS students in 2013–14. In this unrestricted sample, a higher proportion of SAILS students than non-SAILS students were female (53 percent versus 50 percent, respectively). Predictably, the ACT math score for SAILS students was notably lower than that of their non-SAILS peers. In total, 93 percent of SAILS students had an ACT math score below the remediation threshold of 19, compared with 58 percent of non-SAILS students. Panel B, which focuses specifically on students with ACT math scores below 19, indicates that white students accounted for 77 percent of SAILS students, compared with just 70 percent of non-SAILS students. Female students also accounted for a larger share of SAILS students than non-SAILS students among those with ACT math scores below the remediation threshold (54 percent versus 52 percent, respectively).

Table 2 provides a descriptive summary of the 102 high schools that first offered SAILS in 2013–14 and the 122 high schools that did not offer SAILS until 2014–15 or 2015–16 (columns 1 and 2, respectively). The high schools that volunteered and were selected for SAILS in the first full year of the program (column 1) did not significantly differ from the later adopters (column 2) in terms of the share that were located in a rural area, the percent of the student body who were members of racial and ethnic minority groups, or the percent of students eligible for free or reduced-price lunch. Nearly half of all SAILS schools are located in rural areas. Just below 30 percent of students at these SAILS schools were members of racial/ethnic minority groups. Below we explore the impacts of SAILS for students attending rural, suburban, and urban high schools separately in an effort to determine if SAILS may be more or less effective in certain contexts.

Table 2.

Descriptive Summary of Tennessee High Schools that Adopted Seamless Alignment and Integrated Learning Support (SAILS)

Early SAILS High SchoolsLater SAILS High Schools
(1)(2)t Statistic
Rural 0.422 0.475 0.788 
 (0.496) (0.501)  
Racial minority students 0.273 0.297 0.552 
 (0.282) (0.339)  
FRPL-eligible students 0.539 0.561 0.924 
 (0.176) (0.180)  
Number of high schools 102 122  
Early SAILS High SchoolsLater SAILS High Schools
(1)(2)t Statistic
Rural 0.422 0.475 0.788 
 (0.496) (0.501)  
Racial minority students 0.273 0.297 0.552 
 (0.282) (0.339)  
FRPL-eligible students 0.539 0.561 0.924 
 (0.176) (0.180)  
Number of high schools 102 122  

Notes: Table displays means, with standard deviations in parentheses. Early SAILS high schools refers to 102 high schools that first adopted SAILS in the 2013—14 academic year. Later SAILS high schools refers to 122 high schools that first adopted SAILS in the 2014—15 or 2015—16 academic year. FRPL = free or reduced-price lunch.

A descriptive summary of key outcome measures for high school seniors in the analytic sample is presented graphically in figure 2. Additionally, for reference, online appendix table A.2 compares the means for students at early SAILS high schools and later SAILS high schools across all outcomes for the 2013–14 year. These include, for all Tennessee high school seniors in the analytic sample from 2009–10 to 2013–14, graduation from high school, enrollment in college, and enrollment in a Tennessee community college (the left panel of figure 2); among students who enrolled in community college, the right panel of figure 2 displays rates of remedial and college-level math course-taking and passing. In the left panel, we observe consistent high school graduation rates and college enrollment rates over the time period of this study, with a subtle increase in the percent of students enrolling in a Tennessee community college in the 2013–14 academic year. In the right panel of figure 2, we observe the percent of students taking remedial math declined sharply in the first year of full-scale SAILS adoption. In years prior to SAILS, approximately three quarters (73–78 percent) of low-scoring students who attended Tennessee community colleges in the first year after high school enrolled in remedial math coursework; for 2013–14, the first SAILS year, the share of SAILS treatment students who took remedial math courses dropped to 57 percent. Along with the decline in remedial coursework within the first year after high school was a concomitant increase in the share of these students who enrolled in college-level math courses, from 23 percent for 2009–10 seniors to 40 percent among 2013–14 seniors.
Figure 2.

Trends for College Enrollment Outcomes and Math Coursetaking Outcomes, 2009—10 through 2013—14 Senior Cohorts

Notes: TN CC = Tennessee Community College; HS = high school.

Figure 2.

Trends for College Enrollment Outcomes and Math Coursetaking Outcomes, 2009—10 through 2013—14 Senior Cohorts

Notes: TN CC = Tennessee Community College; HS = high school.

Methods

There are several methodological challenges resulting from the enrollment process into SAILS, as well as the scale-up of the program over time. Because the program was not adopted across entire high schools at scale due to issues of computer capacity constraints, all students who might have been eligible for participation did not have equal access to the program. This introduces an element of nonrandom selection into the process. We know from complementary qualitative research about SAILS schools that teachers may have selected not only students with higher ACT math scores for the program, but also those they believed would be the most inclined toward a self-paced, technology-assisted math course. This makes enrollment into SAILS an issue of selection in our schools. For these reasons, an ordinary least squares regression analysis, even with high school and year fixed effects, would still not address these important issues of selection into the program.

Another possible analytic strategy is the use of a regression discontinuity (RD) design. A RD design is used when there is a known assignment mechanism with a discontinuity in eligibility at a cut-point, and assignment is as good as random near the cutoff. Although there is a clear remediation eligibility threshold (ACT math < 19), concerns arise when using a RD approach on a 36-point ACT scale. The What Works Clearinghouse guidelines for RD design require the analysis to use at least four unique values of the forcing variable above and below the cutoff (US ED 2017). Based on the threshold at 19, the What Works Clearinghouse guidance would mean comparing students with junior year ACT math scores of 15–18 and 19–22 when using a 36-point scale. It is difficult to argue convincingly, however, that eligibility within four scaled ACT points of the cutoff is “as good as random,” given the difference in correct answers required for scores in this range. For example, a score of 15 typically requires 15–18 correct answers, while a score of 22 typically requires 34–35 correct answers on the 60-question ACT math section. A RD design would be more persuasive if we had access to the underlying raw scores (i.e., prior to conversion to a 36-point scale), which would facilitate comparisons based on a narrower bandwidth around the eligibility cutoff.4

Instead, we adopt a DDD model, accounting for differences in students’ ACT scores (above/below 19), availability of SAILS at high schools (early SAILS or later SAILS), and senior year timing (before/during 2013–14). As such, we compare changes in postsecondary outcomes for remediation-eligible students attending SAILS high schools in the program's first full-scale year to changes for similarly remediation-eligible students at high schools that did not offer SAILS during the same time period (but did eventually offer SAILS). Therefore, our estimates capture the within-cohort effects of offering SAILS in a high school in 2013–14 relative to the secular trends we infer from remediation-eligible and remediation-ineligible students at the high schools that did not adopt SAILS in this year (but did in the following two academic years). These later-adopting high schools provide the counterfactual of the outcomes that remediation-eligible students at early SAILS high schools would have experienced had their high schools not offered the program during their senior year. We estimate the following model:
Y=α+β1Below19+β2SAILSAvail+β3Below19*SAILSAvail+Xiβ+δs+λt+ɛ,
(1)
where Below19 is equal to one if a student's junior-year ACT math score is below 19 (and zero otherwise), and SAILSAvail equals one if a high school offered SAILS in 2013–14. Xiβ is a vector of student covariates potentially related to SAILS placement, including gender, race, and ethnicity. For outcomes that are contingent on college enrollment, this term also includes Pell Grant eligibility status. δs and λt are high school and year fixed effects, respectively, with the year fixed effect capturing the difference related to a student's senior year. ɛ is an idiosyncratic error term clustered at the high school level. Y represents our outcomes of interest, which include high school graduation, college enrollment, number of credits earned, persistence to years 2 and 3, associate's degree and certificate completion, and enrolling in and passing college-level math courses. For simplicity, we report our main outcomes after two years for college enrollment, persistence, and degree completion (tables 3 and 4), and outcomes after one year for math course-taking (table 5).5
Table 3.

Estimates of Seamless Alignment and Integrated Learning Support (SAILS) on High School Completion and College Enrollment Outcomes, 2009—10 through 2013—14 High School Cohorts

Reduced FormInstrumental Variables (IV)
(1)(2)Mean (ACT math < 19 at later-adopting SAILS HS)
SAILS participant 0.389*** — 0.000 
 (0.027)   
Completed high school 0.001 −0.002 0.934 
 (0.004) (0.010)  
Enrolled in college by spring of second academic year after high school graduation −0.001 0.000 0.541 
 (0.010) (0.020)  
Enrolled in TBR CC by spring of second academic year after high school graduation −0.005 −0.026 0.267 
 (0.013) (0.020)  
Enrolled in 4-year college by spring of second academic year after high school graduation 0.002 0.015 0.254 
 (0.009) (0.020)  
Observations   170,573 
Reduced FormInstrumental Variables (IV)
(1)(2)Mean (ACT math < 19 at later-adopting SAILS HS)
SAILS participant 0.389*** — 0.000 
 (0.027)   
Completed high school 0.001 −0.002 0.934 
 (0.004) (0.010)  
Enrolled in college by spring of second academic year after high school graduation −0.001 0.000 0.541 
 (0.010) (0.020)  
Enrolled in TBR CC by spring of second academic year after high school graduation −0.005 −0.026 0.267 
 (0.013) (0.020)  
Enrolled in 4-year college by spring of second academic year after high school graduation 0.002 0.015 0.254 
 (0.009) (0.020)  
Observations   170,573 

Notes: Standard errors in parentheses; errors are clustered at the high school level. Regressions include the following covariates: race, ethnicity, and sex. High school completion includes diploma and other alternative certifications and is limited to students observed as seniors. Excludes college enrollment while in high school. The analytic sample includes 102 high schools that first offered SAILS in 2013—14 and 122 additional high schools that first offered SAILS in 2014—15 or 2015—16. TBR CC = Tennessee Board of Regents Community College.

***p < .001.

Table 4.

Coefficient Estimates of Seamless Alignment and Integrated Learning Support (SAILS) on Community College Performance Outcomes for Students Who Attended a Tennessee Community College (TN CC), 2009—10 through 2013—14 High School Cohorts

Reduced FormInstrumental Variables (IV)
Dependent Variable(1)(2)Mean (ACT math < 19 at later-adopting SAILS high school)
Persistence to year 3 of college (if attended TN CC) 0.012 0.022 0.611 
 (0.020) (0.045)  
 [24,875] [24,875]  
Total credits earned by year 2 after high school (if attended TN CC) 2.828*** 5.856*** 23.951 
 (0.832) (1.647)  
 [33,448] [33,448]  
Earned AA by year 2 after high school (if attended TN CC) 0.009 0.011 0.060 
 (0.010) (0.024)  
 [37,740] [37,740]  
Earned certificate by year 2 after high school (if attended TN CC) −0.004 0.005 0.050 
 (0.010) (0.020)  
 [37,740] [37,740]  
Earned credential (AA/certification) by year 2 after high school (if attended TN CC) −0.000 0.002 0.105 
 (0.012) (0.028)  
 [37,740] [37,740]  
Reduced FormInstrumental Variables (IV)
Dependent Variable(1)(2)Mean (ACT math < 19 at later-adopting SAILS high school)
Persistence to year 3 of college (if attended TN CC) 0.012 0.022 0.611 
 (0.020) (0.045)  
 [24,875] [24,875]  
Total credits earned by year 2 after high school (if attended TN CC) 2.828*** 5.856*** 23.951 
 (0.832) (1.647)  
 [33,448] [33,448]  
Earned AA by year 2 after high school (if attended TN CC) 0.009 0.011 0.060 
 (0.010) (0.024)  
 [37,740] [37,740]  
Earned certificate by year 2 after high school (if attended TN CC) −0.004 0.005 0.050 
 (0.010) (0.020)  
 [37,740] [37,740]  
Earned credential (AA/certification) by year 2 after high school (if attended TN CC) −0.000 0.002 0.105 
 (0.012) (0.028)  
 [37,740] [37,740]  

Notes: Standard errors in parentheses; errors are clustered at the high school level. Associate's degree (AA) completion limited to students who attended associate's-granting institutions. Regressions include the following covariates: race, ethnicity, sex, and Pell Grant eligibility status. The analytic sample includes 102 high schools that first offered SAILS in 2013—14 and 122 additional high schools that first offered SAILS in 2014—15 or 2015—16.

***p < .001.

Table 5.

Coefficient Estimates of Seamless Alignment and Integrated Learning Support (SAILS) on Math Course Outcomes Who Attended a Tennessee Community College, 2009—10 through 2013—14 High School (HS) Cohorts

Difference-in-Differences (DDD)
DDD Reduced FormInstrumental Variables (IV)ACT Math ≤ 16ACT Math = 17 or 18ACT Math ≥ 19Mean (ACT math < 19 at later-adopting SAILS high school)
Dependent Variable(1)(2)(3)(4)(5)(6)
Took remedial math by year 1 after HS −0.312*** −0.723*** −0.320*** −0.244*** 0.002 0.766 
 (0.026) (0.030) (0.032) (0.033) (0.004) 
 [37,740] [37,740] [25,913] [13,889] [24,862] 
Took college math by year 1 after HS 0.217*** 0.512*** 0.189*** 0.105*** 0.026 0.260 
 (0.022) (0.037) (0.024) (0.031) (0.026) 
 [37,740] [37,740] [25,913] [13,889] [24,862] 
Passed college math by year 1 after HS (if took math) −0.062** −0.136*** −0.099*** −0.052* −0.016 0.700 
 (0.019) (0.038) (0.038) (0.031) (0.022) 
 [15,507] [15,507] [9,262] [8,187] [18,682] 
Passed college math by year 1 after HS 0.120*** 0.281*** 0.082*** 0.053** 0.009 0.182 
 (0.020) (0.035) (0.018) (0.026) (0.026) 
 [37,740] [37,740] [25,913] [13,889] [24,862] 
Difference-in-Differences (DDD)
DDD Reduced FormInstrumental Variables (IV)ACT Math ≤ 16ACT Math = 17 or 18ACT Math ≥ 19Mean (ACT math < 19 at later-adopting SAILS high school)
Dependent Variable(1)(2)(3)(4)(5)(6)
Took remedial math by year 1 after HS −0.312*** −0.723*** −0.320*** −0.244*** 0.002 0.766 
 (0.026) (0.030) (0.032) (0.033) (0.004) 
 [37,740] [37,740] [25,913] [13,889] [24,862] 
Took college math by year 1 after HS 0.217*** 0.512*** 0.189*** 0.105*** 0.026 0.260 
 (0.022) (0.037) (0.024) (0.031) (0.026) 
 [37,740] [37,740] [25,913] [13,889] [24,862] 
Passed college math by year 1 after HS (if took math) −0.062** −0.136*** −0.099*** −0.052* −0.016 0.700 
 (0.019) (0.038) (0.038) (0.031) (0.022) 
 [15,507] [15,507] [9,262] [8,187] [18,682] 
Passed college math by year 1 after HS 0.120*** 0.281*** 0.082*** 0.053** 0.009 0.182 
 (0.020) (0.035) (0.018) (0.026) (0.026) 
 [37,740] [37,740] [25,913] [13,889] [24,862] 

Notes: Standard errors in parentheses; errors are clustered at the high school level. Math course data is available for community colleges in Tennessee. Excludes college-level math courses taken while in high school. Regressions include the following covariates: race, ethnicity, sex, and Pell Grant eligibility status. The analytic sample includes 102 high schools that first offered SAILS in 2013—14 and 122 additional high schools that first offered SAILS in 2014—15 or 2015—16.

*p < .05; **p < .01; ***p < .001.

Equation 1 provides intent-to-treat (ITT) estimates of the within-cohort effects of being remediation-eligible in SAILS-offering high schools across Tennessee. One advantage of the reduced-form DDD equation is that it offers policy-relevant estimates of the effects of offering SAILS for all targeted students, not just the effects for SAILS participants themselves. Given the capacity constraints of the program (e.g., approximately 40 percent of eligible students at participating schools took SAILS), the reduced-form DDD estimates may suggest realistic impacts of the SAILS program at scale.

To recover treatment-on-the-treated estimates for the effects of SAILS, we also use an instrumental variables (IV) approach in which we instrument for SAILS participation based on whether a student had a junior-year ACT math score below 19 and whether the student attended a high school that offered SAILS in their senior year, using school and year fixed effects to account for temporal trends and school-level variation. In the case of the IV approach, the treatment group includes students who were induced into taking SAILS through the combination of their attendance at an early SAILS high school and having a junior-year ACT math score below 19, while the counterfactual is based on students who did not enroll in SAILS. Assuming that the necessary conditions are satisfied, the IV approach yields a local average treatment effect, or the impact of actually enrolling in the SAILS program. Given the strong assumptions required for this approach, it is likely that the IV results presented here represent upper-bound estimates. See online appendix table A.6 for a summary of first-stage results by subgroup.

A critical condition for a valid IV analysis is that the exclusion restriction must hold for our instrument (i.e., having an ACT math score below 19 at an early SAILS high school). This means that the only way that scoring below a 19 on the ACT math exam at an early SAILS high school should affect college outcomes is through participation in SAILS. It seems plausible that a student's likelihood of taking remedial math in college would not change unless the student completed SAILS, given that completion of the program provides an exemption from remediation. One might hypothesize that teachers’ selection of SAILS participants could lead to some students feeling accepted/encouraged and others feeling excluded/discouraged, which might affect their subsequent college and math enrollment behaviors. However, if one considers all of these messages as part of the SAILS program, the exclusion restriction seems reasonably likely to be upheld. Relative to the reduced-form DDD approach, the IV model produces estimates with coefficients that are rescaled proportional to the rate of participation in the program. The reduced-form equation resembles the second stage of the two-stage IV model, except that the reduced-form equation directly substitutes the instrument in place of the predicted SAILS participation value from the first-stage equation. We discuss both the reduced-form ITT and the IV local average treatment effect estimates in our Results section.

We also consider two alternative estimation strategies in addition to the model described in equation 1. First, we also used an alternative DDD model that accounts for an interaction between having an ACT math score below 19 and the year, as well as an interaction between having an ACT math score below 19 and the high school. As in equation 1, a year fixed effect helps account for variation based on a student's senior year. Such a specification allows for the possibility that students with junior-year ACT math scores below 19 experienced separate trends by year and high school than their counterparts with higher ACT math scores. This model is captured in equation 2:
Y=α+β1Below19+β2SAILSAvail+β3Below19*SAILSAvail+Xiβ+δs+δs*Below19+λt+λt*Below19+ɛ.
(2)

Second, relying on the base model in equation 1, we also explore the results if we restrict the sample only to students within three points of the 19 ACT math threshold (i.e., ACT math score of 16–21) to ensure that students with test scores at the very low or high ends of the distribution are not driving the results. Prior research on college remediation suggests that students who barely miss college readiness cut scores may require a different intervention than students with much lower test scores (Boatman and Long 2018). It may be that the SAILS program varies in its impact for different student groups. The results for these two alternative estimation strategies are presented in online appendix table A.3. The two alternative estimation strategies yield estimates that are qualitatively similar to the main reduced-form model, with the exception of credit attainment in the approach that restricts the sample to students with ACT math scores within three points of the cutoff. Because of the broad similarities across these specifications, we focus on the ITT results of the main reduced-form model described in equation 1.

As is common for DDD approaches, we also test for common trends and other robustness checks, as described below. In figure 3, we present trends in credit attainment and math course-taking outcomes for the students in our analytic sample over time. Each graph compares high school seniors attending a SAILS high school that adopted SAILS in 2013–14 (dark gray lines) to students attending a SAILS high school that adopted SAILS in 2014–15 or 2015–16 (light gray lines), with separate groups based on whether the student's junior-year ACT math score was below 19 (solid lines) or at least 19 (dashed lines). Although the parallel trends assumption is not directly testable, the figures provide visual evidence of the similar pretreatment fluctuations for measures on which we detect statistically significant effects for 2013–14 seniors who were SAILS-eligible and primarily attended a high school that offered SAILS. The individual graphs represent several outcomes for students who enrolled in a Tennessee community college within one year of high school: the total number of credits accumulated within the first year of college, the percent of students taking remedial math and college math within the first year, the rate at which students passed college math in their first year (conditional on taking math), and the overall percent of students passing college math by the first year. Consistent with the notion of parallel trends, students at early SAILS high schools (solid lines) generally exhibit similar trends on these measures compared to students at later SAILS high schools (dashed lines) in the years prior to the adoption of SAILS in 2013–14, for both students below and above the remediation threshold. These lines begin to diverge in 2013–14 for students with ACT math scores below 19 with the adoption of the SAILS program.
Figure 3.

Trends in Credit Attainment and Math Coursetaking Outcomes, by High School (HS) Status as Early/Later Adopter of Seamless Alignment and Integrated Learning Support (SAILS) and Junior-Year ACT Math Score, 2009—10 through 2013—14 Senior Cohorts

Notes: CC = community college.

Figure 3.

Trends in Credit Attainment and Math Coursetaking Outcomes, by High School (HS) Status as Early/Later Adopter of Seamless Alignment and Integrated Learning Support (SAILS) and Junior-Year ACT Math Score, 2009—10 through 2013—14 Senior Cohorts

Notes: CC = community college.

6.  Results

Table 3 presents the effects of enrolling in a SAILS school during the senior year of high school on high school graduation and college attendance. From the reduced-form model in column 1, SAILS eligibility (i.e., having a junior-year ACT math score below 19 and attending a high school offering SAILS in 2013–14) increased the rate of participation in SAILS by 39 percent at the high schools in the analytic sample.6 Although 39 percent is a sizable portion of the student body, it does illustrate the computer capacity constraints that many high schools experienced within the first year of the SAILS program. We interpret our reduced-form DDD model (column 1) as our primary model. All else equal, column 1 suggests that eligibility for SAILS was not associated with statistically significant differences in the rate of enrollment into a community college or a four-year college. Fifty-four percent of high school seniors with ACT math scores below 19 in our sample enrolled in college by the spring of the second year after high school. There also appears to be no effect of SAILS eligibility on high school degree completion. Similarly, among SAILS participants themselves, our IV model (column 2) does not detect any statistically significant effects for high school graduation or college enrollment rates. For reduced-form DDD estimates for outcomes at a variety of different time points, see online appendix table A.3.

Table 4 focuses on several college outcomes for students who attended a Tennessee community college. Credits earned are expressed in the original units, while the remaining outcomes are represented as proportions. Among students who enrolled in community college, we do not observe a statistically significant effect on retention to the third year of college or credential attainment rates within two years. For all of these outcomes, we find fairly precise null effects. We do, however, observe differences in the number of total credits earned by the second year after high school. For SAILS-eligible students who enroll in community college (column 1), we detect 2.8 additional credits earned within two years. Among students who actually participated in SAILS (column 2), the estimated impact of SAILS is an increase of 5.9 additional credits within two years. Based on the descriptive statistics available in online appendix table A.2, these increases in credits represent approximately 11 percent of a standard deviation for SAILS-eligible students, and 23 percent of a standard deviation for SAILS participants.

The outcomes in table 5 focus on enrollment and performance in math courses for students who attended one of the thirteen Tennessee community colleges. Consistent with the goals and structure of the SAILS program, SAILS-eligible students experienced clear reductions in the rate of remedial math course-taking, amounting to a 31.2 percentage point decline by the spring after their senior year of high school. For SAILS participants, this corresponded to a 72.3 percentage point decline in remedial course-taking, a magnitude nearly equivalent to the 76.6 percent of students with comparable junior-year ACT math scores at later-adopting high schools who took remedial math. SAILS-eligible students also experienced increased rates of college-level math course-taking, amounting to 21.7 percentage points higher by the spring of their first year of college (51.2 percentage points higher for SAILS participants). While the share of SAILS students who took college-level math increased, our reduced-form estimates indicate a statistically significant 6.2 percentage point decline in the rate that SAILS-eligible students passed college-level math (among students who took college-level math) within the first year after high school (13.6 percentage points lower for SAILS participants). Although the pass rate declined among students who took college-level math, we observe that the overall share of community college passing college-level math within one year of high school was 12.0 percentage points higher for SAILS-eligible students (28.1 percentage points higher for SAILS participants). This is consistent with what we would expect, given the selection into college math-taking prior to SAILS.

Table 5 also provides difference-in-difference estimates (omitting the third difference based on junior-year ACT math scores) to explore variation by ACT score subgroups. It could be the case that the effects of SAILS on college math performance differ for students with very low ACT math scores compared with students with scores closer to the ACT eligibility cutoff of 19. Columns 3–5 present estimates for students with ACT math scores at or below 16, for students with ACT math scores of 17 or 18, and for students with ACT math scores at or above 19, respectively. Across all college math outcomes, estimates for students with ACT math scores of 16 or lower are larger in magnitude than those for students with a 17 or 18, with, as we would expect, no significant differences detected for students with scores of 19 or higher.

Finally, table 6 presents the reduced-form DDD estimates by subgroup, including sex, race, and high school urbanicity. Overall, consistent with program goals, we observe a statistically significant decline in enrollment in remedial math courses among SAILS-eligible students, a pattern that holds across sex, race, and high school urbanicity. We were also able to detect increases in college-level math-taking and overall increases in the share of SAILS-eligible students who passed college-level math for all sex, race, and high school urbanicity subgroups, with the exception of black students (whose point estimates were also in a positive direction). We also find some evidence of variation in outcomes by subgroups. For high school completion, SAILS eligibility was associated with a 1.2 percentage point increase for students at rural schools, but a 1.8 percentage point decrease for students at urban schools.7

Table 6.

Reduced-Form (RF) Coefficient Estimates of Seamless Alignment and Integrated Learning Support (SAILS) on College Performance Outcomes, 2009—10 through 2013—14 High School (HS) Cohorts

SexRaceHigh School Urbanicity
Dependent VariableOverall (RF)WomenMenWhiteBlackRuralSuburbanUrban
SAILS participant 0.389*** 0.410*** 0.367*** 0.392*** 0.369*** 0.359*** 0.472*** 0.322*** 
 (0.027) (0.029) (0.028) (0.028) (0.046) (0.046) (0.047) (0.044) 
Completed high school 0.001 0.002 −0.000 0.003 0.003 0.012* 0.002 −0.018* 
 (0.004) (0.005) (0.006) (0.005) (0.007) (0.006) (0.007) (0.008) 
Enrolled in college by spring of second academic year after HS graduation −0.000 −0.001 0.000 0.003 0.002 0.011 −0.022 0.007 
 (0.010) (0.013) (0.013) (0.011) (0.025) (0.014) (0.018) (0.018) 
Enrolled in TN CC by spring of second academic year after HS graduation −0.004 0.008 −0.015 −0.001 0.012 0.010 −0.013 −0.017 
 (0.013) (0.015) (0.015) (0.013) (0.030) (0.021) (0.023) (0.021) 
Enrolled in 4-year college by spring of second academic year after HS graduation 0.001 −0.016 0.016 −0.001 0.034 −0.003 −0.012 0.021 
 (0.009) (0.013) (0.014) (0.009) (0.024) (0.015) (0.014) (0.019) 
 
Persistence to year 3 of college (if attended TN CC) 0.012 0.041 −0.024 0.006 0.068 0.032 −0.011 0.031 
 (0.020) (0.030) (0.032) (0.022) (0.111) (0.031) (0.034) (0.043) 
Total credits earned by year 2 after HS (if attended TN CC) 2.828*** 4.055** 0.809 2.809*** 3.498 4.027* 1.642 3.033** 
 (0.832) (1.257) (1.134) (0.815) (3.671) (1.536) (1.443) (1.112) 
Earned associate's by year 2 after HS (if attended TN CC) 0.009 0.014 0.003 0.007 0.011 −0.013 0.017 0.023 
 (0.010) (0.017) (0.013) (0.011) (0.038) (0.019) (0.013) (0.016) 
Earned certificate by year 2 after HS (if attended TN CC) −0.004 −0.008 −0.004 −0.005 0.016 −0.005 0.001 −0.006 
 (0.010) (0.014) (0.014) (0.011) (0.034) (0.021) (0.016) (0.009) 
Earned credential by year 2 after HS (if attended TN CC) −0.000 −0.003 −0.002 −0.001 0.014 −0.029 0.015 0.016 
 (0.012) (0.017) (0.019) (0.014) (0.047) (0.024) (0.018) (0.016) 
Took remedial math by year 1 after high school (if attended TN CC) −0.312*** −0.338*** −0.265*** −0.315*** −0.307*** −0.267*** −0.377*** −0.289*** 
 (0.026) (0.026) (0.032) (0.027) (0.057) (0.038) (0.045) (0.044) 
Took college math by year 1 after high school (if attended TN CC) 0.217*** 0.243*** 0.188*** 0.214*** 0.117 0.191*** 0.265*** 0.205*** 
 (0.022) (0.027) (0.028) (0.024) (0.063) (0.043) (0.030) (0.025) 
Passed college math by year 1 after high school (if attended TN CC & took math) −0.062** −0.009 −0.136*** −0.054* −0.020 −0.049 −0.067 −0.053 
 (0.019) (0.027) (0.038) (0.021) (0.091) (0.034) (0.034) (0.031) 
 
Passed college math by year 1 after high school (if attended TN CC) 0.120*** 0.173*** 0.056* 0.125*** 0.054 0.115* 0.153*** 0.093*** 
 (0.020) (0.028) (0.027) (0.022) (0.068) (0.044) (0.027) (0.024) 
SexRaceHigh School Urbanicity
Dependent VariableOverall (RF)WomenMenWhiteBlackRuralSuburbanUrban
SAILS participant 0.389*** 0.410*** 0.367*** 0.392*** 0.369*** 0.359*** 0.472*** 0.322*** 
 (0.027) (0.029) (0.028) (0.028) (0.046) (0.046) (0.047) (0.044) 
Completed high school 0.001 0.002 −0.000 0.003 0.003 0.012* 0.002 −0.018* 
 (0.004) (0.005) (0.006) (0.005) (0.007) (0.006) (0.007) (0.008) 
Enrolled in college by spring of second academic year after HS graduation −0.000 −0.001 0.000 0.003 0.002 0.011 −0.022 0.007 
 (0.010) (0.013) (0.013) (0.011) (0.025) (0.014) (0.018) (0.018) 
Enrolled in TN CC by spring of second academic year after HS graduation −0.004 0.008 −0.015 −0.001 0.012 0.010 −0.013 −0.017 
 (0.013) (0.015) (0.015) (0.013) (0.030) (0.021) (0.023) (0.021) 
Enrolled in 4-year college by spring of second academic year after HS graduation 0.001 −0.016 0.016 −0.001 0.034 −0.003 −0.012 0.021 
 (0.009) (0.013) (0.014) (0.009) (0.024) (0.015) (0.014) (0.019) 
 
Persistence to year 3 of college (if attended TN CC) 0.012 0.041 −0.024 0.006 0.068 0.032 −0.011 0.031 
 (0.020) (0.030) (0.032) (0.022) (0.111) (0.031) (0.034) (0.043) 
Total credits earned by year 2 after HS (if attended TN CC) 2.828*** 4.055** 0.809 2.809*** 3.498 4.027* 1.642 3.033** 
 (0.832) (1.257) (1.134) (0.815) (3.671) (1.536) (1.443) (1.112) 
Earned associate's by year 2 after HS (if attended TN CC) 0.009 0.014 0.003 0.007 0.011 −0.013 0.017 0.023 
 (0.010) (0.017) (0.013) (0.011) (0.038) (0.019) (0.013) (0.016) 
Earned certificate by year 2 after HS (if attended TN CC) −0.004 −0.008 −0.004 −0.005 0.016 −0.005 0.001 −0.006 
 (0.010) (0.014) (0.014) (0.011) (0.034) (0.021) (0.016) (0.009) 
Earned credential by year 2 after HS (if attended TN CC) −0.000 −0.003 −0.002 −0.001 0.014 −0.029 0.015 0.016 
 (0.012) (0.017) (0.019) (0.014) (0.047) (0.024) (0.018) (0.016) 
Took remedial math by year 1 after high school (if attended TN CC) −0.312*** −0.338*** −0.265*** −0.315*** −0.307*** −0.267*** −0.377*** −0.289*** 
 (0.026) (0.026) (0.032) (0.027) (0.057) (0.038) (0.045) (0.044) 
Took college math by year 1 after high school (if attended TN CC) 0.217*** 0.243*** 0.188*** 0.214*** 0.117 0.191*** 0.265*** 0.205*** 
 (0.022) (0.027) (0.028) (0.024) (0.063) (0.043) (0.030) (0.025) 
Passed college math by year 1 after high school (if attended TN CC & took math) −0.062** −0.009 −0.136*** −0.054* −0.020 −0.049 −0.067 −0.053 
 (0.019) (0.027) (0.038) (0.021) (0.091) (0.034) (0.034) (0.031) 
 
Passed college math by year 1 after high school (if attended TN CC) 0.120*** 0.173*** 0.056* 0.125*** 0.054 0.115* 0.153*** 0.093*** 
 (0.020) (0.028) (0.027) (0.022) (0.068) (0.044) (0.027) (0.024) 

Notes: Standard errors in parentheses; errors are clustered at the high school level. Excludes college-level math courses taken while in high school. High school completion includes diploma and other alternative certifications and is limited to students observed as seniors. Associate's degree completion limited to students who attended associate's-granting institutions. Regressions include the following covariates: race, ethnicity, sex, and (for outcomes contingent on college enrollment) Pell Grant eligibility status. The analytic sample includes 102 high schools that first offered SAILS in 2013—14 and 122 additional high schools that first offered SAILS in 2014—15 or 2015—16. TN CC = Tennessee Community College.

*p < .05; **p < .01; ***p < .001.

7.  Robustness Checks

We conducted three robustness checks for our analysis. We first conducted a falsification test, in which we dropped data from the actual SAILS year (2013–14) and instead assigned a false SAILS treatment status in earlier years (table 7).0 Using our reduced-form DDD model to estimate the effect of this artificial SAILS adoption, we assigned false SAILS treatment status to schools in earlier years by shifting each school's SAILS treatment status ahead one, two, and three years. For example, in the falsification test with placebo treatment two years early, SAILS schools from 2013–14 were marked with a false SAILS adoption year of 2011–12. Because this test includes artificial adoption years and excludes data from years SAILS was actually in effect, we would not expect to find significant results unless factors unrelated to SAILS led to changes in the outcomes. As shown in table 7, we do not observe any statistically significant effects for this falsification test, which buttresses our confidence in the findings from the main analyses.

Table 7.

Falsification Test Coefficient Estimates of Seamless Alignment and Integrated Learning Support (SAILS) on Selected Outcomes, 2009—10 through 2012—13 High School Cohorts

Reduced Form: Placebo 3 Years EarlyReduced Form: Placebo 2 Years EarlyReduced Form: Placebo 1 Year Early
Dependent Variable(1)(2)(3)
SAILS participant 0.000 0.000 0.000 
 (0.000) (0.000) (0.000) 
 [137,818] [137,818] [137,818] 
Completed high school −0.004 0.003 0.008 
 (0.005) (0.005) (0.007) 
 [137,818] [137,818] [137,818] 
Enrolled in college by fall of first academic year after high school graduation 0.000 −0.006 0.003 
 (0.010) (0.012) (0.013) 
 [137,818] [137,818] [137,818] 
Enrolled in college by spring of second academic year after high school graduation −0.009 −0.004 0.010 
 (0.010) (0.011) (0.012) 
 [137,818] [137,818] [137,818] 
Enrolled in TN CC by fall of first academic year after high school graduation −0.002 −0.005 0.005 
 (0.007) (0.009) (0.011) 
 [137,818] [137,818] [137,818] 
Enrolled in TN CC by spring of second academic year after high school graduation −0.016 −0.008 0.010 
 (0.009) (0.010) (0.013) 
 [137,818] [137,818] [137,818] 
Enrolled in 4-year college by fall of first academic year after high school graduation 0.005 −0.010 −0.017 
 (0.010) (0.010) (0.013) 
 [137,818] [137,818] [137,818] 
Enrolled in 4-year college by spring of second academic year after high school graduation 0.006 −0.004 −0.017 
 (0.010) (0.010) (0.012) 
 [137,818] [137,818] [137,818] 
Persistence to year 2 of college (if attended TN CC) −0.001 0.009 0.016 
 (0.021) (0.023) (0.027) 
 [30,140] [30,140] [30,140] 
Persistence to year 3 of college (if attended TN CC) 0.044 −0.007 0.018 
 (0.024) (0.030) (0.036) 
 [19,983] [19,983] [19,983] 
Total credits earned by year 1 after high school (if attended TN CC) −0.324 −0.080 −0.025 
 (0.567) (0.571) (0.633) 
 [26,754] [26,754] [26,754] 
Total credits earned by year 2 after high school (if attended TN CC) −0.129 −0.173 0.048 
 (1.064) (1.041) (1.118) 
 [26,754] [26,754] [26,754] 
Total credits earned by year 3 after high school (if attended TN CC) 0.527 −0.127 −0.200 
 (1.519) (1.502) (1.573) 
 [26,754] [26,754] [26,754] 
Enrolled full-time in first semester (if attended TN CC) 0.012 −0.001 −0.027 
 (0.026) (0.024) (0.028) 
 [29,881] [29,881] [29,881] 
Earned associate's by year 2 after high school (if attended TN CC) 0.022 −0.012 −0.013 
 (0.015) (0.019) (0.022) 
 [30,140] [30,140] [30,140] 
Earned certificate by year 2 after high school (if attended TN CC) 0.012 −0.011 −0.010 
 (0.014) (0.014) (0.017) 
 [30,140] [30,140] [30,140] 
Earned credential by year 2 after high school (if attended TN CC) 0.025 −0.028 −0.010 
 (0.020) (0.020) (0.024) 
 [30,140] [30,140] [30,140] 
Took remedial math by year 1 after high school (if attended TN CC) 0.014 0.014 0.003 
 (0.014) (0.014) (0.019) 
 [30,140] [30,140] [30,140] 
Took college math by year 1 after high school (if attended TN CC) 0.025 −0.001 0.001 
 (0.020) (0.028) (0.031) 
 [30,140] [30,140] [30,140] 
Reduced Form: Placebo 3 Years EarlyReduced Form: Placebo 2 Years EarlyReduced Form: Placebo 1 Year Early
Dependent Variable(1)(2)(3)
SAILS participant 0.000 0.000 0.000 
 (0.000) (0.000) (0.000) 
 [137,818] [137,818] [137,818] 
Completed high school −0.004 0.003 0.008 
 (0.005) (0.005) (0.007) 
 [137,818] [137,818] [137,818] 
Enrolled in college by fall of first academic year after high school graduation 0.000 −0.006 0.003 
 (0.010) (0.012) (0.013) 
 [137,818] [137,818] [137,818] 
Enrolled in college by spring of second academic year after high school graduation −0.009 −0.004 0.010 
 (0.010) (0.011) (0.012) 
 [137,818] [137,818] [137,818] 
Enrolled in TN CC by fall of first academic year after high school graduation −0.002 −0.005 0.005 
 (0.007) (0.009) (0.011) 
 [137,818] [137,818] [137,818] 
Enrolled in TN CC by spring of second academic year after high school graduation −0.016 −0.008 0.010 
 (0.009) (0.010) (0.013) 
 [137,818] [137,818] [137,818] 
Enrolled in 4-year college by fall of first academic year after high school graduation 0.005 −0.010 −0.017 
 (0.010) (0.010) (0.013) 
 [137,818] [137,818] [137,818] 
Enrolled in 4-year college by spring of second academic year after high school graduation 0.006 −0.004 −0.017 
 (0.010) (0.010) (0.012) 
 [137,818] [137,818] [137,818] 
Persistence to year 2 of college (if attended TN CC) −0.001 0.009 0.016 
 (0.021) (0.023) (0.027) 
 [30,140] [30,140] [30,140] 
Persistence to year 3 of college (if attended TN CC) 0.044 −0.007 0.018 
 (0.024) (0.030) (0.036) 
 [19,983] [19,983] [19,983] 
Total credits earned by year 1 after high school (if attended TN CC) −0.324 −0.080 −0.025 
 (0.567) (0.571) (0.633) 
 [26,754] [26,754] [26,754] 
Total credits earned by year 2 after high school (if attended TN CC) −0.129 −0.173 0.048 
 (1.064) (1.041) (1.118) 
 [26,754] [26,754] [26,754] 
Total credits earned by year 3 after high school (if attended TN CC) 0.527 −0.127 −0.200 
 (1.519) (1.502) (1.573) 
 [26,754] [26,754] [26,754] 
Enrolled full-time in first semester (if attended TN CC) 0.012 −0.001 −0.027 
 (0.026) (0.024) (0.028) 
 [29,881] [29,881] [29,881] 
Earned associate's by year 2 after high school (if attended TN CC) 0.022 −0.012 −0.013 
 (0.015) (0.019) (0.022) 
 [30,140] [30,140] [30,140] 
Earned certificate by year 2 after high school (if attended TN CC) 0.012 −0.011 −0.010 
 (0.014) (0.014) (0.017) 
 [30,140] [30,140] [30,140] 
Earned credential by year 2 after high school (if attended TN CC) 0.025 −0.028 −0.010 
 (0.020) (0.020) (0.024) 
 [30,140] [30,140] [30,140] 
Took remedial math by year 1 after high school (if attended TN CC) 0.014 0.014 0.003 
 (0.014) (0.014) (0.019) 
 [30,140] [30,140] [30,140] 
Took college math by year 1 after high school (if attended TN CC) 0.025 −0.001 0.001 
 (0.020) (0.028) (0.031) 
 [30,140] [30,140] [30,140] 
Table 7.

Continued.

Reduced Form: Placebo 3 Years EarlyReduced Form: Placebo 2 Years EarlyReduced Form: Placebo 1 Year Early
Dependent Variable(1)(2)(3)
Passed college math by year 1 after high school (if took math & attended TN CC) 0.014 −0.006 −0.019 
 (0.032) (0.034) (0.038) 
 [11,731] [11,731] [11,731] 
Passed college math by year 1 after high school (if attended TN CC) 0.022 0.007 0.006 
 (0.022) (0.024) (0.030) 
 [30,140] [30,140] [30,140] 
Reduced Form: Placebo 3 Years EarlyReduced Form: Placebo 2 Years EarlyReduced Form: Placebo 1 Year Early
Dependent Variable(1)(2)(3)
Passed college math by year 1 after high school (if took math & attended TN CC) 0.014 −0.006 −0.019 
 (0.032) (0.034) (0.038) 
 [11,731] [11,731] [11,731] 
Passed college math by year 1 after high school (if attended TN CC) 0.022 0.007 0.006 
 (0.022) (0.024) (0.030) 
 [30,140] [30,140] [30,140] 

Notes: Standard errors in parentheses and observations are in brackets; errors are clustered at the high school level. Falsification tests exclude data from 2013—14 and assign placebo treatment by shifting actual treatment status ahead 1, 2, or 3 years. Estimates exclude college enrollment during high school. The analytic sample includes 102 high schools that first offered SAILS in 2013—14 and 122 additional high schools that first offered SAILS in 2014—15 or 2015—16. TN CC = Tennessee Community College.

In online table A.4, we test for the presence of time trends in the years prior to SAILS adoption at schools that adopted SAILS in 2013–14. Detecting significant trends for the first cohort of SAILS high schools in the years prior to SAILS would suggest that significant effects observed in the actual SAILS year may be due to chance variation or the continuation of long-running trends. Across all outcomes examined, we observe only one statistically significant trend prior to the actual SAILS adoption year, which is for the rate of remedial math-taking in 2011–12 for high schools that were early adopters of SAILS (i.e., offered SAILS in 2013–14). Accounting for the fifty-seven tests conducted in pretreatment years (three pretreatment years for nineteen outcomes), this single significant coefficient is below the 5 percent rate at which we would expect due to chance.

Table A.5 in the online appendix provides a covariance balance check, in which we treat control variables as outcomes for our reduced-form DDD analysis. As in prior models, the coefficient of interest comes from an interaction that identifies SAILS-eligible students (i.e., students with a junior-year ACT math score below 19 who attended a high school that offered SAILS in their senior year). The goal of this covariate balance check is to estimate whether the observed effects may be driven, in part, by changes in the control variables across time (Duflo 2004). Examining four racial categories, ethnicity, sex, and junior-year ACT math score, we did not find any statistically significant coefficients for any key covariates, which provides additional evidence for the validity of our findings.

8.  Discussion

These results capture the enrollment and academic effects of being eligible for a high school math transition course that directly exempts a student from math remediation once in college. We do not observe any impacts of SAILS eligibility on college enrollment rates at either two-year or four-year institutions. We do find that, among students who enrolled in community colleges, SAILS-eligible students were significantly less likely to enroll in remedial math courses and more likely to enroll in college-level math. These findings support the notion that allowing students to complete their remedial requirements in high school saves students time and money in college; they no longer need to take a remedial math course and can begin in a college-level math course straightaway. However, we find no significant relationship between SAILS and college retention or on-time associate's degree attainment within two years.

These results also speak directly to the primary goal of most high school-to-college transition programs: to exempt students from enrolling in remedial math once in college. In this sense the SAILS program appears to be accomplishing its goals. However, among SAILS-eligible students who took college-level math, there was a 6.2 percentage-point decline in the pass rate of this course within the first year after high school. In other words, the pass rate for the marginal student who was able to take a college-level math course as a result of SAILS was lower than the pass rate for students who took college-level math without the SAILS intervention. Because SAILS induced more students with ACT math scores below 19 into college-math than in previous years, however, the overall share of community college students passing college-level math within one year of high school was 12 percentage points higher for SAILS-eligible students. SAILS provided an avenue for more students to take college-level math, which resulted in a higher overall share of students passing the course. Among students who only enrolled in college-level math, SAILS students were less likely to pass the course, but this percentage compares SAILS-eligible students (some of whom enrolled directly in college-level math) to their peers with similar ACT scores, some of whom enrolled in the course after first completing a remedial math course in college. This is akin to comparing students in their first math class in college to students who passed their first and enrolled in their second, and for this reason, we may not be surprised to see a negative impact on passing the course.

Perhaps more important for students’ long-term success, we observe that students flagged for remediation at SAILS schools completed more total credits after years one, two, and three of college, with the total number of credits completed growing over time. By year three, SAILS-eligible students had completed 4.5 more total credits than their peers, or the equivalent of almost two courses. Furthermore, these effects were most pronounced for female students and students from rural high schools. The positive effects observed among these subgroups suggests a transitional math course such as SAILS may be particularly beneficial for some groups. Continuing to focus on race and income gaps among those students eligible for SAILS participation could ensure a more equitable approach to college remediation overall.

However, the difference in credit accumulation does not appear to influence degree attainment, at least not in the short term. We do not observe any differences in associate's degree or certificate completion within two years of the program, regardless of exposure to the SAILS program. It could be the case that two years is a long enough window in which to observe degree attainment differences, particularly given the suggestive evidence that SAILS-eligible students accumulate more credits by their third year of college than students attending non-SAILS high schools. It could also be the case that the SAILS program is helpful in reducing the amount of time students spend in remedial math classes, but the benefit does not extend beyond college math courses to persistence and degree completion. Incorporating additional years of degree completion data would illuminate longer-term impacts of the program.

The instructional method of the SAILS course deserves consideration, as well. Although not strictly an online course, the technology-driven instructional model of the SAILS program is nevertheless a significant departure from traditional teacher-led high school math instruction. In a meta-analysis of the use of technology in K–12 classrooms, the U.S. Department of Education concluded that blended or hybrid courses, similar to the SAILS model, were the most effective at improving high school course grades (Means et al. 2010). Other research in college contexts found that students who took online introductory English and math courses had lower persistence rates and course grades compared to students who took face-to-face courses, and that these effects differed by student subgroup (Xu and Jaggars 2011, 2014). The context in which students are completing the course matters in understanding the different ways they may be affected by this new instructional method, which may be even more important when considering an intervention targeted at low-scoring students.

It is important to remember in the interpretation of these outcomes that the SAILS program was, and continues to be, capacity-constrained by the number of available computers for students. As a result, roughly three fifths of eligible students attending a SAILS schools in 2013–14 did not actually enroll in a SAILS course. The IV approach offers one way of estimating the effect of SAILS on program participants but the scale-up factor from the reduced form to the IV estimates is significant in this case. Also, in the first year of the SAILS program, just over 70 percent of SAILS participants completed all of the modules. This means that 30 percent of the SAILS participants were not actually eligible for all the theoretical benefits of SAILS, although our ITT estimates are not affected by this fact.

9.  Conclusion

The large number of students who require remediation upon entering college suggests that the K–12 system does not align well with the expectations of colleges and universities (Hoffman et al. 2007). The Tennessee SAILS program is a unique college readiness partnership between secondary and postsecondary institutions that aligns math curricula in an effort to maximize gains for students. These programs are novel in their desire to bridge the K–12/higher education divide and to begin to address issues of academic readiness prior to enrolling in college. Despite their increasing popularity, however, there is very little rigorous evidence to date of the effects of such programs. The closest example of a study of this kind comes from Pheatt, Trimble, and Barnett (2016) in their review of a high school math transition course in West Virginia. Similar to our study, they find that for students who scored close to the assignment cutoff there was a negative impact on students’ likelihood of passing a college gatekeeper math course. The West Virginia policy, however, did not allow students completing the high school transition course to be exempted from their remedial course in college. For this reason, they also observe the transition course had no statistically significant effect on exemption from remedial education, making the SAILS program unique in this regard. The Tennessee SAILS program offers a unique opportunity to study the effects of removing the remediation “barrier” once in college, an important and unanswered question in the literature to date.

In this study, we observe a decline in the rate of passing college-level math (i.e., proportion of students in college-level math who received a passing grade), but an increase in the overall share of community college entrants who earned credit for college-level math (i.e., among all students, not just those who took college-level math). This indicates that the SAILS approach resulted in a greater overall number of students successfully completing a college-level math course, although schools would have observed a slightly higher share of students in college math courses receiving a failing grade. The lower rates of passing college-level math among those who took the course could be the result of lingering gaps in math knowledge even after enrolling in SAILS, or they could be related to peer effects, instructional changes, or differences in expectations from high school to college (Fay 2017). However, we find no significant relationship between SAILS and college retention or on-time associate's degree completion, leading to questions about the long-term impacts of the program.

There are several reasons why a transitional remedial math course may lead to greater credit attainment and higher rates of persistence in college, if not degree attainment. These programs, particularly ones that remove barriers to college-level course enrollment like SAILS, undoubtedly save students time and money, both of which are strong predictors of college completion. We observed SAILS-eligible students making faster progress toward a degree (in terms of credit attainment) within the first two years of college. However, we are not able to attribute this outcome to whether students actually learned more math content as a result of the transition course. For a more recent cohort of SAILS students, though, we collaborated with colleagues to examine impacts of the program on students’ math achievement, and also examined outcomes for SAILS students in the more complex policy environment that included the Tennessee Promise and the introduction of corequisite remediation (Kane et al. 2019). We find that the SAILS program did not significantly improve students’ math achievement, and the positive effects of SAILS on passing college math that we observe in the state policy context here, largely disappear in a context with corequisite remediation and free community college. Findings from the later cohorts of SAILS suggest that, for passing college-level math courses, corequisite remediation policies or other real-time interventions may be overriding the positive effects on passing college-level math observed from implementing the SAILS program alone. For states considering ways to increase success in college-level math courses, SAILS appears most effective in the absence of other statewide college remediation policies.

Academic preparation and the responsibility of college readiness is a critical statewide issue, as it affects both the K–12 and higher education sectors. The popularity of high school/college partnerships is growing, with increasing numbers of states looking to combine early college readiness assessments with structured interventions for students who are not college ready (Barnett et al. 2013). However, state systems need more rigorous research evidence as to the impacts of these programs on student outcomes. Our results suggest that the SAILS model holds promise in getting more students, particularly those with low levels of math ability as measured by the ACT, into and through college-level math, a critical early step toward degree attainment.

Acknowledgments

This project received funding through the Bill & Melinda Gates Foundation. We thank the SAILS program, Tennessee Board of Regents, Tennessee Department of Education, and Tennessee Higher Education Commission for providing data access and invaluable feedback about the context of the SAILS program. We also thank Dale Ballou, Will Doyle, Emily House, Michal Kurlaender, Stefani Relles, and Kevin Stange for helpful comments on early drafts of this paper, as well as participants at the 2017 annual meetings of the Association for the Study of Higher Education and the Association of Public Policy and Management. All errors, omissions, and conclusions are our own.

Notes

1. 

The measure of high school urbanicity comes from the U.S. Department of Education's Common Core of Data (CCD); we use the classifications “urban,” “suburban” (a combination of both “suburban” and “town” from CCD), and “rural.”

2. 

SAILS was piloted in spring 2012 in one high school math classroom and expanded the following year (2012–13) to classes in nineteen high schools, most of which were located in or near Chattanooga, Tennessee.

3. 

Despite encouraging all high school juniors to take the ACT during a free school-based test administration day, student absences and a small number of high schools opting out of the free test day explain the 82 percent ACT rate. Table A.1 (available in a separate online appendix that can be accessed on Education Finance and Policy’s Web site at https://doi.org/10.1162/edfp_a_00312) provides a descriptive summary of those students with and without ACT math scores. Black and Hispanic students make up a larger proportion of the students missing ACT scores than in our sample of students with reported ACT math scores. In our data, we only observe ACT scores, but students can still take the SAT, which we do not observe. Therefore, our findings only generalize to students with ACT scores and those who attended a high school that offered SAILS in one of its first three full-scale years (2013–14, 2014–15, or 2015–16) in their senior year.

4. 

Were we to have more confidence in a RD analysis, we could also consider a difference-in-RD analysis, in which we would separately conduct a RD within early SAILS high schools and a RD within later SAILS high schools and then identify the difference between the two estimates. This method, by definition, relies on the assumptions of a RD analysis being met. Given our concerns with the 36-point ACT exam as the assignment instrument, we do not present these results.

5. 

All reduced-form estimates for both the first and second year are presented in table A.3 in the online appendix. The trends observed after one or two years are consistent across time periods. We further examined the results after one semester (not shown), and they are similar in terms of significance, direction, and magnitude to the results after one year.

6. 

Roughly 5 percent of students at early SAILS high schools with ACT math scores of 19 or higher participated in SAILS, as shown in figure 1. Forty-four percent of students at early SAILS high schools with scores less than 19 participated in SAILS.

7. 

As a further specification check, we also restrict the main subgroups in table 6 to students with ACT scores ±3 points from the cutoff and find that all of the same patterns hold. Results available upon request.

REFERENCES

Alexander
,
Julie.
2013
.
Aligning high school and college instruction: Preparing students for success in college-level mathematics
.
Doctoral thesis
,
Florida State University
,
Tallahassee, FL
.
Alivernini
,
Fabio
, and
Fabio
Lucidi
.
2011
.
Relationship between social context, self-efficacy, motivation, academic achievement, and intention to drop out of high school: A longitudinal study
.
Journal of Educational Research
104
(
4
):
241
252
.
An
,
Brian P.
2015
.
The role of academic motivation and engagement on the relationship between dual enrollment and academic performance
.
Journal of Higher Education
86
(
1
):
98
126
.
Attewell
,
Paul
,
Scott
Heil
, and
Liza
Reisel
.
2012
.
What is academic momentum? And does it matter?
Educational Evaluation and Policy Analysis
34
(
1
):
27
44
.
Bahr
,
Peter R.
2008
.
Does mathematics remediation work? A comparative analysis of academic attainment among community college students
.
Research in Higher Education
49
(
5
):
420
450
.
Barnett
,
Elisabeth A.
,
Margaret P.
Fay
,
Rachel Hare
Bork
, and
Madeline Joy
Trimble
.
2013
.
Reshaping the college transition: States that offer early college readiness assessments and transition curricula
.
New York
:
Community College Research Center, Columbia University
.
Barry
,
Mary Nguyan
, and
Michael
Dannenberg
.
2016
.
Out of pocket: The high cost of inadequate high schools and high school student achievement on college affordability
.
Washington, DC
:
Education Reform Now
.
Beginning Postsecondary Students Longitudinal Study (BPS)
.
2009
.
QuickStats: Beginning college students in 2003-04, followed through to 2009
.
Available
https://nces.ed.gov/Datalab/QuickStats/Workspace/Index/53.
Accessed
11 February 2021
.
Belfield
,
Clive
,
Davis
Jenkins
, and
Hanna
Lahr
.
2016
.
Momentum: The academic and economic value of a 15-credit first-semester course load for college students in Tennessee
.
New York
:
Community College Research Center Working Paper No. 88, Columbia University
.
Bettinger
,
Eric P.
, and
Bridget T.
Long
.
2009
.
Addressing the needs of underprepared students in higher education: Does college remediation work?
Journal of Human Resources
44
(
3
):
736
771
.
Bickerstaff
,
Susan
,
Margaret P.
Fay
, and
Madeline Joy
Trimble
.
2016
.
Modularization in developmental mathematics in two states: Implementation and early outcomes
.
New York
:
Community College Research Center Working Paper No. 87, Columbia University
.
Boatman
,
Angela R.
, and
Bridget T.
Long
.
2018
.
Does remediation work for all students? How the effects of postsecondary remedial and developmental courses vary by level of academic preparation
.
Educational Evaluation and Policy Analysis
40
(
1
):
29
58
.
Bragg
,
Debra D.
, and
Jason L.
Taylor
.
2014
.
Toward college and career readiness: How different models produce similar short-term outcomes
.
American Behavioral Scientist
58
(
8
):
994
1017
.
Chemers
,
Martin M.
,
Li-tze
Hu
, and
Ben F.
Garcia
.
2001
.
Academic self-efficacy and first year college student performance and adjustment
.
Journal of Educational Psychology
93
(
1
):
55
64
.
Chen
,
Xianglei.
2016
.
Remedial coursetaking at U.S. public 2- and 4-year institutions: Scope, experiences, and outcomes; Statistical analysis report.
Available
https://nces.ed.gov/pubs2016/2016405.pdf.
Accessed
28 January 2021
.
College Promise Campaign
.
2019
.
State momentum grows for free college movement
.
Available
https://www.newswire.com/news/state-momentum-grows-for-free-college-movement-20876515.
Accessed
11 February 2021
.
Davidson
,
Jeffrey C.
2014
.
Leading indicators: Increasing statewide bachelor's degree completion rates at 4-year public institutions
.
Higher Education Policy
27
(
1
):
85
109
.
Deil-Amen
,
Regina
, and
James E.
Rosenbaum
.
2003
.
The social prerequisites of success: Can college structure reduce the need for social know-how?
Annals of the American Academy of Political and Social Science
586
(
1
):
120
143
.
DesJardins
,
Stephen L.
,
Dennis A.
Ahlburg
, and
Brian P.
McCall
.
2006
.
The effects of interrupted enrollment on graduation from college: Racial, income, and ability differences
.
Economics of Education Review
25
(
6
):
575
590
.
Duflo
,
Esther.
2004
.
Empirical methods.
Unpublished paper
,
Harvard University
.
Edgecombe
,
Nicole D.
2011
.
Accelerating the academic achievement of students referred to developmental education
.
New York
:
Community College Research Center Working Paper No. 30, Columbia University
.
Engberg
,
Mark E.
, and
Gregory C.
Wolniak
.
2010
.
Examining the effects of high school contexts on postsecondary enrollment
.
Research in Higher Education
51
(
2
):
132
153
.
Fay
,
Margaret P.
2017
.
Computer-mediated developmental math courses in Tennessee high schools and community colleges: An exploration of the consequences of institutional context.
New York
:
Community College Research Center Working Paper No. 91, Columbia University
.
Ginder
,
Scott A.
, and
Janice E.
Kelly-Reid
.
2018
.
Postsecondary institutions and cost of attendance in 2016-17; degrees and other awards conferred: 2015-16; and 12-month enrollment: 2015-16: First look (provisional data)
.
Available
https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2017075rev.
Accessed
11 February 2021
.
Grubb
,
John M.
,
Pamela H.
Scott
, and
Donald W.
Good
.
2017
.
The answer is yes: Dual enrollment benefits students at the community college
.
Community College Review
45
(
2
):
79
98
.
Hilgoe
,
Ellen
,
Jason
Brinkley
,
Johannes
Hattingh
, and
Robert
Bernhardt
.
2016
.
The effectiveness of the North Carolina early mathematics placement test in preparing high school students for college-level introductory mathematics courses
.
College Student Journal
50
(
3
):
369
377
.
Hoffman
,
Nancy
,
Joel
Vargas
,
Andrea
Venezia
, and
Marc
Miller
,
eds.
2007
.
Minding the gap: Why integrating high school with college makes sense and how to do it
.
Cambridge, MA
:
Harvard Education Press
.
Howell
,
Jessica S.
,
Michal
Kurlaender
, and
Eric
Grodsky
.
2010
.
Postsecondary preparation and remediation: Examining the effect of the early assessment program at California State University
.
Journal of Policy Analysis and Management
29
(
4
):
726
748
.
Jackson
,
Jacob
, and
Michael
Kurlaender
.
2016
.
K-12 postsecondary alignment and school accountability: Investigating high school responses to California's Early Assessment Program
.
American Journal of Education
122
(
4
):
477
503
.
Jaggars
,
Shanna S.
,
Nikki
Edgecombe
, and
Georgia W.
Stacey
.
2014
.
What we know about accelerated developmental education
.
Research Overview
.
Available
https://ccrc.tc.columbia.edu/media/k2/attachments/accelerated-developmental-education_1.pdf.
Accessed
11 February 2021
.
Jaggars
,
Shanna S.
, and
Georgia W.
Stacey
.
2014
.
What we know about developmental education outcomes
.
Research overview
.
New York
:
Community College Research Center, Columbia University
.
Jenkins
,
Davis
, and
Katherine
Boswell
.
2002
.
State policies on community college remedial education: Findings from a national survey
.
Denver, CO
:
Education Commission of the States
.
Kane
,
Thomas J.
,
Angela R.
Boatman
,
Whitney
Kozakowski
,
Christopher T.
Bennett
,
Rachel
Hitch
, and
Dana
Weisenfeld
. 2021. Is college remediation a barrier or a boost? Evidence from the Tennessee SAILS program. Journal of Policy Analysis and Management. In press. https://doi.org/10.1002/pam.22306
Karp
,
Melinda Mechur
, and
Rachel Hare
Bork
.
2012
. “
They never told me what to expect, so I didn't know what to do”: Defining and clarifying the role of a community college student
.
New York
:
Community College Research Center Working Paper No. 47, Columbia University
.
Karp
,
Melinda Mechur
,
Juan Carlos
Calcagno
,
Katherine L.
Hughes
,
Dong Wook
Jeong
, and
Thomas R.
Bailey
.
2007
.
The postsecondary achievement of participants in dual enrollment: An analysis of student outcomes in two states
.
Available
https://ccrc.tc.columbia.edu/media/k2/attachments/dual-enrollment-student-outcomes.pdf.
Accessed
28 January 2021
.
Karp
,
Melinda Mechur
,
Katherine L.
Hughes
, and
Maria
Cormier
.
2012
.
Dual enrollment for college completion: Findings from Tennessee and peer states.
New York
:
Community College Research Center, Columbia University
.
Klopfenstein
,
Kristin
, and
Kit
Lively
.
2012
.
Dual enrollment in the broader context of college-level high school programs
.
New Directions for Higher Education
2012
(
158
):
59
68
.
Kurlaender
,
Michal.
2014
.
Assessing the promise of California's Early Assessment Program for community colleges
.
Annals of the American Academy of Political and Social Science
655
(
1
):
36
55
.
Kurlaender
,
Michal
,
Lester
Lusher
, and
Matthew
Case
.
2017
.
Evaluating remediation reforms at the California State University. Presentation to Policy Analysis for California Education,
California State University
,
April
.
Long
,
Mark C.
,
Patrice
Iatarola
, and
Dylan
Conger
.
2009
.
Explaining gaps in readiness for college-level math: The role of high school courses
.
Education Finance and Policy
4
(
1
):
1
33
.
Lotkowski
,
Veronica A.
,
Steven B.
Robbins
, and
Richard J.
Noeth
.
2004
.
The role of academic and non-academic factors in improving college retention
.
Available
https://www.act.org/content/dam/act/unsecured/documents/college_retention.pdf.
Accessed
28 January 2021
.
Martinez
,
Maria Emilia
, and
Steve Frank
Bain
.
2014
.
The costs of remedial and developmental education in postsecondary education
.
Research in Higher Education Journal
22
:
1
12
.
Means
,
Barbara
,
Yukie
Toyama
,
Robert
Murphy
, and
Marianne
Baki
.
2013
.
The effectiveness of online and blended learning: A meta-analysis of the empirical literature
.
Teachers College Record
115
(
3
):
1
47
.
Means
,
Barbara
,
Yukie
Toyama
,
Robert
Murphy
,
Marianne
Baki
, and
Karla
Jones
.
2010
.
Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies
.
Washington, DC
:
U.S. Department of Education
.
Monaghan
,
David B.
, and
Paul
Attewell
.
2015
.
The community college route to the bachelor's degree.
Educational Evaluation and Policy Analysis
37
(
1
):
70
91
.
Ngo
,
Federick
, and
Holly
Kosiewicz
.
2017
.
How extending time in developmental math impacts student persistence and success: Evidence from a regression discontinuity in community colleges
.
Review of Higher Education
40
(
2
):
267
306
.
Pheatt
,
Lara E.
,
Madeline J.
Trimble
, and
Elisabeth
Barnett
.
2016
.
Improving the transition to college: Estimating the impact of high school transition courses on short-term college outcomes
.
New York
:
Community College Research Center Working Paper No. 86, Columbia University
.
Rutschow
,
Elizabeth Zachary
,
Maria Scott
Cormier
,
Dominique
Dukes
, and
Diana E. Cruz
Zamora
.
2019
.
The changing landscape of developmental education practices: Findings from a national survey and interviews with postsecondary institutions.
New York
:
Center for the Analysis of Postsecondary Readiness, Columbia University
.
Rutschow
,
Elizabeth Zachary
, and
Alexander K.
Mayer
.
2018
.
Early findings from a national survey of developmental education practices
.
Research Brief.
New York
:
Center for the Analysis of Postsecondary Readiness, Columbia University
.
Smith
,
Johnathan
,
Michael
Hurwitz
, and
Christopher
Avery
.
2017
.
Giving college credit where it is due: Advanced Placement exam scores and college outcomes
.
Journal of Labor Economics
35
(
1
):
67
147
.
Speroni
,
Cecilia.
2011
.
Determinants of students’ success: The role of Advanced Placement and dual enrollment programs.
New York
:
National Center for Postsecondary Research, Columbia University
.
Struhl
,
Ben
, and
Joel
Vargas
.
2012
.
Taking college courses in high school: A strategy guide for college readiness
.
The college outcomes of dual enrollment in Texas
.
Boston, MA
:
Jobs for the Future
.
Tennessee Office of the
Governor
.
2018
.
Haslam, Tennessee's education innovation recognized with two prestigious national awards: The Education Commission of the States recognizes Gov. Bill Haslam and the SAILS program
.
Available
https://www.tn.gov/former-governor-haslam/news/2018/6/29/haslam–tennessee-s-education-innovation-recognized-with-two-prestigious-national-awards.html.
Accessed
1 February 2021
.
Tierney
,
William G.
, and
Lisa D.
Garcia
.
2011
.
Remediation in higher education: The role of information
.
American Behavioral Scientist
55
(
2
):
102
120
.
Trimble
,
Madeline Joy
,
Lara
Pheatt
,
Tatev
Papikyan
, and
Elisabeth A.
Barnett
.
2017
.
Can high school transition courses help students avoid college remediation? Estimating the impact of a transition program in a large urban district
.
New York
:
Community College Research Center Working Paper No. 99, Columbia University
.
U.S. Department of Education (US ED)
.
2017
.
What Works Clearinghouse standards handbook, Version 4.0.
Available
. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf.
Accessed
1 February 2021
.
Venezia
,
Andrea
, and
Daniel
Voloch
.
2012
.
Using college placement exams as early signals of college readiness: An examination of California's Early Assessment Program and New York's At Home in College Program
.
New Directions for Higher Education
2012
(
158
):
71
79
.
Whinnery
,
Erin
, and
Sarah
Pompelia
.
2018
.
50-state comparison: Developmental education policies
.
Available
https://www.ecs.org/50-state-comparison-developmental-education-policies/.
Accessed
1 February 2021
.
Xu
,
Di
, and
Shanna S.
Jaggars
.
2011
.
The effectiveness of distance education across Virginia's community colleges: Evidence from introductory college-level math and English courses
.
Educational Evaluation and Policy Analysis
33
(
3
):
360
377
.
Xu
,
Di
, and
Shanna S.
Jaggars
.
2014
.
Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas
.
Journal of Higher Education
85
(
5
):
633
659
.
Zajacova
,
Anna
,
Scott M.
Lynch
, and
Thomas J.
Espenshade
.
2005
.
Self-efficacy, stress, and academic success in college
.
Research in Higher Education
46
(
6
):
677
706
.

Supplementary data