Abstract

Place-based promise scholarships are a relatively recent innovation in the space of college access and success. Although evidence on the impact of some of the earliest place-based scholarships has begun to emerge, the rapid proliferation of promise programs largely has preceded empirical evidence of their impact. We utilize regression discontinuity and difference-in-differences analyses to investigate the causal effect of the Pittsburgh Promise on students’ immediate postsecondary attainment and early college persistence outcomes. Both analytic approaches yield similar conclusions. As a result of Promise eligibility, Pittsburgh Public School graduates are approximately 5 percentage points more likely to enroll in college, particularly four-year institutions; 10 percentage points more likely to select a Pennsylvania institution; and 4 to 7 percentage points more likely to enroll and persist into a second year of postsecondary education. Impacts vary with changes over time in the program structure and opportunities, and are larger for those responsive to the Promise opportunity, as instrumental variable-adjusted results reveal. Although the Pittsburgh Promise represents a sizeable investment, conservative cost–benefit calculations indicate positive returns. Even so, an important question is whether locally funded programs such as the Pittsburgh Promise are economically sustainable in the long run.

1.  Introduction

The monetary and nonmonetary benefits of a college education, net of the costs to attend, continue to grow. Yet, costs of attendance have outpaced inflation, making college less affordable for middle- and lower income families (Baum and Ma 2014). Disparities in college access and success by salient characteristics like race/ethnicity, gender, and socioeconomic status (SES) also have grown over time (Goldin, Katz, and Kuziemko 2006; Bailey and Dynarski 2011). In addition to national and state-level responses, local community and economic revitalization efforts—spearheaded by local governments, businesses, and philanthropists—have focused on college access and success, most notably through the development and proliferation of place-based postsecondary scholarship initiatives, collectively referred to as “promise” programs.

Promise scholarships are a relatively recent innovation and differ from standard scholarship programs in a variety of ways. Promise scholarships are part of a holistic approach to regional economic development that leverages local, often private-sector investment to support workforce education and development (Swinburn, Goga, and Murphy 2006). Place-based scholarships represent concentrated efforts to increase college-going within distinct communities by stimulating system-wide changes in student support and resourcing.

This place-based approach has gained popularity since the 2005 announcement of the Kalamazoo (Michigan) Promise, which offers full in-state college tuition to graduates of Kalamazoo Public Schools (KPS). Since the Kalamazoo Promise launch, over one hundred communities have announced or implemented place-based scholarships, although with significant variation in specifics such as scholarship generosity and eligibility requirements.1 Evidence on the impact of some of the earliest place-based scholarships has begun to emerge over the past several years. Yet, the rapid proliferation of promise programs largely has preceded empirical evidence of their impact. We contribute to this literature by examining the impact of the Pittsburgh Promise on the college-going outcomes of Promise-qualified high school graduates.

The Pittsburgh Promise (hereafter, the Promise), launched in 2008, provides postsecondary scholarship funds to qualifying Pittsburgh Public School (PPS) graduates. Specific eligibility requirements have changed over time, with students currently eligible for full Promise funds if they complete high school with a final grade point average (GPA) of 2.5 or above, a high school attendance rate of at least 90 percent, and have attended PPS since kindergarten. Students enrolled since at least the ninth grade are eligible for a partial scholarship. Students earning a GPA between 2.0 and 2.49 are eligible for an “Extension” scholarship to attend community college. Provided extension scholars achieve a certain minimum GPA during their first year of college, they may subsequently gain access to the full Promise scholarship to be used at the full range of higher education institutions.

We use two analytic strategies and data on the PPS graduating classes of 2007–12 to investigate the Promise's impact on college attainment and early persistence. First, we utilize a regression-discontinuity strategy to examine the impacts of Promise eligibility for students at the margin of their cohort-relevant qualifying threshold. We then use an instrumental variables strategy to account for imperfect take-up and, in doing so, estimate the effect of Promise receipt. Second, we use a difference-in-differences (DID) strategy to compare college-going outcomes of eligible and ineligible students before and after the program's inception. With this approach, we estimate the causal effects of the Promise opportunity for students beyond the eligibility margin. Our results paint a consistent picture regarding the Promise's impact. As a result of Promise eligibility, PPS graduates are approximately 5 percentage points more likely to enroll in college, particularly four-year institutions; 10 percentage points more likely to attend college in state; and 4 to 7 percentage points more likely to enroll and persist into a second year of college. Impacts vary with changes in the program structure and opportunities over time. As would be expected, our instrumental variables (IV) estimates indicate impacts that are larger still for the subset of students responsive to the Promise opportunity. Although the Pittsburgh Promise represents a sizeable investment, our conservative back-of-the-envelope cost–benefit calculations indicate positive returns.

Despite the existing literature on promise and other need- or merit-aid programs, the Pittsburgh Promise specifically is worthy of investigation. The success of a promise-type program in one locality does not guarantee success in another, because of the setting-specific nature of each program's design, implementation, and educational context. Therefore, there is value in assessing and documenting impacts of promise programs in their distinct forms and localities. For example, during the time of our study, the Pittsburgh Promise was unusual among place-based programs in that the funds could be used for tuition and fees and other costs of attendance, such as room and board. In addition, students could use funds for qualifying expenses at any accredited institution in state, public or private. Given the range of higher education options within Pennsylvania (PA), we can examine a program that does not place potentially detrimental constraints on students’ institutional choices.

In addition, compared with state merit aid programs, which can drive increases in tuition due to the volume of eligible students (e.g., Long 2004), promise programs may be less susceptible to such unintended consequences, as recipients tend to make up a small fraction of the undergraduates even at the institutions most popular among them. Given the extent of residential and educational segregation in the United States, geographically targeted Promise efforts may provide a means of facilitating college access where it is needed most. Finally, over the period we examine, the Pittsburgh Promise made several policy changes—for example, the implementation of the Extension Scholars program with the class of 2010 and the doubling of the maximum value of the annual award from $5,000 to $10,000 with the class of 2012. We capitalize on these changes in requirements, opportunities, and funding generosity to inform financial aid policy and college access efforts more generally.

2.  Background

Pittsburgh Promise eligibility depends on attendance and GPA criteria. Therefore, we highlight literature on broad-based merit aid programs as well as place-based promise programs.

Literature Review

Over the last three decades, broad, state-level merit aid programs have been a popular policy approach to address the rising costs of higher education. In fact, such programs represent the largest increase in financial aid expenditures over the last twenty years (Baum and Ma 2014).2 Several rigorous studies point to the positive impacts of merit aid programs on a variety of college-readiness and college-going outcomes. The inception of merit aid programs has been linked to increases in GPA and SAT and ACT scores (Henry, Rubenstein, and Bugler 2004; Pallais 2009); increases in college enrollment and sectoral shifts toward four-year institutions (Dynarski 2004, 2008; Cornwell, Mustard, and Sridhar 2006; Bruce and Carruthers 2014); and shifts toward in-state and public institutions (Goodman 2008; Cohodes and Goodman 2014). Depending on the institutions students would have otherwise attended, in some cases, students experience a decline in institutional quality (e.g., Cohodes and Goodman 2014) and in others, an increase (e.g., Bruce and Carruthers 2014). Cohodes and Goodman (2014) link a decline in average institutional quality experienced by Massachusetts Adams Scholarship awardees to a subsequent decline in on-time graduation. In contrast, in other states, the relevant merit-aid programs have been linked to positive and significant impacts on degree attainment (Dynarski 2008; Scott-Clayton 2011). As the Cohodes and Goodman (2014) results suggest, these differences may be due to the counterfactual postsecondary experience that students would have in the absence of the scholarship opportunity.

Comparatively few studies examine place-based promise programs, primarily because of their more recent inception. To date, the most rigorous evidence focuses on programs in Kalamazoo, Michigan, and El Dorado, Arkansas, begun in 2005 and 2007, respectively. The Kalamazoo Promise is a universal scholarship available to KPS graduates who have been residents of and continuously enrolled in KPS since at least ninth grade. Anonymous donors established the scholarship in perpetuity, and the award provides up to 100 percent of tuition and fees (or up to 130 credit hours) at any Michigan public institution. The exact scholarship amount is a function of the student's length of enrollment in the district.3 To retain the scholarship, students must be enrolled full-time (or part-time at the local community college) and maintain a 2.0 GPA.

The Murphy Oil Corporation established the El Dorado Promise for residents of El Dorado, Arkansas, a small city with a population of approximately 20,000. The El Dorado Promise is modeled after the Kalamazoo program and is available to graduates of the El Dorado School District who were continuously enrolled since at least the ninth grade. The award amount is a function of students’ duration of enrollment in the district. El Dorado's maximum award is equal to the highest annual in-state tuition and fees at a public four-year university in Arkansas for a student taking 30 credit hours. Students can apply the award to any college or university, public or private, two-year or four-year, in the country for up to five years. To maintain eligibility, students must attend college full-time and make satisfactory academic progress as defined by their institution.

Promise program research has explored outcomes at the systems and individual levels. At the system level, after years of out-migration, district enrollments have stabilized in Kalamazoo and El Dorado since the inception of their programs (Bartik, Eberts, and Huang 2010; Ash and Ritter 2014). More broadly, LeGower and Walsh (2017) find that place-based promise programs lead to increases in district enrollment, with larger effects where programs provide universal scholarships that can be used at a range of institutions and smaller effects where programs have merit-based criteria. Quasi-experimental studies reveal positive effects of promise programs at both the secondary and postsecondary levels. In El Dorado, Promise-eligible high school students outperformed matched counterparts from other districts in math and literacy, with mixed impacts on high school completion (Ash and Ritter 2014). At the postsecondary level, Kalamazoo's program shifted student preferences toward public and, especially, selective public institutions and low-income students’ preferences away from community college (Andrews, DesJardins, and Ranchhod 2010). In addition, the Kalamazoo Promise increased enrollment in four-year institutions by 10 percentage points and led to improvements in postsecondary course-taking and bachelor's degree attainment (Bartik, Hershbein, and Lachowska 2015).

Pittsburgh Public Schools and the Pittsburgh Promise

Pittsburgh's economic history is rife with decline and renewal. Between 1950 and today, the city's population dropped by half, largely due to the collapse of the steel industry and high rates of regional unemployment that peaked in the early 1980s. Since then, targeted investments in industries such as higher education and health care have contributed to an economic resurgence, making Pittsburgh a common counterpoint to other rust-belt cities (Krugman 2013). As the city's population decreased, so too did its public school enrollments. In 1968, PPS served approximately 68,000 students; today, enrollment is just under 25,000. Similar to other urban contexts, PPS lags behind PA districts academically. In the 2015–16 school year, state standardized test results indicated that 48 percent of PPS high school students scored proficient or advanced in mathematics and 63 percent in reading, compared with statewide rates of 68 and 72 percent, respectively. The racial makeup of PPS students inversely reflects Pittsburgh's population. Fifty-three percent of PPS students are black, more than double the share of black residents in the city's population overall.

Within this context, the Promise is one of the city's signature educational reforms. In December 2006, then PPS Superintendent Mark Roosevelt and Mayor Luke Ravenstahl announced the Promise: “If you play by the rules, and you do what you're supposed to do, and you do your work, and you graduate … there will be education after high school in your future, and money will not be what holds you back.''4 The program began with the PPS class of 2008 with support from local industry and philanthropy, including $100 million from University of Pittsburgh Medical Center and over $10 million each from the Grable Foundation, the Heinz Endowments, and The Pittsburgh Foundation.

The Promise was established with ambitious long-term goals: to (1) support student success in PPS; (2) reform educational systems; (3) help stabilize city and school populations; and (4) act as an engine for an invigorated workforce and volunteer corps. Within the timeframe we examine, the program's signature feature was the offer of up to $10,000 per year (for four years) to attend any PA postsecondary institution (public or private) that awards a degree, license, or diploma for PPS graduates who meet specific academic, attendance, and residency requirements. A cautionary tale regarding the sustainability of such efforts when continual fundraising is required, however: The Pittsburgh Promise has since scaled back the maximum award amount and made other changes that took effect beginning with the class of 2017.5 These changes do not impact the analyses we present here, but we return to them in our Discussion.

The Promise phased in eligibility criteria and scholarship generosity over time, as outlined in figure 1. Substantial changes over time drive our decision to assess and discuss three distinct phases of the program. First, the program now includes two award types. Core Scholars are those who have met all the eligibility criteria and are able to use the full Promise award as described. Extension Scholars are those with a GPA between 2.0 and 2.49 and can use funds to attend the local community college for one year. Upon attaining certain performance goals, Extension Scholars can then access Promise funds to attend any PA institution. Second, the maximum Promise award increased from $5,000 to $10,000 in 2012. We therefore investigate the program in terms of Phase I (before the Extension Scholars Program), Phase II (after the Extension Scholars Program), and Phase III (after the increase in maximum award to $10,000).

Figure 1.

Pittsburgh Promise Eligibility Criteria by Pittsburgh Public School Graduating Cohort

Figure 1.

Pittsburgh Promise Eligibility Criteria by Pittsburgh Public School Graduating Cohort

Promise funds are a last dollar resource—students must first apply for federal, state, and institutional aid. Should these sources cover the costs of college attendance (including tuition, fees, books, and room and board), students were still eligible for an award of up to $1,000 per year. Students first apply for Promise funds in the spring of their senior year and are notified of their final award in the summer after final grades have been posted. To receive funds in subsequent years, students must earn at least a 2.0 GPA and attend college full time.

Grant-based support of $10,000 annually is substantial, particularly given that the cohorts we examine could use Promise funds for all costs of attendance, including room and board. During the 2014–15 academic year, the maximum federal Pell award was $5,730,6 and the PA State Grant program awarded a maximum need-based grant of $4,011.7 In sum, the lowest-income students from the PPS graduating class of 2015 who met the Promise eligibility criteria could access grant-based financial support of nearly $20,000 annually. Even in the context of PA's high sticker price postsecondary market, this combination of benefits implies that low-income, Promise-eligible students could attend a state-owned, four-year institution at little to no cost when the Promise was at its most generous. Although we are mindful of the information-related barriers that hinder students and families from understanding and therefore capitalizing on the extent of this benefit (Page and Scott-Clayton 2016) as well as the substantial barrier that the FAFSA (Free Application for Federal Student Aid) creates for accessing need-based aid (Dynarski and Scott-Clayton 2006; Bettinger et al. 2012; Page, Castleman and Meyer 2017), if postsecondary costs are a driver of students’ postsecondary access and institutional choices, we should expect to see higher rates of postsecondary enrollment and shifts in the distribution of institutional choices in the years in which the Promise was available.

Before discussing our research design, we highlight prior investigations of the Pittsburgh Promise. As with the other localities, the program's implementation is associated with stabilization of district enrollment and parents reporting the Promise as an important factor in their decision to enroll their children in PPS (Gonzalez et al. 2011). Likewise, an early study found that district educators believed the Promise increased students’ likelihood of pursuing postsecondary education, and schools provided additional supports to help students navigate the college-going process (Iriti et al. 2009). Students reported the Promise's GPA requirement to be an academic motivator (Iriti et al. 2009; Gonzalez et al. 2011). Using a DID strategy, Gonzalez and colleagues (2011) found a positive but not significant impact on timely college enrollment and a 5 percentage point increase in second year college persistence for Promise-eligible students in the program's first few years. Bozick, Gonzalez, and Engberg (2015) found the Pittsburgh Promise contributed to enrollment in four-year institutions, in particular.8

We build on these initial studies in two important ways. First, we capitalize on additional years of data with which we improve statistical power and precision to estimate impacts of interest and explore how impacts may shift with the structural details of the scholarship offer. Second, we utilize two quasi-experimental strategies to identify causal impacts of Promise eligibility and scholarship use. The first, regression discontinuity (RD), compares students within the same graduating cohort on either side of the year-relevant GPA eligibility threshold. The second, DID, relies on comparison of consistently eligible and ineligible students before and after the program's inception. Both strategies attend carefully to the program's structure and changes in its eligibility rules over time. Combining strategies allows us to consider several potential mechanisms that may affect students’ college-going outcomes.

Potential Mechanisms

The Pittsburgh Promise may improve college-going outcomes for PPS graduates through several potential channels. First, the Promise may change the financial constraints students and families face in affording college. In conversations, students shared that the Promise “put college on the table” where they previously thought it out of their grasp financially (Page and Iriti 2016). This may be especially true for students from lower income backgrounds. If so, we may expect impact variation such that socioeconomic gaps in college access are narrowed. A related possibility is that the Promise increased perceptions about college affordability and awareness of college financial aid more generally. For example, the Promise may motivate families to complete the FAFSA, thereby accessing other available financial aid while not necessarily needing to take up the maximum allowable Promise award amount. Beyond these affordability mechanisms, the program may motivate students to increase their own academic engagement and effort to qualify for the Promise, thereby increasing the likelihood of being “college-ready” at the time of high school graduation. Relatedly, the Promise's introduction may encourage focus on college-going and college preparation systemwide. For example, PPS implemented a push toward increased advanced placement (AP) course taking (discussed below), and teachers in some PPS schools were encouraged to decorate their classroom doors with their own college paraphernalia (Iriti, Page, and Bickel 2017). Finally, the scholarship opportunity may have incentivized college-motivated students to move into the PPS either through household relocation or shifting from private to public schooling. In this case, impacts may be a function of a distributional shift in the students served by PPS rather than a more direct impact of the program itself.

We hypothesize that the Promise improves rates of college access and persistence for eligible students. Positive college-going effects coupled with award amounts that are large and close to the maximum award value will serve as evidence that students face real challenges with college affordability. Alternatively, relatively small awards coupled with impacts on enrollment and persistence will suggest the program alleviates misperceptions related to affordability and available financial aid. Regarding student motivation to enter the district to take advantage of Promise funding, we reason that this potential mechanism is not relevant at least for the first several Promise-eligible cohorts. With the Promise announced in December 2006 and implemented first with the graduating cohort of 2008, such a shift in the district's high school population could impact college-going outcomes at the earliest with the graduating class of 2011, as attendance from at least the ninth grade was an eligibility requirement. Shifts in the district's population may lead to larger impacts for the 2011 and 2012 cohorts, especially in our DID analysis. For such shifts to influence our RD estimates in these later cohorts, an influx of students would need to drive discontinuities in students’ college motivation at the relevant eligibility threshold.

Increases in academic effort, engagement, and college interest in response to the program may influence results similarly. Promise eligibility is based on cumulative measures of school attendance and course performance. Early cohorts had limited opportunity to improve these cumulative measures, and later cohorts had more time to do so. Because our RD results are based on a within-cohort comparison, they are less likely to be driven by shifts in student motivation over time, unless such shifts lead to student “pile up” just above the threshold for Promise eligibility. As we detail below, we draw on student-level data to inform these potential channels and to estimate the Promise's impact on college access and early college persistence.

3.  Research Design

The implementation and administration of the Pittsburgh Promise allow for two analytic strategies to estimate its impact on college access and persistence. First, we use an RD strategy to examine the Promise's impact within each cohort among those students at the margin of the GPA eligibility requirement. Second, we use a DID strategy to examine cross-cohort changes resulting from its implementation.

Data

Our data come from four primary sources. We have student-level administrative records for PPS graduates from the classes of 2007 through 2012. This allows us one cohort prior to the Promise's implementation (class of 2007) and five cohorts who graduated across its phase-in (figure 1). These data provide student-level demographics: high school achievement (e.g., GPA, SAT scores, number of AP courses taken); school attendance; and each student's graduating high school. We observe each student's home ZIP Code, which we linked to community-level socioeconomic metrics from the American Community Survey Five-Year estimates from 2011 and 2012.9 These data provide measures, such as the unemployment rate, share of population with a high school or college diploma, and racial makeup at the ZIP Code level, thus providing a more fine-grained lens into students’ SES (at the neighborhood level) than a measure such as free- or reduced-price meal eligibility.10 Promise administrative records provide information on the level of funding each student received by year. To examine college-going, we use National Student Clearinghouse (NSC) enrollment records.11 From these data, we focus on several binary outcomes assessed during the first two years after high school graduation: immediate enrollment in any college, in a two-year college, in a four-year college, in an in-state institution, and the joint outcome of immediate enrollment and persistence into the second year of college.

We applied several restrictions relating to the program's eligibility criteria to arrive at our analytic sample. As noted, we focus exclusively on PPS graduates, as we did not have access to college enrollment data for nongraduates. Further, the Promise defines eligibility along three dimensions. Students must have attended PPS beginning in at least the ninth grade and must meet their cohort-specific school attendance and GPA requirements. Although all of these criteria introduce a threshold to consider for discontinuity analysis, we lack power to account for all of these discontinuities (e.g., Papay, Willett, and Murnane 2011; Reardon and Robinson 2012). Therefore, we restricted our sample to those enrolled in PPS since at least the ninth grade. This decision rule resulted in dropping 1,139 student records. Next, during the timeframe of our study, PPS closed one high school and reconstituted another. Our data were missing college enrollment outcomes for these schools’ final graduating classes. Therefore, we dropped these high schools’ class of 2011 graduates (N = 278). We also dropped 487 students who were nonresident district attendees and, as such, never had potential to be Promise eligible. Finally, we were missing valid observations of the variables used to determine eligibility for a small number of students (N = 196). For these students, we were missing school attendance, high school GPA, or both. Where we observed one of these variables but not the other, existing information largely indicated that the student was neither Promise eligible nor close to reaching eligibility. Our resulting analytic sample includes 8,261 students from the graduating classes of 2007 through 2012.

In table 1, we present descriptive statistics by graduation year for students and the communities in which they reside. Across years, the average PPS graduate in our sample meets the most stringent Promise requirements in terms of both GPA (2.5 or above) and high school attendance (90 percent or above). Graduating cohorts included slightly more female than male students and primarily included students who were either black or white. Few graduates are English Language Learners, and across the years, approximately one third of students have an Individual Education Plan. Approximately three of every five students take the SAT. For those missing SAT scores, we imputed values.12 Across all students, the average scores presented in table 2 correspond to approximately the 30th percentile of SAT performance. Finally, we present socioeconomic metrics associated with the location of students’ homes. The average PPS graduate lives in a community with a 9 to 10 percent unemployment rate; where 12 to 13 percent of the population does not have a high school diploma; where approximately one in five adults has a bachelor's degree or higher; where about 28 percent of the community is black and about 65 percent is white. These measures range substantially, reflecting wide socioeconomic diversity across the neighborhoods in which PPS students reside. Despite the potential for the Promise to motivate a shift in the distribution of student characteristics and academic achievement over time, SAT scores, school performance, and demographic characteristics are largely consistent over the years examined. Assessing trends with models that include high school fixed effects, we find that most reveal no meaningful change in average characteristics over time. Average high school GPA was modestly higher in 2011 and 2012 compared with earlier years, and we observe a substantial increase in AP course-taking.13 We attribute this increase to a district-level effort over this period to offer more AP courses, particularly in the lower-performing PPS schools. Because this increase in AP course-taking is not accompanied by similarly dramatic shifts in measures of student academic performance (e.g., SAT, GPA), we rule out the possibility that the shift in AP course-taking signals large changes in the makeup of the student body or in average student achievement.

Table 1.
Descriptive Statistics
Variable2007 N = 1,5072008 N = 1,5812009 N = 1,4192010 N = 1,4712011 N = 1,0042012 N = 1,279
Student characteristics 
GPA 2.622 2.635 2.633 2.581 2.721 2.693 
 (0.742) (0.755) (0.713) (0.741) (0.714) (0.737) 
Attendance rate 0.924 0.914 0.940 0.935 0.942 0.936 
 (0.087) (0.103) (0.055) (0.061) (0.054) (0.065) 
Female 0.534 0.547 0.548 0.528 0.528 0.542 
 (0.499) (0.498) (0.498) (0.499) (0.500) (0.498) 
Black 0.500 0.488 0.508 0.546 0.450 0.531 
 (0.500) (0.500) (0.500) (0.498) (0.498) (0.499) 
White 0.460 0.473 0.444 0.405 0.496 0.400 
 (0.499) (0.499) (0.497) (0.491) (0.500) (0.490) 
Multi-racial 0.016 0.022 0.0290 0.033 0.025 0.035 
 (0.125) (0.147) (0.168) (0.178) (0.156) (0.184) 
Asian 0.017 0.011 0.009 0.010 0.013 0.021 
 (0.130) (0.103) (0.092) (0.097) (0.113) (0.144) 
Hispanic 0.005 0.006 0.0100 0.008 0.015 0.011 
 (0.068) (0.079) (0.099) (0.086) (0.121) (0.104) 
ELL 0.004 0.004 0.009 0.008 0.011 0.006 
 (0.063) (0.066) (0.095) (0.090) (0.104) (0.079) 
IEP 0.321 0.316 0.3370 0.315 0.383 0.313 
 (0.467) (0.465) (0.473) (0.465) (0.486) (0.464) 
SAT math 445.306 447.236 452.161 446.353 443.419 429.176 
 (96.397) (111.945) (110.438) (107.090) (119.413) (112.956) 
SAT reading 441.151 442.021 441.223 432.148 437.114 420.167 
 (97.775) (106.012) (113.616) (109.903) (122.014) (116.687) 
# AP courses 0.360 0.400 0.483 0.512 0.831 0.812 
 (0.791) (0.831) (0.927) (0.990) (1.252) (1.206) 
% unemployed 0.094 0.094 0.094 0.095 0.0915 0.097 
 (0.025) (0.026) (0.025) (0.024) (0.026) (0.030) 
Community characteristics 
% No high school degree 0.125 0.125 0.124 0.128 0.129 0.122 
 (0.072) (0.071) (0.072) (0.070) (0.072) (0.073) 
% BA or higher 0.204 0.206 0.207 0.200 0.202 0.219 
 (0.115) (0.115) (0.116) (0.114) (0.112) (0.122) 
% Below poverty 0.209 0.204 0.208 0.205 0.191 0.206 
 (0.083) (0.081) (0.084) (0.081) (0.072) (0.081) 
% Black 0.293 0.281 0.287 0.293 0.262 0.282 
 (0.185) (0.185) (0.185) (0.185) (0.182) (0.184) 
% White 0.642 0.656 0.648 0.642 0.675 0.651 
 (0.179) (0.180) (0.180) (0.180) (0.175) (0.178) 
Variable2007 N = 1,5072008 N = 1,5812009 N = 1,4192010 N = 1,4712011 N = 1,0042012 N = 1,279
Student characteristics 
GPA 2.622 2.635 2.633 2.581 2.721 2.693 
 (0.742) (0.755) (0.713) (0.741) (0.714) (0.737) 
Attendance rate 0.924 0.914 0.940 0.935 0.942 0.936 
 (0.087) (0.103) (0.055) (0.061) (0.054) (0.065) 
Female 0.534 0.547 0.548 0.528 0.528 0.542 
 (0.499) (0.498) (0.498) (0.499) (0.500) (0.498) 
Black 0.500 0.488 0.508 0.546 0.450 0.531 
 (0.500) (0.500) (0.500) (0.498) (0.498) (0.499) 
White 0.460 0.473 0.444 0.405 0.496 0.400 
 (0.499) (0.499) (0.497) (0.491) (0.500) (0.490) 
Multi-racial 0.016 0.022 0.0290 0.033 0.025 0.035 
 (0.125) (0.147) (0.168) (0.178) (0.156) (0.184) 
Asian 0.017 0.011 0.009 0.010 0.013 0.021 
 (0.130) (0.103) (0.092) (0.097) (0.113) (0.144) 
Hispanic 0.005 0.006 0.0100 0.008 0.015 0.011 
 (0.068) (0.079) (0.099) (0.086) (0.121) (0.104) 
ELL 0.004 0.004 0.009 0.008 0.011 0.006 
 (0.063) (0.066) (0.095) (0.090) (0.104) (0.079) 
IEP 0.321 0.316 0.3370 0.315 0.383 0.313 
 (0.467) (0.465) (0.473) (0.465) (0.486) (0.464) 
SAT math 445.306 447.236 452.161 446.353 443.419 429.176 
 (96.397) (111.945) (110.438) (107.090) (119.413) (112.956) 
SAT reading 441.151 442.021 441.223 432.148 437.114 420.167 
 (97.775) (106.012) (113.616) (109.903) (122.014) (116.687) 
# AP courses 0.360 0.400 0.483 0.512 0.831 0.812 
 (0.791) (0.831) (0.927) (0.990) (1.252) (1.206) 
% unemployed 0.094 0.094 0.094 0.095 0.0915 0.097 
 (0.025) (0.026) (0.025) (0.024) (0.026) (0.030) 
Community characteristics 
% No high school degree 0.125 0.125 0.124 0.128 0.129 0.122 
 (0.072) (0.071) (0.072) (0.070) (0.072) (0.073) 
% BA or higher 0.204 0.206 0.207 0.200 0.202 0.219 
 (0.115) (0.115) (0.116) (0.114) (0.112) (0.122) 
% Below poverty 0.209 0.204 0.208 0.205 0.191 0.206 
 (0.083) (0.081) (0.084) (0.081) (0.072) (0.081) 
% Black 0.293 0.281 0.287 0.293 0.262 0.282 
 (0.185) (0.185) (0.185) (0.185) (0.182) (0.184) 
% White 0.642 0.656 0.648 0.642 0.675 0.651 
 (0.179) (0.180) (0.180) (0.180) (0.175) (0.178) 

Source: Pittsburgh Public Schools, the Pittsburgh Promise, and the American Communities Survey.

Notes: Table cells present means and standard deviation (in parentheses). GPA = grade point average; ELL = English Language Learners; IEP = Individual Education Plan; AP = advanced placement; BA = bachelor's degree.

Table 2.
t-statistics, McCrary Density Tests for Smoothness of Grade Point Average (GPA) Distribution at Selected GPA Thresholds, by Cohort
GPA threshold
Cohort2.02.252.5
2007 1.385 −0.310 0.998 
2008 −1.085 −0.581 −1.363 
2009 0.022 0.544 0.027 
2010 1.468 −0.614 1.648 
2011 0.320 1.265 1.250 
2012 −0.797 1.252 0.969 
GPA threshold
Cohort2.02.252.5
2007 1.385 −0.310 0.998 
2008 −1.085 −0.581 −1.363 
2009 0.022 0.544 0.027 
2010 1.468 −0.614 1.648 
2011 0.320 1.265 1.250 
2012 −0.797 1.252 0.969 

Source: Pittsburgh Public Schools and the Pittsburgh Promise.

Notes: Each cell reports a t-statistic from a separate McCrary (2008) density test to assess smoothness in the GPA distribution at the given GPA threshold. Cells shaded in gray pertain to the GPA threshold for full Promise eligibility for each given cohort. Unshaded cells provided as points of comparison. In years 2010 through 2012, the 2.0 threshold pertains to the cutoff for Extension Scholar eligibility.

Analytic Strategies

We now detail our analytic strategies and discuss both their benefits and limitations. While especially cognizant of the potential threats to the validity of our findings, we argue, as did Scott-Clayton (2011) in her analysis of the West Virginia PROMISE scholarship, that our two analytic strategies are stronger together than either would be separately.

Regression Discontinuity

We first utilize an RD analytic strategy to investigate the causal impact of Promise eligibility and receipt of funds in the first year after high school graduation. We capitalize on the fact that strict criteria govern eligibility, and zero in on differences in outcomes for students who fall just on either side of relevant eligibility thresholds. In preliminary analyses, we found that the GPA criterion was a more important binding constraint than the attendance criterion. In all years, the preponderance of students met the designated attendance criterion, but the same is not true with regard to GPA. We focus our RD analyses on students who met their graduating cohort's attendance criterion. Thus, our results pertain to those who met their year-relevant attendance criterion but who just met or just missed their year-relevant GPA threshold.14

Several conditions must be met for an RD strategy to yield valid causal inference (Schochet et al. 2010). First, the assignment rule must be clear and followed with a high degree of fidelity. Second, the variable determining “treatment” and “control” status must be measured on an ordinal scale and have sufficient density around the cutoff. Third, the relevant threshold must relate to assignment of the focal “treatment” and not other interventions or opportunities that might serve as potential mechanisms for the impacts that we observe. Finally, and most critically in the Pittsburgh Promise context, students must not be able to manipulate their own value of the forcing variable, particularly in terms of their position directly above or below the eligibility threshold.

Across the first five years of the program, 52 percent of PPS graduates in our analytic sample met their cohort's eligibility criteria, and of eligible students, approximately three quarters received Promise funds in the year after high school completion.15 In figure 2, we provide graphical evidence that the eligibility rules are followed with a high degree of fidelity.16 Across all years, at the GPA threshold associated with Core Scholars eligibility, students were nearly 38 percentage points more likely to take up Promise funds in their first year after high school. This is a sizeable and statistically significant jump in the probability of Promise receipt (table 3, column 1). Although the magnitude of this jump varies across the phases of the program, the jump in probability remains high even when students just below the threshold were able to access the Extension Scholars program to attend community college (table 4, column 1). These graphical and model-based results provide evidence that at relevant margins, those just above the threshold were substantially more likely to access Promise funds.17

Figure 2.

Relationship between Probability of Receiving Promise Funds and High School Grade Point Average (GPA), by Year

Notes: Vertical line represents year-specific GPA threshold for Core Promise eligibility. From 2010 through 2012, students with GPAs between 2 and 2.5 were eligible for the Extension Scholars opportunity, and students with GPAs below 2.0 were not eligible for any postsecondary support. Therefore, in these years, we restrict our comparison to these two different scholarship opportunities and exclude those students who fall below the 2.0 GPA margin.

Figure 2.

Relationship between Probability of Receiving Promise Funds and High School Grade Point Average (GPA), by Year

Notes: Vertical line represents year-specific GPA threshold for Core Promise eligibility. From 2010 through 2012, students with GPAs between 2 and 2.5 were eligible for the Extension Scholars opportunity, and students with GPAs below 2.0 were not eligible for any postsecondary support. Therefore, in these years, we restrict our comparison to these two different scholarship opportunities and exclude those students who fall below the 2.0 GPA margin.

Table 3.
Pooled Regression Discontinuity Impacts of Promise Eligibility (Ordinary Least Squares) And Promise Take Up (Instrumental Variable) at Grade Point Average (GPA) Threshold for Those Students Who Meet the Attendance Criterion Relevant in Their Year of High School Graduation
Receive Year 1 Promise FundingYear 1 Promise Funding ($)Enroll in CollegeEnroll in Two-Year CollegeEnroll in Four-Year CollegeEnroll in PA InstitutionEnroll & Persist into Year 2
Promise eligibility 0.379*** 1,558.496*** 0.046 −0.03 0.077*** 0.050 0.061* 
 (0.024) (106.951) (0.029) (0.026) (0.022) (0.029) (0.026) 
Average below threshold [0.172] [404.471] [0.404] [0.240] [0.163] [0.381] [0.242] 
R2 0.312 0.411 0.249 0.079 0.358 0.168 0.272 
Promise usage — 3,934.845*** 0.137* −0.041 0.179** 0.147* 0.165** 
 — (197.505) (0.069) (0.065) (0.054) (0.069) (0.062) 
N 4,853 4,853 4,853 4,853 4,853 4,853 4,853 
Receive Year 1 Promise FundingYear 1 Promise Funding ($)Enroll in CollegeEnroll in Two-Year CollegeEnroll in Four-Year CollegeEnroll in PA InstitutionEnroll & Persist into Year 2
Promise eligibility 0.379*** 1,558.496*** 0.046 −0.03 0.077*** 0.050 0.061* 
 (0.024) (106.951) (0.029) (0.026) (0.022) (0.029) (0.026) 
Average below threshold [0.172] [404.471] [0.404] [0.240] [0.163] [0.381] [0.242] 
R2 0.312 0.411 0.249 0.079 0.358 0.168 0.272 
Promise usage — 3,934.845*** 0.137* −0.041 0.179** 0.147* 0.165** 
 — (197.505) (0.069) (0.065) (0.054) (0.069) (0.062) 
N 4,853 4,853 4,853 4,853 4,853 4,853 4,853 

Source: Pittsburgh Public Schools, the Pittsburgh Promise, the American Communities Survey, and the National Student Clearinghouse.

Notes: Coefficients presented from linear probability models predicting enrollment outcomes as a function of GPA and promise eligibility as determined by GPA thresholds. Coefficients in the top panel represent the impact of Promise eligibility and those in the bottom panel represent the impact of Promise receipt at the year-relevant eligibility threshold. Data are restricted to those students who meet the year-relevant attendance requirements for Promise eligibility. All models include covariates included in table 2. Models include school-by-year fixed effects. Regressions fitted within a bandwidth of ±1.35 GPA points above and below year-specific threshold. In years 2010 through 2012, lower threshold extends only to −0.50 GPA points, due to introduction of the Extension Scholars program. Robust standard errors in parentheses. Average value of outcome within 0.25 GPA points below the threshold in brackets.

p < 0.10; *p < 0.05; **p < 0.01; ***p < 0.001.

Table 4.
By-Phase Regression Discontinuity Impacts of Promise Eligibility (Ordinary Least Squares) and Promise Take Up (Instrumental Variable) at Grade Point Average (GPA) Threshold for Those Students Who Meet the Attendance Criterion Relevant in Their Year of High School Graduation
Receive Year 1 Promise FundingYear 1 Promise Funding ($)Enroll in CollegeEnroll in Two-Year CollegeEnroll in Four-Year CollegeEnroll in PA InstitutionEnroll & Persist into Year 2
Promise eligibility, Phase I 0.425*** 1,179.427*** 0.066∼ 0.018 0.048 0.058 0.070* 
 (0.027) (106.785) (0.036) (0.032) (0.027) (0.036) (0.031) 
Average below threshold [0.056] [176.862] [0.312] [0.200] [0.112] [0.288] [0.168] 
Promise eligibility, Phase II 0.257*** 1,528.121*** −0.112* 0.112* 0.023 −0.001 
 (0.053) (187.376) (0.055) (0.051) (0.044) (0.056) (0.054) 
Average below threshold [0.293] [537.972] [0.492] [0.301] [0.191] [0.473] [0.328] 
Promise eligibility, Phase III 0.450*** 3,507.224*** 0.064 −0.068 0.132 0.08 0.172* 
 (0.078) (536.738) (0.084) (0.074) (0.075) (0.085) (0.075) 
Average below threshold [0.188] [635.242] [0.429] [0.211] [0.218] [0.406] [0.233] 
R2 0.313 0.414 0.249 0.08 0.358 0.168 0.272 
Promise usage, Phase I — 3,592.099** 0.155 0.039 0.116 0.132 0.160* 
 — (262.189) (0.083) (0.075) (0.063) (0.083) (0.073) 
Promise usage, Phase II — 4,151.229*** 0.052 −0.293 0.345* 0.18 −0.032 
 — (227.192) (0.191) (0.206) (0.149) (0.183) (0.193) 
Promise usage, Phase III — 7,127.419*** 0.143 −0.149 0.292 0.178 0.379* 
 — (541.446) (0.175) (0.173) (0.166) (0.174) (0.155) 
N 4,853 4,853 4,853 4,853 4,853 4,853 4853 
Receive Year 1 Promise FundingYear 1 Promise Funding ($)Enroll in CollegeEnroll in Two-Year CollegeEnroll in Four-Year CollegeEnroll in PA InstitutionEnroll & Persist into Year 2
Promise eligibility, Phase I 0.425*** 1,179.427*** 0.066∼ 0.018 0.048 0.058 0.070* 
 (0.027) (106.785) (0.036) (0.032) (0.027) (0.036) (0.031) 
Average below threshold [0.056] [176.862] [0.312] [0.200] [0.112] [0.288] [0.168] 
Promise eligibility, Phase II 0.257*** 1,528.121*** −0.112* 0.112* 0.023 −0.001 
 (0.053) (187.376) (0.055) (0.051) (0.044) (0.056) (0.054) 
Average below threshold [0.293] [537.972] [0.492] [0.301] [0.191] [0.473] [0.328] 
Promise eligibility, Phase III 0.450*** 3,507.224*** 0.064 −0.068 0.132 0.08 0.172* 
 (0.078) (536.738) (0.084) (0.074) (0.075) (0.085) (0.075) 
Average below threshold [0.188] [635.242] [0.429] [0.211] [0.218] [0.406] [0.233] 
R2 0.313 0.414 0.249 0.08 0.358 0.168 0.272 
Promise usage, Phase I — 3,592.099** 0.155 0.039 0.116 0.132 0.160* 
 — (262.189) (0.083) (0.075) (0.063) (0.083) (0.073) 
Promise usage, Phase II — 4,151.229*** 0.052 −0.293 0.345* 0.18 −0.032 
 — (227.192) (0.191) (0.206) (0.149) (0.183) (0.193) 
Promise usage, Phase III — 7,127.419*** 0.143 −0.149 0.292 0.178 0.379* 
 — (541.446) (0.175) (0.173) (0.166) (0.174) (0.155) 
N 4,853 4,853 4,853 4,853 4,853 4,853 4853 

Source: Pittsburgh Public Schools, the Pittsburgh Promise, the American Communities Survey, and the National Student Clearinghouse.

Notes: Coefficients presented from linear probability models predicting enrollment outcomes as a function of GPA and promise eligibility as determined by GPA thresholds. Coefficients in the top panel represent the impact of Promise eligibility and those in the bottom panel represent the impact of Promise receipt at the year-relevant eligibility threshold. Data are restricted to those students who meet the year-relevant attendance requirements for Promise eligibility. All models include covariates included in table 2. Models include school-by-year fixed effects. Regressions fitted within a bandwidth of ±1.35 GPA points above and below year-specific threshold. In years 2010 through 2012, lower threshold extends only to −0.50 GPA points, due to introduction of the Extension Scholars program. Robust standard errors in parentheses. Average value of outcome within 0.25 GPA points below the threshold in brackets.

p < 0.10; *p < 0.05; **p < 0.01; ***p < 0.001.

The second and third conditions are easily met, given that our forcing variable is GPA and we know of no other opportunities for which students were simultaneously eligible specifically at their year-relevant threshold. The final condition—that students are not able to manipulate their placement with regard to their year-relevant GPA threshold—requires greater attention, given that the guiding Promise eligibility criteria were well-known and communicated to students. We assess possible manipulation at these thresholds with several strategies. First, if students did respond strategically to the Promise criteria, we might expect to see a “pile up” of students just above their cohort-relevant GPA thresholds. We examine evidence of such pile up with GPA distributions by year. In figure A.1 (which is available in a separate online appendix that can be accessed on Education Finance and Policy’s Web site at www.mitpressjournals.org/doi/suppl/10.1162/edfp_a_00257), each panel includes vertical lines to demarcate GPAs of 2.0, 2.25, and 2.5. In no case do these histograms suggest pile up. This is especially sensible for the first few cohorts, which had less time to react to the eligibility requirements. Next, we utilize the McCrary (2008) test to assess smoothness of the GPA densities, by year, across the year-relevant thresholds. In all years, we generate statistics associated with three GPA values, 2.0, 2.25, and 2.5 to provide additional context. We present t-statistics associated with these tests in table 2. In no case do associated p-values fall below the 0.05 margin.

Next, to assess the GPA distribution while handling the structure of students nested within schools, we utilize a strategy similar to Smith, Hurwitz, and Avery (2015). We collapse students into bins defined by graduating cohort, high school, and GPA rounded to the nearest 0.10 point. We then regress counts of the number of students in each cell on the distance from the year-specific threshold, an indicator for whether the GPA value renders students eligible for the Promise, an interaction between this indicator and the year-specific distance, and high school fixed effects. The results are marginally significant in only one year (table A.1, available in the online appendix). We compare these results with those from similar analyses from the 2007 (pre-Promise) year, which serves as a falsification test. For the class of 2007, we should not expect to see strategic behavior in response to GPA thresholds, given that the Promise was not yet active. Nevertheless, we estimate threshold effects on par and even larger than for the Promise-eligible cohorts.

As a final test, we examine whether student characteristics change discontinuously across the Promise-eligibility thresholds. Such patterns would be suggestive of manipulation if students with certain characteristics were more likely to situate themselves directly above their year-relevant threshold. We fit our primary RD specification (outlined below) to a set of models using student- and community-level covariates as outcomes. In online appendix table A.2, we present results. Each coefficient measures the presence of a discontinuous jump in the average value of the covariate at the threshold for full Promise eligibility. In certain years, just eligible students score somewhat lower on the Math SAT, take a modestly lower number of AP courses, and are more likely to have an Individual Education Plan, but these patterns are not consistent over time. Interestingly, these results would actually be suggestive of poorer postsecondary outcomes, if anything. Collectively, the results across these multiple strategies for assessing smoothness in the GPA distribution provide little evidence of significant manipulation relative to the Promise thresholds. In addition, because we control for the covariates indicated in table 1 in our preferred RD specifications, any remaining selection bias would need to be due to omitted covariates unrelated to those for which we control.

These common tests for smoothness around the year-relevant GPA thresholds, however, may not be enough to determine whether students are strategically manipulating their GPAs to meet Promise requirements. Barreca, Lindo, and Waddell (2016) show that RD estimates are biased if nonrandom heaping of observations at the threshold and elsewhere within the bandwidth of data analyzed is present. To address nonrandom heaping, the authors suggest utilizing a “donut” RD design by dropping observations at heaped points along the running variable. Recent studies using GPA as a running variable have implemented various donut RD strategies.18 We adopt Dee and Penner's (2017) method and remove students at year-relevant cutoffs and at half and whole integers of GPA. Because GPA values in PPS data are rounded to the nearest 0.001 point, we increase our donut size by removing observations in 0.1-point intervals around year-relevant GPA cutoffs and half/whole integers. The corresponding results (presented in online appendix tables A.4–A.7) differ only modestly from our main findings and do not change our substantive conclusions.

To examine the impact of Promise receipt on students’ postsecondary outcomes, we utilize a two-stage IV or “fuzzy” RD approach (e.g., Jacob and Lefgren 2004; Imbens and Lemieux 2008). In our first stage, we utilize a linear probability specification to model receipt of Promise funding in the first year after high school graduation:
PROMISEit=β0+β1ELIGit+β2CGPAit+β3ELIG×CGPAit+γXit+ɛit,
(1)
where PROMISEit is an indicator for student i in cohort t receiving year 1 Promise funds, ELIGit is an indicator for student i’s GPA exceeding the year t threshold, CGPAit is student i’s GPA re-centered at the year t threshold, and Xit is a vector of student-level characteristics. The coefficient β1 represents the difference in the probability of receiving Promise funds in the first year after high school between students who are just above and just below the year-specific threshold. Our specification allows the slope of the relationship between Promise receipt and GPA to vary above and below the threshold. The second stage model uses a linear functional form as follows:
Yit=π0+π1PROMISEit+π2CGPAit+π3CGPA×ELIGit+θXit+τit,
(2)
where Yit represents an outcome such as college enrollment. We instrument for each student's Promise status as described in equation 1. The coefficient π1 indicates the impact of receiving Promise funds. In addition to this IV specification, we report on the impact of Promise eligibility, obtained by utilizing equation 1 to model Yit directly. In fitting models to data pooled across years, we include school-by-year fixed effects to capitalize only on variation among students who graduate from the same high school in the same year. Finally, we include year-specific interactions with the effects of GPA above and below the thresholds to allow these relationships to vary by year.

Difference-in-Differences

A limitation of RD is that causal impacts are limited to those students local to their year-relevant eligibility margin, and the magnitude of impacts for marginal and inframarginal students may differ. To estimate impacts for eligible students overall, we utilize a DID identification strategy where we compare Promise-eligible and -ineligible students before and after the program's implementation.19 We utilize the following basic model:
Yit=θ0+θ1POST×ELIGit+θ2ELIGit+θ3POSTit+ϑXit+ɛit,
(3)
where Yit pertains to college-going outcomes for student i in cohort t, ELIGit is a binary variable equal to 1 if a student is Promise-eligible, and POSTit is a binary variable equal to 1 if the student graduates after the Promise was introduced (i.e., in years 2008 through 2012). Our data include only one pre-Promise cohort (2007), and therefore POSTit is equal to zero for class of 2007 graduates. All models include fixed effects for high school. In addition, we include fixed effects for the specific post-Promise years, when we utilize multiple post-Promise cohorts in estimation. In our DID specification, the vector Xit includes covariates measured prior to the start of high school, such as gender and race/ethnicity.20θ1 represents the causal effect of Promise eligibility on college-going outcomes and is, therefore, our key parameter of interest.

Across the Promise's first five years, the eligibility criteria changed (figure 1). A resulting challenge in implementing our DID strategy is identifying those who consistently would or would not be Promise eligible across these first five years. At its most stringent, Promise criteria required that students achieve a cumulative GPA of 2.5 and a four-year average attendance rate of 90 percent. Although requirements were less stringent in the program's early years, in this analysis we define eligible students as those who met these two criteria. To filter out students who would be eligible in some years but not in others (either for the Core or the Extension scholarship), we then define as ineligible students graduating with a GPA of less than 2.0 or an attendance rate of less than 85 percent.21 In restricting our analysis to those who would be consistently eligible or ineligible, we eliminate 1,722 students whose eligibility would differ according to the year of their high school graduation. In online appendix table A.3, we present descriptive statistics for students defined as Promise eligible and ineligible. Although student characteristics are relatively consistent over time within these groups, they are quite different across the groups, as we would expect.

There are two key limitations to our DID approach. First, we derive a second difference from students who differ substantially from those who were consistently eligible for the Promise. Different from previous investigations of the Promise, however, an advantage is that our treatment effect estimate is less likely to be biased downward by misclassifying a large number of recipients as ineligible.22 Second, we consider cohorts of students who graduated several years after the policy shock of the Promise introduction. To test sensitivity of our results to the inclusion of later cohorts, we present effects estimated from the addition of successive cohorts one at a time.

4.  Results

We present results from our RD analyses in tables 3 and 4.,23 In table 3 we present impacts of Core Scholar eligibility and receipt overall, and in table 4, we disaggregate effects across the three program phases, as the incentives, opportunities, and signals that students received differed between them. In each table, the top panel provides reduced-form impacts of Promise eligibility, and the bottom panel provides the IV estimates of actual Promise receipt. The top panel of table 3 reveals positive effects of Promise eligibility. Across the cohorts, those just above their year-relevant threshold were 38 percentage points more likely to receive Promise funding, compared with a rate of 17 percent just below the threshold. As the panels of figure 2 reveal, probability of Promise take-up was positively related to GPA and so was higher farther from the selection threshold. Regarding college-going, those just eligible were more likely to enroll in college, particularly in a four-year institution. The nearly 8-percentage-point effect of eligibility on four-year college enrollment is a combination of new postsecondary enrollment in four-year schools and sectoral switching from two-year to four-year institutions. Eligible students were 5 percentage points more likely to enroll in college in PA and were 6 percentage points more likely to enroll and persist into a second year of college. When we scale these effects for the imperfect relationship between eligibility and take-up and the selection threshold, we observe large impacts. Marginal students who utilized the Promise were nearly 14 percentage points more likely to pursue postsecondary education than they would have in the absence of the Promise opportunity and nearly 17 percentage points more likely to enroll and persist into a second year of college. That the persistence coefficients are larger, qualitatively, than the enrollment coefficients suggests that the Promise opportunity has important impacts on both college access and success.

Table 4 provides evidence that the three phases of the Promise contributed differentially to the overall impacts that we observe for the first five cohorts of students who graduated under the Promise. In the first two years of the Promise (Phase I), when the eligibility criteria were lowest but when the funds were most limited geographically only to institutions within Allegheny County, the Promise opportunity induced marginally eligible students to enroll in college and four-year colleges, in particular, with a more modest impact on two-year college enrollment. Phase I instrumental variable estimates indicate that at the margin of eligibility, the average recipient received a Promise award of nearly $3,600. Students who would take up the Promise opportunity were approximately 16 percentage points more likely to enroll in college as a result of the funding and similarly more likely to maintain enrollment into the second year of college.24

The Promise's second phase brought several changes. First, eligibility criteria reached their most stringent, with a minimum GPA of 2.5 and an attendance rate of at least 90 percent. Second, fully eligible students could now use funds at any public or private in-state institution. Third, the Promise implemented the Extension Scholars program, which guided students below the Core Scholars threshold to attend the local community college. Just below the threshold for Core Scholar eligibility, nearly 30 percent of students received Promise support, with those just above the threshold 26 percentage points more likely to do so. Descriptively, recipients just below the threshold received approximately $1,900 in year 1 funding, whereas those just above the threshold received close to the maximum level of funding available. Given the opportunities available to students below the full Promise selection threshold in Phase II, it is not surprising that we see essentially no impact on overall college enrollment and instead greater impact on the postsecondary sector. Specifically, at the margin of full Promise eligibility, the program increased four-year college enrollment by 11 percentage points and decreased two-year college enrollment by a similar margin. During Phase II, we find no impact on persistence through the second year of college. In sum, during the second phase of the Promise, students at the selection threshold responded strongly to the directive that the program created in terms of the sector of institution in which to enroll.

The Pittsburgh Promise made another critical change with the graduating class of 2012. Specifically, they doubled the generosity of the maximum Promise award to $10,000 annually. We refer to this as Phase III. At the margin of selection, the average recipient now received nearly $8,000 more in funding for postsecondary education as a result of the Promise, and this corresponds to a fairly large impact on enrollment overall and enrollment in a four-year institution, specifically. By this third phase of the program, we also observe a sizeable impact on enrollment and persistence into the second year of college. Compared with Phase II, students below the threshold are less likely to take up Promise funds and are also less likely to enroll in postsecondary education, perhaps signaling less interest over time in the Extension Scholars program. Nevertheless, the strong impacts on enrollment and persistence for those who just made the Promise-eligibility threshold may indicate that the increased Promise generosity helped students to manage costs that otherwise threaten persistence and/or opened the doors further to institutions where students have a higher probability of postsecondary persistence and success.

In sum, at the margin of selection, the Promise opportunity led to sizeable impacts on college access and particularly on sectoral choice of postsecondary institution, once the Extension Scholars program was implemented. We also observe overall impacts on persistence into the second year of college, particularly when the Promise was at its most generous. In all phases of the program, the size of the awards that students received is large and approaches the maximum level of funding available. Although the Promise may have motivated students to access other sources of financial aid through FAFSA completion, the size of the grants that students receive nevertheless suggest they were not constrained by mere perceptions regarding college affordability, as even after accessing other sources of grant-based financial aid, Promise recipients had sufficient unmet costs such that they took advantage of the Promise's full generosity.

We now turn to our DID analyses. Recall that we focused on impacts of the full Promise for the subset of students who would be consistently eligible across all years by achieving a cumulative high school GPA of at least 2.5 and an attendance rate of at least 90 percent. We construct our second difference from those students who are consistently ineligible for any Promise support, including the Extension Scholars program, as these opportunities differed across the years. Therefore, this analysis focuses on the impact of eligibility for the Core Scholars support. In table 5, we present our DID results. Each row pertains to successive inclusion of an additional cohort. For example, the first row compares class of 2007 graduates with those from the class of 2008, and in the second row we also include the class of 2009. The final row provides estimates for our full panel of data. Recall that the program changed in two key ways for these fully eligible students over the first five years of the program. First, for the graduating cohorts of 2008 and 2009, Promise funds could only be used to attend institutions within Allegheny County. Beginning with the class of 2010, this was expanded to institutions throughout PA. Second, in the final cohort that we examine, the maximum award was doubled to $10,000 annually.

Table 5.
Difference-in-Differences Promise Effects on College Outcomes
Receive Year 1 Promise FundingYear 1 Promise Funding ($)Enroll in CollegeEnroll in Two-Year CollegeEnroll in Four-Year CollegeEnroll in PA InstitutionEnroll & Persist into Year 2
2007—08 0.595*** 2,180.357*** 0.051 0.022 0.029 0.095** 0.069* 
N = 2,449 (0.021) (84.825) (0.034) (0.028) (0.031) (0.035) (0.031) 
R2 0.539 0.454 0.332 0.033 0.325 0.207 0.349 
2007—09 0.653*** 2,460.611*** 0.051 0.009 0.041 0.091** 0.061* 
N = 3,556 (0.014) (60.071) (0.030) (0.025) (0.027) (0.032) (0.028) 
R2 0.546 0.459 0.322 0.032 0.323 0.201 0.332 
2007—10 0.661*** 2,537.351*** 0.047 −0.005 0.052* 0.094** 0.055* 
N = 4,726 (0.011) (48.735) (0.029) (0.023) (0.025) (0.030) (0.026) 
R2 0.517 0.433 0.319 0.029 0.334 0.204 0.327 
2007—11 0.665*** 2,581.134*** 0.041 −0.008 0.049* 0.092** 0.050* 
N = 5,515 (0.010) (44.171) (0.028) (0.023) (0.025) (0.029) (0.025) 
R2 0.501 0.422 0.313 0.029 0.336 0.198 0.321 
2007—12 0.673*** 3,071.840*** 0.045 −0.003 0.048* 0.105*** 0.044 
N = 6,540 (0.010) (53.860) (0.028) (0.022) (0.024) (0.029) (0.025) 
R2 0.498 0.436 0.318 0.027 0.338 0.205 0.319 
Average for eligible students in 2007 — — 0.741 0.126 0.615 0.575 0.637 
Receive Year 1 Promise FundingYear 1 Promise Funding ($)Enroll in CollegeEnroll in Two-Year CollegeEnroll in Four-Year CollegeEnroll in PA InstitutionEnroll & Persist into Year 2
2007—08 0.595*** 2,180.357*** 0.051 0.022 0.029 0.095** 0.069* 
N = 2,449 (0.021) (84.825) (0.034) (0.028) (0.031) (0.035) (0.031) 
R2 0.539 0.454 0.332 0.033 0.325 0.207 0.349 
2007—09 0.653*** 2,460.611*** 0.051 0.009 0.041 0.091** 0.061* 
N = 3,556 (0.014) (60.071) (0.030) (0.025) (0.027) (0.032) (0.028) 
R2 0.546 0.459 0.322 0.032 0.323 0.201 0.332 
2007—10 0.661*** 2,537.351*** 0.047 −0.005 0.052* 0.094** 0.055* 
N = 4,726 (0.011) (48.735) (0.029) (0.023) (0.025) (0.030) (0.026) 
R2 0.517 0.433 0.319 0.029 0.334 0.204 0.327 
2007—11 0.665*** 2,581.134*** 0.041 −0.008 0.049* 0.092** 0.050* 
N = 5,515 (0.010) (44.171) (0.028) (0.023) (0.025) (0.029) (0.025) 
R2 0.501 0.422 0.313 0.029 0.336 0.198 0.321 
2007—12 0.673*** 3,071.840*** 0.045 −0.003 0.048* 0.105*** 0.044 
N = 6,540 (0.010) (53.860) (0.028) (0.022) (0.024) (0.029) (0.025) 
R2 0.498 0.436 0.318 0.027 0.338 0.205 0.319 
Average for eligible students in 2007 — — 0.741 0.126 0.615 0.575 0.637 

Source: Pittsburgh Public Schools, the Pittsburgh Promise, and the National Student Clearinghouse.

Notes: Coefficients presented from linear probability models predicting enrollment outcomes as a function of Promise eligibility and high school completion in a year in which Promise was available. Coefficients represent the impact of Promise eligibility. All models include covariates included in table 4. Models include high school fixed effects and fixed effects for the years in which Promise was available. Robust standard errors in parentheses.

p < 0.10; *p < 0.05; **p < 0.01; ***p < 0.001.

The results in table 5 tell a story similar to that derived from the RD analyses. Regardless of the Promise cohorts included in our analysis, we estimate that Promise eligibility increased direct-to-college enrollment by approximately 5 percentage points, with this effect driven by enrollment in four-year institutions. The Promise opportunity increased direct-to-college enrollment in PA institutions by approximately 10 percentage points, an effect twice as large as the overall DID enrollment impact and twice as large as the effect on enrollment in PA at the margin of Promise eligibility (table 3, top panel). This indicates that inframarginal (e.g., higher-achieving) students were particularly induced to remain in-state for postsecondary education as a result of the Promise opportunity. Lastly, impacts on enrollment and persistence into the second year of college range from 4 to 7 percentage points, again on par with our RD impact estimates. Finally, we can scale these intent-to-treat estimates by the rate of take-up to estimate the effect of actually receiving Promise funds. Focusing on the results pooling across all years, we estimate that actually receiving Promise funds improved college enrollment and enrollment in a four-year institution by seven percentage points, enrollment in an in-state institution by nearly 16 percentage points, and enrollment and persistence into the second year of college by 7 percentage points.

Investigating Potential Impact Heterogeneity

Prior to the Promise's implementation, rates of college access for PPS graduates differed by student sociodemographic characteristics. Among 2007 graduates, for example, 44 percent of black students and 64 percent of white students enrolled in college immediately after high school. Analogous rates are 48 percent and 67 percent for 2007 graduates from the relatively lowest and highest SES neighborhoods. Given these simple differences, we might speculate that the Promise would yield heterogeneous impacts across these groups. However, when we conduct these same RD and DID analyses to compare impacts for black students and white students, and to compare impacts for students residing in high-SES neighborhoods to those in low-SES neighborhoods, we largely fail to detect meaningful differences. This is partially an issue of statistical precision, and for this reason, we refrain from presenting subgroup results in tabular form.

Nevertheless, we note that even point estimates are generally similar across subgroups. We reason that this similarity in impacts is likely related to Promise eligibility being a function of high school GPA. The raw difference in college enrollment that we observe between black students and white students disappears and actually reverses direction after controlling for high school GPA. For example, for the 2007 PPS graduates, after controlling for GPA, black students are 9 percentage points more likely to matriculate to college than their white counterparts. We observe a similar pattern when comparing students residing in high SES and low SES neighborhoods. In short, at the margin of eligibility, impacts on college-going outcomes are generally similar in magnitude, given that eligibility is a function of GPA. To note is that black students and white students, for example, have very different average GPAs, at 2.27 and 2.98, respectively, for the graduating class of 2007. However, conditional on GPA, as the Promise is, the program has a similar impact on students across these subgroups. The one exception to this lack of heterogeneity pertains to the Promise's impact on continuing to college within PA, with impacts essentially twice as large for students from high-SES neighborhoods than for those residing in lower-income areas of the city. For students from high-SES neighborhoods, the Promise has a much larger impact on the decision of where—rather than whether—to go to college.

5.  Discussion

Our analytic strategies yield similar results regarding the impact of the Promise in its first five years. As a result of the Promise, eligible PPS graduates are approximately 5 percentage points more likely to enroll in college (particularly four-year institutions), 10 percentage points more likely to attend college in PA, and 4 to 7 percentage points more likely to enroll and persist into a second year of college. Using an IV adjustment, we estimate impacts on those who received Promise funds that are larger still. These results point plainly to the conclusion that the Promise is positively and significantly improving college-going outcomes for PPS students.

We outlined five potential mechanisms through which the Promise may have driven these impacts: improving actual college affordability; improving perceptions of college affordability; increasing academic engagement and college readiness; motivating more college-interested students to enter and remain in PPS; and motivating systemwide attention to college going. Regarding the third and fourth possible mechanisms, we argue that these could only influence later Promise cohorts, given that students needed to have attended PPS from the ninth grade on to be eligible, and eligibility is based on cumulative GPA and attendance. In addition, we observe little movement in average student performance over time. For those enrolling in PPS for the first time in the ninth grade, we do not see a difference in cumulative GPA before and after Promise implementation, and we observe only a modest increase in this group's attendance rate after the implementation of the Promise. Further, the program led to positive impacts even in its first and second years, with Promise recipients receiving a several thousand–dollar benefit, on average, toward defraying the costs of college attendance. Although systemwide efforts in response to the Promise were at play, they will not necessarily be primary drivers of our impact estimates, if these efforts influenced college-going behavior for all PPS students. Taken together, we judge that the Promise worked primarily through helping to mitigate the high cost of college attendance in PA by providing additional scholarship dollars. Of course, it may also have motivated families to file the FAFSA and access other sources of financial aid for which they were eligible. We do not have the data to quantify the extent to which this may be the case.

It is useful to contextualize our results by comparing them to other place-based efforts. Using a DID strategy, Bartik, Hershbein, and Lachowska (2015) estimate that Kalamazoo Promise eligibility increased enrollment in four-year institutions on the order of 10 to 13 percentage points. This is similar to the effect that we estimate at the margin of Pittsburgh Promise eligibility and double the effect on four-year enrollment for eligible students overall. That the Pittsburgh Promise effect on four-year enrollment is larger for the marginal student is sensible, as the inframarginal students (e.g., those with higher GPAs) are more likely to continue to college even without Promise support. It is also sensible that the overall Kalamazoo impacts on four-year enrollment would exceed those in Pittsburgh. Broad-based aid programs with simple and transparent designs tend to have larger effects (Deming and Dynarski 2010). The Kalamazoo Promise is not dependent on GPA and attendance criteria and therefore is accessible to a broader set of students. During the college application process, Kalamazoo students can be certain of their eligibility and can weigh it in decisions about whether and where to apply to college. Pittsburgh students may be less certain, as the governing eligibility measures are not finalized until after high school graduation. In addition, the Pittsburgh criteria changed over the program's first several years, and these shifts may have contributed to student uncertainty. A second feature that may lead to smaller impacts is the Pittsburgh Promise's status as a “last-dollar” opportunity that requires students to complete the FAFSA. The FAFSA process itself is a well-known barrier to college enrollment, and merit aid programs that require it for application have been found to be less effective at improving college enrollment (Bruce and Carruthers 2014). In short, given the comparative ease with which students can determine their eligibility and apply for the Kalamazoo Promise, we might expect its impacts to be larger than those for the Pittsburgh Promise.

Another difference is that the Kalamazoo Promise covers tuition and fees at selected institutions whereas the Pittsburgh Promise offers a maximum dollar amount not necessarily tied to tuition levels. In reality, when the Pittsburgh Promise was at its most generous, recipients received a similar level of financial support under the two programs. Nevertheless, Kalamazoo may be perceived as more generous given its presentation as a full-tuition benefit. From the student perspective, a comparative advantage of the Pittsburgh Promise is that it provides support for both public and private in-state institutions. Thus, whereas efforts such as Kalamazoo or the Adams Scholarship in Massachusetts (e.g., Cohodes and Goodman 2014) may constrain institutional choices, the Pittsburgh Promise does not to the same degree.

A final point that may relate to both programs is the risk of aid displacement whereby institutions try to capture some of the Promise dollars awarded to students. In the case of broad-based analogues to place-based scholarship programs, such as Georgia HOPE, this might take the form of institutions increasing tuition levels (e.g., Long 2004). Such tuition increases may be less likely in the case of narrowly targeted place-based efforts where Promise recipients make up a smaller fraction of the total student body. Nevertheless, institutions may still respond strategically by reducing other financial aid in response to Promise dollars (e.g., Cellini and Goldin 2014; Turner 2014; Goldrick-Rab et al. 2016), thereby decreasing the potential impact of the Promise. At least in Pittsburgh, the evidence that we have gathered to date does not suggest such strategic responding by PA institutions (Lowry et al. 2017).

Of course, the Promise is an expensive endeavor. The Pittsburgh Promise awarded students in our sample over $25 million in their first two years after high school. Given the magnitude of this expenditure, as well as the proliferation of Promise programs, it is critical to ask whether the results are worth the investment. To shed light on this question, we perform a back-of-the envelope calculation in the spirit of Deming (2009), Hurwitz and colleagues (2017), and Pallais (2015). We focus on Promise expenditures on recipients in their first two years of college, as we can examine outcomes of enrollment and persistence over this period. For now, we leave aside additional scholarship expenditures beyond these two years as well as continued impacts on persistence and ultimate degree attainment. In our estimates, we also consider returns only in terms of earnings gains resulting from additional years of higher education. Both limitations make for conservative estimates of the return to the Promise investment. For example, Scott-Clayton (2011) provides evidence that a similarly structured Promise program in West Virginia has longer-run impacts in terms of bachelor's degree attainment. Further, additional benefits, both public and private, accrue as a result of scholarship support and exposure to higher education (Baum, Ma, and Payea 2013; Scott-Clayton and Zafar 2019).

Carnevale, Rose, and Cheah (2011) report median lifetime earnings for high school graduates of $1,304,000 (in 2009 dollars). We couple this with Card's (1999) estimate that each year of postsecondary education increases earnings by 10 percent and equate two years of higher education to the following average improvement in lifetime earnings: $1,304,000 × 20% = $260,800. From the graduating classes of 2008 through 2012, we observe 3,073 students to have enrolled immediately in college and persisted into their second year. Given our pooled estimate of a 4.4 percentage point increase on this outcome, we reason that an additional 130 students enrolled and persisted to their second year of college due to Promise support, leading to an increase of $260,800×130=$33,904,000 in lifetime earnings.25 Scaling this by the $25 million in scholarship expenditure, we estimate a $1.35 return for each Promise dollar invested in students through their second year after high school. Given other positive benefits of education, we might reasonably treat this rate of return as a lower bound, although we recognize that not all students may expect the same earnings premium from higher education (see, e.g., Bartik and Hershbein 2016). Nevertheless, based on our back-of-the-envelope calculation, we judge the Pittsburgh Promise to meet the requirement for social investment that expected benefits exceed incurred costs. And although we have focused on private financial benefit, given the local nature of higher education, such an earnings premium would likely translate to eventual returns to the region overall in the form of higher tax revenues and increased consumer spending, among other benefits of a more highly educated citizenry.26 On the whole, therefore, our results point to the benefit of investment in higher education spending through a place-based scholarship, perhaps particularly in the context of a relatively expensive higher education market like PA (Baum and Ma 2014).

Even with positive economic returns, an important question is whether programs such as the Promise are sustainable over time. It requires a singular, long-term investment from local business and philanthropy. Even lottery-funded programs such as Georgia HOPE have scaled back their generosity over time, as the Pittsburgh Promise recently announced. One programmatic change is that, beginning with the class of 2017, Promise funds are limited to costs of tuition and fees. This may have implications for the Promise's impact overall and especially for its impact on four-year college enrollment. Recall that the Promise improved rates of four-year college enrollment, in particular. A unique feature of the Promise during the period we examine is that funds could be used for all costs, including room and board, whereas other programs, such as in Kalamazoo, only tuition and fees were covered. In future work we will investigate how this award structure change affects the level of funding and enrollment outcomes for Pittsburgh Promise-eligible students. For example, across the 2009 through 2012 cohorts, approximately 19 percent of Promise recipients received other grants in excess of tuition and fees. It is an open question how the new Promise restrictions will influence such students. Encouragingly, the Promise is forging relationships with area colleges to try to offset negative repercussions of this award reduction.27 At least based on our examination of the early enrollment and persistence outcomes associated with the first five years of the Pittsburgh Promise, our results suggest that investment in the Pittsburgh Promise—if it can be sustained—is well worth it, especially given the ever-growing economic requirement of a college-educated workforce in Pittsburgh and in the nation overall.

Notes

1. 

For a detailed summary of place-based scholarship programs across the United States, see www.upjohn.org/sites/default/files/promise/Lumina/Promisescholarshipprograms.pdf.

2. 

See Page and Scott-Clayton (2016) for a recent review. Since 1991, several states have implemented merit-aid programs requiring students to meet achievement thresholds to qualify; states include Arkansas, Florida, Georgia, Kentucky, Louisiana, Maryland, Mississippi, Nevada, New Mexico, Oklahoma, South Carolina, Tennessee, and West Virginia. The Arkansas and Maryland programs are no longer operating. The existing state programs vary in terms of eligibility requirements and generosity. For example, the Georgia HOPE program covers tuition and fees for any in-state public institution for high school graduates achieving a 3.0 GPA; West Virginia's PROMISE Scholarship also covers in-state tuition and fees for students earning an ACT score of 21 and a GPA of 3.0; the Tennessee Education Lottery Scholarship (TELS) provides students who meet either a GPA or an ACT score criterion with up to $6,000 per year in scholarship funds; and the Massachusetts Adams Scholarship covers tuition at in-state public institutions for the top quarter of students in each school district, as ranked by performance on the state standardized assessments.

3. 

KPS students enrolled since kindergarten (ninth grade) are eligible for 100 percent (65 percent) of the award.

4. 

Lord, R. 2006. Tuition grants a lure for city schools. The Pittsburgh Post-Gazette, 14 December. www.post-gazette.com/news/education/2006/12/14/Tuition-grants-a-lure-for-city-schools/stories/200612140353.

5. 

For more information, see http://pittsburghpromise.org/.

8. 

A weakness of this analysis, however, is that the authors defined eligible students as those continuously enrolled in the district since the ninth grade with a graduating GPA of 2.5 and an attendance rate of 90 percent; therefore, the authors did not take the many phases of Promise eligibility (as described in figure 1) into account in the analysis.

10. 

PPS data include subsidized meals status, but this variable appeared inconsistently reported across cohorts.

11. 

The NSC is a nonprofit organization that maintains postsecondary enrollment records at the majority of U.S. colleges and universities. NSC data provide student semester-level enrollment information and these records represent the best, most comprehensive source of college enrollment information for U.S. students. Nevertheless, coverage is imperfect, and coverage rates vary across states. In the years we considered, NSC records included over 90 percent of colleges and universities in PA (Dynarski, Hemelt, and Hyman 2015). Promise funding can be utilized toward costs not only at colleges and universities but also at trade and vocational schooling. Institutions such as ITT Technical Institute are included in the NSC data, although we note that these types of institutions are less well represented.

12. 

Because observing SAT scores was important for related analyses, we impute SAT scores, where missing (see Page and Iriti 2016 for details). We used multiple imputations with chained equations to impute missing math and verbal SAT scores. Variables to inform the imputation include scores on eighth- and eleventh-grade reading and mathematics state standardized tests, limited English proficient status, gifted status, gender, race/ethnicity, and high school attended. We generated five draws for each missing observation. In analyses here, we use the average of the five draws. As a result, the average SAT values we report include both actual and imputed scores. Although not shown, among those who sat for the exam, the corresponding average scores are about half a standard deviation (50 points) higher than for those who did not.

13. 

We unfortunately do not have access to data on AP test-taking or performance.

14. 

Although not shown, we note that the inclusion or exclusion of students who fail to meet the relevant attendance thresholds does not substantively change conclusions.

15. 

This is somewhat lower than the take-up rate for universal-eligibility efforts like the Kalamazoo Promise, which had a student take-up rate above 80 percent in its first several years (Bartik, Hershbein, and Lachowska 2016). Compared with eligible students who did not take up Promise funds, eligible students who received Promise funds have modestly higher GPAs, took a modestly higher number of AP courses, on average, and were more likely to be female. These two groups were largely similar on all other characteristics. Of eligible students who did not receive funds, 42 percent nevertheless continued directly to college, but only about one quarter enrolled in a PA institution. Taken together, certain students eligible for the Promise did not take it up for a variety of reasons, which may include: decision to pursue a non-college option after high school, enrollment in a non-PA college, failure to successfully apply for the Promise (which may include failure to complete the FAFSA), or failure to follow through on college plans despite being awarded Promise funds (Castleman and Page 2014a, b; 2017). We lack the data to estimate the prevalence of these potential reasons.

16. 

In figure 2, we extend the upper bound of the graphic to 1.35 points above the year-relevant threshold. This is so that we are able to utilize a linear function to model the relationship between GPA and probability of Promise receipt. Very high achieving students are less likely to take up Promise funds, primarily because these students are more likely to enroll out of state. For years 2008 and 2009, we extend the graphic to 1.35 points below the year relevant cutoff. For year 2010 and beyond, we extend the lower bound only to 0.5 points below the Promise eligibility threshold. Students within this range are still eligible for the Extension Scholar opportunity with the GPA of 2.0 representing the cutoff for eligibility. At this lower-bound margin, we detect no impacts of being selected as an Extension Scholar.

17. 

In examining these results across years, the graphical discontinuity in Promise take-up appears least clear for the graduating cohort of 2010. Several things occurred in that year that may explain this lack of large discontinuity. First, this was the first cohort eligible for the Extension Scholar program. Therefore, students just below the 2.5 GPA threshold of eligibility may have been encouraged particularly to take up Promise funds to attend the local community college. Second, the GPA eligibility threshold for a full Promise scholarship increased to 2.5. If students were not well informed about this shift, those with GPAs between 2.25 and 2.5 may have still developed college plans and been particularly willing to begin their postsecondary education at the community college with support from the Promise.

18. 

Strategies include: removing observations only at the treatment cutoff (Ost, Pan, and Webber 2018), removing observations at whole number heaps (Schudde and Scott-Clayton 2016), removing observations at whole and half number heaps (Dee and Penner 2017), and removing observations at 0.25 multiples (Carruthers and Ozek 2016).

19. 

In implementing this strategy, we retain our focus on those students who attended PPS schools from at least the ninth grade on and exclude those students who were high-school entrants to the district.

20. 

Other covariates, such as SAT scores or the number of AP courses taken, could be considered preliminary outcomes associated with the advent of the Promise and therefore are excluded from the covariates in these models.

21. 

Note that by utilizing this attendance threshold, we do misclassify a small number of students who were eligible for the Promise in 2008, the only year that Promise eligibility did not have an attendance criterion.

22. 

Bozick, Gonzalez, and Engberg (2015) also utilize a DID strategy to estimate the impact of the Pittsburgh Promise. Their approach is largely consistent but differs on three key dimensions. First, although our pre-Promise period includes only one cohort (class of 2007), Bozick, Gonzalez, and Engberg additionally had access to data on the class of 2006. To note, we applied their methodological approach with our more limited data and obtain similar results, demonstrating that our analysis is not sensitive to the exclusion of the 2006 cohort. Second, the post-Promise period they considered extended only through 2010; we examine impacts through the class of 2012. Third, Bozick and colleagues defined eligible students as those who achieved a minimum GPA of 2.5 and minimum attendance rate of 90 percent and defined all others as ineligible. By doing so, the authors included among ineligible students those who would have been eligible in some but not all years. We exclude those students whose eligibility would vary over time.

23. 

The primary specifications that we present utilize a local-linear regression, a rectangular kernel, and a bandwidth of ±1.35 GPA points. An implication of the introduction of the Extension Scholars program is that in Phases II and III, we drop all students with GPAs below the 2.0 threshold (i.e., those students who did not meet the GPA criterion to be at least Extension scholars). Therefore, the bandwidth below the threshold of −1.35 is relevant for the 2008 and 2009 cohorts, and the effective bandwidth below the threshold from 2010 on is −0.50. Standard to the RD context, we face a trade-off between statistical power and estimating treatment effects local to the relevant thresholds (e.g., Ludwig and Miller 2007). In online Appendix A, we report additional sensitivity checks including the donut RD specifications described above (tables A.4–A.7) as well as varying bandwidth and investigating sensitivity to inclusion and exclusion of covariates (tables A.8 and A.9). Conclusions are similar across specifications although statistical significance differs in some instances, as we would expect, due to a loss of precision when restricting the sample.

24. 

Even in this first phase of the Promise, we observe a few students below the threshold receiving Promise funds. This may be a result of some noise in our data and/or the availability of a process whereby students could appeal their eligibility status. Early in Promise implementation, program leaders made the decision to be more liberal in granting appeals because students had little time to adjust their behaviors to respond to the GPA and attendance criteria.

25. 

Without the Promise funds, the persistence figure would be 4.4 percentage points lower. That is, 3,073 ÷ 1.044, or 2,943 students, would have persisted into their second year of college; 3,073 − 2,943 = 130 students.

26. 

Of course, this depends on Promise recipients residing in the Pittsburgh area after attending higher education. For several reasons, it is reasonable to assume that many will do so. First, of Promise recipients, the average distance between Pittsburgh and their college campus is 47 miles. Second, we examine the extent of geographic mobility among bachelor's degree recipients in PA using the 2010 Current Population Survey March Supplement. Among bachelor's degree-holding PA residents between the ages of 22 and 30 years, 70 to 85 percent did not leave the state within five years at the time of the survey and 60 to 75 percent did not leave the county. Together, we reason that the monetary and nonmonetary benefits of Promise-induced college attendance are likely to positively impact the Pittsburgh region.

27. 

For example, nineteen partner institutions have agreed to provide proactive advising supports to Promise recipients. Some will also offer financial support for room and board and/or book expenses.

Acknowledgments

We are grateful to the Pittsburgh Promise and to the Pittsburgh Public Schools for providing data to make this research possible. From the Pittsburgh Promise, we especially thank Saleem Ghubril, Shawn Kinter Nelson, Shelley Scherer, and Steve Kroser. The interpretations and views presented here do not reflect those of either the Pittsburgh Promise or the Pittsburgh Public Schools. All errors are our own.

REFERENCES

Andrews
,
Rodney J.
,
Stephen
DesJardins
, and
Vimal
Ranchhod
.
2010
.
The effects of the Kalamazoo promise on college choice
.
Economics of Education Review
29
(
5
):
722
737
.
Ash
,
Jennifer W.
, and
Gary W.
Ritter
.
2014
.
Early impacts of the El Dorado promise on enrollment and achievement
.
Available
http://www.officeforeducationpolicy.org/wp-content/uploads/El-Dorado-Promise-AER.pdf.
Accessed 24 March 2019
.
Bailey
,
Martha J.
, and
Susan M.
Dynarski
.
2011
. Inequality in postsecondary education. In
Whither opportunity? Rising inequality, schools, and children's life chances
,
edited by
Greg J.
Duncan
and
Richard J.
Murnane
, pp.
117
132
.
New York
:
Russell Sage Foundation
.
Barreca
,
Alan I.
,
Jason M.
Lindo
, and
Glen R.
Waddell
.
2016
.
Heaping‐induced bias in regression-discontinuity designs
.
Economic Inquiry
54
(
1
):
268
293
.
Bartik
,
Timothy J.
,
Randall
Eberts
, and
Wei-Jang
Huang
.
2010
.
The Kalamazoo Promise, and enrollment and achievement trends in Kalamazoo Public Schools
.
Presented at the PromiseNet 2010 Conference
,
Kalamazoo, MI, June
.
Bartik
,
Timothy J.
, and
Brad J.
Hershbein
.
2006
. College grads earn less if they grew up poor.
Kalamazoo, MI
:
W.E. Upjohn Institute for Employment Research
.
Bartik
,
Timothy J.
,
Brad J.
Hershbein
, and
Marta
Lachowska
.
2015
. The effects of the Kalamazoo promise scholarship on college enrollment, persistence, and completion.
Upjohn Institute Working Paper No. 15-229
.
Kalamazoo, MI
:
W.E. Upjohn Institute for Employment Research
.
Bartik
,
Timothy J.
,
Brad J.
Hershbein
, and
Marta
Lachowska
.
2016
.
The merits of universal scholarships: Benefit-cost evidence from the Kalamazoo Promise
.
Journal of Benefit-Cost Analysis
7
(
3
):
400
433
.
Baum
,
Sandy
, and
Jennifer
Ma
.
2014
.
Trends in college pricing 2014
.
Available
https://secure-media.collegeboard.org/digitalServices/misc/trends/2014-trends-college-pricing-report-final.pdf.
Accessed 25 March 2019
.
Baum
,
Sandy
,
Jennifer
Ma
, and
Kathleen
Payea
.
2013
.
Education pays 2013: The benefits of higher education for individuals and society
.
Available
https://trends.collegeboard.org/sites/default/files/education-pays-2013-full-report.pdf.
Accessed 25 March 2019
.
Bettinger
,
Eric P.
,
Bridget Terry
Long
,
Philip
Oreopoulos
, and
Lisa
Sanbonmatsu
.
2012
.
The role of application assistance and information in college decisions: Results from the H&R Block FAFSA experiment
.
Quarterly Journal of Economics
127
(
3
):
1205
1242
.
Bozick
,
Robert
,
Gabriella
Gonzalez
, and
John
Engberg
.
2015
.
Using a merit-based scholarship program to increase rates of college enrollment in an urban school district: The case of the Pittsburgh Promise
.
Journal of Student Financial Aid
45
(
2
):
Article 2
.
Bruce
,
Donald J.
, and
Celeste K.
Carruthers
.
2014
.
Jackpot? The impact of lottery scholarships on enrollment in Tennessee
.
Journal of Urban Economics
81
:
30
44
.
Card
,
David
.
1999
. The causal effect of education on earnings. In
Handbook of labor economics
,
volume 3, edited by
Orley
Ashenfelter
and
David
Card
, pp.
1801
1863
.
Amsterdam
:
Elsevier Science B.V.
Carnevale
,
Anthony P.
,
Stephen J.
Rose
, and
Ban
Cheah
.
2011
. The college payoff: Education, occupations, lifetime earnings.
Washington, DC
:
Georgetown University Center on Education and the Workforce
.
Carruthers
,
Celeste K.
, and
Umut
Özek
.
2016
.
Losing HOPE: Financial aid and the line between college and work
.
Economics of Education Review
53
:
1
15
.
Castleman
,
Benjamin L.
, and
Lindsay C.
Page
.
2014a
.
Summer melt: Supporting low-income students in the transition from high school to college
.
Cambridge, MA
:
Harvard Education Press
.
Castleman
,
Benjamin L.
, and
Lindsay C.
Page
.
2014b
.
A trickle or a torrent? Understanding the extent of summer “melt” among college-intending high school graduates
.
Social Science Quarterly
95
(
1
):
202
220
.
Castleman
,
Benjamin L.
, and
Lindsay C.
Page
.
2017
.
Parental influences on postsecondary decision making: Evidence from a text messaging experiment
.
Educational Evaluation and Policy Analysis
39
(
2
):
361
377
.
Cellini
,
Stephanie Riegg
, and
Claudia
Goldin
.
2014
.
Does federal student aid raise tuition? New evidence on for-profit colleges
.
American Economic Journal: Economic Policy
6
(
4
):
174
206
.
Cohodes
,
Sarah R.
, and
Joshua S.
Goodman
.
2014
.
Merit aid, college quality, and college completion: Massachusetts’ Adams scholarship as an in-kind subsidy
.
American Economic Journal: Applied Economics
6
(
4
):
251
285
.
Cornwell
,
Christopher
,
David B.
Mustard
, and
Deepa J.
Sridhar
.
2006
.
The enrollment effects of merit-based financial aid: Evidence from Georgia's HOPE scholarship
.
Journal of Labor Economics
24
(
4
):
761
786
.
Dee
,
Thomas
, and
Emily
Penner
.
2017
.
The causal effects of cultural relevance: Evidence from an ethnic studies curriculum
.
American Educational Research Journal
54
(
1
):
127
166
.
Deming
,
David
.
2009
.
Early childhood intervention and life-cycle skill development: Evidence from Head Start
.
American Economic Journal: Applied Economics
1
(
3
):
111
134
.
Deming
,
David
, and
Susan
Dynarski
.
2010
. College aid. In
Targeting investments in children: Fighting poverty when resources are limited
,
edited by
Philip B.
Levine
and
David J.
Zimmerman
, pp.
283
302
.
Chicago
:
University of Chicago Press
.
Dynarski
,
Susan
.
2004
. The new merit aid. In
college choices: The economics of where to go, when to go, and how to pay for it
,
edited by
Caroline M.
Hoxby
, pp.
63
100
.
Chicago
:
University of Chicago Press
.
Dynarski
,
Susan
.
2008
.
Building the stock of college-educated labor
.
Journal of Human Resources
43
(
3
):
576
610
.
Dynarski
,
Susan M.
,
Steven W.
Hemelt
, and
Joshua M.
Hyman
.
2015
.
The missing manual using National Student Clearinghouse data to track postsecondary outcomes
.
Educational Evaluation and Policy Analysis
37
(
1 suppl
):
53S
79S
.
Dynarski
,
Susan M.
, and
Judith E.
Scott-Clayton
.
2006
.
The cost of complexity in federal student aid: Lessons from optimal tax theory and behavioral economics
.
National Tax Journal
59
(
2
):
319
356
.
Goldin
,
Claudia
,
Lawrence F.
Katz
, and
Ilyana
Kuziemko
.
2006
.
The homecoming of American college women: The reversal of the college gender gap
.
Journal of Economic Perspectives
20
(
4
):
133
156
.
Goldrick-Rab
,
Sarah
,
Robert
Kelchen
,
Douglas N.
Harris
, and
James
Benson
.
2016
.
Reducing income inequality in educational attainment: Experimental evidence on the impact of financial aid on college completion
.
American Journal of Sociology
121
(
6
):
1762
1817
.
Gonzalez
,
Gabriella C.
,
Robert
Bozick
,
Shannah
Tharp-Taylor
, and
Andrea
Phillips
.
2011
. Fulfilling the Pittsburgh Promise: Early progress of Pittsburgh's postsecondary scholarship program.
Santa Monica, CA
:
RAND Corporation
.
Goodman
,
Joshua S.
2008
.
Who merits financial aid?: Massachusetts’ Adams Scholarship
.
Journal of Public Economics
92
(
10
):
2121
2131
.
Henry
,
Gary T.
,
Ross
Rubenstein
, and
Daniel T.
Bugler
.
2004
.
Is HOPE enough? Impacts of receiving and losing merit-based financial aid
.
Educational Policy
18
(
5
):
686
709
.
Hurwitz
,
Michael
,
Preeya P.
Mbekeani
,
Margaret
Nipson
, and
Lindsay C.
Page
.
2017
.
Surprising ripple effects: How changing the sat score-sending policy for low-income students impacts college access and success
.
Educational Evaluation and Policy Analysis
39
(
1
):
77
103
.
Imbens
,
Guido W.
, and
Thomas
Lemieux
.
2008
.
Regression discontinuity designs: A guide to practice
.
Journal of Econometrics
142
(
2
):
615
635
.
Iriti
,
Jennifer
,
William E.
Bickel
,
Julie
Meredith
,
Megan
Walker
, and
Catherine
Nelson
.
2009
.
Looking inward to keep The Promise: What do Pittsburgh charter high schools do to prepare students for post-secondary education?
Pittsburgh, PA
:
Learning Research & Development Center, University of Pittsburgh
.
Iriti
,
Jennifer
,
Lindsay C.
Page
, and
William E.
Bickel
.
2017
.
Place-based scholarships: Catalysts for systems reform to improve postsecondary attainment
.
International Journal of Educational Development
58
:
137
148
.
Jacob
,
Brian A.
, and
Lars
Lefgren
.
2004
.
Remedial education and student achievement: A regression-discontinuity analysis
.
Review of Economics and Statistics
86
(
1
):
226
244
.
Krugman
,
Paul
.
2013
.
A tale of two rust-belt cities
.
The New York Times
,
21 July
.
LeGower
,
Michael
, and
Randall
Walsh
.
2017
.
Promise scholarship programs as place-making policy: Evidence from school enrollment and housing prices
.
Journal of Urban Economics
101
:
74
89
.
Long
,
Bridget Terry
.
2004
.
How do financial aid policies affect colleges? The institutional impact of the Georgia HOPE scholarship
.
Journal of Human Resources
39
(
4
):
1045
1066
.
Lowry
,
Danielle
,
Lindsay C.
Page
,
Aaron M.
Anthony
, and
Jennifer
Iriti
.
2017
.
To supplement or to supplant? Institutional responses in financial aid to the Pittsburgh Promise
.
Paper presented at the 42nd Annual Conference of Association for Education Finance and Policy
,
Washington, DC, March
.
Ludwig
,
Jens
, and
Douglas L.
Miller
.
2007
.
Does Head Start improve children's life chances? Evidence from a regression discontinuity design
.
Quarterly Journal of Economics
122
(
1
):
159
208
.
McCrary
,
Justin
.
2008
.
Manipulation of the running variable in the regression discontinuity design: A density test
.
Journal of Econometrics
142
(
2
):
698
714
.
Ost
,
Ben
,
Weixiang
Pan
, and
Douglas A.
Webber
.
2018
.
The returns to college persistence for marginal students: Regression discontinuity evidence from university dismissal policies
.
Journal of Labor Economics
36
(
3
):
779
805
.
Page
,
Lindsay C.
,
Benjamin L.
Castleman
, and
Katharine
Meyer
.
2017
.
Customized nudging to improve FAFSA completion and income verification
.
Available
https://ssrn.com/abstract=2854345.
Accessed 25 March 2019
.
Page
,
Lindsay C.
, and
Jennifer E.
Iriti
.
2016
. On undermatch and college cost: A case study of the Pittsburgh Promise. In
Matching students to opportunity: Expanding college choice, access, and quality
,
edited by
Andrew P.
Kelly
,
Jessica S.
Howell
, and
Carolyn
Satin-Bajaj
, pp.
135
160
.
Cambridge, MA
:
Harvard Education Press
.
Page
,
Lindsay C.
, and
Judith
Scott-Clayton
.
2016
.
Improving college access in the United States: Barriers and policy responses
.
Economics of Education Review
51
:
4
22
.
Pallais
,
Amanda
.
2009
.
Taking a chance on college: Is the Tennessee Education Lottery Scholarship Program a winner
?
Journal of Human Resources
44
(
1
):
199
222
.
Pallais
,
Amanda
.
2015
.
Small differences that matter: Mistakes in applying to college
.
Journal of Labor Economics
33
(
2
):
493
520
.
Papay
,
John P.
,
John B.
Willett
, and
Richard J.
Murnane
.
2011
.
Extending the regression-discontinuity approach to multiple assignment variables
.
Journal of Econometrics
161
(
2
):
203
207
.
Reardon
,
Sean F.
, and
Joseph P.
Robinson
.
2012
.
Regression discontinuity designs with multiple rating-score variables
.
Journal of Research on Educational Effectiveness
5
(
1
):
83
104
.
Schochet
,
Peter
,
Thomas
Cook
,
Jonathan
Deke
,
Guido
Imbens
,
J. R.
Lockwood
,
Jack
Porter
, and
Jeffrey
Smith
.
2010
.
Standards for regression discontinuity designs
.
Available
https://ies.ed.gov/ncee/wwc/Docs/ReferenceResources/wwc_rd.pdf.
Accessed 19 March 2019
.
Schudde
,
Lauren
, and
Judith
Scott-Clayton
.
2016
.
Pell Grants as performance-based scholarships? An examination of satisfactory academic progress requirements in the nation's largest need-based aid program
.
Research in Higher Education
57
(
8
):
943
967
.
Scott-Clayton
,
Judith
.
2011
.
On money and motivation a quasi-experimental analysis of financial incentives for college achievement
.
Journal of Human Resources
46
(
3
):
614
646
.
Scott-Clayton
,
Judith
, and
Basit
Zafar
.
2019
.
Financial aid, debt management, and socioeconomic outcomes: Post-college effects of merit-based aid
.
Journal of Public Economics
170
:
68
82
.
Smith
,
Jonathan
,
Michael
Hurwitz
, and
Christopher
Avery
.
2015
.
Giving college credit where it is due: Advanced Placement exam scores and college outcomes
.
Journal of Labor Economics
35
(
1
):
67
147
.
Swinburn
,
Gwen
,
Soraya
Goga
, and
Fergus
Murphy
.
2006
.
Local economic development: A primer developing and implementing local economic development strategies and action plans
.
Washington, DC
:
World Bank
.
Turner
,
Lesley J.
2014
.
The road to Pell is paved with good intentions: The economic incidence of federal student grant aid
.
Unpublished paper, University of Maryland
.