Abstract

The 2007 Public Education Reform Amendment Act led to 39 percent of the principals in District of Columbia Public Schools (DCPS) being dismissed before the start of the 2008–09 school year, and additional principal exits over the next few years. We measure the impact of replacing these principals on schoolwide student achievement by measuring the changes in achievement that occurred when principals were replaced, and comparing these changes to achievement in comparison schools within DCPS that kept the same principal. We find that after a new principal's third year in a school, average schoolwide achievement increased by 4 percentile points (0.09 standard deviations) compared with how students in the school would have achieved had DCPS not replaced the previous principal. For students in grades 6 to 8, the gains were larger and statistically significant in both math and reading.

1.  Introduction

Overview

In 2007, the District of Columbia (DC) began a process of school reform with the Public Education Reform Amendment Act (PERAA). PERAA led to numerous reforms that changed nearly every aspect of the DC Public Schools (DCPS), including school governance structures, human capital policies, and leadership. PERAA placed DCPS under mayoral control; Adrian Fenty, the mayor of DC, used his authority to appoint Michelle Rhee as the first chancellor of DCPS. In an effort to boost student achievement, Chancellor Rhee replaced many school principals as one of her first reforms.

Although DCPS has made annual renewal decisions about school principals prior to and since PERAA, Chancellor Rhee changed the implementation of the principal retention policy to engage in a conscious strategy of replacing poor-performing principals. For the 2008–09 school year, 39 percent of the principals in the school district—fifty-one individuals—did not return. Fewer than 30 percent of principals left DCPS in any other school year between 2003–04 and 2010–11. Because as many as half of the exits that occurred at the end of the 2007–08 school year were intentional dismissals by Rhee (Turque 2008), these circumstances provide a unique opportunity to understand the impact of targeted principal dismissals on student achievement. Whereas most previous research on the impact of principal transitions has focused on typical principal turnover or rotations across schools, the replacements in DCPS provide evidence on the effectiveness of a policy of principal dismissals.

In this paper, we measure whether students in a school with a new principal performed better on standardized tests than they would have if the original principal had been retained. To do so, we analyze the changes in schoolwide achievement that occurred when exiting DCPS principals were replaced. We examine principal exits that occurred at the end of each of the school years from 2007–08 through 2010–11.

The primary challenge in this analysis is to distinguish between changes in schoolwide student achievement caused by the new principal and those that might have occurred even if the dismissed principal had continued to lead the school. Achievement gains could have occurred if other factors besides the principal also changed in DCPS schools and affected student achievement. For example, PERAA also led to changes in human capital policies for teachers, which may have affected achievement trends in all DCPS schools. To address this issue, our analysis uses a comparison group of DCPS schools that did not experience transitions in school leadership. Doing so allows us to focus on how achievement trends differ in schools with and without principal replacements.

We use a difference-in-differences design in which the first difference is between the achievement of students in DCPS schools before and after a principal's replacement. The second difference compares this change to the change in the achievement of students from a sample of comparison schools within DCPS in which the principal was not replaced. Under this approach, we estimate how the achievement of students in the schools with new principals would have performed in the absence of a leadership transition. Our analysis also accounts for students’ prior achievement and background characteristics to address changes in the composition of students in a school over time.

We also address other challenges in our analysis. For example, some of the fifty-one principal exits following the 2007–08 school year coincided with school closures and combinations. As a result, changes in achievement after replacements may reflect not only the impact of the change in leadership but also the impact of combining schools. In addition, we address challenges that arise from the possibility that the schools in our comparison sample differ importantly from schools with exiting principals. For example, DCPS may have selected principals for dismissal from schools with declining achievement trends. If so, then achievement trends in comparison schools may not reflect how achievement would have evolved in schools with replacements had DCPS not replaced any principals, and our difference-in-differences design would produce estimates that are confounded by the differences between the two groups of schools.

We found that new principals in DCPS schools led to significantly higher schoolwide achievement in reading. The average student's reading achievement in schools led by new principals increased by 4 percentile points, or 0.10 standard deviations, compared with how students would have achieved had DCPS not replaced their previous principals. New principals did not have immediate impacts on achievement—we found statistically significant impacts on reading achievement following new principals’ third year—but we found no evidence that student achievement declined after replacements, even temporarily. Although not as strong, the pattern was similar for math. For students in grades 6 to 8, the gains were larger and statistically significant in both subjects after two years; new principals improved achievement of the average sixth- to eighth-grade student by 8 percentile points (0.20 standard deviations) in math and 8 percentiles (0.21 standard deviations) in reading.

Previous Research

Several previous studies have attempted to measure the contributions of principals to student achievement. For this review, we focus on studies like ours that use individual student data and account for students’ prior test scores and other background characteristics when measuring these contributions. We omit studies that analyze school-level data without student-level statistical controls because these studies can misattribute to a new principal a change in student achievement that is caused by a change in student composition.

Most recent studies of principals’ impact on achievement do account for student background, and many do so by calculating school value added. School value added isolates the school's contribution to student achievement from the contributions of factors that are outside the control of the school, including the background characteristics of students. In addition to principal effectiveness, school value added may also measure the effectiveness of teachers in the school, contributions of school resources and facilities to achievement, and other school-level factors. By comparing a school's value added before and after a principal was replaced, this approach can isolate a principal's impact on achievement from other school-level factors (Chiang, Lipscomb, and Gill 2016).

Recent studies that compare a school's value added in the years before and after a principal transition have found that principals account for well under half of the differences in the level of student achievement across schools, with other school-level factors responsible for the remaining differences. Using data from Pennsylvania, Chiang, Lipscomb, and Gill (2016) found that principals are responsible for at most 25 percent of the school's contribution to student achievement. Results from studies in Miami-Dade County Public Schools (Grissom, Kalogrides, and Loeb 2015), and Texas (Branch, Hanushek, and Rivkin 2012) are consistent with a figure that is less than 15 percent.1 Based on our estimates of school value added in DCPS, if DCPS principals were responsible for 15 percent of the school's contribution to student achievement, then replacing a principal who is at the 16th percentile of effectiveness with an average principal—an improvement of one standard deviation—would improve the average student's achievement by 1 percentile point.2

Several studies have found that it may take a few years for a new principal to make full impact in a new school. Using data from New York City, Clark, Martorell, and Rockoff (2009) found that new principals’ contributions to student achievement improve by approximately 0.01 standard deviations between the principals’ first and third years of experience. Coelli and Green (2012) studied principal transitions in British Columbia, Canada, and found that it may take three or more years for a new principal to reach full impact in a school—and this impact can be much larger than the average impact over the first few years.3 Two studies examined cumulative impacts of new principals on achievement over time. Dhuey and Smith (2014), who also studied British Columbia principals, found that cumulative exposure for three years to a new principal who is one standard deviation more effective can boost student scores by 0.4 standard deviations. Gates et al. (2014) studied outcomes of students in ten districts that recruited principals from New Leaders—a program designed to recruit, train, and support highly effective principals—and found that cumulative exposure to these principals over three years improved scores by approximately 0.03 standard deviations.4

Finally, Miller (2013) suggests that too much credit may be given to new principals if they were hired after a drop in the school's achievement under the previous principal. Using data from North Carolina, she found that although new principals improve over their first few years in a new school, after five years the new principal is only as effective as the previous principal's highest level of performance. Miller (2013) warns against attributing all of the post-transition gains to the new principal. Had the original principal instead been retained, the pre-transition drop in performance may have proven to be only temporary. Thus, some or all of the gains associated with the new principal might also have been achieved had no transition occurred.5

Our Contribution

This study makes two contributions to the previous research. First, all principal transitions in DCPS were precipitated by principals who left the district, many of whom DCPS targeted for replacement. In contrast, previous studies have focused on rotations between schools or other typical nonretention. Thus, the exiting principals may be more likely to be low performers than in previous studies. The DCPS replacements are more likely to be new hires or promotions, although some replacements were transferred from other schools that closed or were combined. Consequently, the impact of the new DCPS principals may differ from the impact of transitions in previously studied states and districts. Furthermore, to our knowledge this is the first study to examine the impact of a strategy of replacing poor-performing principals similar to the one precipitated by PERAA in DCPS.

Second, we provide evidence of the impact of a new principal on student achievement in each year up to four years following the previous principal's exit. Similarly, we are also able to observe possible trends in achievement prior to the replacements, such as the declines that Miller (2013) warns could lead to overstating the impact of a new principal. Our eight-year panel of student achievement data allows us to investigate these patterns to understand whether post-transition impacts can be fully attributed to the impact of the new principal. We are not aware of any previous study using longitudinal student-level data that obtains such rich information about the timing of student achievement impacts from new principals.

2.  Empirical Approach

Difference-in-Differences Design

We use a difference-in-differences approach to estimate the impacts of the new principals on achievement. The change in achievement before and after the change in school leadership is the first difference in our design; the second difference is between this achievement trend and the trend over the same time period in a set of comparison schools that kept the same principal. In doing so, we also account for changes in the composition of students in schools with and without new principals.

Our approach uses a regression analysis to compare trends in math and reading achievement for schools with and without transitions. In addition to prior achievement and other characteristics of students, we use fixed effects to account for differences in achievement between schools that do not change over time, such as those that may be caused by differences in school resources or other school-level factors. We also account for changes in the overall average student achievement levels over time and across grades that may have arisen from other district-wide changes or DCPS policies.

Specifically, we estimate the following regression of students’ post-test scores on student background characteristics and variables that identify achievement trends for schools with new principals and comparison schools:
Yigst=j=-5-1δjRstj+j=14δjRstj+μgt+λ1gtSigt+λ2gtOigt+β'Xit+υgs+ɛigst.
(1)

In this regression, Y is the post-test for student i in grade g, school s, and year t. The summation terms represent a set of indicators R for each year, from five years previous to a replacement to four years after a replacement, and the associated coefficients δ. Whereas year t is a school year from 2005–06 through 2011–12, the index j represents a year relative to a replacement that may have occurred after any one of the 2007–08 through 2010–11 school years. The two summations exclude year j = 0 because our primary specification excludes the relative year indicator for year 0—the last school year the exiting principal led the school—so that the coefficients on the remaining indicators measure changes relative to achievement in exiting principals’ final year in DCPS.

The next term, μgt, is a set of indicators for each grade–year combination to account for differences in achievement levels over time and across grades that arise because of which students are included in the analysis sample. The variables S and O represent the same- and opposite-subject pre-tests with associated coefficient vectors λ1 and λ2. We estimated a separate pre-test coefficient for each grade and year. The vector X includes the other individual student background characteristics, and the coefficient vector β provides relationships between each characteristic and achievement that are constrained to be the same in every grade and year.

To account for characteristics of schools that do not change over time, including fixed differences between schools with new principals and comparison schools, we included indicators for each school-grade combination in υgs.6 The error term ɛ represents any other student-, school-, or year-specific factors. We account for heteroskedasticity and correlation of regression errors within schools. Finally, we weighted each record in the regression based on a dosage variable that gives less weight to each record for students who attended multiple schools during the year.

In our main specification, we compare all eligible schools with replacements to all eligible schools that kept the same principal. However, we also estimated a version of regression 1 using propensity score weights to construct a comparison group that was more similar to the group of schools with replacements based on value added from the 2005–06 and 2006–07 school years and demographics of students in the schools.

Impact estimates δ from the regression in equation 1 give the change in the gap in achievement between schools with and without new principals from a baseline year—the last school year the exiting principal led the school. Using this approach, we estimate the change in the gap for each of the four years following and the five years prior to a change in school leadership. However, for our primary method of estimating impacts we use a simpler specification that excludes the pre-transition indicators so the impacts measure the change in the gap from those across the full pre-transition period. This approach has the potential to increase precision, although the choice is not consequential for our findings.

Limitations

Although our study makes important contributions to understanding the impact of new principals on student achievement that ultimately resulted from PERAA reforms, we acknowledge four limitations. First, our estimates of the impact of new principals could be confounded over time with other changes within schools that are not caused by the exits. In other words, our difference-in-differences strategy requires that we assume any unobserved determinants of student achievement that vary across schools do not also vary over time in a way that is related to whether schools did or did not have a principal replaced.

Although we account for changes in the composition of students within schools, DCPS may have implemented other changes in schools at the same time they were replacing principals. Our analysis accounts for these changes if they have the same impact on achievement in schools with and without new principals. However, some changes may have differentially affected achievement in these groups. Most notably, DCPS closed or combined many schools, with some of these school closings and combinations occurring simultaneously with the changes in school leadership. Özek, Hansen, and Gonzalez (2012) show that school closures in DC led to a temporary decline in achievement for affected students. Other policies that DCPS implemented in the years after PERAA may have affected student achievement in the schools with principal replacements. Changes to school resources that coincided with a change in school leadership may have also occurred. For example, DCPS may have provided new principals with additional resources to support the transition.

To address this first limitation, we conduct alternative versions of our primary analysis designed to address some of these threats to the interpretation of our findings. We conduct an analysis to address simultaneous school closures and exits, which is available in a separate online appendix.7 We also address the possibility—by conducting a version of our analysis that accounts for year effects in the gap between test scores in schools with and without replacements—that DCPS policies implemented in the years immediately following the enactment of PERAA had a differential impact in schools that had replacements. This is possible because the timing of the replacements in our data varies. Although these analyses address some threats to the interpretation of our findings, we cannot rule out the possibility that DCPS implemented other policies or provided resources that directly coincided with the timing of replacements, whenever they occurred. If the resources positively influenced achievement in these schools, our impact estimates would be too large, as they would conflate the impact of the new principal with the impact of the resources.

Second, although we examine principal turnover that in many cases is the direct result of an intentional policy of replacing ineffective principals with highly effective new principals, we cannot distinguish principals who left voluntarily from those who left DCPS involuntarily. Thus, we examine the impact of replacing principals who left DCPS voluntarily or otherwise. For example, Dee and Wyckoff (2015) found that the DCPS IMPACT evaluation system led more lower-performing teachers to exit the district even though they were not subject to dismissal under IMPACT. If DCPS principals voluntarily exit, our results provide an estimate of the impact on student achievement that DCPS achieved from replacements that occurred as a result of targeted dismissals, combined with the impact of more typical nonretention.

Third, our approach to estimating the impact of new principals requires that schools in the comparison group are unaffected by the policy of selectively replacing principals, but this may not be the case. Some replacement principals were drawn from comparison schools that were closed or combined. Movement of students out of closed comparison schools will necessarily lead to changes in the composition of students in other comparison and treatment schools. To help address concerns that the composition of students in comparison schools may change over time, we account for student background characteristics when measuring trends in the contributions of comparison school principals, just as we do for schools with changes in school leadership. However, there may be other ways in which comparison schools are affected by this human capital policy that we cannot address. For example, the threat of dismissals in DCPS may have incentivized principals to bring about higher achievement in their schools. If so, principals in comparison schools may have been retained in part because they responded to those incentives by improving student achievement. In this case, our impact estimates would be lower than they would be in the absence of any incentives. Alternatively, if the incentives affected both groups of principals similarly, our impact estimates would reflect only the effects of replacing principals and not the full impact of the policy including incentive effects. Consequently, the full impact of the principal dismissal strategy could be larger than our estimates suggest.

Finally, we note that the results of this paper may have limited external validity and may not generalize to a typical U.S. school district given the unique context of public schools in DC. Differences between DCPS and other districts include a large charter sector and a unique demographic student profile. Nearly 40 percent of DC public school students were enrolled in a charter school during the years examined in this study, which affects the composition of students attending the noncharter DCPS schools in our study. As shown in table A.1 in the online appendix, the composition of students in DCPS schools differed markedly from that of other urban Title I schools, including a higher proportion of black students and a lower proportion of students who are Hispanic. These and other differences may have affected the impact of new principals we measure. For example, as a result of these or other differences, DCPS may have been more or less able to identify and hire highly effective replacement principals.

3.  Data

We use administrative data provided by DCPS and the Office of the State Superintendent of Education of DC. The data include (1) a list of DCPS principals’ school assignments for each school year from 2000–01 through 2011–12, (2) student background characteristics, including math and reading test scores in grades 3 through 8 and 10 for the 2002–03 through 2011–12 school years, and (3) information on students’ school enrollment. Although our main analysis focuses on student outcomes in the seven school years from 2005–06 through 2011–12, we use data from 2000–01 through 2004–05 to construct a measure of principal experience and in some sensitivity analyses. We describe these data in more detail in the online appendix.

Our analysis divides schools into those in which DCPS replaced principals and those in which it did not between the 2007–08 and 2010–11 school years. The percentage of principals who left DCPS varied substantially over time, and some principals were forced out of jobs due to school combinations or closings. The annual turnover rate of DCPS principals varied between 14 and 39 percent from 2003–04 through 2010–11 (last row of table 1). The largest percentage of principals leaving DCPS occurred at the end of the 2007–08 school year, Michelle Rhee's first year as chancellor, when 39 percent of principals—fifty-one individuals—did not return to DCPS. Figure 1 shows the locations of each school included in our analysis, whether and when a principal was replaced, and the number of students in each school.

Table 1.
Principal Transitions in District of Columbia Public Schools (DCPS) by School Year and Status
Principal and School Status2003—042004—052005—06200—0072007—082008—092009—102010—11
Left DCPS         
School remained open 17 32 24 18 33 22 29 21 
School combined 
School closed Stayed in DCPS 
School remained open 102 87 95 112 71 85 77 84 
School combined 
School closed 
Total 119 119 122 132 131 109 107 106 
Left DCPS (%) 14 27 21 14 39 22 28 21 
Principal and School Status2003—042004—052005—06200—0072007—082008—092009—102010—11
Left DCPS         
School remained open 17 32 24 18 33 22 29 21 
School combined 
School closed Stayed in DCPS 
School remained open 102 87 95 112 71 85 77 84 
School combined 
School closed 
Total 119 119 122 132 131 109 107 106 
Left DCPS (%) 14 27 21 14 39 22 28 21 

Notes: The table includes principals in schools with at least fifty tested students in grades 4 through 8 or in grade 10. The table describes transitions that occurred at the end of each school year.

Figure 1.

Principal Transitions in District of Columbia Public Schools (DCPS) by School Year and Size

Notes: The figure shows the location of each DCPS school that we included in our analysis sample, the number of students with test scores from each school, and the timing of the school's principal replacement (if any).

Figure 1.

Principal Transitions in District of Columbia Public Schools (DCPS) by School Year and Size

Notes: The figure shows the location of each DCPS school that we included in our analysis sample, the number of students with test scores from each school, and the timing of the school's principal replacement (if any).

School restructuring in DCPS creates challenges for tracking student achievement over time in schools with and without changes in school leadership. For example, schools that closed do not have new principals. In most cases, the school of a departing principal remained open, however, as shown in rows 2 and 3 of table 1, these schools were sometimes closed or combined. The principal exits from 2007–08 coincided with substantial restructuring of the schools; nine of the schools with a departing principal combined with another school, and nine other schools with departing principals closed. The next rows of table 1 show that school closures and combinations also affected some returning principals. Again, taking the 2007–08 school year as an example, the principals in six schools that closed transferred to a different school in DCPS and three principals continued leading their school after it was combined with one of the nine combined schools led by a departing principal.8

We measure the impacts of replacing principals in DCPS using student test scores in math and reading. The test scores are from SAT-9 tests from spring 2003 through spring 2005 and DC Comprehensive Assessment System (DC CAS) in spring of the subsequent seven years. We standardized the test scores to have a mean of zero and standard deviation of one within each combination of grade, year, and subject. This step translated math and reading test scores in every grade and year into a common metric; the DC CAS scores otherwise would not be comparable across these groups (i.e., they are not “vertically aligned”). Standardizing the scores means that we cannot track DC-wide changes in achievement levels over time, but that is not a goal of our analyses. Instead, we compare trends in achievement between students in schools with and without principal transitions.9

Because of concerns with the accuracy and completeness of the SAT-9 test scores, we did not use these scores as outcomes for our main analyses. For analysis that did include the SAT-9 test scores, we excluded scores from grades 4, 6, and 7 in the 2004–05 school year because we obtained relatively few test scores for students in those grades. Because we account for pre-test scores from the previous year in our analysis, excluding these SAT-9 scores also meant that we excluded from our analysis all students in grades 5, 7, and 8 in the 2005–06 school year.

To account for student background, we used indicators for race/ethnicity categories, subsidized meals eligibility, English language learner status, receipt of special education services, gender, and whether a student transferred between schools during the year. Individual student data on subsidized meals eligibility are lacking for students attending a community-eligible school because these schools do not collect annual information about individual student poverty status.10 Beginning in the 2005–06 school year, for students who attended community-eligible schools, we used a subsidized meals status for the student from another school or year, when available, and otherwise marked students in those schools as eligible for free lunch.11

We make several restrictions to the students and schools included in the analysis. We include in our main analysis students in each of the seven school years from 2005–06 through 2011–12 who are linked to at least one DCPS school for which we have identified a principal in the year. We also require that students have both a post-test and a pre-test from the same subject. For students in grades 4 through 8, the pre-test was from the previous grade and year. For students in grade 10, the pre-test was from grade 8, two years prior to the post-test. We then excluded twelve schools with new principals and six schools without changes in school leadership because of possibly compromised test scores in those schools. These schools were identified in a USA Today report as ones where at least half of tested classrooms showed evidence of cheating in at least one of the 2007–08, 2008–09, or 2009–10 school years. Tests were flagged by the DC test score publisher if they had high rates of incorrect answers that were erased and replaced with correct answers (USA Today 2011).12 As a final step, we excluded schools that were missing from any of the seven years in the panel.13,14,15 Although we include closed schools in some of our analyses, this restriction to our primary analysis sample removes all twenty schools that closed before the 2011–12 school year. The final analysis sample retains 88 percent of students who have post-tests from one or more of the seven school years.

Our analysis focuses on the principals who left DCPS in the years following the enactment of the PERAA school reform legislation and their replacements. In table 2, we provide counts of the new principals included in our analysis in each of the three school years following PERAA and identify whether the new principals had previously led a DCPS school. Of the thirty-two new principals who replaced a 2007–08 principal, twenty-three were not leading a DCPS school in the previous year. The remaining nine new principals either led a different DCPS school during the 2007–08 school year or their previous school was combined with an exiting principal's school. Prior to assuming leadership of a school, new principals may have been teaching or in administration within DCPS, or may have been recruited from outside DCPS. The counts of new principals in the last row of table 2 are lower than the total number of transitions in table 1 because of the restrictions we made to the analysis sample and because some schools have had multiple post-PERAA changes in school leadership but are counted only once in table 2. For the twenty-one schools with multiple transitions, we include only the first new principal following PERAA. In doing so, we treat the subsequent replacements as a consequence of the first post-PERAA replacement.

Table 2.
New Principals After Public Education Reform Amendment Act (PERAA) by School Year and Status
Last School Year Departing Principal Led School
New Principal Status2007—082008—092009—102010—11
New Principal is:     
Not previously a DCPS principal 23 
From another DCPS school 
New principals in analysis sample 32 
Last School Year Departing Principal Led School
New Principal Status2007—082008—092009—102010—11
New Principal is:     
Not previously a DCPS principal 23 
From another DCPS school 
New principals in analysis sample 32 

Notes: The table includes schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined, so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Counts of new principals include only the first replacement year for the twenty-one schools with multiple post-PERAA replacements. Of the thirty-two schools with replacements for departing principals from the 2007—08 school year, seventeen have had at least one subsequent replacement. One new principal from the 2010—11 school year began leading two schools, so there are fifty-three new principals in our analysis but the total number of schools with new principals is fifty-four. DCPS = District of Columbia Public Schools.

The schools in our analysis sample have slightly higher average achievement compared with the averages for DCPS schools generally. Table 3 provides averages and standard deviations of school characteristics. The average levels of student achievement in the analysis sample can be different from zero because we standardized test scores using all students with test scores, not only those used in our analysis. Schools in our analysis sample have slightly higher achievement than average by 0.04 standard deviations in math and reading, shown in rows 1 and 2.

Table 3.
Characteristics of District of Columbia Public Schools (DCPS)
School CharacteristicAverageStandard Deviation
(1) Average math achievement (standard deviations of student achievement) 0.04 0.52 
(2) Average reading achievement (standard deviations of student achievement) 0.04 0.50 
(3) Math value added (standard deviations of student achievement) 0.00 0.21 
(4) Reading value added (standard deviations of student achievement) 0.00 0.18 
(5) Fraction of students eligible for free or reduced-price lunch 0.68 0.25 
(6) Fraction of students that are English language learners 0.08 0.14 
(7) Fraction of students that receive special education services 0.17 0.08 
(8) Fraction of students that are black 0.80 0.27 
(9) Fraction of students that are Hispanic 0.11 0.18 
School CharacteristicAverageStandard Deviation
(1) Average math achievement (standard deviations of student achievement) 0.04 0.52 
(2) Average reading achievement (standard deviations of student achievement) 0.04 0.50 
(3) Math value added (standard deviations of student achievement) 0.00 0.21 
(4) Reading value added (standard deviations of student achievement) 0.00 0.18 
(5) Fraction of students eligible for free or reduced-price lunch 0.68 0.25 
(6) Fraction of students that are English language learners 0.08 0.14 
(7) Fraction of students that receive special education services 0.17 0.08 
(8) Fraction of students that are black 0.80 0.27 
(9) Fraction of students that are Hispanic 0.11 0.18 

Notes: The table includes schools observed in each of the 7 school years from 2005—06 through 2011—12, but excludes schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined, so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Averages and standard deviations were calculated with one observation per school-year combination and are not weighted. Math and reading achievement is standardized to have an average of zero and a standard deviation of one within each grade, year, and subject among all DCPS students. Value added is measured in student-level standard deviations of math or reading achievement and has an average of zero within each year and subject among all DCPS schools. The table treats schools that are combined as distinct schools prior to being combined, and as a single school after being combined. The sample includes 82 schools, of which 12 were involved in six combinations. The averages include a total of 543 school-year records from the 7 school years.

Although students in included schools have slightly higher test scores than students in schools overall, the included schools were no more effective at raising student achievement than excluded schools. To measure school effectiveness, we used value added to student achievement, a measure of the contribution of school-level factors that includes but is not limited to principals.16 The average math and reading value-added estimates for schools in DCPS is zero by definition. Thus, rows 3 and 4 of table 3 indicate that the schools included in the analysis sample are representative of the average value added of all schools in DCPS. Finally, rows 5 through 9 of table 3 show average student characteristics. For example, in the average school, 68 percent of students in the sample are eligible for free or reduced-price lunch and 80 percent are black.

Compared with principals who did not leave, those who left DCPS at the end of the 2007–08 school year had lower-achieving students in math and reading in the year of the exit and had lower school value-added estimates, indicating that, on balance, the schools in which principals exited (voluntarily or otherwise) were not as effective at raising student achievement as schools in which principals were retained. Additionally, returning principals led schools with fewer students who were eligible for free or reduced-price lunch compared to exiting principals. Other characteristics of principals’ students did not significantly differ for principals returning versus exiting after the 2007–08 school year, and principals who left did not have significantly more or less experience leading schools in DCPS compared to returning principals. When pooling the 2008–09, 2009–10, and 2010–11 school years, we found no statistically significant differences between student characteristics in schools with principals returning versus exiting after these years, but exiting principals were less likely to have two to five years of experience. We present these differences in table 4.

Table 4.
Average Characteristics of Returning and Exiting Principals by Time Period
Principals from 2007—08Principals from 2008—09 through 2010—11
School or Principal CharacteristicReturningExitingReturningExiting
Average achievement (standard deviations of student achievement)     
Math 0.31 −0.12* 0.22 0.02 
Reading 0.28 −0.12* 0.17 0.05 
Value added (standard deviations of student achievement)     
Math 0.08 −0.04* 0.06 −0.04 
Reading 0.07 −0.06* 0.02 −0.01 
Experience leading a DCPS school     
One year 0.18 0.19 0.02 0.09 
Two to five years 0.55 0.56 0.56 0.23* 
Six or more years 0.27 0.25 0.42 0.68 
Fraction of students who are:     
Eligible for free or reduced-price lunch 0.58 0.71* 0.66 0.67 
English Language Learners 0.06 0.09 0.07 0.07 
Special education 0.14 0.18 0.15 0.16 
Black 0.85 0.80 0.77 0.79 
Hispanic 0.08 0.12 0.10 0.09 
Number of schools 22 32 22 22 
Principals from 2007—08Principals from 2008—09 through 2010—11
School or Principal CharacteristicReturningExitingReturningExiting
Average achievement (standard deviations of student achievement)     
Math 0.31 −0.12* 0.22 0.02 
Reading 0.28 −0.12* 0.17 0.05 
Value added (standard deviations of student achievement)     
Math 0.08 −0.04* 0.06 −0.04 
Reading 0.07 −0.06* 0.02 −0.01 
Experience leading a DCPS school     
One year 0.18 0.19 0.02 0.09 
Two to five years 0.55 0.56 0.56 0.23* 
Six or more years 0.27 0.25 0.42 0.68 
Fraction of students who are:     
Eligible for free or reduced-price lunch 0.58 0.71* 0.66 0.67 
English Language Learners 0.06 0.09 0.07 0.07 
Special education 0.14 0.18 0.15 0.16 
Black 0.85 0.80 0.77 0.79 
Hispanic 0.08 0.12 0.10 0.09 
Number of schools 22 32 22 22 

Notes: The table includes schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined, so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Averages and standard deviations were calculated with one observation per school-year combination and are not weighted. Math and reading achievement is standardized to have an average of zero and a standard deviation of one within each grade, year, and subject among all District of Columbia Public Schools (DCPS) students. Value added is measured in student-level standard deviations of math or reading achievement and has an average of zero within each year and subject among all DCPS. Principals in schools with multiple replacements between the 2007—08 and 2010—11 school years are only counted as exiting for the first of these replacements, and are not included in the averages in the subsequent years. Returning principals include only principals who were not replaced between the 2007—08 and 2010—11 school years.

*Statistically significant at the 5 percent level.

Although the schools with returning principals—the twenty-two comparison schools in the analysis—are lower-achieving compared with schools with new principals, this is not necessarily a concern. The difference-in-differences research design accounts for differences in average characteristics between the two groups of schools so long as a key assumption holds. We assume that the difference in achievement between schools with and without new principals before the replacements occur would be the same as the difference in the years following the replacements in the hypothetical case that no replacements actually occurred. In other words, we allow a pre-transition gap in achievement between schools with and without new principals, but we assume that the trend in schools without new principals represents how achievement would have evolved in schools with changes in school leadership had DCPS not replaced any principals. As with similar assumptions for all research designs that rely on nonexperimental methods, this assumption is not directly testable because we do not observe the hypothetical case of no changes in school leadership. However, we provide some important evidence in support of this assumption by testing for differences in achievement trends in the two groups of schools prior to the changes in school leadership.

4.  Results

Impact of Post-PERAA New Principals

Impact on Math and Reading Achievement

New principals produced higher reading achievement after three years in the school compared with the level of achievement prior to the change in school leadership. We found positive but insignificant impacts on math achievement. Figure 2 shows the estimate of impacts on math achievement for replacements that occurred between the 2007–08 and 2011–12 school years. To implement our difference-in-differences research design, we analyze changes in the gap in student achievement between schools with and without new principals. All impact estimates in the figure are measured as changes in the gap in math achievement between schools with and without replacements from the gap that was present during the year of the replacement. This baseline gap in achievement is shown as a single dot at 0.0 standard deviations in the final school year before the transition occurred (called “year 0”). We measure changes in the gap using student-level standard deviations of student achievement. The gap in the year immediately following the replacement (year 1) is nearly identical to the baseline gap, indicating that new principals had no impact on math achievement after one year. However, the point estimate is larger in year 2, indicating that math achievement in schools with new principals improved relative to schools that kept the same principal after two years with the new principal. This positive impact in year 2 of 0.04 standard deviations is not statistically significant (the confidence interval crosses the dashed line at 0.0). Although also not statistically significant in years 3 and 4, the impact estimate is 0.07 standard deviations in both of these years, suggesting that a higher level of achievement may have been sustained in these schools through the fourth year with the new principal. An impact of 0.07 standard deviations is equivalent to improving the average student's performance by 3 percentiles.

Figure 2.

Impact of New Principals on Math Achievement by Year Relative to Replacement

Notes: The figure includes 76 schools, of which 54 had post-PERAA (Public Education Reform Amendment Act) new principals. The figure includes schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes 18 schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Impacts are measured relative to outcomes in year zero, the last year exiting principals led their schools. Outcomes for the seven school years are District of Columbia Comprehensive Assessment System scores. Confidence intervals are based on standard errors that are clustered at the school level.

Figure 2.

Impact of New Principals on Math Achievement by Year Relative to Replacement

Notes: The figure includes 76 schools, of which 54 had post-PERAA (Public Education Reform Amendment Act) new principals. The figure includes schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes 18 schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Impacts are measured relative to outcomes in year zero, the last year exiting principals led their schools. Outcomes for the seven school years are District of Columbia Comprehensive Assessment System scores. Confidence intervals are based on standard errors that are clustered at the school level.

Figure 3 shows the same impact estimates for reading. As with math, we find no impact of the new principals on reading achievement in the first year after a change in school leadership. The impact on reading achievement is 0.05 standard deviations in year 2, 0.09 standard deviations in year 3 and 0.10 standard deviations in year 4, and each of these estimates is statistically significant when using the specification shown in panel A of table 5. The specification used in table 5 obtains a small increase in precision by measuring the impacts relative to the gap in outcomes from 0 to 5 years prior to exiting principals’ last year in DCPS. However, the figure uses only year 0 as the reference point to allow us to estimate the gaps in each of the remaining pre-transition years.

Figure 3.

Impact of New Principals on Reading Achievement by Year Relative to Replacement

Notes: The figure includes 76 schools, of which 54 had post-PERAA (Public Education Reform Amendment Act) new principals. The figure includes schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes 18 schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Impacts are measured relative to outcomes in year zero, the last year exiting principals led their schools. Outcomes for the seven school years are District of Columbia Comprehensive Assessment System scores. Confidence intervals are based on standard errors that are clustered at the school level.

Figure 3.

Impact of New Principals on Reading Achievement by Year Relative to Replacement

Notes: The figure includes 76 schools, of which 54 had post-PERAA (Public Education Reform Amendment Act) new principals. The figure includes schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes 18 schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Impacts are measured relative to outcomes in year zero, the last year exiting principals led their schools. Outcomes for the seven school years are District of Columbia Comprehensive Assessment System scores. Confidence intervals are based on standard errors that are clustered at the school level.

Table 5.
Impact of New Principals on Math and Reading Achievement Overall and by Subgroup
Impact by Year Since Replacement (standard deviations of student achievement)
SubjectYear 1Year 2Year 3Year 4
Panel A: All Schools 
(54 schools with new principals, 22 comparison schools) 
Math 0.00 0.04 0.07 0.07 
 (0.03) (0.03) (0.04) (0.05) 
Reading 0.01 0.05* 0.09* 0.10* 
 (0.02) (0.03) (0.03) (0.04) 
Panel B: Grades 4 and 5 
(41 schools with new principals, 17 comparison schools) 
Math 0.04 0.04 0.02 0.05 
 (0.04) (0.05) (0.05) (0.06) 
Reading 0.04 0.04 0.05 0.08* 
 (0.03) (0.04) (0.04) (0.05) 
Panel C: Grades 6 to 8 
(45 schools with new principals, 13 comparison schools) 
Math 0.02 0.13* 0.23* 0.20* 
 (0.05) (0.06) (0.08) (0.08) 
Reading 0.01 0.12* 0.20* 0.21* 
 (0.03) (0.03) (0.04) (0.04) 
Panel D: Higher-Achieving Students 
(54 schools with new principals, 22 comparison schools) 
Math −0.01 0.04 0.07 0.10 
 (0.03) (0.04) (0.04) (0.05) 
Reading 0.00 0.02 0.09* 0.09* 
 (0.02) (0.02) (0.03) (0.03) 
Panel E: Lower-Achieving Students 
(54 schools with new principals, 22 comparison schools) 
Math 0.02 0.06 0.08 0.04 
 (0.03) (0.04) (0.06) (0.06) 
Reading 0.02 0.09* 0.10* 0.12* 
 (0.03) (0.03) (0.05) (0.05) 
Panel F: Three or Fewer Years of Experience in DCPS 
(20 schools with new principals, 22 comparison schools) 
Math 0.04 0.07 0.07 0.07 
 (0.05) (0.05) (0.06) (0.07) 
Reading 0.06 0.09* 0.12* 0.11* 
 (0.03) (0.03) (0.04) (0.05) 
Panel G: More than Four Years of Experience in DCPS 
(34 schools with new principals, 22 comparison schools) 
Math −0.01 0.02 0.05 0.06 
 (0.03) (0.04) (0.05) (0.06) 
Reading −0.01 0.03 0.06 0.10* 
 (0.03) (0.04) (0.03) (0.04) 
Impact by Year Since Replacement (standard deviations of student achievement)
SubjectYear 1Year 2Year 3Year 4
Panel A: All Schools 
(54 schools with new principals, 22 comparison schools) 
Math 0.00 0.04 0.07 0.07 
 (0.03) (0.03) (0.04) (0.05) 
Reading 0.01 0.05* 0.09* 0.10* 
 (0.02) (0.03) (0.03) (0.04) 
Panel B: Grades 4 and 5 
(41 schools with new principals, 17 comparison schools) 
Math 0.04 0.04 0.02 0.05 
 (0.04) (0.05) (0.05) (0.06) 
Reading 0.04 0.04 0.05 0.08* 
 (0.03) (0.04) (0.04) (0.05) 
Panel C: Grades 6 to 8 
(45 schools with new principals, 13 comparison schools) 
Math 0.02 0.13* 0.23* 0.20* 
 (0.05) (0.06) (0.08) (0.08) 
Reading 0.01 0.12* 0.20* 0.21* 
 (0.03) (0.03) (0.04) (0.04) 
Panel D: Higher-Achieving Students 
(54 schools with new principals, 22 comparison schools) 
Math −0.01 0.04 0.07 0.10 
 (0.03) (0.04) (0.04) (0.05) 
Reading 0.00 0.02 0.09* 0.09* 
 (0.02) (0.02) (0.03) (0.03) 
Panel E: Lower-Achieving Students 
(54 schools with new principals, 22 comparison schools) 
Math 0.02 0.06 0.08 0.04 
 (0.03) (0.04) (0.06) (0.06) 
Reading 0.02 0.09* 0.10* 0.12* 
 (0.03) (0.03) (0.05) (0.05) 
Panel F: Three or Fewer Years of Experience in DCPS 
(20 schools with new principals, 22 comparison schools) 
Math 0.04 0.07 0.07 0.07 
 (0.05) (0.05) (0.06) (0.07) 
Reading 0.06 0.09* 0.12* 0.11* 
 (0.03) (0.03) (0.04) (0.05) 
Panel G: More than Four Years of Experience in DCPS 
(34 schools with new principals, 22 comparison schools) 
Math −0.01 0.02 0.05 0.06 
 (0.03) (0.04) (0.05) (0.06) 
Reading −0.01 0.03 0.06 0.10* 
 (0.03) (0.04) (0.03) (0.04) 

Notes: The table includes only schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes 18 schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined, so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Impacts are measured relative to outcomes from 0 to 5 years prior to exiting principals’ last year in District of Columbia Public Schools, where year 0 is the last year exiting principals led their schools. Outcomes for the seven school years are District of Columbia Comprehensive Assessment System scores. Standard errors are clustered at the school level.

*Statistically significant at the 5 percent level.

Impacts of 0.07 to 0.10 standard deviations of student-level achievement are equivalent to an increase in student achievement of between 3 and 4 percentiles for an average student. Impacts of this magnitude are consistent with new principals who are about two to three standard deviations more effective than the principals they replaced (Branch, Hanushek, and Rivkin 2012; Grissom, Kalogrides, and Loeb 2015; Chiang, Lipscomb, and Gill 2016).17 Gains of these magnitudes would be expected when replacing a principal in the bottom 5 percent of the distribution of DCPS principals with one who is in the middle of the distribution. Furthermore, the improvement in reading was large enough to have increased the proficiency rate in affected schools during the 2006–07 school year from 36 to 43 percent.18

Because we observe outcomes only through the 2011–12 school year, the year 4 impact estimates are based only on principal replacements from the 2007–08 school year. However, the pattern of impacts appears similar for each cohort of exits. For example, we examined impact estimates only for the first cohort of replacement principals and obtained impact estimates that were larger in the first year, but otherwise very similar. In the online appendix we report impact estimates separately by cohort of new principals.

Pre-Transition Gaps in Achievement

A key assumption underlying the difference-and-differences research design is that outcomes for the two groups of schools would have trended similarly had DCPS not replaced any principals. This assumption cannot be tested directly because we cannot know how the trend in value added for schools with new principals would have evolved had the original principal remained in the school. However, differences in the trends before the transition—such as a widening or narrowing of the gap leading up to the transition—would be evidence that trends for the two groups of schools would also have appeared different after 2008 had DCPS not replaced any principals.

We do not find strong evidence of a pre-transition trend in the gaps in achievement between schools with and without replacements, suggesting that our analysis is adequately accounting for the selection of principals for replacements. None of the pre-transition impacts in math or reading is statistically significant—the confidence intervals for pre-transition years in figures 2 and 3 all overlap zero. Furthermore, there is no evidence of a systematic widening or narrowing of the gaps in math or reading leading up to the year in which a principal replacement occurred, in contrast to Miller's (2013) findings for principal transitions in North Carolina. The gap is nearly identical in the year immediately prior to the dismissal compared to the baseline gap. Although the “impacts” on achievement in the second and third years prior to the transitions are larger than those in the subsequent pre-transition years (but not statistically significantly so), these are not part of an overall trend downward; the point estimates in the earlier pre-transition years are lower.

Even so, because the pre-intervention impacts are imprecise, it is possible that achievement declined in schools leading up to a change in school leadership, so we consider the possibility that principals who were replaced were simply unlucky in riding a downward trend in test scores in the years leading up to their departure. We conduct a conservative analysis to account for this possibility, described in section 5.

Impact of New Principals for Subgroups of Students and Schools

We also estimate the impact of new principals for subgroups of students and schools, including for grade spans, higher- and lower-achieving students, and more- and less-experienced principals. Impacts could be larger for students in higher grades if discipline and school culture policies allow principals more influence on student achievement in those grades. Results for lower-achieving students could be larger if these students are more sensitive to changes in leadership. Principals with less experience may be less effective, which would lead to larger impact estimates for that subgroup. However, there are many reasons besides these that results could differ between these subgroups. Furthermore, because of the number of subgroups we examine, it is possible that we might obtain different results for one or more subgroups based only on chance. Consequently, the differences should only be considered suggestive of which groups might benefit more from new principals. Furthermore, none of the differences in impact estimates between these groups is statistically significant.

Results by Grade Span

We found larger impacts of new principals for students in grades 6 to 8, compared with students in grades 4 and 5.19 In the third year after a replacement—the first year in which we found significant impacts for the full sample—we found statistically insignificant impacts for students in grades 4 or 5 of 0.02 standard deviations in math and 0.06 standard deviations in reading (panel B of table 5). For students in grades 6 to 8, the impact after three years with the new principal was 0.23 standard deviations in math and 0.20 standard deviations in reading (panel C of table 5). These impacts for students in grades 6 to 8 are equivalent to an increase in student achievement of between 8 and 9 percentiles for an average student. The improvements after three years were large enough to have increased the proficiency rate in affected middle schools during the 2006–07 school year from 36 to 51 percent in reading, and from 28 to 40 percent in math.

Results for Higher- and Lower-Achieving Students

We did not find strong evidence that the impact of new principals differed for higher- and lower-achieving students. We define higher-achieving students as those who scored above the district-wide average score within a grade and year, and lower-achieving students as those who scored below that same average. The impact three years after a replacement for higher-achieving students was 0.07 standard deviations for math and 0.09 for reading (panel D of table 5). For lower-achieving students, these impacts after three years with a new principal were 0.08 standard deviations for math and 0.10 for reading (panel E of table 5). For both groups, the estimate was statistically significant for reading but not for math.

Results for Principals with More and Less Experience

Finally, we estimated the impact of replacing more- and less-experienced principals. We define less-experienced principals as those with three or fewer years of experience leading schools in DCPS at the time they were replaced.20 By this definition, schools with less-experienced principals recently experienced a previous transition. Consequently, a higher impact of replacing a less-experienced principal compared to a more-experienced principal could result from the less-experienced principal having been less effective, from lower achievement in the school as a result of a recent transition in leadership, or from both. Our analysis cannot distinguish these possibilities.

Three years after a replacement, we found larger impact estimates from replacing the less-experienced principals. The impact three years after a replacement for less-experienced principals was 0.07 standard deviations for math and a statistically significant 0.12 for reading (panel F of table 5). For more-experienced principals, this year-3 impact was a statistically insignificant 0.05 standard deviations for math and 0.06 for reading (panel G of table 5). However, four years after a replacement, the impacts were more similar for the two principal experience subgroups.

5.  Specification and Robustness Checks

We conducted four sets of sensitivity analyses to understand how our results could be affected by (1) school closures and combinations, (2) the possibility that other DCPS policies led to impacts on student achievement in the schools that received new principals, (3) choices we made about how to conduct the analysis and select the sample, and (4) a transitory decrease in student performance in schools that received new principals. We discuss the results of our analyses to address (1) and (2) in the online appendix, and address (3) and (4) in the following discussions. In short, we do not find evidence that any of these factors substantially alter the interpretation of our findings.

Accounting for a Possible Pre-Transition Decrease in Student Performance

Using pre-transition data, our analyses can address the possibility that some of the change in the gap between schools with and without new principals following the transition may have occurred even if the exiting principal had remained in the school (Miller 2013). Because we compare the gap in achievement gains each year to the gap in the last year exiting principals led their schools, our difference-in-differences impact estimates from section 4 could attribute the entire decrease in the gap between 2008 and 2009 to a new principal. However, the gaps in 2009 through 2012 could have simply reverted to the mean level prior to 2008. Thus, our difference-in-differences estimates are not able to distinguish between two possible alternative explanations: (1) the original principal became worse over time, leading up to the replacement or (2) the original principal was “unlucky” to experience a temporary downward trend in performance that caused DCPS to remove the principal, but this trend is not related to the principal's effectiveness. The second possibility might occur because achievement based on standardized tests includes some measurement error. Consequently, standardized test scores can fluctuate even in a school with no actual changes in the skills of students over time. Under this explanation, the downward pre-transition trend in performance is not due to the quality of the original principal and would have rebounded had the principal remained in the school. Unlike the first explanation, the post-transition impacts should not be attributed to the new principal, since the exiting principal would have achieved the same result had he or she remained in the school.21

Our panel of student achievement data allows us to investigate these patterns to understand whether post-transition impacts can be fully attributed to the impact of the new principal. We address this concern by comparing post-transition impacts to the baseline level of achievement from before a possible decrease in achievement. Table 6 contrasts the results based on using the last year exiting principals led their schools (year 0) as the baseline, as in our main analysis (panel A), and the alternative approach that uses achievement from two to five years prior to the exiting principals’ last year in DCPS (panel B). The results in the year 0 column of panel B indicate that the gap in achievement between schools with and without new principals decreased by 0.03 standard deviations in math and reading before the transition occurred. This decrease is not statistically significant. As a consequence of this pre-transition decrease in achievement from the baseline years, the post-transition impacts that account for the decrease are smaller and lose statistical significance.

Table 6.
Impact of New Principals With and Without Accounting for Possible Pre-Transition Decline in Achievement
Impact by Year Since Replacement (standard deviations of student achievement)
SubjectYear 0Year 1Year 2Year 3Year 4
Panel A: Not Accounting for Pre-Transition Decline 
(impact relative to 0 to 5 years prior to exiting principals’ last year in DCPS) 
Math NA 0.00 0.04 0.07 0.07 
  (0.03) (0.03) (0.04) (0.05) 
Reading NA 0.01 0.05* 0.09* 0.10* 
  (0.02) (0.03) (0.03) (0.04) 
Panel B: Accounting for Pre-Transition Decline 
(impact relative to 2 to 5 years prior to exiting principals’ last year in DCPS) 
Math −0.03 −0.03 0.01 0.03 0.03 
 (0.03) (0.04) (0.05) (0.06) (0.06) 
Reading −0.03 −0.02 0.02 0.06 0.07 
 (0.02) (0.03) (0.03) (0.04) (0.04) 
Impact by Year Since Replacement (standard deviations of student achievement)
SubjectYear 0Year 1Year 2Year 3Year 4
Panel A: Not Accounting for Pre-Transition Decline 
(impact relative to 0 to 5 years prior to exiting principals’ last year in DCPS) 
Math NA 0.00 0.04 0.07 0.07 
  (0.03) (0.03) (0.04) (0.05) 
Reading NA 0.01 0.05* 0.09* 0.10* 
  (0.02) (0.03) (0.03) (0.04) 
Panel B: Accounting for Pre-Transition Decline 
(impact relative to 2 to 5 years prior to exiting principals’ last year in DCPS) 
Math −0.03 −0.03 0.01 0.03 0.03 
 (0.03) (0.04) (0.05) (0.06) (0.06) 
Reading −0.03 −0.02 0.02 0.06 0.07 
 (0.02) (0.03) (0.03) (0.04) (0.04) 

Notes: The table includes 76 schools, of which 54 had post-PERAA (Public Education Reform Amendment Act) principal transitions. The table includes only schools observed in each of the seven school years from 2005—06 through 2011—12, but excludes 18 schools where likely cheating occurred. Schools that combined are treated as the same school before and after they combined, so they can be included in this sample. Schools that closed prior to the 2011—12 school year are not included. Standard errors are clustered at the school level. NA = not applicable; DCPS = District of Columbia Public Schools.

*Statistically significant at the 5 percent level.

These lower alternative impact estimates likely understate the true impact. The evidence of a pre-transition decrease in achievement is based largely on positive gaps observed in two pre-transition years (the second and third years prior to the exiting principals’ last year), rather than a systematic trend downwards (figures 2 and 3). If, instead, we were to estimate the impact using achievement from three to five years prior to the exiting principals’ last year in DCPS, the impact estimates would have changed less from those in panel A of table 6 and might even have increased. Indeed, when we include achievement from additional pre-transition years, accounting for a pre-transition decrease leads to results that are more similar to those that do not account for the possible pre-transition decrease (this specification is reported in the online appendix). However, we caution that the quality of test scores from these additional pre-transition years may not be as high as those from the later years.

Alternative Estimates of the Impact of New Principals

In this section, we provide estimates of the impact of new principals based on (1) a nine-year panel of schools instead of the seven-year panel used for the estimates in section 4, (2) weights to make the schools with and without new principals in our analysis more similar, and (3) including schools that were identified by a report in USA Today as having incidences of cheating on assessments (USA Today 2011).

Results Based on a Nine-Year Panel

We excluded outcomes from the 2003–04 and 2004–05 school years from our main analysis because our data on the SAT-9 scores from these years are incomplete. In the nine-year panel, we included scores from these years. Doing so could provide a more complete picture of how impacts evolved over time, including possible trends in pre-transition impacts. One main difference between the samples used in the seven- and nine-year panels is that the latter excludes grade 10-students because we did not observe a pre-test for these students in the 2003–04 school year. The results using the nine-year panel are similar to our main results from the seven-year panel (shown in panels A and B of table 7).

Table 7.
Alternative Estimates of the Impact of New Principals on Math and Reading Achievement
Impact by Year Since Replacement (standard deviations of student achievement)
SubjectYear 1Year 2Year 3Year 4
Panel A: Main Results Using the Seven-Year Panel 
(54 schools with new principals, 22 comparison schools) 
Math 0.00 0.04 0.07 0.07 
 (0.03) (0.03) (0.04) (0.05) 
Reading 0.01 0.05* 0.09* 0.10* 
 (0.02) (0.03) (0.03) (0.04) 
Panel B: Nine-Year Panel 
(41 schools with new principals, 16 comparison schools) 
Math 0.01 0.04 0.06 0.11* 
 (0.04) (0.04) (0.05) (0.05) 
Reading 0.04 0.06 0.10* 0.15* 
 (0.03) (0.03) (0.03) (0.04) 
Panel C: Propensity Score Weights 
(54 schools with new principals, 22 comparison schools) 
Math 0.01 0.04 0.05 0.03 
 (0.03) (0.05) (0.05) (0.04) 
Reading 0.01 0.04 0.08* 0.10* 
 (0.02) (0.03) (0.03) (0.03) 
Panel D: Including Schools with Possible Cheating 
(66 schools with new principals, 28 comparison schools) 
Math −0.02 0.05 0.05 0.05 
 (0.03) (0.03) (0.04) (0.05) 
Reading −0.01 0.05 0.07* 0.09* 
 (0.03) (0.03) (0.03) (0.04) 
Impact by Year Since Replacement (standard deviations of student achievement)
SubjectYear 1Year 2Year 3Year 4
Panel A: Main Results Using the Seven-Year Panel 
(54 schools with new principals, 22 comparison schools) 
Math 0.00 0.04 0.07 0.07 
 (0.03) (0.03) (0.04) (0.05) 
Reading 0.01 0.05* 0.09* 0.10* 
 (0.02) (0.03) (0.03) (0.04) 
Panel B: Nine-Year Panel 
(41 schools with new principals, 16 comparison schools) 
Math 0.01 0.04 0.06 0.11* 
 (0.04) (0.04) (0.05) (0.05) 
Reading 0.04 0.06 0.10* 0.15* 
 (0.03) (0.03) (0.03) (0.04) 
Panel C: Propensity Score Weights 
(54 schools with new principals, 22 comparison schools) 
Math 0.01 0.04 0.05 0.03 
 (0.03) (0.05) (0.05) (0.04) 
Reading 0.01 0.04 0.08* 0.10* 
 (0.02) (0.03) (0.03) (0.03) 
Panel D: Including Schools with Possible Cheating 
(66 schools with new principals, 28 comparison schools) 
Math −0.02 0.05 0.05 0.05 
 (0.03) (0.03) (0.04) (0.05) 
Reading −0.01 0.05 0.07* 0.09* 
 (0.03) (0.03) (0.03) (0.04) 

Notes: Panel A includes schools observed in each of the seven school years from 2005—06 through 2011—12. Panel B includes schools observed in each of the nine school years from 2003—04 through 2011—12. Schools that combined are treated as the same school before and after they combined so they can be included in these samples. Schools where likely cheating occurred are excluded except in panel D. The propensity score specification places more weight on comparison schools that are more similar to those with new principals based on value added from the 2005—06 and 2006—07 school years and demographics of students in the schools. Impacts are measured relative to outcomes from 0 to 5 years prior to exiting principals’ last year in District of Columbia Public Schools (DCPS), where year 0 is the last year exiting principals led their schools, except for panel B in which impacts are measured relative to outcomes from 0 to 7 years prior to exiting principals’ last year in DCPS. Outcomes for the nine-year panel are SAT-9 scores for the 2003—04 and 2004—05 school years and District of Columbia Comprehensive Assessment System scores for 2005—06 through 2011—12. Standard errors are clustered at the school level.

* Statistically significant at the 5 percent level.

Results Based on Propensity Score Weights

Using matching or weighting to construct groups that are more similar has the potential to reduce possible bias in our impact estimates (St.Clair, Hallberg, and Cook 2016). We estimated impacts using propensity score weights to construct a comparison group that was more similar to the group of schools with new principals based on value added from the 2005–06 and 2006–07 school years and demographics of students in the schools. Schools that kept the same principal that were more similar to those with new principals received more weight in the analysis and schools with new principals that were more similar to those that kept the same principal also received more weight. Results from these matched specifications are similar to our main results for reading, and smaller and not significant in math (panel C of table 7).

Results Based on Including Schools with Potentially Compromised Test Scores

Finally, we estimated impacts that included twelve schools with new principals and six schools that kept the same principal that were identified in a USA Today report as showing evidence of cheating in at least half of tested classrooms in at least one of the 2007–08, 2008–09, or 2009–10 school years. Tests were flagged by the DC test score publisher if they had high rates of incorrect answers that were erased and replaced with correct answers (USA Today 2011). Including these schools leads to results that are similar in magnitude to results that exclude them (panel D of table 7).

6.  Conclusion

Our analysis suggests that the DCPS principal replacement strategy led to higher reading achievement in schools and positive but statistically insignificant impacts in math. We followed schools with new principals for at most four years. We found that in the first year, new principals had no impact on achievement. Statistically significant achievement gains began in the third year the new principal led the school. The gains persisted through the fourth year with the new principal, the last year for which we were able to estimate impacts. The impact estimates are consistent with new principals who are two to three standard deviations more effective compared with the principals they replaced, or an increase in student achievement of 3 to 4 percentiles for an average student. The gains for students in grades 6 to 8 were consistent with an increase in student achievement of between 8 and 9 percentiles for an average student.

We are able to address several potential concerns with the interpretation of our findings, including the possibility that some of the change in the gap between schools with and without new principals following the transition may have occurred even if the exiting principal had remained in the school or that DCPS policies implemented in the years after PERAA were responsible for the changes rather than the new principals. However, we cannot address one important concern about the interpretation of our findings. Specifically, we are not able to distinguish the impact of the new principals from the impact of any resources provided in a way that coincided with the principal replacements. Rather, in this case, our impacts would reflect the combined effects of the replacement and those resources.

Notes

1. 

However, Dhuey and Smith (2018), studying principals in North Carolina, found that the same principal may have a larger contribution to student achievement if he or she is placed in a different school where the principal is a better “match” for the specific challenges that school faces. Also, Branch, Hanushek, and Rivkin (2012) found that the impact of individual principals may vary more in schools with more low-income students, suggesting larger variation in match quality in these schools—although this finding could instead result from differences in the principals who lead these schools compared with schools with higher income students.

2. 

We estimate that a DCPS school that is one standard deviation more effective improves student achievement by 0.20 student-level standard deviations, equivalent to improving the average student's achievement by 8 percentiles. If principals are responsible for 15 percent of that improvement, a principal who is one standard deviation more effective would contribute 0.03 standard deviations, or a 1-percentile point improvement.

3. 

Branch, Hanushek, and Rivkin (2012) also examine estimates of principals’ impacts on student achievement that are allowed to change with tenure in the school, but conclude that the measures are too imprecise to be useful in their data.

4. 

Estimates in Gates et al. (2014) are not directly comparable to those in the other studies. Whereas the other studies presented impacts from improving principal effectiveness by one standard deviation, the New Leaders principals in Gates et al. may differ by more or less than one standard deviation of principal effectiveness from the principals they replaced.

5. 

Although Miller's study uses school-level rather than student-level data, leaving the possibility that the impact estimates may be partly caused by changes in school composition and not only principal transitions, she attempts to address concerns related to changes in student composition that arise from using school-level data.

6. 

Our main results treat combined schools as the same school when constructing fixed effects, but we also obtain similar results based on treating combined schools and precombined schools as distinct units. The latter approach implicitly excludes simultaneous transitions and school combinations from the group of schools with replacements.

7. 

The online appendix can be accessed on Education Finance and Policy’s Web site at www.mitpressjournals.org/doi/suppl/10.1162/edfp_a_00279.

8. 

The twelve principals in the 2007–08 schools that were combined were involved in six combinations of two schools each, so that these twelve schools were led by six of the principals in the 2008–09 school year.

9. 

In standardizing across years, we also assumed that the dispersion in student ability is the same in each year. This would not be the case if the reforms following PERAA had a larger impact on the achievement of low-performing compared to high-performing students so that the gap in achievement between these two groups of students narrowed. Although we cannot rule out this possibility, we examined impacts for lower- and higher-achieving students separately and found no substantive differences.

10. 

Schools become community eligible if they have a student population composed of at least 40 percent of students with an identified need for free lunch based on direct certification, where students qualify based on their families’ participation in state welfare or food stamp programs. These schools provide free breakfasts and lunches to all enrolled students and save on administrative costs by forgoing the collection of individual student subsidized meals applications.

11. 

We marked 3.5 percent of students in these years as eligible for free lunch for this reason.

12. 

We present results in section 5 that instead include these eighteen schools with compromised test scores.

13. 

For a sensitivity analysis, we also included students in the previous two school years for a nine-year panel. The nine-year panel excludes grade 10 because we do not have test scores from the 2001–02 school year.

14. 

For this step, to avoid excluding these schools from the analysis, we treated two schools that were combined at some point into a single school as having been the same school in all years.

15. 

Prior to restricting to the seven-year panel, we excluded from our analysis school–year combinations with fewer than fifty remaining students in any grade and subject.

16. 

We use school value added for these descriptive statistics, but our main analysis is conducted using student-level achievement data. The value-added model includes the same test scores and background characteristics included in equation 1. The value-added model follows the approach described in Isenberg and Hock (2012); some of the methods from that description have been updated as in the description of the teacher value-added model in Isenberg and Walsh (2014).

17. 

For this calculation, we assume that principals are responsible for 15 percent of schools’ contributions to student achievement, and that a one standard deviation improvement in school effectiveness leads to an improvement in student achievement of 0.20 student-level standard deviations.

18. 

For this calculation, we applied the impact of new principals in the third year after the replacements to the 2007 test scores of students who were enrolled in the schools with post-PERAA new principals during the 2006–07 school year. We obtained the proficiency levels for each grade and subject from technical documentation of the 2007 DC CAS (CTB/McGraw Hill 2008).

19. 

Although we include grade 10 in the full sample results, we do not report separate results for grade 10 because they were very imprecise; only eleven schools had grade 10 students.

20. 

We limited the sample of schools with replacements to those with more- or less-experienced principals, but did not similarly limit the sample of comparison schools. Doing so was necessary to obtain precise results. Consequently, the comparison group of schools for both principal experience subgroups includes the same twenty-two schools.

21. 

This is an example of an “Ashenfelter Dip,” which referred originally to falsely attributing wage gains to a training program that may only have returned participants to the wage rate they would have obtained without the program. The apparent gains arose because workers who had experienced a dip in wages were the ones who chose to participate in the program (Ashenfelter 1978; Jacobson, LaLonde, and Sullivan 1993).

Acknowledgments

We would like to thank the Walton Family Foundation for funding the work. We would also like to thank Kelly Linker and Alden Wells at the District of Columbia Public Schools and Jeffrey Noel at the Office of the State Superintendent of Education of the District of Columbia for providing the data used in this study. At Mathematica Policy Research, John Hotchkiss and Lisa McCusker, assisted by Mark Timms, Adele Costigan, and Carolyn Chuong, processed the data and provided expert programming. Eric Isenberg and Steven Glazerman provided valuable comments. Any opinions expressed herein are those of the authors and do not necessarily represent the views of the Walton Family Foundation or any other organization.

REFERENCES

Ashenfelter
,
Orley
.
1978
.
Estimating the effect of training programs on earnings
.
Review of Economics and Statistics
60
(
1
):
47
57
.
Branch
,
Gregory F.
,
Eric A.
Hanushek
, and
Steven G.
Rivkin
.
2012
.
Estimating the effect of leaders on public sector productivity: The case of school principals
.
NBER Working Paper No. 17803
.
Chiang
,
Hanley
,
Stephen
Lipscomb
, and
Brian
Gill
.
2016
.
Is school value added indicative of principal quality
?
Education Finance and Policy
11
(
3
):
283
309
.
Clark
,
Damon
,
Paco
Martorell
, and
Jonah
Rockoff
.
2009
.
School principals and school performance
.
CALDER Working Paper No. 38
,
Urban Institute
.
Coelli
,
Michael
, and
David A.
Green
.
2012
.
Leadership effects: School principals and student outcomes
.
Economics of Education Review
31
(
1
):
92
109
.
CTB/McGraw-Hill
.
2008
.
Technical report for the Washington, DC Comprehensive Assessment System (DC-CAS) Spring 2007
.
Monterey, CA
:
CTB/McGraw-Hill
.
Dee
,
Thomas
, and
James
Wyckoff
.
2015
.
Incentives, selection, and teacher performance: Evidence from IMPACT
.
Journal of Policy Analysis and Management
34
(
2
):
267
297
.
Dhuey
,
Elizabeth
, and
Justin
Smith
.
2014
.
How important are school principals in the production of student achievement?
Canadian Journal of Economics
47
(
2
):
634
663
.
Dhuey
,
Elizabeth
, and
Justin
Smith
.
2018
.
How school principals influence student learning
.
Empirical Economics
54
(
2
):
851
882
.
Gates
,
Susan M.
,
Laura S.
Hamilton
,
Paco
Martorell
,
Susan
Burkhauser
,
Paul
Heaton
,
Ashley
Pierson
,
Matthew
Baird
,
Mirka
Vuollo
,
Jennifer J.
Li
,
Diana Catherine
Lavery
,
Melody
Harvey
, and
Kun
Gu
.
2014
.
Preparing principals to raise student achievement: Implementation and effects of the New Leaders program in ten districts.
Santa Monica, CA
:
RAND Corporation
.
Grissom
,
Jason A.
,
Demetra
Kalogrides
, and
Susanna
Loeb
.
2015
.
Using student test scores to measure principal performance
.
Educational Analysis and Policy Evaluation
37
(
1
):
3
28
.
Isenberg
,
Eric
, and
Heinrich
Hock
.
2012
.
Measuring school and teacher value added in DC, 2011–2012 school year.
Washington, DC
:
Mathematica Policy Research
.
Isenberg
,
Eric
, and
Elias
Walsh
.
2014
.
Measuring teacher value added in DC, 2013–2014 school year.
Washington, DC
:
Mathematica Policy Research
.
Jacobson
,
Louis S.
,
Robert J.
LaLonde
, and
Daniel G.
Sullivan
.
1993
.
Earnings losses of displaced workers
.
American Economic Review
83
(
1
):
685
709
.
Miller
,
Ashley
.
2013
.
Principal turnover and student achievement
.
Economics of Education Review
36
(
1
):
60
72
.
Özek
,
Umut
,
Michael
Hansen
, and
Thomas
Gonzalez
.
2012
.
A leg up or a boot out? Student achievement and mobility under school restructuring
.
CALDER Working Paper No. 78
,
American Institutes for Research
.
St.Clair
,
Travis
,
Kelly
Hallberg
, and
Thomas D.
Cook
.
2016
.
The validity and precision of the comparative interrupted time-series design: Three within-study comparisons
.
Journal of Educational and Behavioral Statistics
41
(
3
):
269
299
.
Turque
,
Bill
.
2008
.
Rhee has dismissed 24 principals
.
Washington Post
,
16 May
.
USA Today
.
2011
.
Hundreds of classes in D.C. public schools flagged for erasures
.
USA Today
,
8 April
.

Supplementary data