## Abstract

Over 2,000 teachers in the state of Washington received reduction-in-force (RIF) notices across the 2008–09 and 2009–10 school years. We link data on these RIF notices to an administrative data set that includes student, teacher, school, and district variables to determine the factors that predict the likelihood of a teacher receiving a RIF notice. Not surprisingly, we find that a teacher's seniority is the strongest predictor, but we also find (all else equal) that teachers with master's degrees and those credentialed in the high-need areas of math, science, and special education were less likely to receive a RIF notice. Value-added measures of teacher effectiveness, which can be calculated for a subset of the teachers, were not correlated with the probability of receiving a RIF notice. Finally, simulations suggest that a very different group of teachers would be targeted for layoffs under an effectiveness-based layoff scenario than under the seniority-driven system that exists today.

Budget cuts have resulted in “the most rampant layoffs of teachers and other government employees in decades,” as “72,700 education jobs were eliminated in September (2010) on a seasonally adjusted basis.” (Aversa and Rugaber 2010)

## 1. Background

Tough economic times portend tight school budgets, possibly for years to come. Stories abound about the staffing cuts that will have to be made today and in the next couple of years, as the ripple effects of the economic crisis impact local and state education budgets (Westneat 2009). Because education is a labor-intensive industry—most districts devote well over 50 percent of expenditures toward teacher compensation—major budget cuts have resulted in the first significant teacher layoffs in recent times.^{1} The number of layoffs likely would have been significantly larger over the past few years in the absence of federal stimulus money (Khadaroo 2009). But although the Education Jobs and Medicaid Assistance Act (widely known as the “Edujobs bill”) may have saved hundreds of thousands of teacher jobs (Klein 2010), education budgets do not look like they will be improving in the near term (Adams 2010; Hess and Downs 2010).

For districts facing the prospect of layoffs, the decision about which teachers are laid off has profound implications for student achievement. A growing body of literature that uses value-added models (VAMs) to identify the contribution individual teachers make toward student learning gains (e.g., Aaronson, Barrow, and Sanders 2007; Chetty, Friedman, and Rockoff 2011; Goldhaber and Hansen 2010b; McCaffrey et al. 2009) demonstrates that teacher quality is the most important *schooling* factor influencing student achievement.^{2} Estimates suggest, for example, that a one standard deviation increase in teacher quality raises student achievement in reading and math by 10 (Rivkin, Hanushek, and Kain 2005; Rockoff 2004) to 26 (Koedel and Betts 2010) percent of a standard deviation.^{3} This fact has not gone unnoticed as numerous pundits and policy organizations have called to reform teacher layoff policies (e.g., Daly and Ramanathan 2010; NCTQ 2010; New Teacher Project 2010).

In this paper we present findings on two distinct questions about the determinants and implications of teacher layoffs: (1) which teacher and school characteristics predict the likelihood that a teacher receives a layoff notice in Washington state?, and (2) what are the implications of the current layoff system for student achievement? This is the first paper to address either question using data on actual layoff notices.^{4} We utilize a unique data set that links teachers to schools and students, and includes information on those teachers who received layoff notices in the 2008–09 and 2009–10 school years.

Predictably, there is good evidence that seniority plays an outsized role in determining the teachers who are targeted for layoffs. Seniority, however, is not the sole factor as teachers in high-need subjects are significantly less likely to receive a layoff notice than those who are not. By contrast, however, VAMs of teacher effectiveness appear to be uncorrelated with which teachers receive layoff notices. We also run simulations of effectiveness-based layoffs, and given that effectiveness is uncorrelated with layoffs, it is not surprising these simulations target a very different group of teachers than the current seniority-driven system.

The remainder of the paper proceeds as follows. In section 2, we review the literature on layoffs in education and other sectors and present arguments from the literature about how seniority-based layoffs might affect student achievement. In section 3, we describe our data and outline our analytic approach for investigating our two research questions. In section 4, we present findings on the determinants of teacher layoffs, and in the final sections we offer policy implications and conclusions.

## 2. Seniority-Based Layoff Policies: Background and Implications

Teachers are rarely laid off because of school budget shortages, given that per-pupil spending has risen every year since 1993 (USDOE 2010).^{5} But layoffs do occasionally happen, and thus the layoff process is typically addressed in collective bargaining agreements (CBAs). In the overwhelming majority of these agreements, “last hired, first fired” provisions make seniority the determining factor for which teachers are laid off first. For example, all of the 75 largest school districts in the nation use seniority as *a* factor in layoff decisions, and seniority is the *sole* factor that determines the order of layoffs in over 70 percent of these districts.^{6}

There are notable examples of districts that do not rely solely on seniority. Chicago Public Schools, for instance, recently changed policies to allow layoffs for untenured teachers who are performance-based (judged by principals’ evaluations of teachers; Rossi 2010)—but performance-based exceptions to the seniority-based rule are relatively rare, occurring in only 20 percent of the 75 largest districts. The situation in the state of Washington, the focus of this study, looks similar. A review of the CBAs currently operating in the ten largest school districts in Washington shows that all use seniority as a basis for determining layoffs within certificated categories,^{7} and eight of these districts use seniority as the only determinant of which teachers get laid off.

The use of seniority as a determining factor in layoffs is certainly not unique to public education. In a study of the history of layoffs beginning with railroad employees after the American Civil War and continuing with the auto industry in the early 1900s, Lee (2004) finds (as of 1994) that 88 percent of union contracts specified seniority as a factor in the layoff decision. The Bureau of National Affairs (1989) also reports that 46 percent of industries (as of 1989) used seniority as the sole factor for layoffs. The majority of both manufacturing and nonmanufacturing union contracts include seniority layoff provisions, but there is also significant variation across industries.^{8} Nor are seniority rules limited to unionized sectors of the economy; in a survey of 429 nonconstruction and nonagricultural firms, Abraham and Medoff (1984) found that 24 percent of nonunion hourly workers were subject to some sort of written seniority policy regarding layoffs.

Much of the empirical focus of economic analyses of employment termination has been on its effect on future labor market outcomes (e.g., Heckman and Borjas 1980; Gibbons and Katz 1991). There are relatively few studies of the predictors of employment separation in the case of downsizing. Wolpin (1992), for instance, focuses on the effects of job discrimination, Moser (1986) on macroeconomic cyclicality, Parsons (1972) on institutional structures, such as firm-specific human capital, and Topel (1983) on unemployment insurance.^{9}

To our knowledge, this is the first paper to investigate the determinants of layoffs in education, though there is a small body of literature on teacher dismissals. In a recent working paper, Jacob (2010) explores the relationships between various teacher, school, and student characteristics and teacher dismissal probabilities among nontenured teachers in the Chicago Public Schools. He finds an increased likelihood of dismissal for teachers with frequent absences, poorer credentials (measured by competitiveness of the college attended, failure on previous certification exams, and highest degree completed), and certain demographic characteristics, particularly older and male teachers. Jacob considers several measures of teacher effectiveness as well, including an estimate of the teacher's value added. He finds a statistically significant negative relationship between a teacher's value added and the probability of being dismissed.

While the literature has paid little attention to the determinants of teacher layoffs, there has been some recent attention on the implications of these layoffs. Although proponents of seniority-based layoffs point to the fairness and ease-of-use of the current system,^{10} there are reasons that “first in, last out” policies may have negative consequences for student achievement. The first is obvious: To achieve a given budget reduction target, school districts would need to lay off relatively more junior teachers than senior teachers (as junior teachers have lower salaries).^{11} Washington state provides an interesting case study for this particular consequence of seniority-based layoffs, because districts in Washington have somewhat less incentive to lay off more expensive teachers. The state's school funding formula allocates district funding for personnel based on the position of the district's teachers in the state salary schedule, which in turn is based on education credits and experience. Thus, districts hiring (or retaining) more expensive teachers receive more money from the state to pay them, so individual districts receive no monetary benefit from laying off higher- rather than lower-salary teachers.^{12}

The second potential negative consequence of seniority-based layoffs is linked to the fact that there is far more variation in teacher effectiveness within an experience category than between categories (e.g., Gordon, Kane, and Staiger 2006). Thus, the most senior teachers, protected under a seniority-based system, may not be the most effective teachers. Boyd et al. (2011) explore this issue using data from elementary school teachers in New York City. In particular, they simulate seniority-based layoffs and compare the resulting teacher workforce to the workforce that results from simulated effectiveness-based (based on VAMs) layoffs.^{13} Their findings suggest that the typical teacher laid off under an effectiveness-based system is 26 percent of a standard deviation in student achievement less effective than the typical layoff under a seniority-based system.^{14}

A final reason that seniority-based systems may have consequential impacts on student achievement is that strict adherence to seniority would require at least some districts to lay off teachers in high-demand subject areas, like math and special education. Table 1, which reports seniority by most recent endorsement area, demonstrates that teachers who are endorsed on their teaching credential to teach in these high-demand areas tend to be less senior than other teachers, and there may well be greater costs associated with recruiting teachers in these areas in the future.

Most Recent Endorsement Area . | Mean . | 10th Percentile . | 25th Percentile . | Median . |
---|---|---|---|---|

High-need area | 8.55 | 0.00 | 2.00 | 6.00 |

Math | 7.05 | 0.00 | 1.00 | 4.00 |

Science | 9.00 | 1.00 | 2.00 | 6.00 |

English/LA | 9.56 | 1.00 | 3.00 | 8.00 |

Social Studies | 12.60 | 1.00 | 4.00 | 10.00 |

Elementary Ed | 7.86 | 1.00 | 2.00 | 6.00 |

Special Ed | 8.89 | 0.00 | 2.00 | 6.00 |

Health/PE | 12.56 | 1.00 | 5.00 | 11.00 |

Agriculture/Tech/Other | 11.79 | 1.00 | 4.00 | 10.00 |

Arts | 10.42 | 1.00 | 3.00 | 8.00 |

Foreign Languages | 9.83 | 1.00 | 3.00 | 8.00 |

Most Recent Endorsement Area . | Mean . | 10th Percentile . | 25th Percentile . | Median . |
---|---|---|---|---|

High-need area | 8.55 | 0.00 | 2.00 | 6.00 |

Math | 7.05 | 0.00 | 1.00 | 4.00 |

Science | 9.00 | 1.00 | 2.00 | 6.00 |

English/LA | 9.56 | 1.00 | 3.00 | 8.00 |

Social Studies | 12.60 | 1.00 | 4.00 | 10.00 |

Elementary Ed | 7.86 | 1.00 | 2.00 | 6.00 |

Special Ed | 8.89 | 0.00 | 2.00 | 6.00 |

Health/PE | 12.56 | 1.00 | 5.00 | 11.00 |

Agriculture/Tech/Other | 11.79 | 1.00 | 4.00 | 10.00 |

Arts | 10.42 | 1.00 | 3.00 | 8.00 |

Foreign Languages | 9.83 | 1.00 | 3.00 | 8.00 |

Beyond the effects of seniority-based layoffs on the teacher workforce overall is the potential for distributional consequences. For instance, a recent report on the Los Angeles school district (UCLA/IDEA 2009) concluded that seniority-based layoffs will disproportionately affect schools with high proportions of at-risk students because these schools employ more first- and second-year teachers than other schools in the district. This was not borne out in the case of the Boyd et al. work from New York, nor in work by Plecki, Elfers, and Finster (2010, p. iii), who present summary statistics for the Washington State teachers laid off in the first cycle of layoffs in 2008–09 and find “little variation by student poverty level or race/ethnicity” at the district level. Ultimately the distributional effects depend on which teachers (in terms of seniority and effectiveness) are teaching in which schools; this may differ from one district to the next.

## 3. Data and Analytic Approach

### Data

The analyses here are based on an administrative data set from Washington State that links teachers to their schools and, in some cases, their students; the data set also includes information on those teachers who received reduction-in-force (RIF) notices in the 2008–09 and 2009–10 school years (we use the terms *RIF notice* and *layoff notice* interchangeably). As it turns out, the great majority of teachers who received RIF notices in 2008–09 were still employed in Washington's public schools in the next school year.^{15} Some were not actually laid off, perhaps due to the infusion of federal monies, and others found jobs in different districts. Still, RIF notices offer a clear picture of the teachers whose jobs are imminently at risk.

Information on RIF notices was collected in each year by Washington State's Professional Education Standards Board (PESB). In the 2008–09 school year, 2,144 employees received a layoff notice and in 2009–10, 450 employees received a notice. Employees receiving these notices can be linked to data about their credentials, school assignments, degrees, and compensation that are collected from four administrative databases: the Washington State S-275, the PESB credentials database, the Washington State Report Card, and the Common Core of Data.^{16} A subset of these data can be linked to students through the Core Student Record System (CSRS) and Washington Assessment of Student Learning (WASL) databases.

The S-275 administrative database provides a record of certificated and classified employees working in Washington State's school districts including information such as their places of employment, experience and degree, gender and race, and annual compensation levels.^{17} We restrict our analysis sample to employees appearing in the S-275 (meaning they were hired by 1 October of the year they received a layoff notice) and whose assignment identification indicates they were in a teaching position that year. These restrictions leave a sample of 1,717 teachers who received a layoff notice in 2008–09 and 407 teachers who received a layoff notice in 2009–10, with 130 teachers who received a layoff notice in both school years. Overall, about 2 percent of teachers in the state received a layoff notice in either year.

The S-275 includes a measure of cumulative teaching experience in the state, but does not include a direct measure of teacher seniority in a teacher's current district—an important variable to consider because some collective bargaining agreements specify seniority, as opposed to experience, as a factor in determining which teachers are dismissed in the event of necessary downsizing. Given this, we calculate a rough measure of seniority by tracking each teacher through S-275 databases going back to 1994–95 and calculating how many years (up to 14 in 2008–09 and 15 in 2009–10) he/she has been employed by his or her current school district. We impute seniority values for teachers who have taught in the same district since 1994–95 to reflect the expected spread of values beyond 14 or 15 years of seniority.^{18}Figure 1 shows the distribution of teachers who received layoff notices in 2008–09 or 2009–10 by experience and seniority. This figure clearly shows that most of the teachers receiving RIF notices are junior (approximately 60 percent of RIF teachers have two or fewer years of experience, and approximately 80 percent have two or fewer years of seniority). It is interesting, however, that some teachers are actually well into their careers, implying that there are districts in the state making judgments about which teachers to lay off based on criteria other than seniority.

We link teacher data from the S-275 to the PESB credentials database, which includes information about teacher credentials, such as where each teacher was trained and in what areas each teacher holds endorsements.^{19} The database includes an entry for each new and renewed credential a teacher has received in his/her career. We use these data in two different ways to calculate measures of teacher training and endorsements. First, we restrict the database to include each teacher's oldest credential that came from a college or university, and use the institution that issued the credential as the teacher's college.^{20} We code each teacher's college using the 2008 NCES/Barrons Admissions Competitiveness Index^{21} (1 = most competitive, 7 = least competitive) to create a measure of the selectivity of each teacher's college. The database also includes the endorsement area of each credential, so we include indicators for the number of endorsements and whether teachers hold endorsements in any one of ten areas^{22}: math, science, English/reading, social studies, elementary education, special education, health/PE, arts, languages, and other (including agriculture/technology, office staff, administration, etc.).^{23}

These teacher data are linked to school data from two more databases: the Washington State Report Card, which includes school-level achievement data on the WASL, as well as student and teacher demographics; and the Common Core of Data, which provides the location and level of each public elementary and secondary school in the United States. We use the Washington State Report Card to measure the racial composition, student–teacher ratio, percent of students enrolled in the free or reduced-price meal program, total enrollment, and percent of students who passed the reading and math WASL exams in each teacher's school. We use the Common Core of Data to create a dummy variable for whether each teacher teaches in an urban area^{24} and the level of each teacher's school.

Panel 1 of table 2 provides sample statistics for teachers who either did or did not receive a RIF notice in 2008–09. We chose to report a single year because some teachers received a layoff notice in both years of the data, but the results are very similar whether or not the averages are for 2008–09 (as in the table), 2009–10, or aggregated across years.^{25} Not surprisingly, in each year, teachers receiving RIF notices (RIFed teachers) are less experienced and less senior, by about 10 and 8 years, respectively. RIFed teachers are also far less likely to hold an advanced degree. Consequently, there is an average difference of about $15,000 in salary between the average RIFed and non-RIFed teacher.

. | RIF . | No RIF . |
---|---|---|

Panel 1: All Teachers in Sample . | N = 1,717
. | N = 53,939
. |

Teacher Characteristics . | Mean (SD) . | Mean (SD) . |

Final salary | $42,826 ($11,194) | $57,898 ($14,705) |

Years of experience in state | 3.43 (5.16) | 13.66 (9.65) |

Years of seniority in district | 1.68 (3.26) | 9.93 (8.44) |

Master's degree or higher (%) | 44.32 | 64.15 |

Most Recent Endorsement Area (%) | ||

High-need area | 13.33 | 15.10 |

Math | 3.55 | 3.09 |

Science | 4.95 | 5.12 |

English/LA | 13.45 | 12.40 |

Social Studies | 7.63 | 10.70 |

Elementary Ed | 45.60 | 37.46 |

Special Ed | 4.83 | 6.89 |

Health/PE | 4.02 | 4.42 |

Agriculture/Tech/Other | 6.23 | 8.87 |

Arts | 6.99 | 5.25 |

Foreign Languages | 1.69 | 2.42 |

School Characteristics | ||

% urban | 11.07 | 14.96 |

School size (N) | 707.00 (469.22) | 747.30 (519.07) |

Student/teacher ratio | 17.01 (3.28) | 17.11 (4.22) |

% free/reduced lunch | 39.64 | 42.08 |

% white students | 68.30 | 64.00 |

% black students | 5.81 | 5.60 |

% Hispanic students | 10.45 | 15.84 |

% other races | 15.44 | 14.56 |

% special education | 12.53 | 12.40 |

% passed English WASL | 72.59 | 72.54 |

% passed Math WASL | 51.70 | 51.75 |

Panel 2: VAM Subsample | N= 145 | N = 6,400 |

Classroom Characteristics | ||

% free/reduced lunch | 45.91 | 44.20 |

% white students | 64.81 | 64.41 |

% black students | 8.00 | 5.52 |

% Hispanic students | 12.48 | 16.30 |

% other races | 14.71 | 13.77 |

% special education | 11.57 | 11.76 |

% passed English WASL | 68.41 | 72.00 |

% passed Math WASL | 48.19 | 54.88 |

Average Teacher Value Added | ||

Value added in reading (in student SDs) | −0.048 (0.17) | 0.0013 (0.18) |

Value added in math (in student SDs) | −0.054 (0.17) | 0.00049 (0.19) |

. | RIF . | No RIF . |
---|---|---|

Panel 1: All Teachers in Sample . | N = 1,717
. | N = 53,939
. |

Teacher Characteristics . | Mean (SD) . | Mean (SD) . |

Final salary | $42,826 ($11,194) | $57,898 ($14,705) |

Years of experience in state | 3.43 (5.16) | 13.66 (9.65) |

Years of seniority in district | 1.68 (3.26) | 9.93 (8.44) |

Master's degree or higher (%) | 44.32 | 64.15 |

Most Recent Endorsement Area (%) | ||

High-need area | 13.33 | 15.10 |

Math | 3.55 | 3.09 |

Science | 4.95 | 5.12 |

English/LA | 13.45 | 12.40 |

Social Studies | 7.63 | 10.70 |

Elementary Ed | 45.60 | 37.46 |

Special Ed | 4.83 | 6.89 |

Health/PE | 4.02 | 4.42 |

Agriculture/Tech/Other | 6.23 | 8.87 |

Arts | 6.99 | 5.25 |

Foreign Languages | 1.69 | 2.42 |

School Characteristics | ||

% urban | 11.07 | 14.96 |

School size (N) | 707.00 (469.22) | 747.30 (519.07) |

Student/teacher ratio | 17.01 (3.28) | 17.11 (4.22) |

% free/reduced lunch | 39.64 | 42.08 |

% white students | 68.30 | 64.00 |

% black students | 5.81 | 5.60 |

% Hispanic students | 10.45 | 15.84 |

% other races | 15.44 | 14.56 |

% special education | 12.53 | 12.40 |

% passed English WASL | 72.59 | 72.54 |

% passed Math WASL | 51.70 | 51.75 |

Panel 2: VAM Subsample | N= 145 | N = 6,400 |

Classroom Characteristics | ||

% free/reduced lunch | 45.91 | 44.20 |

% white students | 64.81 | 64.41 |

% black students | 8.00 | 5.52 |

% Hispanic students | 12.48 | 16.30 |

% other races | 14.71 | 13.77 |

% special education | 11.57 | 11.76 |

% passed English WASL | 68.41 | 72.00 |

% passed Math WASL | 48.19 | 54.88 |

Average Teacher Value Added | ||

Value added in reading (in student SDs) | −0.048 (0.17) | 0.0013 (0.18) |

Value added in math (in student SDs) | −0.054 (0.17) | 0.00049 (0.19) |

*Notes:* For ease of comparison we only present statistics for a teacher's most recent endorsement area. Classroom statistics are from the most recent year we can calculate that teacher's value added. The VAM estimates pooled across all years of data and shrunken by EB methods.

Had all 1,717 teachers who received RIF notices in 2008–09 actually been laid off, the direct salary savings in the state would have been $5,521,238. As noted earlier, one of the prevailing critiques of seniority-based layoffs is that it is necessary to lay off more teachers in order to attain a specified budget objective than would have been laid off using alternative criteria. Based on the actual salaries of teachers in each school district, we calculate the number of teacher layoffs that would be necessary to achieve the same (or slightly greater) budgetary savings in each district if the teachers laid off in each district were earning the district-average salary. Based on this, and aggregating to the state-level, it is estimated that it would only be necessary to lay off 1,349 teachers in order to attain the same (or greater) budgetary savings; this is approximately 20 percent less than the actual number of teachers (1,717) who received layoff notices.^{26} In the final section we explore more explicitly the question of the budgetary implications of layoffs that are based on teacher effectiveness.

According to the 2006 report “Educator Supply and Demand in Washington State” (Lashway et al. 2007), there are 14 endorsement areas for which there are high degrees of shortage, all of which fall into math, science, or special education areas. Thus, we classify any teacher with an endorsement in one of these areas as *high need*. Based on this aggregation across endorsements, there is some evidence to suggest that school districts are protecting teachers in high-need subjects; 13.33 percent of RIFed teachers fell into a high-need category and 15.10 percent of non-RIFed teachers also did.^{27}

Teachers receiving a notice tended to be in smaller schools, but contrary to existing research (UCLA/IDEA 2009; Sepe and Roza 2010), RIFed teachers were not, in general, more likely to be teaching in schools with high proportions of minority students or lower WASL achievement levels.

School-level measures can mask a significant degree of teacher sorting across classrooms within schools. Fortunately, a subset of teachers and students can be linked together through the proctor on each student's state assessment.^{28} This also allows for aggregation of student data to the classroom level from the CSRS and WASL databases. It also allows for the estimation of VAMs of teacher effectiveness. The CSRS database includes information on each student's race/ethnicity, free and/or reduced-price meal eligibility status, and status in the following programs: Learning Assistance Program reading/math, Title I reading/math, Title I Migrant, Gifted/Highly Capable, State Transitional Bilingual Program, and Special Education. The WASL database includes information on each student's achievement on the WASL—an annual statewide assessment test for third through eighth, and tenth grades.

There are over 10,000 teachers in the 2008–09 school year who can be directly linked to student-level information, but because we cannot calculate VAM estimates in third grade (since there is no base-year test score that can be used in estimating VAMs) our VAM subsample is restricted to about 6,600 teachers in grades 4, 5, and 6 (this is approximately 12 percent of all teachers employed in the state).^{29}

In our VAM subsample, 145 of the teachers received a RIF notice in the 2008–09 school year, and panel 2 of table 2 reports the classroom level sample statistics for this subset of teachers.^{30} These classroom level characteristics present a slightly different picture of the differences between RIFed and non-RIFed teachers than do the school-level characteristics. For instance, compared with the school-level aggregates, the classroom level measures for percentage of poor students are relatively higher for RIFed teachers and the percentage of white students is relatively lower. Also, the average student achievement in classrooms with RIFed teachers is lower than the average achievement in other classrooms. The dichotomy of findings between the school- and classroom-levels suggests within-school inequities in the distribution of teacher experience/seniority.

Panel 2 of table 2 also includes the average value-added estimate of teacher effectiveness of RIFed and non-RIFed teachers, and the results demonstrate that the average teacher who received a RIF notice was about 5 percent of a standard deviation less effective (in both reading and math) than the average teacher who did not receive a RIF notice.^{31} This result is not surprising given that teachers who received RIF notices included many first- and second-year teachers, and many studies (Clotfelter, Ladd, and Vigdor 2007; Goldhaber and Hansen 2010a; Rivkin, Hanushek, and Kain 2005; Rockoff 2004) have shown that, on average, effectiveness improves substantially over a teacher's first few years of teaching.

Because all models we run that include VAM estimates are necessarily restricted to the subset of teachers for whom we can calculate VAM estimates, an important question is whether the teachers in the VAM subsample are representative of teachers in similar placements. Unfortunately, we do not have data on the grades taught by teachers not in this subsample, but we can compare the characteristics of teachers in this subsample with other teachers in the state who hold an endorsement in elementary education. We do this in table 3. This table shows that there are some statistically significant differences between the VAM subsample and the sample of teachers who hold an elementary school endorsement but are excluded. These are small, and could be driven by differences between teachers who teach grades 4–6 and teachers in other grades, or the fact that there are no teachers from K–2 schools in the VAM sample. Nevertheless, the fact that differences exist means that an important caveat for the results using the VAM subsample is that they may only apply to the specific subset of teachers we were able to match to student-level data.

. | Non-VAM . | VAM . |
---|---|---|

. | N = 24,308
. | N = 5,729
. |

Teacher Characteristics . | Mean (SD) . | Mean (SD) . |

Final salary | $54,865 | $56,858** |

($14,763) | ($12,611) | |

Years of experience in state | 11.20 | 11.23 |

(7.98) | (7.72) | |

Years of seniority in district | 8.17 | 8.74** |

(6.98) | (6.92) | |

Master's degree or higher (%) | 63.64 | 66.92** |

School Characteristics | ||

% urban | 14.95 | 13.07** |

School size (N) | 573.98 | 509.24** |

(336.39) | (170.15) | |

Student/teacher ratio | 16.70 | 16.67 |

(4.48) | (2.69) | |

% free/reduced lunch | 45.24 | 45.59 |

% white students | 61.90 | 62.90* |

% black students | 5.79 | 5.25** |

% Hispanic students | 17.13 | 16.72 |

% special education | 13.17 | 13.51** |

% passed English WASL | 71.07 | 72.36** |

% passed Math WASL | 55.29 | 58.52** |

. | Non-VAM . | VAM . |
---|---|---|

. | N = 24,308
. | N = 5,729
. |

Teacher Characteristics . | Mean (SD) . | Mean (SD) . |

Final salary | $54,865 | $56,858** |

($14,763) | ($12,611) | |

Years of experience in state | 11.20 | 11.23 |

(7.98) | (7.72) | |

Years of seniority in district | 8.17 | 8.74** |

(6.98) | (6.92) | |

Master's degree or higher (%) | 63.64 | 66.92** |

School Characteristics | ||

% urban | 14.95 | 13.07** |

School size (N) | 573.98 | 509.24** |

(336.39) | (170.15) | |

Student/teacher ratio | 16.70 | 16.67 |

(4.48) | (2.69) | |

% free/reduced lunch | 45.24 | 45.59 |

% white students | 61.90 | 62.90* |

% black students | 5.79 | 5.25** |

% Hispanic students | 17.13 | 16.72 |

% special education | 13.17 | 13.51** |

% passed English WASL | 71.07 | 72.36** |

% passed Math WASL | 55.29 | 58.52** |

*Notes:* VAM teachers are teachers who appear in the sample for specification 3, defined in table 5. Significance levels are from a two-sample *t*-test of the null hypothesis that the mean from the VAM sample is equal to the mean from the non-VAM sample.

**p* < .01; ***p* < .001.

### Analytic Approach

*Y*, is an indicator of whether or not teacher

_{ijkst}*i*in school

*j*and district

*k*in year

*t*received a RIF notice: In the baseline, we model the probability of receiving a RIF notice as a function of individual teacher characteristics

*T*, including: seniority in district,

_{it}^{32}degree level (MA or higher), gender, race, college selectivity, and endorsement area; the characteristics of the teacher's school,

*S*, including urbanicity, grade level (e.g., high school), enrollment, student–teacher ratio, the percentage of students who are eligible for the free/reduced-price lunch program, the percentage of minority students, and measures of student achievement in reading and math; a district fixed effect, δ

_{jt}*; and a year effect,*

_{kt}*.*

_{t}^{33}

The relationship between seniority and the likelihood of receiving a RIF notice is nonlinear (see figure 1), so we include dichotomous indicator variables for whether each teacher has 0–1, 1–2, 2–3, 3–4, 4–6, or 7–11 years of seniority (teachers with more than 11 years of seniority are the reference category). In controlling for endorsements, we are concerned with both endorsement area and the number of endorsements teachers hold. Thus, *T _{it}* includes indicators for each area in which the teacher is endorsed to teach (with elementary education as the reference category), the number of endorsements the teacher holds, and interactions between each endorsement area and the number of endorsements the teacher holds. Because of these interactions, the coefficient on each individual endorsement area can be interpreted as the marginal effect of holding a single endorsement in that area on the likelihood of receiving a RIF notice, relative to teachers who hold a single endorsement in elementary education.

Whether student characteristics predict a teacher being RIFed is also of interest. Which teachers lose their jobs as a result of budget constraints is likely to be a politically contentious issue; teacher layoffs in large districts have sparked fierce public debate about the distribution of layoffs across schools and students (e.g., Daly and Ramanathan 2010; Frazier 2009). To capture the association between school covariates and the probability of RIF notices, we include observable school (*S _{jt}*) variables in the model. We parameterize this model to treat school demographics as either continuous (percent of each race within the school) or categorical (quintiles of minority or free or reduced-price lunch composition) to account for possible non-linear effects of this variable, and also include the level (elementary, middle, or high) of each teacher's school. Further, because layoff decisions are made at the district level, all models include a district fixed effect.

^{34}Thus, the coefficients on the teacher covariates in this model should be interpreted as the marginal effects of these teacher characteristics relative to teachers at comparable schools within the same district.

Although CBAs suggest that seniority will be a driving factor in determining which teachers are laid off, a few CBAs in the state allow administrators to consider teacher effectiveness in the layoff process. Moreover, when there is managerial discretion over layoffs, we might expect effectiveness to be a factor whether it is mentioned in a CBA or not. To explore this, we utilize the VAM subsample, including value-added measures of teacher effectiveness in model 1 to see whether job performance (at least this measure of it) influences which teachers receive layoff notices.

*i*represents students,

*j*represents teachers,

*k*represents schools,

*s*represents subject area (math or reading), and

*t*represents the school year. Student achievement normed within grade and year,

*A*, is regressed against: prior student achievement in math and reading,

_{ijkst}*A*

_{i(t−1)}, and a vector of student and family background characteristics (e.g., race and ethnicity, special education status, gifted status, and free or reduced-price lunch status),

*X*. The remaining teacher fixed-effect (τ

_{it}*) is the VAM estimate for teacher*

_{jt}*j*in year

*t*. The error term, , is assumed to be normally distributed with a mean of zero.

^{35}

*C*, grade effects (

_{jt})*G*), and year effects () to equation 2, and calculates teacher effectiveness estimates that are pooled across all three years of student performance data (τ

_{it}*instead of τ*

_{j}*).*

_{jt}^{36}We also experiment with variants of model 3 to test the robustness of our results, including a model that controls for teacher experience, a model that adds a school fixed effect, and a model that substitutes a student fixed-effect for student-level covariates.

^{37}

We adjust all teacher effect estimates using EB methods.^{38} We find that the unadjusted teacher effect is 0.24 (i.e., a one standard deviation change in teacher effectiveness corresponds with a 0.24 standard deviation change in student performance) in both math and reading, and the EB shrunken estimates suggest effect sizes of 0.19 in math and 0.18 in reading.

## 4. Results

### Which Teacher Characteristics Predict the Likelihood of Receiving a Layoff Notice?

Table 4 provides the estimated marginal coefficients from logit models identifying if teachers received a RIF notice in 2008–09 or 2009–10.^{39} The three columns present the marginal effects from specifications of model 1 that include: linear school covariates (column 1); quintiles of the percent of minority students in the school (column 2); and quintiles of the percent of students in the school eligible for free or reduced-price lunch (column 3).^{40} As discussed in section 3, all models in table 4 include school covariates and a district fixed effect, so coefficients should be interpreted as relative to teachers at similar schools in the same district.

. | 1 . | 2 . | 3 . |
---|---|---|---|

0–1 years seniority | 0.1160*** | 0.1160*** | 0.1161*** |

(0.0108) | (0.0108) | (0.0108) | |

1–2 years seniority | 0.1108*** | 0.1107*** | 0.1109*** |

(0.0104) | (0.0104) | (0.0104) | |

2–3 years seniority | 0.0924*** | 0.0924*** | 0.0925*** |

(0.0090) | (0.0090) | (0.0090) | |

3–4 years seniority | 0.0812*** | 0.0813*** | 0.0813*** |

(0.0083) | (0.0083) | (0.0083) | |

4–6 years seniority | 0.0551*** | 0.0551*** | 0.0552*** |

(0.0064) | (0.0064) | (0.0064) | |

7–11 years seniority | 0.0263*** | 0.0263*** | 0.0264*** |

(0.0052) | (0.0052) | (0.0052) | |

Master's degree or higher | −0.0084*** | −0.0085*** | −0.0084*** |

(0.0014) | (0.0014) | (0.0014) | |

Teacher Endorsements | Reference Category is Elementary Education | ||

Math | −0.0083* | −0.0083* | −0.0083* |

(0.0041) | (0.042) | (0.0041) | |

Science | −0.0088* | −0.0088* | −0.0088* |

(0.0038) | (0.0038) | (0.0038) | |

English/LA | 0.0080* | 0.0082** | 0.0081* |

(0.0031) | (0.0031) | (0.0031) | |

Social Studies | 0.0098** | 0.0097** | 0.0097** |

(0.0035) | (0.0035) | (0.0035) | |

Special Ed | −0.0165*** | −0.0163*** | −0.0166*** |

(0.0040) | (0.0040) | (0.0040) | |

Health/PE | 0.0149*** | 0.0150*** | 0.0148*** |

(0.0038) | (0.0038) | (0.0038) | |

Agriculture/Tech/Other | −0.0007 | −0.0006 | −0.0006 |

(0.0037) | (0.0037) | (0.0037) | |

Arts | 0.0098*** | 0.0098*** | 0.0097*** |

(0.0030) | (0.0030) | (0.0030) | |

Languages | 0.0008 | 0.0008 | 0.0010 |

(0.0059) | (0.0059) | (0.0059) | |

School Percent Minority Students (by quintile in state): 1st (lowest) Quintile is Reference | |||

2nd Quintile | −0.0001 | ||

(0.0023) | |||

3rd Quintile | 0.0031 | ||

(0.0030) | |||

4th Quintile | 0.0031 | ||

(0.0035) | |||

5th Quintile (highest) | −0.0041 | ||

(0.0045) | |||

School Percent FRPL Students (by quintile in state): 1st (lowest) Quintile is Reference | |||

2nd Quintile | −0.0011 | ||

(0.0027) | |||

3rd Quintile | −0.0015 | ||

(0.0031) | |||

4th Quintile | −0.0061^{+} | ||

(0.0035) | |||

5th Quintile | −0.0070^{+} | ||

(0.0042) |

. | 1 . | 2 . | 3 . |
---|---|---|---|

0–1 years seniority | 0.1160*** | 0.1160*** | 0.1161*** |

(0.0108) | (0.0108) | (0.0108) | |

1–2 years seniority | 0.1108*** | 0.1107*** | 0.1109*** |

(0.0104) | (0.0104) | (0.0104) | |

2–3 years seniority | 0.0924*** | 0.0924*** | 0.0925*** |

(0.0090) | (0.0090) | (0.0090) | |

3–4 years seniority | 0.0812*** | 0.0813*** | 0.0813*** |

(0.0083) | (0.0083) | (0.0083) | |

4–6 years seniority | 0.0551*** | 0.0551*** | 0.0552*** |

(0.0064) | (0.0064) | (0.0064) | |

7–11 years seniority | 0.0263*** | 0.0263*** | 0.0264*** |

(0.0052) | (0.0052) | (0.0052) | |

Master's degree or higher | −0.0084*** | −0.0085*** | −0.0084*** |

(0.0014) | (0.0014) | (0.0014) | |

Teacher Endorsements | Reference Category is Elementary Education | ||

Math | −0.0083* | −0.0083* | −0.0083* |

(0.0041) | (0.042) | (0.0041) | |

Science | −0.0088* | −0.0088* | −0.0088* |

(0.0038) | (0.0038) | (0.0038) | |

English/LA | 0.0080* | 0.0082** | 0.0081* |

(0.0031) | (0.0031) | (0.0031) | |

Social Studies | 0.0098** | 0.0097** | 0.0097** |

(0.0035) | (0.0035) | (0.0035) | |

Special Ed | −0.0165*** | −0.0163*** | −0.0166*** |

(0.0040) | (0.0040) | (0.0040) | |

Health/PE | 0.0149*** | 0.0150*** | 0.0148*** |

(0.0038) | (0.0038) | (0.0038) | |

Agriculture/Tech/Other | −0.0007 | −0.0006 | −0.0006 |

(0.0037) | (0.0037) | (0.0037) | |

Arts | 0.0098*** | 0.0098*** | 0.0097*** |

(0.0030) | (0.0030) | (0.0030) | |

Languages | 0.0008 | 0.0008 | 0.0010 |

(0.0059) | (0.0059) | (0.0059) | |

School Percent Minority Students (by quintile in state): 1st (lowest) Quintile is Reference | |||

2nd Quintile | −0.0001 | ||

(0.0023) | |||

3rd Quintile | 0.0031 | ||

(0.0030) | |||

4th Quintile | 0.0031 | ||

(0.0035) | |||

5th Quintile (highest) | −0.0041 | ||

(0.0045) | |||

School Percent FRPL Students (by quintile in state): 1st (lowest) Quintile is Reference | |||

2nd Quintile | −0.0011 | ||

(0.0027) | |||

3rd Quintile | −0.0015 | ||

(0.0031) | |||

4th Quintile | −0.0061^{+} | ||

(0.0035) | |||

5th Quintile | −0.0070^{+} | ||

(0.0042) |

*Notes:* Models combine both years of RIFs with year fixed-effect (55,656 teachers in 2008–09 with 1,717 RIFs, 54,996 teachers in 2009–10 with 407 RIFs). All columns include a district fixed effect, teacher variables (college selectivity, female dummy, and non-white dummy), interactions between each endorsement area and the number of endorsements the teacher holds (see figure 3), and school covariates (size, level, student/teacher ratio, and percent passing state exams).

Significance levels are based on Wald test: ^{+}*p* < .1 +; **p* < .05; ***p* < 0.01; ****p* < 0.001.

As expected, seniority plays an important role in determining whether teachers receive a layoff notice. As the marginal effects on the seniority dummies indicate, the likelihood of receiving a RIF notice decreases at each additional threshold of seniority.^{41} For example, a first-year teacher is over 75 times more likely to receive a layoff notice, all else equal, than a teacher with 12 or more years of seniority.^{42} This finding is not at all surprising.

As it turns out, a number of other variables are also significant, controlling for seniority. For instance, teachers with a graduate degree are almost one percentage point less likely to receive a RIF notice, all else equal, and endorsements also appear to matter.^{43} In particular, teachers holding a single endorsement in math, science, or special education—precisely the three high-need areas identified by the state of Washington—are all significantly less likely to receive a RIF notice, all else equal, than teachers holding a single endorsement in elementary education.

The finding that endorsements predict the receipt of a RIF notice would seem to suggest that districts are protecting teachers credentialed in high-need areas from the layoff process but there is a question of the mechanism through which this takes place. Many of the CBAs we reviewed stipulate that seniority-based layoffs should take place within credentialing areas, and with good reason; as table 1 shows, there is significant variability in seniority across credentialing areas. In particular, teachers endorsed in math, elementary education, and special education are, on average, considerably less senior than teachers in other areas.^{44} Because of this variability, it is conceivable that the negative marginal effects of teachers being credentialed in math, science, or special education are solely a product of the relative seniority of teachers within those areas rather than any affirmative managerial decision to protect teachers in high-demand subject areas.^{45}

Table 4 also demonstrates that teachers with a single endorsement in English/language arts, social studies, health/PE, or arts are all significantly more likely to receive a RIF notice than teachers holding a single endorsement in elementary education. Table 1 shows that teachers endorsed in any of these areas are, on average, more senior than teachers in elementary education or any of the high-need areas, so there is still a question of whether these positive marginal effects are the result of districts disproportionately identifying teachers in these areas for layoffs or are simply the product of teachers in these areas being more senior than teachers in other areas.

Figure 2 presents the marginal effects of holding a single endorsement in each area—taken directly from the marginal effects in table 4—and the marginal effect of holding an endorsement in each area in addition to endorsements in one or more additional areas.^{46} The clear trend in figure 2 is that the likelihood of receiving a RIF notice decreases markedly with the number of endorsements a teacher holds. In fact, the marginal effect of holding one additional endorsement, −0.6 percentage points, is roughly the change in probability associated with one additional year of early-career experience. Further, teachers with two or more additional endorsements are less likely to receive a layoff notice than teachers with a single endorsement in elementary education, regardless of the areas in which they have those endorsements. Interestingly, although teachers with a single endorsement in health/PE are the most likely, all else equal, to receive a RIF notice, teachers with an endorsement in health/PE and at least two other areas are the least likely to receive a RIF notice, all else equal. This lends credence to the notion that school districts are strategically protecting teachers with endorsements in multiple areas because of the flexibility in the classes they can teach.

Columns 2 and 3 of table 4 also present the marginal effects of teaching in schools with different levels of minority and high-poverty student composition.^{47} One might imagine that when facing a difficult personnel decision, districts may have chosen to lay off teachers in schools with more vulnerable and less politically active student populations.^{48} Contrary to at least some of the rhetoric around layoffs (UCLA/IDEA 2009; Sepe and Roza 2010), however, but consistent with Boyd et al. (2011) and Plecki, Elfers, and Finster (2010), we find little evidence that teachers in high-minority or high-poverty schools are disproportionately identified for layoffs. In fact, teachers in schools with a high percent of students in the free or reduced-price meal program are marginally less likely to receive a RIF notice, all else equal. Note that this does not necessarily mean that low-income or minority students are not disproportionately affected by teacher layoffs, because schools with low-income or high-minority enrollments might also share other characteristics (e.g., the seniority level of teachers) that influence layoffs. We explore the implications of this in greater detail in the next section.

### Is Teacher Effectiveness Considered in RIF Decisions?

In table 5 we report selected findings from models estimating the probability of receiving a RIF notice in 2008–09 that include estimates of teacher effectiveness.^{49} We include a variety of different teacher effectiveness measures in the RIF models in order to test the robustness of our findings. First, it is conceivable that judgments about teachers may be made based on a teacher's performance in the year in which layoffs occur, so column 1 of the table includes teacher-year value-added effect estimates.^{50} Of course it is also possible that administrators’ judgments are based on a teacher's prior year of performance given that value-added estimates (or more simple means of calculating the academic growth of students in a teacher's classroom) are unlikely to be available at the time that layoff notices are sent out. So, in column 2 we include teacher-year effect estimates based on the year prior to the year in which teachers received a layoff notice.^{51} Because teacher layoffs were not nearly as great of a concern in 2007–08, this specification also controls for any concern that a RIF anticipation effect may bias our estimates.

. | 1 . | 2 . | 3 . | 4 . | 5 . | 6 . |
---|---|---|---|---|---|---|

0–1 years seniority | 0.2337* | 0.2121* | 0.2111* | 0.2109* | 0.2074* | |

(0.0268) | (0.0245) | (0.0242) | (0.0241) | (0.0239) | ||

1–2 years seniority | 0.1849* | 0.2072* | 0.1930* | 0.1917* | 0.1904* | 0.1889* |

(0.0254) | (0.0195) | (0.0225) | (0.0222) | (0.0219) | (0.0217) | |

2–3 years seniority | 0.1190* | 0.1418* | 0.1284* | 0.1274* | 0.1271* | 0.1262* |

(0.0267) | (0.0190) | (0.0204) | (0.0203) | (0.0202) | (0.0200) | |

3–4 years seniority | 0.1347* | 0.1308* | 0.1262* | 0.1259* | 0.1246* | 0.1241* |

(0.0273) | (0.0217) | (0.0227) | (0.0226) | (0.0224) | (0.0223) | |

Math value added | −0.0021 | 0.0226 | −0.0099 | −0.0230 | 0.0038 | −0.0108 |

(0.0384) | (0.0419) | (0.0355) | (0.0351) | (0.0264) | (0.0133) | |

Reading value added | 0.0565 | 0.0682^{+} | 0.0297 | 0.0203 | −0.0098 | −0.0046 |

(0.0496) | (0.0401) | (0.0367) | (0.0373) | (0.0244) | (0.0212) |

. | 1 . | 2 . | 3 . | 4 . | 5 . | 6 . |
---|---|---|---|---|---|---|

0–1 years seniority | 0.2337* | 0.2121* | 0.2111* | 0.2109* | 0.2074* | |

(0.0268) | (0.0245) | (0.0242) | (0.0241) | (0.0239) | ||

1–2 years seniority | 0.1849* | 0.2072* | 0.1930* | 0.1917* | 0.1904* | 0.1889* |

(0.0254) | (0.0195) | (0.0225) | (0.0222) | (0.0219) | (0.0217) | |

2–3 years seniority | 0.1190* | 0.1418* | 0.1284* | 0.1274* | 0.1271* | 0.1262* |

(0.0267) | (0.0190) | (0.0204) | (0.0203) | (0.0202) | (0.0200) | |

3–4 years seniority | 0.1347* | 0.1308* | 0.1262* | 0.1259* | 0.1246* | 0.1241* |

(0.0273) | (0.0217) | (0.0227) | (0.0226) | (0.0224) | (0.0223) | |

Math value added | −0.0021 | 0.0226 | −0.0099 | −0.0230 | 0.0038 | −0.0108 |

(0.0384) | (0.0419) | (0.0355) | (0.0351) | (0.0264) | (0.0133) | |

Reading value added | 0.0565 | 0.0682^{+} | 0.0297 | 0.0203 | −0.0098 | −0.0046 |

(0.0496) | (0.0401) | (0.0367) | (0.0373) | (0.0244) | (0.0212) |

*Notes:* Models include district fixed effects, teacher covariates (college selectivity, master's degree, gender, and race), and school covariates (size, student/teacher ratio, and racial composition). The regressions in each column use a different specification of the VAM model, described here:

(1) Single-year VAM estimate from year of RIF notices (2008–09) (*N* = 2611, 88 RIF teachers).

(2) Single-year VAM estimate from year before RIF notices (2007–08) (*N* = 4,728, 86 RIF teachers).Note that this sample necessarily omits first-year teachers, so we drop all teachers with 0–1 yearsseniority for consistency.

(3) Pooled-year VAM estimate across all three years of data (2006–09) (*N* = 6,545, 145 RIFteachers).

(4) Pooled-year VAM estimate, controlling for experience (2006–09) (*N* = 6,545, 145 RIF teachers).

(5) Pooled-year VAM estimate, including school fixed-effect (2006–09) (*N* = 6,545, 145 RIFteachers).

(6) Pooled-year VAM estimate, including student fixed-effect (2006–09) (*N* = 6,545, 145 RIFteachers).

**p* < .001.

The final four columns of table 5 contain checks for robustness. Research shows a degree of intertemporal instability in VAM estimates of teacher effectiveness, but that this instability decreases when multiple years of teacher-student data are used to inform the effectiveness estimates (Aaronson, Barrow, and Sanders 2007; Goldhaber and Hansen 2010a; McCaffrey et al. 2009). So, we report results for models that include as much matched student-teacher data as are available for each teacher in column 3.^{52} In column 4 we report the results of a specification that includes teacher effect estimates that are adjusted for teacher experience to account for the possibility that administrators may wish to protect the jobs of teachers who show a great deal of potential, even if they are less effective than more senior teachers. Finally, there is disagreement in the literature about whether it is appropriate to include school or student fixed-effects in VAMs designed to estimate teacher performance, so we report the results for models using effect estimates based on the specifications in columns 5 and 6, respectively.^{53}

There is little point in going into great detail about the magnitude of the coefficient estimates on estimated teacher effectiveness. In all cases we find that the relationship between teacher effectiveness and the probability of receiving a layoff notice is very weak, implying that effectiveness plays little or no role in determining which teachers are targeted for being laid off.^{54}

## 5. Policy Implications

To get a more concrete sense of the extent to which various factors play into the targeting of teachers for layoffs, in table 6, we report the expected probability of teachers receiving a layoff notice who have various endorsement area and seniority level combinations.^{55} Although the results from table 6 illustrate that a teacher's endorsement area affects the likelihood of being laid off, the effect is far smaller than the influence of seniority. For instance, we estimate the probability that a first-year special education teacher receives a layoff notice is 6.2 percent, compared with 17 percent for a first-year health/PE teacher. This difference is significant, but it pales in comparison to the difference in probabilities for a first-year teacher compared with a teacher with 12 or more years of seniority; the estimated probability of a teacher with 12 or more years of seniority receiving a layoff notice is less than a quarter of a percent for every endorsement area.

. | Years of Seniority . | ||||||
---|---|---|---|---|---|---|---|

. | 0–1 . | 1–2 . | 2–3 . | 3–4 . | 4–6 . | 6–12 . | 12+ . |

Endorsement Area | |||||||

Math | 9.3 | 7.0 | 3.7 | 2.4 | 0.95 | 0.31 | 0.11 |

Science | 9.1 | 6.9 | 3.6 | 2.4 | 0.93 | 0.31 | 0.11 |

English/LA | 13 | 10 | 5.4 | 3.6 | 1.4 | 0.47 | 0.17 |

Social Studies | 13 | 9.7 | 5.2 | 3.4 | 1.4 | 0.44 | 0.16 |

Elementary Ed | 10 | 7.8 | 4.2 | 2.7 | 1.1 | 0.36 | 0.13 |

Special Ed | 6.2 | 4.7 | 2.4 | 1.6 | 0.62 | 0.20 | 0.074 |

Health/PE | 17 | 13 | 7.4 | 4.9 | 2.0 | 0.65 | 0.24 |

Agriculture/Tech/Other | 11 | 8.1 | 4.3 | 2.8 | 1.1 | 0.37 | 0.13 |

Arts | 15 | 12 | 6.4 | 4.3 | 1.7 | 0.56 | 0.21 |

Foreign Languages | 12 | 9.2 | 4.9 | 3.2 | 1.3 | 0.42 | 0.15 |

. | Years of Seniority . | ||||||
---|---|---|---|---|---|---|---|

. | 0–1 . | 1–2 . | 2–3 . | 3–4 . | 4–6 . | 6–12 . | 12+ . |

Endorsement Area | |||||||

Math | 9.3 | 7.0 | 3.7 | 2.4 | 0.95 | 0.31 | 0.11 |

Science | 9.1 | 6.9 | 3.6 | 2.4 | 0.93 | 0.31 | 0.11 |

English/LA | 13 | 10 | 5.4 | 3.6 | 1.4 | 0.47 | 0.17 |

Social Studies | 13 | 9.7 | 5.2 | 3.4 | 1.4 | 0.44 | 0.16 |

Elementary Ed | 10 | 7.8 | 4.2 | 2.7 | 1.1 | 0.36 | 0.13 |

Special Ed | 6.2 | 4.7 | 2.4 | 1.6 | 0.62 | 0.20 | 0.074 |

Health/PE | 17 | 13 | 7.4 | 4.9 | 2.0 | 0.65 | 0.24 |

Agriculture/Tech/Other | 11 | 8.1 | 4.3 | 2.8 | 1.1 | 0.37 | 0.13 |

Arts | 15 | 12 | 6.4 | 4.3 | 1.7 | 0.56 | 0.21 |

Foreign Languages | 12 | 9.2 | 4.9 | 3.2 | 1.3 | 0.42 | 0.15 |

*Note:* Overall probability of receiving a RIF notice was 1.9%.

Next we examine the implications of using an effectiveness-based layoff system by comparing simulated effectiveness-based layoffs to the actual layoff notices that were largely driven by seniority.^{56} First, we estimate teacher effectiveness in both math and reading across all years of data up to and including 2008–09 (the year of the actual layoff notices). Next, for simplicity, the VAMs are averaged across the two subjects in order to obtain a single estimate of effectiveness for each teacher.^{57} Teachers in each school district are then ranked according to their value added. Finally, starting with the least effective teachers in each district and moving up the effectiveness distribution, enough teachers are assigned to a hypothetical layoff pool to achieve a budgetary savings for each district that is at least as great as the savings that would have resulted were all the teachers who received a RIF notice in 2008–09 actually laid off.^{58}

The overlap between the subgroup of teachers who actually received a RIF notice and the subgroup of teachers laid off in our value-added effectiveness-based simulation is relatively small: only 23 teachers (or 16 percent of teachers who received a RIF notice). Because the teachers who received RIF notices in our simulation were more senior (and have higher salaries) than the teachers who actually received RIF notices by an average of approximately eight years,^{59} the simulation results in far fewer layoffs. We conservatively calculate that districts would only have to lay off 132 teachers under an effectiveness-based system in order to achieve the same budgetary savings they achieved with 145 RIF notices under today's seniority-driven system, a difference of about 10 percent of the RIFed workforce.^{60}

Figure 3 shows the kernel distribution of teacher effectiveness under the simulated effectiveness-based RIF versus the actual effectiveness of teachers who received RIF notices. Clearly, the setup of our simulation guarantees that there will be a difference between the average effectiveness of the two groups, so the true question is how great the difference is. Like Boyd et al. (2011), we find that the difference in average effectiveness is substantial: 20 percent of a standard deviation of student achievement in math and 19 percent of a standard deviation of student achievement in reading.^{61} These magnitudes are striking, roughly equivalent to the differential between students having a teacher who is at the 16th percentile of effectiveness rather than the 50th percentile. And, given estimates that students in the upper elementary grades typically gain a standard deviation from one grade to the next (Schochet and Chiang 2010), the differences we are detecting between RIF systems are on the order of magnitude of 2.5 to 3.5 months of student learning.

Given that there is little overlap between the samples under different layoff systems, we consider the characteristics of the students whose teachers received a RIF notice under the actual system or in our effectiveness-based simulation (for the VAM subsample of teachers who can be linked to student-level data). Columns 1 and 2 of table 7 present the probability that a student in various subgroups would have their teacher laid off under the two systems. Column 1 presents the probabilities under the actual RIF system, and demonstrates that the probability of being in a classroom with a teacher who receives a layoff notice varies considerably from one subgroup to the next.

. | 1 . | 2 . | 3 . | 4 . |
---|---|---|---|---|

. | Probability . | Probability . | Average VAM . | Average VAM . |

. | of RIF . | of RIF . | (actual RIF . | (simulated RIF . |

. | (actual) . | (simulated) . | teachers) . | teachers) . |

All students | 2.41 | 2.20 | −0.0487 | −0.241 |

(0.136) | (0.0757) | |||

White students | 2.40 | 2.31 | −0.0410 | −0.238 |

(0.130) | (0.0769) | |||

Non-white students | 2.42 | 2.00 | −0.0632 | −0.248 |

(0.145) | (0.0727) | |||

Black students | 3.77 | 2.81 | −0.0559 | −0.244 |

(0.141) | (0.0654) | |||

Hispanic students | 1.86 | 1.54 | −0.0800 | −0.263 |

(0.161) | (0.0760) | |||

Low-income students | 2.59 | 2.28 | −0.0526 | −0.250 |

(0.144) | (0.0740) |

. | 1 . | 2 . | 3 . | 4 . |
---|---|---|---|---|

. | Probability . | Probability . | Average VAM . | Average VAM . |

. | of RIF . | of RIF . | (actual RIF . | (simulated RIF . |

. | (actual) . | (simulated) . | teachers) . | teachers) . |

All students | 2.41 | 2.20 | −0.0487 | −0.241 |

(0.136) | (0.0757) | |||

White students | 2.40 | 2.31 | −0.0410 | −0.238 |

(0.130) | (0.0769) | |||

Non-white students | 2.42 | 2.00 | −0.0632 | −0.248 |

(0.145) | (0.0727) | |||

Black students | 3.77 | 2.81 | −0.0559 | −0.244 |

(0.141) | (0.0654) | |||

Hispanic students | 1.86 | 1.54 | −0.0800 | −0.263 |

(0.161) | (0.0760) | |||

Low-income students | 2.59 | 2.28 | −0.0526 | −0.250 |

(0.144) | (0.0740) |

*Notes:* Standard deviations in parentheses. All VAM estimates are the mean pooled-year VAM estimate used for the VAM simulations. *N* = 143,005 students; 93,116 white, 7,958 black, 22,462 Hispanic, 62,114 low-income.

Black students are far more likely than other students to have been in a classroom of a teacher who received a RIF notice.^{62} In column 2, we repeat the calculations for the simulated effectiveness-based layoffs. The effectiveness-based layoffs result in fewer RIFs across all student subgroups. It also results in a more equitable distribution of RIFs; black students in particular are only marginally more likely to have been in a classroom with a teacher who received a RIF notice under this system. Finally, we explore the student achievement consequences of the two layoff systems by calculating the average effectiveness of the RIFed teachers of different subgroups of students under the actual layoffs (column 3) and under the effectiveness-based layoffs (column 4). The differences are similar across all subgroups, as teachers RIFed in our simulation are approximately 20 percent of a standard deviation in student performance less effective than teachers RIFed in reality.

Our effectiveness measures are Empirical Bayes shrunken estimates, but as Boyd et al. (2011) show, the findings may still overstate the achievement differential in a subsequent year given that performance in year *t* + 1 is likely to differ from that estimated in year *t*. Specifically, Boyd et al. find that the 26 percent of a standard deviation difference is reduced to 12 percent when teachers are judged by their value added in the year following the simulated layoffs. When we perform a similar exercise, considering the performance in the next school year of the subset of teachers who were hypothetically laid off under our simulation, the performance of the average teacher laid off in our simulation was still 19 percent of a standard deviation of student performance lower in math than the average teacher in the state, and 9 percent of a standard deviation of student performance lower in reading.

Of course, the effectiveness-based simulation we describe here comes with a number of caveats. First, we used a particular VAM specification (column 3 of table 5) in estimating effectiveness, and although the aggregated simulation results are similar regardless of the specification we use, the specific teachers laid off under each specification varies. For example, 82 percent of the teachers who receive a layoff notice when we use the pooled-year estimates also receive a layoff notice when we use the experience-adjusted estimates (column 4, table 5), but this number drops to 63 percent when we use school fixed-effect estimates (column 5, table 5) and 41 percent when we use the student fixed-effect estimates. So, although our aggregate results are robust to the VAM specification we use, the specific population of teachers receiving a layoff notice under an effectiveness-based system is in fact sensitive to the choice of specification.

Second, our effectiveness-based measure is a somewhat narrow estimate of the contribution of teachers to students in that it is limited to what is captured by Washington's state assessment. Perhaps more importantly, we can only perform a partial equilibrium analysis that does not account for any behavioral changes that might have far-reaching consequences for who opts to enter or remain in the teacher labor force and how teachers in the workforce perform.

Finally, our simulations are somewhat simplistic, describing a dichotomous choice between today's seniority-driven system and one that relies on value added. There is no reason that policy makers might not consider a more nuanced approach that considers measures of teacher performance in conjunction with seniority or other factors. Nor is there any reason that performance should be restricted to value-added measures. Having said this, one of the great limiting factors in considering any measure of teacher performance in making layoff determinations (or for any other personnel decisions) is the fact that there is typically no variation in teacher performance evaluations (Weisberg et al. 2009). In Washington, for instance, 99.1 percent of teachers received a satisfactory rating on the binary evaluation instrument used by most districts in the state.^{63} This obviously limits the capacity to use any measure of teacher performance in making a high-stakes personnel decision like selecting teachers for layoffs.

## 6. Conclusions

Our findings are not terribly surprising to anyone who is familiar with seniority provisions in collective bargaining agreements: Seniority clearly matters for teacher job security. As we noted herein, all of the collective bargaining agreements governing policy in the ten largest school districts in Washington (during the years of our study) mention that seniority must be used as a factor in determining layoffs and none of these mentions teacher effectiveness at all. Moreover, districts in Washington State have little incentive to consider the cost of their personnel due to the state funding formula. Given this, it is perhaps surprising that we find evidence that factors other than seniority appear to influence which teachers are targeted for layoffs. In particular, it is clear that teachers in high-need areas are less likely to be laid off as are teachers with multiple endorsements. But, as the simulations illustrate, having more seniority in a district is far more important than a teacher's areas of endorsements.

We cannot infer the general equilibrium consequences of a shift in the method by which teachers are targeted for layoffs, but the simulations comparing layoffs under an effectiveness-based system to the actual layoff notices under the existing seniority-driven system are intriguing. In short, this somewhat speculative exercise suggests these two systems would result in a very different distribution of teachers targeted for layoffs. Moreover, the system in which teachers would be laid off is estimated to have meaningful effects on both student achievement and the distribution of layoffs across student subgroups. An important caveat here is that these findings are based on the subsample of teachers for whom we can estimate value added (i.e., mostly fourth- and fifth-grade teachers who are responsible for students in math and reading).

School systems have not had to face the prospect of widespread teacher layoffs until recently, so the fact that seniority layoff provisions exist in most CBAs was historically largely irrelevant. This is clearly not the case today, nor is it likely to be the case in the near future as school systems wrestle with tighter budgets. Districts and states across the country are now rethinking layoff strategies. Our findings suggest this is sensible, because although the simplicity and transparency of a seniority-based layoff system certainly has advantages, it is hard to argue that it is a system in the best interest of student achievement.

## Notes

According to the U.S. Department of Education, for instance, in 2005–06 about 54 percent of aggregate educational expenditures were devoted to instructional salaries and benefits (USDOE 2010).

We use the term “teacher quality” to mean the ability of teachers to contribute in measurable ways to student gains on standardized tests, and treat this as synonymous with the terms “teacher performance” and “teacher effectiveness.”

Moreover, other estimates (Hanushek 1992) show the difference between having a very effective versus a very ineffective teacher can be as much as a full year's learning growth.

See Kraft (2012) for a subsequent analysis of teacher layoffs in the Charlotte-Mecklenberg School District that builds on the work in this paper.

In fact, per-pupil spending in the United States has increased every year since 1970 with the exceptions of 1979–80, 1980–81, 1991–92, and 1992–93, and has more than doubled in that time ($10,041 in 2006–07 compared with $4,489 in 1970–71, as measured in 2007–08 dollars).

These figures are compiled based on information from the National Council on Teacher Quality (NCTQ) TR3 database (NCTQ 2009). Notably, however, some of these districts are trying to move away from seniority-based layoffs despite the language in CBAs. In addition to Chicago, Los Angeles Unified recently settled a lawsuit brought by the ACLU and agreed to limit the use of seniority in teacher layoffs (Felch and Song 2011).

Certificated categories include teachers (grouped by subject area taught), nurses, speech therapists, and any other credentialed employees.

For instance, over 90 percent of contracts in the manufacture of transportation equipment and communications industries include these seniority provisions, but only about 10 percent of contracts in construction specify seniority as a factor in layoffs (Bureau of National Affairs 1989).

Farber, Haltiwanger, and Abraham (1997) provide a comprehensive look at differences in job loss rates by gender, age, and education, as well as across different occupations and industries for U.S. workers.

For instance, as the California Teachers Association Handbook states: “The seniority system should be encouraged. The seniority system has demonstrated its equity and validity in protecting the rights of all employees. All personnel begin vesting in the system from the first day of service, and modification of the seniority system imperils job security for all employees.” (CTA 2010, p. 153)

Roza (2009) calculates that if layoffs are done solely on the basis of seniority, a district needing to reduce salary expenditures by 5 percent must lay off 7.5 percent of its workforce. And, consistent with this, Boyd et al. (2011) find that reducing expenditures on teacher salaries by 5 percent requires terminating about 5 percent of teachers under an effectiveness-based system and about 7 percent under a seniority–based system.

We thank Harvey Erickson, Annie Pennucci, and an anonymous reviewer for their insight on this topic. For more information, see www.leg.wa.gov/Senate/Committees/WM/Documents/Publications/BudgetGuides/2011/FINALK-12Guide2011.pdf.

A key distinction between our analysis and that of Boyd et al. is that we use data on actual teacher layoff notices, whereas Boyd et al. rely on simulations of seniority-based layoffs.

This somewhat overstates the true impact of the difference in teacher layoff systems because some portion on measured teacher effectiveness is sampling error (Goldhaber and Hansen 2010a; McCaffrey et al. 2009), but the authors account for this by comparing teachers under the two regimes in the future. In these estimates they still find a differential of 12 percent of a standard deviation of student achievement.

Of the 1,717 teachers who received a RIF notice in the 2008–09 school year, 1,457 were still teaching in 2009–10.

We also utilize college selectivity data supplied by College Board.

Technically, the database contains the amount apportioned to the district to pay each teacher. However, most (if not all) districts pay teachers the apportioned amount in the database.

The number of years of in-district experience is, on average, 75 percent of the number of years of in-state experience (for teachers who have taught fewer than 15 years). So, for teachers with the maximum number of years of in-district experience, the in-district experience value was imputed using the formula: *seniority (imputed) = 15 + [(experience – seniority) × 0.75]*

This database has missing data for both very old teachers—who began teaching before the state started to require and track teaching credentials—and very new teachers who have either not finalized their credentials or whose paperwork has not been processed yet. We discuss our approach to missing data in the Analytic Approach section.

A large proportion of teachers have received all their credentials directly from the state's Office of the Superintendent for Public Instruction, meaning they either received their credential from out of state or through an alternative certification route. This means we have college selectivity data for 39,261 of the 55,656 teachers in the 2008–09 sample. We discuss our approach to missing data in the Analytic Approach section.

For more information, see http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2010331.

Though we focus on a teacher's endorsement areas, we recognize that a teacher's endorsements do not necessarily align with his/her actual placement. For example, Ingersoll (2003) reports that in 1999–2000, 29 percent of secondary math teachers did not have a credential to teach mathematics. Nonetheless, our objective is to assess the determinants of teacher layoffs, and we believe a teacher's endorsements may be as or more important in the layoff process than his or her actual placement.

For the summary statistics presented in the next section, we restrict the database to include only each teacher's most recent credential, so each teacher also has a single dummy variable for the area in which he or she is most recently endorsed.

For the purposes of this paper, we define “urban” as an urbanized area within a city of 100,000 people or more.

Teachers who received layoff notices in 2009–10 were more experienced on average by about a year and a half than teachers who received layoff notices in 2008–09, and there is less evidence that high-need subjects (defined subsequently) were protected.

The actual number of layoffs necessary to meet a budgetary target under the assumption that teachers with the average salary in each district are laid off is somewhat overstated given that the number of teachers necessary to achieve a budgetary target for each district is rounded up in cases where the estimate is not a whole number.

An alternative measure of high-need subjects, based on the federal Teacher Education Assistance for College and Higher Education grant program's definition of high-need subjects, presents a similar picture.

The proctor of the state assessment was used as the teacher–student link for at least some of the data used for analysis. The “proctor” variable was not intended to be a link between students and their classroom teachers, so this link may not accurately identify those classroom teachers. However, for the 2009–10 school year, we are able to check the accuracy of these proctor matches using the state's new Comprehensive Education Data and Research System (CEDARS) that matches students to teachers through a unique course indentifier. Our proctor match agrees with the student's teacher in the CEDARS system for about 95 percent of students in math and 94 percent of students in reading. Further, fitting a teacher production function to these data produces similar results to those found elsewhere in the literature (e.g., Jackson and Bruegmann 2009; Clotfelter, Ladd, and Vigdor 2007; Goldhaber et al. 2012).

We restrict the sample of teachers to cases where the proctor is a teacher reported to be teaching in a “self-contained” classroom; the proctor's certificate issued by the Washington State PESB is for a “Teacher”; at least 50 percent of the proctor's time is spent in the school where they proctored a test; the proctor taught no more than one grade in a given year; and the proctor's class size is within reasonable limits (i.e., not smaller than 10 nor larger than 29). Relatively few sixth-grade teachers are in the sample given that many sixth grades are in middle schools so teachers are not in self-contained classrooms.

The classroom-level statistics include all the students in the teacher's class, not just the students whose test scores are used to generate the teacher's VAM estimate.

These estimates of effectiveness are based on the models described subsequently (equations 2 and 3) and are “shrunken” using Empirical Bayes (EB) techniques.

Experience and seniority are highly correlated (a bivariate correlation of about 0.85) so we only include seniority in the models. The results are very similar if experience is used in place of seniority.

In each model, we also include missing value dummies to test whether teachers with missing values in each area differ significantly from other teachers. The only variable for which the missing value dummy was ever significant was endorsement area, which is not surprising because these data are missing mostly for very experienced teachers who were unlikely to receive a RIF notice.

In a district fixed-effect model, teachers in districts without layoff notices do not contribute to the estimates, so the “effective sample size” in these models is 66,431 teacher/year observations from districts that delivered at least one layoff notice in 2008–09 or 2009–10. For results from other parameterizations of model (1), see Goldhaber and Theobald (2011).

Note that we plan to focus on self-contained classrooms so that subject area does not vary by teacher, class, or school. The annual class grouping of students implies shared (and potentially unobservable) environmental factors that will influence the performance of the entire class, however, contributing to positive intra-class correlation among students in the same classroom that should be accounted for by clustering students at the classroom level.

In the case where teachers are in the workforce in all those years, the effectiveness estimates for many junior teachers are, of course, based on fewer years of data.

All VAMs were run in STATA using the user-written programs FESE (Nichols 2008) and a2reg (Ouazad 2007).

The standard EB method shrinks estimates back to the grand mean of the population. Note, however, that standard EB adjustment does not properly account for the uncertainty in the grand mean, suggesting the estimates are shrunk too much (McCaffrey et al. 2009). We use the standard approach that has been commonly estimated in the literature (an appendix on EB shrinkage is available from the authors by request).

The results are found to be qualitatively similar when the models are estimated separately for each school year. The only major difference is that fewer of the variables are statistically significant in 2009–10, which is not surprising given the smaller sample of RIFed teachers in that year.

The logit coefficients represent the expected change in the log odds of a teacher receiving a RIF notice per one-unit change in the predictor. For a more straightforward interpretation, however, we transform the coefficients into marginal effects and present the results in terms of the marginal probability of receiving a RIF notice. We calculate the marginal effects at the individual level (as recommended in Greene 2000, p. 816) and then take the sample mean of those effects to calculate the overall marginal effect of each variable.

Twelve or more years of seniority is used as the reference category for each seniority indicator, so all coefficients on the seniority indicators should be interpreted relative to teachers who have taught twelve or more years in the district. Separate specifications that use other levels as the reference category demonstrate that these marginal effects are all significantly different from each other as well as from the twelve-years-and-over group.

This is calculated from table 6. Note that the marginal probability of receiving a layoff notice for a teacher with twelve years or more seniority is less than 0.25 percent across all endorsement areas.

As discussed in section 3, many teachers in the state hold endorsements in more than one area, so all models include interactions between individual teacher endorsement areas and indicators for the number of additional endorsements the teacher holds. Table 4 includes the coefficient for each individual endorsement area, and these coefficients should be interpreted as the marginal effect of holding a single endorsement in each area relative to holding a single endorsement in elementary education.

The tenth percentile of seniority for both math and special education is 0, meaning that over 10 percent of math and special education teachers in the state are in their first year in the district.

Specifically, our logit models estimate the probability of a teacher receiving a layoff notice relative to other teachers of the same seniority. Because math teachers are significantly less senior than PE teachers, for example, in a system that gives layoff notices by seniority within credentialing areas, a third-year math teacher may be less likely to receive a layoff notice solely because she is relatively more senior within her credentialing area. We test this possibility by running a model that does not control for seniority, and find that the coefficients on several endorsement areas are still negatively correlated with the probability of receiving a RIF notice: special education, science, and social studies are all significantly negative, and arts is significantly positive. This suggests that our results reflect a combination of administrators protecting hard-to-staff areas, such as special education, and giving RIF notices by seniority within endorsement areas.

The marginal effect of holding an endorsement in one area as well as one additional endorsement in another area (for example) is calculated by adding the marginal effect of holding a single endorsement in that area, the marginal effect of holding one additional endorsement, and the marginal effect on the interaction terms between these two indicators.

The reference category for these variables is the first quintile, so coefficients should be interpreted as relative to teachers who teach in a school whose percent of minority or students receiving free or reduced priced meals falls in the lowest 20 percent of the state.

We do not observe political involvement directly, but there is well-documented concern that minority and poor students are less likely to receive adequate school resources (e.g., Orfield and Lee 2005).

Although we do not report them in the model, the inclusion of the teacher effectiveness variables do little to change the estimated effects of the district and school-level covariates.

Note that the teacher-year effect estimates are from model specifications that do not necessarily include classroom or school covariates.

One drawback of this model is that we cannot include any first-year teachers.

This model is also limited in that most teachers receiving a layoff notice are quite junior so have few years of matched data informing the estimates of effectiveness.

The estimates in the models that include school or student fixed-effects were very imprecise so we opted to use teacher effect estimates for these specifications that are not adjusted using EB methods.

This is not surprising given that the correlation of the effect estimates are generally high: The correlations between the various VAM specifications we use are all greater than 0.75, with the exception of the estimates for 2008–09 (the year of the RIF notices) and 2007–08 (the year before the RIF notices), which have a bivariate (intertemporal) correlation of 0.37. This correlation is consistent with year-to-year correlations of 0.3 to 0.5 found elsewhere in the value-added literature (Goldhaber and Hansen 2010b).

These simulations are based on the marginal effects reported in column 1 of table 4. The choice of specification does have an impact on estimates of schooling variables on the likelihood of receiving a layoff notice, but very little impact on the magnitude of the effect of teacher seniority. For each combination of endorsement and seniority level, we use the true covariate values nonendorsement and nonseniority variables and the marginal effects from column 1 of table 4 to predict the likelihood of that teacher receiving a layoff notice if a teacher had that endorsement/seniority level. The average of these probabilities gives the simulated probability that a teacher with that combination of endorsement and seniority level would receive a layoff notice.

We note that using current-year VAM estimates for layoff decisions is not possible because test scores are generally not available until the late summer. The results presented here, then, should be interpreted as speculative evidence about the potential differences between layoffs based on seniority and those based on effectiveness.

This is not necessarily the most sophisticated way of generating a cross-subject estimate of teacher effectiveness—for example, it is possible to use both math and reading scores in the same model to generate a single measure of effectiveness—but it represents one simple way that a district might choose to use these scores.

In doing this calculation we are restricted to the sample of teachers for whom we can estimate value added. We do not control for the changes in class size that result from the layoffs but our estimates suggest the effects of class size changes are quite small. In particular, a one student increase in class size is associated with a drop of only 0.0014 standard deviations of student performance, meaning that the expected change in student achievement resulting from any change in average class size is negligible.

The average seniority of teachers who actually received RIF notices was 1.78 years, whereas the average seniority of RIFed teachers in our simulation was 9.69 years.

As in our first simulation, this 10 percent estimate is very conservative because we required that districts achieve at least the same savings in our simulation as they did in reality. Had we ignored districts and simulated effectiveness-based RIF notices across the state, only 117 teachers (20 percent fewer) needed to get a RIF notice. Because 10 percent is very conservative, and 20 percent ignores the reality that layoffs occur within districts, the true reduction in RIFs under an effectiveness-based system is likely to be between 10 percent and 20 percent.

These estimates are somewhat smaller than the estimate in Boyd et al. (2011) that the difference is 0.26 standard deviations in student achievement. This is likely due to two reasons. First, our simulations are based on actual teacher layoff notices rather than the simulated seniority-based layoffs in Boyd et al., so our results reflect all the factors that appear to drive teacher layoff decisions. Further, we are interested in simulating a statewide policy/practice that operates within districts. Our findings on differences in effectiveness between the two different systems would have been even larger had we ignored district boundaries (because there are some teachers who would not be RIFed based only on within district comparisons, but would be based on teacher comparisons between districts).

This result is at least partly attributable to the fact that these students are more likely to be taught by more junior teachers; specifically, the average white student in Washington has a teacher who is one year more senior than the teacher of an average black student. We show in section 4 that teachers in schools with a high percent of minority students are not more likely to receive a RIF notice after controlling for seniority.

Authors’ calculation from data supplied by the Office of the Superintendent for Public Instruction.

## Acknowledgments

The research presented here utilizes confidential data from Washington State supplied by the Office of Superintendent for Public Instruction. We gratefully acknowledge the receipt of these data. We wish to thank the Gates Foundation for support for the Urban Institute Capital Project. We also wish to thank the American Enterprise Institute and the National Center for the Analysis of Longitudinal Data in Education Research for supporting this research through grant R305A060018 to the Urban Institute from the Institute of Education Sciences. This paper has benefited from helpful comments from Joe Koski, John Krieg, Katharine Strunk, Annie Pennucci, Harvey Erickson, and two anonymous reviewers. Finally, we wish to thank Steve Dieterle, Stephanie Liddle, and Scott DeBurgomaster for research assistance, and Jordan Chamberlain for editorial assistance. The views expressed in this paper do not necessarily reflect those of the University of Washington, Washington State, or the study's sponsor. Responsibility for any and all errors rests solely with the authors.

## REFERENCES

*Brooking Papers on Economic Activity.*