Abstract

Strategic Performance Indicators (SPIs) are summary measures derived from parallel, descriptive analyses conducted across educational agencies. The SPIs are designed to inform agency management and efforts to improve student outcomes. We developed the SPIs to reveal patterns common across partner agencies, to highlight exceptions to those patterns, and to provide tools for educational agencies to understand their own successes and challenges. We present two examples of SPI briefs and highlight specific steps partner agencies have taken in response to such analyses. Our goal is that, with data, the SPIs will catalyze educational agencies to engage in deep investigation. If educational systems must rely on academic researchers to make progress on core management challenges, progress will be slow. The ability to use and analyze administrative data must be integrated into the everyday management of educational systems, much as it has been in many other sectors of the modern economy.

The Strategic Data Project's Strategic Performance Indicators

The mission of the Strategic Data Project (SDP) is to transform the use of data in education to improve student achievement. To date, SDP, which was developed at Harvard University's Center for Education Policy Research, has partnered with over thirty state and local education agencies to increase their capacity to utilize administrative data in order to guide strategy and decision making.

With many of our partner agencies, we conduct a set of performance diagnostics, descriptive analyses intended to distill important and actionable information from their administrative data systems. These diagnostics serve two purposes. First, they provide our partners with systematic evidence on their performance in key areas, such as the college-going outcomes of their students after high school graduation and the recruitment, development, and retention of effective teachers. Second, they illustrate the types of knowledge agencies can glean by synthesizing and analyzing their existing data.

The Strategic Performance Indicators (SPIs) emerged from this work. Strategic Performance Indicators are summary measures derived from parallel, descriptive analyses of a common set of issues conducted with each partner agency's existing data to inform agency management and focus efforts to improve student outcomes. We developed the SPIs to reveal patterns common across many of our partners, highlight exceptions to those patterns, and provide tools to agencies both within and outside the SDP network to understand their own successes and challenges with regard to each SPI's focal issue.

Many policy briefs attempt to summarize the latest academic research findings and provide policy makers with an easily understood overview of what we do and do not know based on that research. Others go a step further and argue for particular policy solutions on the basis of existing research findings. In both cases, the writer brings generalizable evidence to bear on policy issues of widespread concern. Where the actual research driving policy conclusions was conducted is of secondary importance, and the ability of policy makers and organizations to replicate the analysis typically is not a goal. The briefs presenting each SPI, however, explicitly differ from this model in three ways.

First, the SPI briefs gain their credibility, at least in part, through their specificity. The data presented act as a mirror that allows leaders of participating agencies to see and act on relevant trends. Policy briefs relying exclusively on research results from other settings may be less persuasive and compelling than results associated with one's own organization. Whereas findings generated in other educational entities can be shrugged off as irrelevant, results from one's own agency can reveal pressing management problems that necessitate response.

Second, the SPI briefs aim to provoke further analysis and discussion within our partner agencies. We avoid providing answers regarding appropriate policy responses for two primary reasons. First, our partners face very different academic, policy, and funding environments. To present policy responses presumes that we have a recipe that would work for all (or at least many) agencies. Second, and more importantly, the SPIs are not derived from “root-cause” analyses of the patterns they reveal in any one district. If the patterns revealed are of concern, more analysis needs to be done to guide strategy and policy. Our aim, therefore, is that the SPIs will catalyze the agencies to engage in this deeper investigation, in turn developing and honing their own analytical skills. If educational systems must rely on academic researchers to make progress on core management challenges, progress will be slow. The ability to use and analyze administrative data must be integrated into the everyday management of educational systems, much as it has been in many other sectors of the modern economy.

Third, we intend for these analyses to be replicable by many other agencies. Indeed, we are developing and disseminating tools to support their proliferation and widespread adoption. The cross-organization benchmarking made possible by the SPI briefs will enable agencies to learn from one another and enhance their ability to understand how performance on various dimensions ranges across systems. We expect that the SPI results will vary across agencies (otherwise why conduct them) and will change over time as a result of management actions taken in response to earlier findings. In this way, the replication of the SPIs as a strategy for tracking improvement in a given agency is as or more important than that agency's initial findings.

In sum, the SPI briefs are not simply arguments about what we as researchers know and how to communicate this information to system leaders and policy makers. They are instead illustrations of what is possible for agency leaders to know about their own systems and the critical importance of having such knowledge for driving agency change.

The next two sections of this article provide examples of the SPI briefs that the Strategic Data Project has developed. College Going by High School addresses variation in college-going outcomes across high schools within partner agencies, and Novice Teacher Placements examines which students are assigned to novice teachers both across and within each agency's elementary and middle schools. (Additional SPI briefs are available at www.strategicdataproject.org.) We conclude by discussing examples of specific steps partner agencies have taken in response to the patterns they observed in their own administrative data through their participation in the SDP diagnostic process.

The SDP College-Going Diagnostic Strategic Performance Indicators

College Going by High School: Do College Enrollment Rates Differ Across High Schools?

Summary of Findings

Although prior academic achievement is a strong predictor of whether a student graduates from high school and enrolls in college, college-going rates for students with similar levels of prior academic achievement vary dramatically across high schools within a school district. This brief presents results for Albuquerque Public Schools (NM), Boston Public Schools (MA), Charlotte-Mecklenburg Schools (NC), Fort Worth Independent School District (TX), Fulton County Schools (GA), Gwinnett County Schools (GA), and The School District of Philadelphia (PA). Although descriptive, these results suggest that a high school can have considerable impact on the college enrollment patterns of its students.

Introduction

In today's knowledge-based economy, earning a college degree is more important than ever. For far too many students, however, the dream of going to college remains unfulfilled. In this brief, we examine the college-going rates of high school graduates from seven school districts across the country, using a set of SPIs developed by the SDP at Harvard University's Center for Education Policy Research.1

To examine college enrollment rates for graduates of the SDP partner districts, we link student-level high school records to college attendance information available from the National Student Clearinghouse.2 After studying variation in college-going rates across the high schools within each district, we delve more deeply into the relationship between college enrollment rates and prior academic achievement. Finally, we compare college-going rates across high schools among students with similar levels of prior achievement. At the end of the brief, we pose questions for school district leaders to consider and suggest action steps to increase the share of students who complete high school and continue on to college.

Findings

There can be sizable differences in college-going rates across high schools within a district. In figure 1, for every high school within each district, we illustrate the percentage of graduates who enroll in a four-year college as well as the percentage who enroll in any (i.e., either a two- or four-year) postsecondary institution. Each circle in the figure represents a high school, and the size of the circle reflects the size of that school's graduating class. The vertical placement of the circle reflects the percent of the school's graduates who matriculate to college the fall after graduation. The horizontal line indicates the overall rate of college going within each district.

Figure 1.

College Enrollment Rates among High School Graduates, by High School. Notes: Results for each district and college type include the range of college-going rates. pp refers to “percentage points.” For example, four-year college-going rates among high school graduates range by 44 percentage points across high schools in the Albuquerque Public Schools.

Figure 1.

College Enrollment Rates among High School Graduates, by High School. Notes: Results for each district and college type include the range of college-going rates. pp refers to “percentage points.” For example, four-year college-going rates among high school graduates range by 44 percentage points across high schools in the Albuquerque Public Schools.

Figure 1 shows there is substantial variation in college-going rates across high schools within each district. In Philadelphia, for example, there is an 89-percentage point spread between the high school with the highest rate of four-year college enrollment and the high school with the lowest rate; for some high schools, the vast majority of graduates enroll in four-year colleges, whereas for others, relatively few do. In Fort Worth, TX, on the other hand, the spread is much smaller (28 percentage points), but no high school in that district has a four-year college enrollment rate above 40 percent.

What might be driving these within-district patterns? One likely explanation is that schools serve different populations of students. If some high schools serve students who arrive better prepared academically, for example, we would expect these schools to have higher college-going rates.

The Strategic Performance Indicators: How Do College Enrollment Rates Vary for Similarly Achieving Students Attending Different High Schools Within a District?

We use two SPIs to explore college-going outcomes across high schools. The first indicator examines the extent to which college-going rates for each school relate to prior student achievement. In figure 2, we plot, by district, high school-specific rates of college going by the average achievement of that school's students in the eighth grade.3

Figure 2.

Relationship between College Enrollment Rates and Average Prior Academic Achievement among High School Graduates

Figure 2.

Relationship between College Enrollment Rates and Average Prior Academic Achievement among High School Graduates

The strength of the relationship between college-going rates and average prior achievement can be seen by examining the spread of the points around the trend line. Points clustered tightly around the trend line suggest that the prior achievement of a school's incoming students relates strongly to overall rates of college going. If points are scattered more widely, it suggests high schools themselves are having different levels of success with students of similar academic achievement. The correlation coefficient (a statistical measure that summarizes the cluster of the school-level results around the trend line) formalizes the strength of this relationship. No correlation (0) means that the prior achievement of students is not related to the percentage of a school's graduates going to college, and a perfect correlation (1.0) indicates that average assessment scores across incoming students are a perfect predictor of the percentage of graduates going to college. Across districts, the results confirm that high schools serve students with varying levels of academic preparation, on average, and that those with better-prepared students tend to have higher college enrollment rates. The strength of this relationship varies, with district-level correlations ranging from 0.57 to 0.96.

Although this indicator illustrates that school-level college-going rates can be predicted by students’ average academic achievement at the beginning of high school, whether an individual student will enroll in college is hardly a foregone conclusion when they start ninth grade. Some high schools have higher college enrollment rates than predicted based on the prior achievement of their students (these schools are represented by dots above each district's trend line), whereas others have lower rates than predicted (dots below the trend).

In fact, most high schools serve a variety of students, some with low levels of prior achievement and others with high levels of prior achievement. Thus, looking only at the overall college-going rate for each high school may hide important variation within schools. Accordingly, our second indicator examines variation within and across high schools by comparing college-going rates among students with similar prior achievement.4

In figure 3, each circle now represents students in a given high school whose eighth-grade math test scores ranked them in a particular quartile of all incoming high school students district-wide. The size of each circle reflects the relative number of graduates in a given school and prior achievement quartile, and the vertical placement of the circle indicates the percentage of graduates in that group who matriculate to college.5 The horizontal lines now represent quartile-specific district averages.

Figure 3.

College Enrollment Rates among High School Graduates Within Quartile of Prior Achievement, by High School

Figure 3.

College Enrollment Rates among High School Graduates Within Quartile of Prior Achievement, by High School

Figure 3 reveals several noteworthy patterns. First, the upward trend in the quartile-specific district averages provides further evidence that students who enter high school at a higher achievement level continue on to college at higher rates than their lower-performing peers. Nevertheless, even within prior achievement quartiles, substantial differences exist across high schools. Further, although each school typically sends to college students from each of the four achievement quartiles, they do so at different rates.

Consider the college-going rates of graduates who were in the top quartile of eighth-grade achievement. In Boston, MA, for example, college enrollment rates for these students range by about 50 percentage points across high schools; in Gwinnett County, GA, on the other hand, the spread is much smaller (approximately 20 percentage points). It is also important to note that in some districts, college-going rates for students with strong achievement histories are quite low in some high schools. This pattern raises the question of whether certain schools—and school districts—are adequately supporting promising students to help them reach their full potential.

Finally, within each district, there is substantial overlap in the distributions of college-going rates across quartiles. This reveals that students entering with substantially lower academic achievement in some schools transition to college at higher rates than do higher-performing students in other schools.

Implications from Findings

Patterns shown here reveal that, across all seven SDP partner districts considered, students’ achievement in eighth grade is a strong predictor of whether they ultimately enroll in college. Students with lower levels of prior achievement are, in general, less likely to matriculate than peers with higher levels of achievement. This is not the end of the story, however. Even when comparing students with similar levels of prior achievement, college-going outcomes can differ dramatically for students who attend different schools.

In light of these patterns, agencies should work to understand the sources of variation in college-going rates across high schools, especially for students with similar prior academic achievement. In particular, it may prove enlightening for agencies to compare the practices of high schools that have success at achieving uniformly high rates of college going, even for students who enter high school with low eighth-grade achievement, with those of schools that exhibit greater variation in outcomes across levels of prior achievement. Such an examination may help to uncover college-preparation and college-going strategies that have potential to improve outcomes for low-performing students throughout the district.

Ask Yourself, Take Action

Why do these college enrollment patterns exist? These analyses on their own are not designed to determine the causes for these findings. Rather, they aim to prompt a series of questions that will help education leaders uncover causes and be positioned to make informed changes in management and policy. Asking and answering these questions should lead to a better understanding of differences in outcomes. Our ultimate hope is that such a cycle of deep inquiry will lead to improved strategies and solutions.

Ask Yourself: How are we using data to improve college going within our agency? How can we use data more effectively? What is preventing us from doing so?

Take Action: View data as a source of insights for improvement and long-term impact.

  • Build and maintain a longitudinal student data system for tracking student achievement over time. Include data from the National Student Clearinghouse to examine the college-going outcomes of the students in your system.

  • Cultivate the analytic capacity within your agency to examine college-going outcomes for your high schools as well as for students grouped by salient characteristics, such as prior academic achievement.

  • Conduct these SPI analyses annually to take stock and to investigate whether changes in policy, strategy, management, or practice are having the desired impact.

  • Share these results broadly with teachers, counselors, and school leaders as a means of encouraging focus on college-going outcomes.

  • Use the results of these analyses as a springboard for follow-up questions and analyses to more fully illuminate key challenges or needs specific to the agency.

Ask Yourself: What are our students’ postsecondary aspirations? What are the major barriers that prevent them from fulfilling these aspirations?

Take Action: Understand students’ postsecondary intentions and tackle barriers to achieving them.

Ask Yourself: What might explain differences in college-going rates across high schools among students with similar incoming achievement? What are we doing (and what more should we be doing) to ensure that all students are academically prepared for success in high school and college?

Take Action: Develop, evaluate, and improve strategies and interventions for boosting college-going rates.

  • Comprehensively assess your agency's current strategies for promoting college enrollment. Gather evidence regarding the implementation and effectiveness of these strategies.

  • Evaluate the extent to which current strategies align with your students’ needs.

  • Investigate other agencies’ strategies for promoting students’ successful transition to postsecondary education. Identify potential strategies that might be effective given the context of your particular agency and student population.

  • Investigate outside resources and partnerships for supporting students’ progress toward, and enrollment in, college. Work with colleges to increase their outreach efforts in your high schools. Collaborate with local community-based organizations, nonprofits, and businesses on programs and practices that boost students’ college enrollment and success.

Ask Yourself: How can high schools help lower-performing students overcome the limitations of their past achievement and beat the odds by enrolling in college?

Take Action: Invest in strategies for keeping students on track for college attainment.

  • Choose an indicator of college readiness that aligns with local college admissions requirements and map backward to a set of elementary and middle school benchmarks that align with this indicator. As an example, investigate Montgomery County (MD) Public School's Seven Keys to College Readiness.6

  • Track students’ attainment of these benchmarks and communicate information about students’ progress toward the benchmarks to students, their families, teachers, and counselors.

  • Help students make explicit connections between their academic work, the attainment of these benchmarks, and the attainment of their long-term educational goals.

  • Study and learn from schools inside and outside the agency that have successful systems and processes for getting less prepared students ready for and into college.

Ask Yourself: What can individual schools do to foster a college-going culture?

Take Action: Influence students’ postsecondary intentions by creating a strong college-going culture.

  • Explore effective ways to foster a culture and set of expectations around college going, particularly in high schools with low postsecondary matriculation rates.

  • Study and learn from schools inside and outside the agency that have strong college-going cultures.

The SDP Human Capital Diagnostic Strategic Performance Indicators

Novice Teacher Placements: Do Low-Performing Students Get Placed with Novice Teachers?

Summary of Findings

Although novice teachers tend to be less effective than those who have been teaching for at least a few years, lower-performing students are more likely than their higher-performing peers to be assigned to the classrooms of first-year teachers. This pattern is evident in each school district we examine, and is generally found not only across schools within a district but also within individual schools. Such systematic placement patterns can exacerbate the challenges faced by students who are furthest behind academically and can perpetuate existing achievement gaps. They may also make the first years of teaching more difficult for novices, in turn impacting their retention. In this brief, we present novice-teacher placement results for the Albuquerque Public Schools (NM), Boston Public Schools (MA), Charlotte-Mecklenburg Schools (NC), Fort Worth Independent School District (TX), Fulton County Schools (GA), Gwinnett County Public Schools (GA), Los Angeles Unified School District (CA), and The School District of Philadelphia (PA).

Introduction

Research indicates that being a novice teacher is one of the few easily measureable factors that regularly relates to teachers’ impact on student achievement (Rockoff 2004; Clotfelter, Ladd, and Vigdor 2007). Consistent with existing research, SDP analyses find that first-year teachers are, on average, less effective than those with several years of classroom experience. Specifically, novice teachers tend to have lower value-added scores than their more experienced counterparts.

Given the evidence that novice teachers tend to be less effective than those who have had more years in the classroom, it is important to investigate how novice teachers are being deployed. Are they more likely to be teaching students who are behind academically? Or students who are ahead? Or is there no systematic placement pattern? To answer these questions, SDP created an SPI to compare the prior-year achievement of students placed with novice teachers (those in their first year of teaching) to that of students placed with experienced teachers (those in the profession for four years or more).

Differences in the average prior achievement of students placed with novice teachers and those placed with experienced teachers could exist for at least two reasons. First, schools with relatively low average achievement may also have higher turnover than other schools. High-turnover schools must hire more teachers each year and thus may tend to be staffed disproportionately by novice teachers. This pattern would lead to between-school differences in novice teacher exposure to students with low prior achievement, but does not necessarily imply that within any individual school novice teachers receive students with lower prior achievement. Second, even within individual schools, staffing policies and practices may intentionally or unintentionally assign novice teachers to classrooms with lower-achieving students. For example, within a given middle school, higher-level math courses may typically be staffed by more experienced teachers. Such practices would lead to within-school differences in students’ exposure to novice teachers.

Differential exposure to novice teachers for a district's students may therefore be the product of between-school differences, within-school differences, or a combination of the two. Because appropriate policy actions will vary depending on the levels at which these differences exist, for each district we examine differences in exposure to novice teachers overall and differences that exist among students who attend the same schools.

Strategic Performance Indicator: Which Students Are Placed with First-Year Teachers?

For this SPI, we compare prior mathematics achievement, as measured by performance on the prior year's state standardized mathematics assessment, of students placed with novice teachers to that of their peers placed with experienced teachers. Figures 4 and 5 present results for nine SDP partner school districts for elementary and middle schools, respectively.7 The black bars represent the differences in prior student achievement overall in each district—differences that may exist because lower-performing schools have more novice teachers or because novice teachers tend to be assigned lower-performing students within schools, or both. The gray bars, on the other hand, capture only those differences attributable to placement patterns within individual schools.

Figure 4.

Differences in Average Prior Math Achievement of Students Assigned to Novice Teachers and Those Assigned to Experienced Teachers among Elementary School Teachers. Note: *p < 0.05.

Figure 4.

Differences in Average Prior Math Achievement of Students Assigned to Novice Teachers and Those Assigned to Experienced Teachers among Elementary School Teachers. Note: *p < 0.05.

Figure 5.

Differences in Average Prior Math Achievement of Students Assigned to Novice Teachers and Those Assigned to Experienced Teachers among Middle School Teachers. Note: *p < 0.05.

Figure 5.

Differences in Average Prior Math Achievement of Students Assigned to Novice Teachers and Those Assigned to Experienced Teachers among Middle School Teachers. Note: *p < 0.05.

As figure 4 reveals, in six of the nine districts, the prior math scores of elementary school students assigned to the classrooms of novice teachers are significantly lower than those of peers assigned to more experienced teachers. Specifically, in these six districts, the students placed with first-year teachers are, on average, 0.09 to 0.31 standard deviations behind students placed with more experienced teachers on measures of prior achievement in mathematics. This is roughly equivalent to a gap of three to nine months of learning.8

Across the school districts, the gap in prior achievement within elementary schools tends to be smaller than the gap across schools. Nevertheless, in Charlotte, Fulton, Gwinnett, Los Angeles, and Philadelphia, significant differences in prior math achievement exist between the students of novice and experienced teachers working in the same elementary school. In other words, within specific elementary schools in these districts, students with lower levels of prior achievement are more likely to be assigned a novice teacher than are their peers with higher prior achievement.

Figure 5 reveals that gaps in prior math achievement between the students of novice teachers and the students of experienced teachers tend to be larger in middle schools than in elementary schools. In seven of the eight SDP partner districts examined here, middle school students assigned to first-year teachers were as much as 0.37 standard deviations (or roughly 10 months) behind peers assigned to experienced teachers, on average. Although the gaps again tend to be smaller within schools than they are across schools, the within-school differences in prior achievement between the two groups are statistically significant in six districts, and are almost as large as (or larger than) the overall gap in elementary schools in some districts. These larger within-school disparities in figure 5 may be the result of the greater prevalence of tracking in middle schools. For example, experienced teachers may be more likely to teach advanced or honors math sections, whereas novice teachers may be disproportionately assigned to teach basic or remedial math classes.

In additional analyses not reported here, SDP examined whether second- and third-year teachers are also more likely to be placed with students with lower prior achievement, either within schools or agency-wide. Although the gaps between elementary teachers in their second and third year and those in at least their fourth year were consistently smaller, they remained statistically significant and substantial in several districts. In other words, although the patterns documented by this SPI are most pronounced for first-year teachers, they are also evident for other teachers early in their careers.

Implications from Findings

The results of the SDP analyses show that, across all of the agencies examined, students with lower levels of prior achievement in at least one grade level (elementary or middle school) are more likely to be placed with novice teachers than are their higher-achieving peers. These patterns are consistent with findings from other school districts and states in which researchers have examined this relationship (Clotfelter, Ladd, and Vigdor 2005; Feng 2010; Kalogrides, Loeb, and Béteille 2013). They deserve attention, because both previous research and SDP analyses of administrative data from these same agencies (not reported here) show that students placed with novice teachers tend to achieve less academic growth (as measured by value-added scores) than do peers assigned to more experienced teachers. The systematic placement of novice teachers with lower-achieving students may therefore compound the academic difficulties they face and exacerbate existing achievement gaps.

This SPI confirms that differential exposure to novice teachers can and often does occur as a function of both cross-school differences in teacher experience as well as classroom assignment practices within individual schools. In considering strategies to address students’ differential exposure to novice teachers, it is critical for agency leaders to understand the levels at which these differences are occurring. Without such information, potential policy responses, such as those that aim to distribute novice teachers more equitably, might not target the source of the differences appropriately.

In considering possible responses, it is also important not to diminish the efforts or potential of early-career teachers. Many first-year teachers in all SDP partner districts perform at levels comparable or even superior to that of their veteran colleagues, and all teachers were first-year teachers at some point. Instead, the aim is to highlight systemic differences in students’ access to experienced teachers and to raise questions regarding the impact of these differences on student achievement. Agency leaders should also consider the potential impact of these placement patterns on novice teachers. Given how difficult it is for school districts to find and retain effective educators, it seems unlikely that placing first-year teachers in highly challenging teaching situations at the very beginning of their career—often without sufficient support and guidance—is the best way to encourage and retain them. Indeed, existing research suggests that new teachers assigned lower-achieving students with more discipline problems are more likely to leave their school, while the same is not true for more experienced teachers (Donaldson and Johnson 2010; Feng 2010).

A teacher's number of years of experience is one of the few easily measurable characteristics that consistently predicts classroom effectiveness. We therefore urge agencies to carefully examine how they assign their teachers at present. Changing practices concerning the placement of students with novice teachers may be an important mechanism to boost the achievement of students who are furthest behind.

Ask Yourself, Take Action

Why do these teacher placement patterns exist? The SPI analyses on their own are not designed to determine the causes for these findings. Rather, they aim to prompt a series of questions that will help education leaders uncover causes and be positioned to make informed changes in management and policy. Asking and answering these questions should lead to a better understanding of differences in outcomes. Our ultimate hope is that such a cycle of deep inquiry will lead to improved strategies and solutions.

Ask Yourself: How are we using data to improve human capital decisions within our agency? How can we use data more effectively? What is preventing us from doing so?

Take Action: View data as a source of insights for improvement.

  • Cultivate the analytical capacity to examine student and teacher data, both agency-wide and for individual schools. Explore how teacher placement patterns vary for students grouped by salient characteristics, such as prior academic achievement.

  • Conduct these SPI analyses annually to take stock and to investigate whether changes in policy, strategy, management, or practice are having the desired impact.

  • Share these results broadly with teachers and school leaders to build understanding of and support for data-driven placement decisions.

  • Use the results of these analyses as a springboard for follow-up questions and analyses to illuminate key challenges or needs specific to the agency.

Ask Yourself: What policies and politics influence how teachers are assigned to schools? How much authority do principals have over hiring, and to what extent do hiring practices vary across schools? To what extent are teacher placement patterns driven by the concentration of novice teachers and lower-performing students in certain schools?

Take Action: Take measures to level the playing field so that schools serving the most challenged students can attract, and retain, experienced and effective teachers.

  • Identify the drivers of high teacher turnover at schools serving more disadvantaged and lower-achieving students (Boyd et al. 2009, 2011), and develop policies and programs in response.

  • Address policy and other barriers that impede high-needs schools from competing successfully for experienced and effective teachers. For example, seniority clauses in union contracts preference veteran teachers in transfers across schools; in the absence of incentives for choosing a high-needs school, teachers with seniority are more likely to choose “easier” (low-poverty) schools (Levin and Quinn 2003; Levin, Mulhern, and Schunck 2005).

  • To encourage more experienced and effective teachers to teach in hard-to-staff schools, consider providing monetary incentives as well as taking steps to improve the working conditions for teachers in those schools (Berry 2008).

Ask Yourself: Are within-school teacher placement gaps concentrated in certain schools? What internal policies or politics influence within-school teacher placement patterns?

Take Action: Examine within-school assignment mechanisms to ensure that students with lower levels of prior achievement have equal access to more effective teachers.

  • Provide principals with data on teacher assignment patterns within their schools and consider holding them accountable for assigning effective teachers to low-performing students.

  • Renegotiate seniority rules that allow more senior teachers to choose their classroom assignments.

  • Identify and revise “default” practices that place lower-performing students at a disadvantage. For example, ensure that the process of preparing classroom rosters gives students equal access to effective teachers, and encourage principals to reserve a few seats in each classroom for late-enrolling students to keep them from being placed disproportionately with novice teachers.

Ask Yourself: What are we doing to improve retention of effective teachers? How effective are we in attracting new teachers with the greatest potential and in filling existing vacancies with effective, experienced teachers?

Take Action: Recognize teacher retention and hiring as strategically related to student achievement outcomes. Champion efforts to reduce turnover and retain effective teachers. Strive to attract high-quality candidates in order to fill vacancies with effective teachers.

  • Conduct exit surveys to learn more about which teachers leave the agency and why they leave. Use survey results and other data sources to understand the agency's success in retaining effective teachers.

  • Examine the potential for incentives and other policy steps to encourage effective, experienced teachers to remain in the agency.

  • Use data to understand the sources of the agency's most effective teachers.

  • Use this information to inform efforts to recruit teacher candidates with strong potential for success. Implement or revise policies and practices to maximize the agency's ability to attract high-quality teachers.

Ask Yourself: What are the experiences of new teachers within your agency? What steps are you taking to proactively support novice teachers in having a successful first year in the classroom?

Take Action: Recognize and work to address the challenges typically experienced by first-year teachers.

  • Gather information from your first-year teachers on their experiences and challenges in the classroom.

  • Catalogue your agency's efforts to support beginning teachers, in particular, and take steps to investigate the effectiveness of these efforts.

  • Investigate what additional steps your agency can take to ensure a successful first year for novice teachers. For example, consider how careful student placement may contribute to beginning teachers’ success.

Conclusion: Spotlight on Taking Action

The SDP diagnostics and associated SPIs have been conducted with several partner agencies and are currently ongoing in others. In closing, we provide examples of concrete actions that partner agencies have taken in response to these analytical findings, thus illustrating the strategic use of data for improving educational outcomes.

Summer Counseling to Improve College Going in Fulton County Schools

In reviewing SDP analyses such as those presented here, counseling staff in the Fulton County Schools (FCS) found that postsecondary outcomes for their graduates were not as strong as had been previously assumed. Specifically, many high school seniors who said that they intended to go to college directly after high school did not actually enroll in the fall. In response, the Fulton team collaborated with researchers at the Center for Education Policy Research at Harvard University to learn more about this “summer melt” phenomenon and to develop an intervention.

Out of these conversations grew Summer PACE (Personalized Assistance for College Enrollment), a summer college counseling program launched in the summer of 2011. The district first utilized data from high school exit surveys to identify college-intending graduating seniors—students who had applied and been accepted to college and who reported intentions to enroll the following fall. Over the summer, high school counselors proactively reached out to the targeted students to provide guidance and assistance related to summer college-going tasks, such as securing additional financial aid, finding housing, and deciphering and completing college-related paperwork, in addition to providing support and encouragement in general.

Intent on understanding the impact of this intervention on college-going outcomes, program implementation included a randomized controlled trial evaluation design. Results revealed Summer PACE to be highly effective. At an average cost of just over $100 per targeted student, on-time college enrollment for economically disadvantaged students increased by nearly 9 percentage points, reducing summer melt among these students by 26 percent. Based on the positive results, FCS has expanded the program district-wide, targeting college-intending, low-income high school graduates whom the data show are at greater risk of faltering in the seamless transition to college (see Jenkins, Wisdom, and Glover [2012] and Castleman, Page, and Schooley [2012] for more information).

A Framework and Information Campaign to Encourage More Strategic Classroom Staffing

In several partner agencies, the results of the SDP human capital diagnostics motivated district officials to investigate what steps they were able to take to impact classroom staffing patterns throughout the district along dimensions such as students' equitable exposure to high-quality teachers. This involved practices such as providing school leadership with guidance on and support related to the hiring, placing, and retaining of effective teachers. District leaders took steps to help principals understand the district policies and the flexibilities afforded by those policies regarding teacher hiring and placement. In some instances, district officials identified disparities between the actual and perceived latitude that principals had in staffing decisions as well as opportunities for positive changes in standard district practice.

In Charlotte-Mecklenburg, for example, the district founded the Center for Human Capital Strategies to help the district focus explicitly on students' exposure to highly effective teachers. The district devised a system to analyze the distribution of teaching talent both across and within schools. Principals were provided access to those data and were held accountable for their choices related to student assignment. This provided greater motivation for principals to assign high-need students to teachers most adept at providing the academic support they needed.

Collectively, the goal of these efforts is to approach teacher recruitment and staffing with a strategic focus. Because these efforts are still underway, it is too early to assess their impact on staffing patterns and student achievement outcomes.

Notes

1. 

These analyses draw on data from the 1998–99 school year through the 2009–10 school year. The specific dates vary by district based on data quality and availability. Students who attended more than one high school are classified according to the last high school they attended.

2. 

The National Student Clearinghouse is a national nonprofit that provides enrollment and degree verification to more than 3,400 colleges and universities (representing more than 96 percent of students enrolled in college nationwide).

3. 

For each district, eighth-grade achievement reflects performance on the state's standardized math assessment. Performance is represented in standard deviation units, with average eighth-grade achievement centered at 0.

4. 

To make these comparisons, we first sort all students with test score information into district-wide quartiles (i.e., four equal-sized groups) of prior achievement, based on their performance on the state's eighth-grade standardized math assessment. Quartile 1 (Q1) students are the lowest performing, and Quartile 4 (Q4) students are the highest. We then examine college-going rates by high school among the graduates within each quartile of prior achievement.

5. 

Readers should place less weight on observations that represent very small numbers of students in certain performance quartiles in certain schools (i.e., the very smallest circles).

6. 

For more information, see www.montgomeryschoolsmd.org/info/keys/.

7. 

The results presented in this brief include data from the following school years, by site: Albuquerque (2006–07 through 2010–11); Boston (2006–07 through 2009–10); Charlotte (2004–05 through 2008–09); Fulton (2007–08 through 2009–10); Fort Worth (2005–06 through 2010–11); Gwinnett (2005–06 through 2009–10); Los Angeles (2004–05 through 2010–11); Philadelphia (2006–07 through 2009–10); and Washington DC (2008–09 through 2009–10). We examine placement patterns for elementary schools and middle schools separately because placement patterns may differ for middle schools due to their greater reliance on formal student tracking systems. Because the analysis requires a measure of prior student achievement, the indicator is limited to math teachers in grades 4 through 8. (Note: Most state assessment systems begin at grade 3 and only provide complete coverage for math and English language arts.)

8. 

The conversions of standard deviations of student achievement to months of learning are based on Hill et al. (2008).

Acknowledgments

This work benefitted from the guidance and feedback of Chris Avery, Patty Diaz-Andrade, Rachel Hitch, and several other members of the Strategic Data Project team. We are particularly grateful for the contributions of members of our research team, including Ilya Faibushevich, Meg Nipson, Aaron Dow, Olivia Chi, and Amal Kumar. We thank Lynn Jenkins for excellent editorial contributions. We gratefully acknowledge the Bill & Melinda Gates Foundation for generous financial support. All errors and omissions are our own.

REFERENCES

Arnold
,
Karen
,
Shezwae
Fleming
,
Mario
DeAnda
,
Benjamin
Castleman
, and
Katherine Lynk
Wartman
.
2009
.
The summer flood: The invisible gap among low-income students
.
Thought and Action
25
:
23
34
.
Avery
,
Christopher
, and
Thomas J.
Kane
.
2004
.
Student perceptions of college opportunities: The Boston COACH Program
. In
College choices: The economics of where to go, when to go, and how to pay for it
,
edited by Caroline M. Hoxby
, pp.
355
94
.
Chicago
:
University of Chicago Press
. doi:10.7208/chicago/9780226355375.003.0009
Berry
,
Barnett
.
2008
.
Staffing high-needs schools: Insights from the nation's best teachers.
Phi Delta Kappan
89
(
10
):
766
71
.
Bettinger
,
Eric
,
Bridget Terry
Long
,
Philip
Oreopoulos
, and
Lisa
Sanbonmatsu
.
2012
.
The role of application assistance and information in college decisions: Results from the H&R Block FAFSA experiment.
Quarterly Journal of Economics
127
(
3
):
1205
42
. doi:10.1093/qje/qjs017
Boyd
,
Donald
,
Pam
Grossman
,
Marsha
Ing
,
Hamilton
Lankford
,
Susanna
Loeb
, and
James
Wycoff
.
2011
.
The influence of school administrators on teacher retention decisions.
American Educational Research Journal
48
(
2
):
303
33
. doi:10.3102/0002831210380788
Boyd
,
Donald
,
Pam
Grossman
,
Hamilton
Lankford
,
Susanna
Loeb
, and
James
Wycoff
.
2009
.
Who leaves? Teacher attrition and student achievement
.
CALDER Working Paper No. 23, Urban Institute
.
Castleman
,
Benjamin L.
,
Lindsay C.
Page
, and
Korynn
Schooley
.
2012
.
The forgotten summer: The impact of college counseling the summer after high school on whether students enroll in college
.
Paper presented at the Annual Meeting of the Association for Public Policy Analysis and Management, Sheraton Baltimore City Center, November
.
Clotfelter
,
Charles T.
,
Helen F.
Ladd
, and
Jacob
Vigdor
.
2005
.
Who teaches whom? Race and the distribution of novice teachers.
Economics of Education Review
24
(
4
):
377
92
. doi:10.1016/j.econedurev.2004.06.008
Clotfelter
,
Charles T.
,
Helen F.
Ladd
, and
Jacob
Vigdor
.
2007
.
Teacher credentials and student achievement in high school: A cross-subject analysis with fixed effects
.
CALDER Working Paper No. 11, Urban Institute
.
College Board
.
2011
.
Complexity in college admission: The barriers between aspiration and enrollment for lower-income students
.
College Board Advocacy & Policy Center Report No. 11b-4062
.
Donaldson
,
Morgaen L.
, and
Susan Moore
Johnson
.
2010
.
The price of misassignment: The role of teaching assignments in Teach for America teachers’ exit from low-income schools and the teaching profession.
Educational Evaluation and Policy Analysis
32
(
2
):
299
323
. doi:10.3102/0162373710367680
Dynarski
,
Susan M.
, and
Judith
Scott-Clayton
.
2006
.
The cost of complexity in federal student aid: Lessons from optimal tax theory and behavioral economics.
National Tax Journal
59
(
2
):
319
56
.
Feng
,
Li
.
2010
.
Hire today, gone tomorrow: New teacher classroom assignments and teacher mobility.
Education Finance and Policy
5
(
3
):
278
316
. doi:10.1162/EDFP_a_00002
Hill
,
Carolyn J.
,
Howard
Bloom
,
Alison R.
Black
, and
Mark W.
Lipsey
.
2008
.
Empirical benchmarks for interpreting effect sizes in research
.
Child Development Perspectives
2
(
3
):
172
77
. doi:10.1111/j.1750-8606.2008.00061.x
Jenkins
,
Lynn
,
Michelle
Wisdom
, and
Sarah
Glover
.
2012
.
Increasing college-going rates in Fulton County Schools: A summer intervention based on the strategic use of data
.
Cambridge, MA
:
Harvard Education Press
.
Kalogrides
,
Demetra
,
Susanna
Loeb
, and
Tara
Béteille
.
2013
.
Systematic sorting: Teacher characteristics and class assignments
.
Sociology of Education
86
(
2
):
103
23
. doi:10.1177/0038040712456555
Levin
,
Jessica
,
Jennifer
Mulhern
, and
Joan
Schunck
.
2005
.
Unintended consequences: The case for reforming the staffing rules in urban teachers union contracts
.
New York
:
The New Teacher Project
.
Levin
,
Jessica
, and
Meredith
Quinn
.
2003
.
Missed opportunities: How we keep high-quality teachers out of urban classrooms
.
New York
:
The New Teacher Project
.
Rockoff
,
Jonah
.
2004
.
The impact of individual teachers on student achievement: Evidence from panel data.
American Economic Review
94
(
2
):
247
52
. doi:10.1257/0002828041302244
Roderick
,
Melissa
,
Jenny
Nagaoka
,
Vanessa
Coca
,
Eliza
Moeller
,
Karen
Roddie
,
Jamiliyah
Gilliam
, and
Desmond
Patton
.
2008
.
From high school to the future: Potholes on the road to college
.
Consortium on Chicago School Research Report
.
Rouse
,
Cecilia
E.
2004
.
Low-income students and college attendance: An exploration of income expectations.
Social Science Quarterly
85
(
5
):
1299
1317
. doi:10.1111/j.0038-4941.2004.00277.x