Abstract
School attendance zone boundary (AZB) data remain relatively underdocumented and understudied within the field of education, despite their critical implications for educational (in)equity. AZBs shape student outcomes and residential sorting patterns both by determining the public schools a student is assigned to and by signaling neighborhood characteristics to prospective homebuyers. The limited access, regulation, and review of AZB data to date has left a gap in the knowledge base, having the potential to leave intact (and exacerbate) patterns of segregation that maintain inequities in educational opportunity. Lack of data also limits our ability to know whether and when AZBs may mitigate segregation. In this brief, we examine a novel data collection effort of current and historical AZB data—the Longitudinal School Attendance Boundary System—to explore the contextual and political factors associated with data access and data quality. We aim to show how factors that hinder access to quality AZB data affect the study of educational equity, and we advocate for more comprehensive, top–down governmental efforts to create, maintain, and collect these data.
Introduction
Data increasingly measure, govern, and reform education systems (Mehta 2013; Schafft 2016). In recent years, calls for increased data and data transparency have been especially championed by civil rights advocates in an attempt to hold schools accountable to serve all children equitably, especially students of color, students from low-income households, and other historically marginalized student subgroups (e.g., Amerikaner 2020). Though many types of educational data are ubiquitous today, school attendance zone boundary (AZB) data remain relatively underdocumented and understudied (Saporito 2017; Geverdt 2018), which ultimately limits knowledge of whether and how students’ access to educational opportunity is shaped by where they live.
AZBs determine public school assignment for most students based on residence. As such, AZBs might also influence prospective homebuyers’ decisions, particularly if they have children. Longitudinal AZB data could help answer questions about how boundaries—and changes to boundaries—shape patterns of residential and school segregation over time. Despite the centrality of these questions to many contemporary educational and societal inequities, there are limited sources of comprehensive, longitudinal AZB data from school districts across the United States.
We draw from our experience conducting a novel collection of longitudinal school AZB data to examine factors limiting data access and quality. Our efforts thus far have been affected by a lack of federal- and state-level structures for AZB data and by a myriad of local political factors, leading to an unrepresentative sample to date that constrains understanding of how AZBs shape access to educational opportunities. In particular, we tend to lack data—especially historical data—in districts with small enrollments, districts in rural areas, and districts serving predominantly black, Hispanic, and/or economically disadvantaged students. In this brief, we critique the politics and infrastructure that govern AZB data. We suggest more comprehensive governmental efforts to create, maintain, and collect these data.
Limited AZB Data Hinder Systematic Knowledge
Public schools vary considerably in resources, quality, and educational outcomes, despite politicians’ frequent statements that children's zip codes should not affect their access to opportunity (e.g., DeVos 2017; Biden 2020). In the United States, 71 percent of students attend their zoned, public school (National Center for Education Statistics 2019). Thus, AZBs play a major role in students’ access to educational quality (Wells and Crain 1994; Mickelson and Nkomo 2012). Furthermore, because homeseekers with and without children consider perceptions of local schools when choosing a home, AZBs also influence home values and shape residential sorting processes (Siegel-Hawley 2013; Lareau and Goyette 2014; Liebowitz and Page 2014). Individual decisions about where to live are shaped by policy makers’ decisions about where to draw AZBs. Together, this affects access to and support for public K–12 education. Thus, we believe that patterns of AZB change over time deserve more attention.
To date, there exist very limited sources of centralized AZB data. The National Science Foundation funded the School Attendance Boundary Information System (SABINS) as the first national AZB data collection effort. SABINS collected and digitized AZB data for approximately six hundred school districts in the 2009–10, 2010–11, and 2011–12 school years, focusing on some of the largest districts in thirteen regionally diverse metropolitan areas (Saporito, Van Riper, and Wakchaure 2013). Next, the U.S. Department of Education (ED) initiated its School Attendance Boundary Survey (SABS), which drew from SABINS data and collected additional AZB data in 2013–14 and 2015–16 (Geverdt 2018). The SABS project did not continue after 2015–16; further collection cycles would have required more long-term infrastructure planning, which ED had to balance against other data priorities (ED staff, personal communication, 23 February 2022). Other than these two publicly available datasets, the most consistent place AZB data are found is on real estate search engines (e.g., Zillow, Trulia, Redfin). Private companies collect these data to sell to real estate Web sites, but their accuracy remains unclear. Finally, the availability of AZB data on individual school district Web sites varies widely, and many Web sites that do have boundaries include disclaimers about accuracy. Other district Web sites may provide school assignment for a specific address if entered, but do not provide AZB data for the entire district.
While the lack of systematic, longitudinal AZB data ultimately limits knowledge about the relationship between AZBs and educational equity, existing research suggests the critical nature of these boundaries. Using data from SABINS and SABS, several cross-sectional studies find that, on average, most AZBs reproduce patterns of residential segregation within schools (Richards 2014; Saporito and Van Riper 2016; Saporito 2017; Monarrez 2019). However, these averages can obscure important variation about which AZBs may be particularly segregative or integrative. For example, Saporito and Van Riper (2016) and Saporito (2017) find that irregularly shaped AZBs are more likely to be racially diverse, suggesting that they are better able to draw segregated neighborhoods into the same school. Richards (2014) finds that AZBs in districts with rapidly changing racial demographics are more segregative than those in districts experiencing less demographic change, suggesting that we pay careful attention to diversifying places. Furthermore, case studies of AZB changes demonstrate mixed findings and suggest that further analysis over time is needed. Some analysts highlight school districts where recent AZB changes have increased net racial school segregation (Wiley, Shircliffe, and Morley 2012; Siegel-Hawley 2013; Siegel-Hawley, Bridges, and Shields 2017), whereas others find places where changes have intentionally enhanced net desegregation (Eaton 2012; Saporito 2017).1
Assembling longitudinal AZB data would facilitate additional research that could identify where AZBs have maintained segregation over time and where they have been altered to reduce segregation. This could provide insight into the extent to which districts can influence segregation by modifying their AZBs. Preliminary analyses of the districts currently in our longitudinal dataset demonstrates that 73 percent have experienced at least some AZB change since 1990, and our more detailed analysis of eight districts suggests AZB changes are frequent and widespread, especially in large districts (see, e.g., Frankenberg et al. 2023).
AZB changes in recent decades are of particular interest given waning judicial enforcement of school desegregation. In the early 1990s, the U.S. Supreme Court substantially eased what it required of school districts to remedy prior segregation. Since then, hundreds of districts have ended court oversight (Orfield and Eaton 1996; Reardon et al. 2012), chilling policies aimed at explicitly dismantling the school-residential segregation link (Reardon and Yun 2003; Reardon et al. 2012; Frankenberg 2013). Most recently, the Supreme Court permitted, but did not require, consideration of an area's racial composition when redrawing AZBs (Parents Involved in Community Schools v. Seattle School District No. 1, 551 U.S. 701 [2007]), making AZBs one of the few remaining robust integration tools. Moreover, AZBs are part of a broader set of administrative boundaries that scholars use to study inequality; expanding their availability has the potential to benefit scholarship and policy across a broad range of related disciplines.
Factors Affecting the Creation of a Robust AZB Dataset
The Longitudinal School Attendance Boundary System
To help fulfill the need for AZB data, we began to create the Longitudinal School Attendance Boundary System (LSABS) in September 2019 with support from the National Science Foundation. We aim to collect and digitize 1990, 2000, 2010, and 2020 AZB data for a sample of 2,238 districts across all fifty states and Washington, DC (see table 1 regarding data collection to date). This sample includes the largest one hundred school districts in the country (National Center for Education Statistics 2017), those identified as having once been under a court-ordered desegregation plan (Brown University 2005; Reardon et al. 2012), those identified as having a voluntary integration plan (Qiu and Hannah-Jones 2014), and those for which ED provides 2009–10 data as part of SABS (National Center for Education Statistics 2010).
. | N Districts . | % of Full Sample . |
---|---|---|
1990 | 129 | 5.8 |
2000 | 389 | 17.4 |
2010 | 919b | 41.1 |
2020 | 1,384 | 61.8 |
All years | 173 | 7.7 |
Full sample | 2,238 | 100.0 |
. | N Districts . | % of Full Sample . |
---|---|---|
1990 | 129 | 5.8 |
2000 | 389 | 17.4 |
2010 | 919b | 41.1 |
2020 | 1,384 | 61.8 |
All years | 173 | 7.7 |
Full sample | 2,238 | 100.0 |
Notes: The count of districts for which we have “all years” of data exceeds the count of districts for which we have 1990 data because there were several districts that had only one elementary, middle, and high school in 1990 and thus did not have attendance zone boundaries (AZBs) distinct from their district boundaries. Those districts are not included in our counts of 1990 complete Longitudinal School Attendance Boundary System (LSABS) data. However, if we have complete AZB data for those districts from subsequent years when there were additional schools, the districts are included in the “all years” count. Moreover, our sampling mechanisms produced an additional 1,037 districts with only one school serving each grade level across all years of study. Those districts are not included in this table, since they, at no point in the study period, had AZBs distinct from their district boundaries.
a“Complete” signifies that we have AZB information for the district at every level that it serves: elementary, middle, and/or high.
b2010 data for 514 districts come from SABS; data for the remaining districts come from our own collection.
Limited Statewide Infrastructure for Longitudinal AZB Data
In the absence of federal-level efforts to track AZB data, few states have attempted to collect such information on their own. The Minnesota Department of Education provides a lonely example with AZB shapefiles available for download for each school year since 2002–03 (Minnesota Geospatial Commons n.d.). Delaware's Department of Education also provides current AZB shapefiles through the state's open data portal, though it does not provide boundaries from previous years (Delaware FirstMap n.d.). Finally, the Oregon Department of Human Services & Oregon Health Authority collected school boundaries for all districts in the state in 2010 and has intermittently updated the data, most recently in 2020 with special COVID-19 funding (Oregon Spatial Data Library n.d.).
The only other state in which we have encountered any sort of top–down capability for monitoring boundary data is in North Carolina, but presently, their data remain inaccessible to our collection efforts. In North Carolina, the Transportation Information Management System (TIMS) provides a statewide software to help districts plan busing routes, and consequently, the system contains data on school AZBs. However, these data exist in a proprietary structure related to the software functionality, and it would take a prohibitively costly amount of time for TIMS staff to retrieve these data, extract student-level information, and convert to a shareable format. Furthermore, TIMS has not saved data prior to 2010, though the organization has existed since 1992. Other than these four exceptions, we have not found state-level boundary data, nor do most states monitor AZBs.
Limits on Gathering Boundaries within States
In the absence of state-level data, our most effective method of collecting boundary data has been to submit public information requests to individual school districts. All states have Freedom of Information (FOI) statutes, and as public bodies, school districts are subject to such laws. Specifics of FOI laws, including who can request records, timelines for response, and fee schedules, all vary by state. For example, five states—Arkansas, Delaware, Missouri, Tennessee, and Virginia—allow only state residents to make requests, and two—Georgia and Alabama—have interpreted their respective FOI laws to mean only state residents, though this is not explicit in their written laws (see Alabama Attorney General Opinion 2018-030 and Georgia HB 397 2012).
In most states, there is also a dearth of historical data available due to records retention laws that allow for the destruction of older public records. In many of the districts we contacted, administrative staff were unable to fulfill our data requests as they had no historical AZB data to provide. For example, Arizona's records retention law dictates that school districts are only required to maintain data for two years, and it specifically encourages that older records be destroyed, noting that “[k]eeping records longer than the retention period poses financial, legal, and investigative risks” (Arizona GS-1018 Rev 3 2019). Other districts have cited events such as malware attacks or floods that have destroyed their historical records. In some places, we have found alternative means—including library and Internet archives—to access older files (see appendix table A.1, available in a separate online appendix that can be accessed on Education Finance and Policy’s Web site at https://doi.org/10.1162/edfp_a_00388), but in other districts we have been unsuccessful, leaving us with a limited historical dataset.2
Another barrier shaping our collection effort has been the inconsistency with which local districts comply with their own open government laws. Sometimes this is due to a lack of training and education about the laws, especially as many are written vaguely, but other times it appears to be intentional (Hooper and Davis 2014; Marzen 2018; Wagner 2021). Several districts in South Carolina denied our request because we were not state residents; though, when we cited the state FOI law that does not specify this provision, some of those districts retracted earlier statements and searched for responsive records. Texas's state law specifies that a public entity may waive costs incurred for a records request if it “determines that waiver or reduction of the charge is in the public interest because providing the copy of the information primarily benefits the general public” (Texas Government Code §552.267 1997). This leaves district employees able to decide whether they believe AZB data “benefits the general public,” though our response rate in Texas was slightly higher than average (63 percent of contacted Texas districts provided responsive records, compared with 56 percent of contacted districts overall). Statewide response rates ranged from 91 percent in New York and 83 percent in Maryland and Illinois to 17 percent in Arizona and 9 percent in Alabama (where the state-residency requirements of the FOI law are contested). Ultimately, our experiences demonstrate how street-level bureaucrats within districts shape the information we have access to through their implementation of public records laws (Lipsky 1980). Table 2 illustrates the variety of responses we have received to our data requests thus far. The table shows that rejection rates are low, as explicit rejections must be based on a legal rationale under each state's FOI law. However, nonresponse rates are relatively high and may, in some cases, be a form of rejection. While we cannot know the specific factors leading to a nonresponse (e.g., lack of district capacity), nonresponses have certainly shaped our access to information.
Response Type . | N Districts . |
---|---|
No response | 178 |
Responded, rejected the request | 12 |
Responded, unable to provide records | 140 |
Responded, provided records | 599 |
Provided 1990 data | 118 |
Provided 2000 data | 213 |
Provided 2010 data | 216 |
Had SABS 2010 data | 212 |
Provided 2020 data | 326 |
Had 2020 data online | 180 |
Total contacted districts | 929 |
Response Type . | N Districts . |
---|---|
No response | 178 |
Responded, rejected the request | 12 |
Responded, unable to provide records | 140 |
Responded, provided records | 599 |
Provided 1990 data | 118 |
Provided 2000 data | 213 |
Provided 2010 data | 216 |
Had SABS 2010 data | 212 |
Provided 2020 data | 326 |
Had 2020 data online | 180 |
Total contacted districts | 929 |
Notes: We include here any district to which we have submitted a formal Freedom of Information request or made an informal request for data via phone or e-mail. We exclude districts to which we had open requests as of February 2022. We count a district as providing “no response” if it did not respond after at least two attempts at contact. A district “rejected the request” if it provided a legal rationale for not providing records (e.g., lack of state residence). A district was “unable to provide records” if it did not locate any responsive records on file, usually due to the destruction of old records. The first four rows sum to the total; the last six rows are not mutually exclusive and double count many districts.
Further, we find that the nuances of local politics affect the context in which district officials respond to data requests.3 Given the weight that families place on where their children attend school, AZBs are often highly contentious. Recent media attention has documented divisive conflicts around AZB rezoning in places like Richmond, Virginia, and Montgomery County, Maryland (Mattingly 2019; Peetz 2020). Case study research and media accounts have also documented instances in which diversifying suburban districts try to retain white and/or more affluent families and ultimately adjust boundaries in ways that further segregation (Eaton 2012; Wiley, Shircliffe, and Morley 2012; Siegel-Hawley 2013; McDermott, Frankenberg, and Diem 2015). Given this complex landscape, data that may be perceived as problematic—for example, a threat to an institution's reputation (Jenkins 2020) or even a basis for litigation (Siegel-Hawley 2013)—may be closely managed by educational stakeholders. Local education officials may deem it politically necessary to mitigate against “bad press” and obscure AZB data if they believe it could raise questions regarding the quality and equity of educational provision (Jenkins 2020).
Limits on Spatial Data Quality
Finally, AZB data represent a form of spatial data, which many educational systems do not have the capacity to create and maintain. Though geographic information systems (GIS) have advanced significantly in recent decades and are increasingly common in educational research (Lubienski and Lee 2017; Yoon and Lubienski 2018), many districts simply do not have the resources to build or maintain their own spatial AZB data. Because very few districts can afford in-house GIS analysts, the availability and cooperation of local GIS institutions—including county GIS offices and private demographic firms—often influences which districts are able to provide us with high-quality, digitized AZB data. Static maps such as PDFs or scanned paper maps are typically less reliable than receiving the geospatial vector data directly. For example, when digitizing PDF or paper maps, we must make assumptions about how exactly boundaries cut around specific city blocks, which side of the street boundaries fall, and whether boundaries follow true roads or the simplified lines depicted. As we prepare all data for inclusion in our final dataset, the disparity in map quality introduces varying levels of uncertainty to the digitization process.
Implications of an Uneven and Incomplete Dataset
Ultimately, this patchwork of state- and local-level factors has shaped our (in)access to quality data in nonrandom ways. In tabulating the complete AZB data we have collected thus far, we find that we tend to have more data from school districts with larger enrollments in comparison to those with smaller enrollments and from urban and suburban districts in comparison to rural districts (see table 3). For example, we have collected 2000 AZB data from 34 percent of the districts in our sample with more than 25,000 students and from only 16 percent of the districts in our sample with fewer than 5,000 students. This could be due to the fact that large and dense districts are most likely to have the administrative capacity necessary to employ dedicated personnel responsible for creating and sharing such data (Miller 2010; Schafft 2016). The data collected from suburban districts are promising, as we are particularly interested in studying how AZBs have shaped student populations in racially diversifying suburbs across the country (Diamond and Posey-Maddox 2020). However, the data gap in rural areas and towns limits our ability to identify segregative boundary changes in less-dense districts, which have also seen recent demographic shifts and diversification (Lichter, Parisi, and Taquino 2018). Research shows that small towns have patterns of racial segregation similar to those of larger cities, but segregation in these areas is often understudied (Lichter et al. 2007; Kebede et al. 2021). The lack of AZB data from rural districts means patterns of segregation there may continue to be understudied and unacknowledged, which ultimately hinders practitioners who could address such segregation.
. | % with Complete 1990 Data . | % with Complete 2000 Data . | % with Complete 2010 Data . | % with Complete 2020 Data . | N Districts . |
---|---|---|---|---|---|
Enrollment | |||||
Fewer than 5,000 | 8 | 16 | 22 | 39 | 1,050 |
5,000 to 9,999 | 8 | 20 | 39 | 79 | 454 |
10,000 to 24,999 | 6 | 18 | 58 | 87 | 413 |
25,000 or more | 11 | 34 | 91 | 82 | 321 |
Locale | |||||
City | 9 | 20 | 67 | 81 | 486 |
Suburb | 10 | 24 | 47 | 71 | 522 |
Town | 6 | 16 | 25 | 49 | 897 |
Rural | 4 | 14 | 21 | 38 | 318 |
Ethnoracial composition | |||||
Diverse | 9 | 20 | 47 | 69 | 1,302 |
Mostly black and/or Hispanic | 5 | 14 | 48 | 56 | 336 |
Mostly white | 7 | 21 | 27 | 50 | 600 |
Economic composition | |||||
Not economically disadvantaged | 8 | 21 | 42 | 65 | 1,680 |
Economically disadvantaged | 8 | 15 | 40 | 52 | 558 |
Ethnoracial and economic composition | |||||
Not racially and economically isolated | 8 | 21 | 42 | 63 | 1,969 |
Racially and economically isolated | 4 | 11 | 42 | 48 | 269 |
Total | 8 | 20 | 42 | 62 | 2,238 |
. | % with Complete 1990 Data . | % with Complete 2000 Data . | % with Complete 2010 Data . | % with Complete 2020 Data . | N Districts . |
---|---|---|---|---|---|
Enrollment | |||||
Fewer than 5,000 | 8 | 16 | 22 | 39 | 1,050 |
5,000 to 9,999 | 8 | 20 | 39 | 79 | 454 |
10,000 to 24,999 | 6 | 18 | 58 | 87 | 413 |
25,000 or more | 11 | 34 | 91 | 82 | 321 |
Locale | |||||
City | 9 | 20 | 67 | 81 | 486 |
Suburb | 10 | 24 | 47 | 71 | 522 |
Town | 6 | 16 | 25 | 49 | 897 |
Rural | 4 | 14 | 21 | 38 | 318 |
Ethnoracial composition | |||||
Diverse | 9 | 20 | 47 | 69 | 1,302 |
Mostly black and/or Hispanic | 5 | 14 | 48 | 56 | 336 |
Mostly white | 7 | 21 | 27 | 50 | 600 |
Economic composition | |||||
Not economically disadvantaged | 8 | 21 | 42 | 65 | 1,680 |
Economically disadvantaged | 8 | 15 | 40 | 52 | 558 |
Ethnoracial and economic composition | |||||
Not racially and economically isolated | 8 | 21 | 42 | 63 | 1,969 |
Racially and economically isolated | 4 | 11 | 42 | 48 | 269 |
Total | 8 | 20 | 42 | 62 | 2,238 |
Notes: In calculating each percentage, we include in the numerator all those districts with at least two schools serving some grade levels for which the Longitudinal School Attendance Boundary System (LSABS) contains complete data. The denominators reported in the right-most column are based on the full LSABS sample, as reported in table 1. District characteristics are based on 2018—19 NCES data; fifteen districts with missing locale data are excluded from these counts. “Mostly black and/or Hispanic” districts are those with more than 75 percent of students identifying as black and/or Hispanic, while “mostly white” districts are those with more than 75 percent of students identifying as white. All other districts are classified as “diverse.” “Economically disadvantaged” districts are those in which more than 75 percent of students receive free or reduced-price lunch (FRPL). “Racially and economically isolated” districts are those in which both more than 75 percent of students identify as black and/or Hispanic and more than 75 percent of students receive FRPL.
We also have more data from districts that are not racially and economically isolated (i.e., districts with fewer than 75 percent of students identifying as black or Hispanic and fewer than 75 percent of students receiving free or reduced-price lunch), compared to racially and economically isolated districts (see table 3). Given how AZBs shape access to opportunity, this finding is particularly concerning as it limits understanding of how boundaries have operated over time in districts with higher percentages of historically marginalized students. While we cannot know the reasons that individual districts did or did not provide data, districts with predominantly white and/or affluent populations are more likely to be better resourced than districts with high rates of minoritized students (EdBuild 2019), so they likely have greater staffing capacity to both create AZB data and respond to requests for data.
In terms of access to quality digitized AZB data, we have similarly found that we are more likely to access geospatial vector data for districts with larger enrollments compared with those with smaller enrollments, and for urban districts compared with suburban districts (see table 4). Vector data are also slightly more common among mostly black and/or Hispanic districts, compared with diverse districts and mostly white districts. This means we will have to digitize—and therefore introduce more uncertainty—for districts in suburban areas and those serving diverse populations. The lack of geospatial vector data for suburban areas is especially concerning because research has documented how suburban districts may respond to diversifying populations with minute, segregative boundary changes that they may try to keep under the radar (Eaton 2012; Wiley, Shircliffe, and Morley 2012; Siegel-Hawley 2013; McDermott, Frankenberg, and Diem 2015). Districts may tweak boundaries to zone individual homes or apartment buildings to specific schools (Dunn 2017). Such small changes can be difficult to depict on static maps, and our digitization process may not precisely capture such gerrymandering, leaving us with inaccurate estimations of segregation.
. | % Geospatial Vector Data . | % Static Maps . | % Textual Data . | N Districts with 2020 Elementary AZBs . |
---|---|---|---|---|
Enrollment | ||||
Fewer than 5,000 | 17 | 60 | 23 | 392 |
5,000 to 9,999 | 19 | 66 | 15 | 344 |
10,000 to 24,999 | 25 | 71 | 4 | 354 |
25,000 or more | 41 | 58 | 2 | 273 |
Locale | ||||
City | 29 | 65 | 5 | 398 |
Suburb | 19 | 65 | 16 | 616 |
Town | 29 | 62 | 9 | 152 |
Rural | 25 | 60 | 15 | 197 |
Ethnoracial composition | ||||
Diverse | 24 | 64 | 12 | 886 |
Mostly black and/or Hispanic | 31 | 58 | 11 | 188 |
Mostly white | 20 | 67 | 13 | 289 |
Economic composition | ||||
Not economically disadvantaged | 24 | 65 | 11 | 1,081 |
Economically disadvantaged | 25 | 59 | 16 | 282 |
Ethnoracial and economic composition | ||||
Not racially and economically isolated | 24 | 64 | 12 | 1,236 |
Racially and economically isolated | 31 | 61 | 8 | 127 |
Total | 24 | 64 | 12 | 1,363 |
. | % Geospatial Vector Data . | % Static Maps . | % Textual Data . | N Districts with 2020 Elementary AZBs . |
---|---|---|---|---|
Enrollment | ||||
Fewer than 5,000 | 17 | 60 | 23 | 392 |
5,000 to 9,999 | 19 | 66 | 15 | 344 |
10,000 to 24,999 | 25 | 71 | 4 | 354 |
25,000 or more | 41 | 58 | 2 | 273 |
Locale | ||||
City | 29 | 65 | 5 | 398 |
Suburb | 19 | 65 | 16 | 616 |
Town | 29 | 62 | 9 | 152 |
Rural | 25 | 60 | 15 | 197 |
Ethnoracial composition | ||||
Diverse | 24 | 64 | 12 | 886 |
Mostly black and/or Hispanic | 31 | 58 | 11 | 188 |
Mostly white | 20 | 67 | 13 | 289 |
Economic composition | ||||
Not economically disadvantaged | 24 | 65 | 11 | 1,081 |
Economically disadvantaged | 25 | 59 | 16 | 282 |
Ethnoracial and economic composition | ||||
Not racially and economically isolated | 24 | 64 | 12 | 1,236 |
Racially and economically isolated | 31 | 61 | 8 | 127 |
Total | 24 | 64 | 12 | 1,363 |
Notes: We display data types for elementary attendance zone boundaries (AZBs) because districts are most likely to have more than one school serving elementary grades—–and therefore have AZBs—–compared to middle or high school grades. District characteristics are based on 2018—19 NCES data. “Mostly black and/or Hispanic” districts are those with more than 75 percent of students identifying as black and/or Hispanic, while “mostly white” districts are those with more than 75 percent of students identifying as white. All other districts are classified as “diverse.” “Economically disadvantaged” districts are those in which more than 75 percent of students receive free or reduced-price lunch (FRPL). “Racially and economically isolated” districts are those in which both more than 75 percent of students identify as black and/or Hispanic and more than 75 percent of students receive FRPL.
Policy Recommendations
The LSABS data collection process demonstrates how the lack of top–down systems of AZB data collection and the varying capacity of individual school districts to collect and share these data ultimately shape the information we have access to and the kinds of research we can conduct. To support systematic study of AZBs over time, we need policies that both support the collection of AZB data and remove barriers to data access.
At the federal level, we urge ED to collect AZB data once again in perpetuity, as they are vitally important to serving the children, school districts, and general public that ED is meant to serve.4 Though education is primarily a state and local responsibility in the United States, the federal government is responsible for four basic functions in public education today, one of which is to “collect data” and oversee research on “most aspects of education” (U.S. Department of Education 2010). In particular, ED's Office of Civil Rights (OCR) is a means to collect more robust data to investigate various civil rights considerations (Kim 2020). One way to implement this recommendation is to reinstate the SABS collection, but it could also be done as part of ED's regular Civil Rights Data Collection and would further support OCR's mission. This data collection effort would benefit from newly available browser-based GIS software, advances in crowdsourcing spatial data, and an overall increase in familiarity with digital mapping products. The existence of private AZB data products covering much of the country indicates that a federal-level database of this type should not be out of reach. Nevertheless, the complexities of design identified in our own collection efforts and in the SABINS and SABS efforts that preceded ours suggest that any collection effort should incorporate guidance from experts both within and outside of government.
States should also aid in the effort for top–down AZB data collection by providing infrastructure to help school districts create and maintain AZB shapefiles. Such tasks require a higher level of capacity and infrastructure than is reasonable or efficient to expect of individual school districts, and we have seen districts turn to outside entities like county GIS offices or private demographic consultants. Instead, state departments of education should house GIS resources and personnel to assist local districts. Capacity-building in this area would facilitate any federal collection efforts, reduce the burden on individual school districts, and help systematize the data for the benefit of district leaders, families, and researchers (Kelchen, Rosinger, and Ortagus 2019). In Minnesota, for example, statewide collection of AZBs proved quite useful to local districts that did not previously have a systematic way of tracking their own school boundaries (Minnesota Department of Education staff, personal communication, 2 June 2021).
In addition to incentivizing AZB transparency by providing infrastructure, states could also require transparency by mandating that school districts publish updated AZB maps on their Web sites. Though many school districts already publish boundary maps online, many others do not. Some districts have boundaries online that appear to be several years old, and it is unclear if the boundaries have been unchanged or if the Web site has simply not been updated. In the absence of easily accessible, updated AZB data, many stakeholders turn to other sources, including Web sites like Zillow and HomeTownLocator. These sites purchase boundary data from a few private companies that collect and sell them. We remain uncertain of these companies’ claims that their boundary data are accurately updated annually. Requiring school districts to publish updated AZB data would provide more reliable information to homebuyers and families.
Finally, state legislatures should revise data retention laws that allow, or even require, districts to destroy historical versions of boundary data. AZB data do not contain protected student information that may be risky to store. Rather, historical maps should be preserved so district leaders can better understand both how the past has shaped current patterns of educational (in)equity and how to advocate for more integrative future boundaries (Kelly 2019).
Conclusion
Data availability and quality have important implications for the kinds of analyses researchers are able to conduct, and social scientists are rightly concerned with bias that may arise when restrictions on data collection efforts are nonrandom (Clark, Rothstein, and Schanzenbach 2009; Rothstein 2009). Politicization of data collection, self-selection bias, and inconsistently applied data access regulations can impede research efforts and limit understandings of important social, health, and educational issues (Randall, Cooper, and Hite 1999; Krumholz 2014; Goos and Salomons 2017). While we have considered the ways in which our data—and eventual knowledge—of AZBs may be limited, we propose several policy changes that will better support systematic AZB data going forward.
Education research and accountability efforts, although flawed, are generally intended to bring hidden data to light and inform policies that ensure every student has access to equitable educational opportunities. School AZBs, in particular, are critical pieces of the equity puzzle, as they shape students’ access to schools, opportunities, and resources, and influence patterns of school and residential segregation (Hasan and Kumar 2019; Rooks 2020). We currently lack the necessary data and oversight to ensure AZBs and changes to AZBs promote equity in each district. As Gloria Ladson-Billings (2006) frames it, our country owes an “educational debt” to students of color who have been harmed by the institutional racism perpetuated by our systems of schooling. Inequitable AZBs remain one part of those racist systems, but without access to large-scale AZB data, it is difficult to identify where AZBs serve to segregate and where they are successfully helping to integrate students. We therefore believe that creating a stronger system of AZB data development, reporting, and analysis would support greater equity in education.
Acknowledgments
This material is based on work supported by the National Science Foundation under grant no. 1918277 and by The Pennsylvania State University's Social Science Research Institute, Population Research Institute, College of Earth and Mineral Sciences, and College of Education.
Notes
While AZBs contribute to segregation within public school districts—segregation more easily addressed through existing legal and policy context—a large portion of the overall school segregation we see today is also caused by segregation between districts. Across U.S. metro areas, 35 percent of racial segregation and 57 percent of economic segregation exists within districts, while 54 percent of racial segregation and 43 percent of economic segregation exists between districts (Potter 2022). Remaining racial segregation is due to segregation between public and private schools (6 percent) and to segregation between private schools (6 percent).
There are almost certainly additional, untapped historical data to be collected outside of school districts; however, to date, we have prioritized contacting districts.
Two examples help demonstrate this finding. In the first, a large Southern school district requested a substantial down payment before searching its archives for records responsive to our FOI request. After a few months of back and forth with our team over the cost, the district eventually rescinded its invoice in June 2020 citing the “protests and violence” over police brutality and racial justice taking place at the time (phone call, 3 June 2020). The district ultimately provided AZB maps free of charge, perhaps because it had more pressing responsibilities to tend to and/or perhaps because current events highlighted the grave need for transparency in our public institutions. The second example comes from a large district in the West where we informally emailed the GIS department to request AZB shapefiles. The district appeared willing to provide data at first and asked several clarifying questions about our project. In particular, one employee wrote that they “noticed on [a blog post about our project] that [we] are looking at residential segregation based on boundaries and boundary changes” and asked us to “expand on that” (e-mail communication, 29 April 2020). Though we addressed the questions, the district stopped responding to our attempts at contact and never provided AZB data. Notably, the district was under investigation by the U.S. Department of Justice at the time for racial harassment, which may help explain their reluctance to assist a research project studying segregation.
The National Coalition on School Diversity has made similar policy recommendations in recent advocacy letters regarding ED's 2023 proposed budget (https://www.school-diversity.org/wp-content/uploads/supporting-school-integration-in-fy-23.pdf) and the Civil Rights Data Collection survey (https://www.school-diversity.org/wp-content/uploads/NCSD-CRDC-comment-letter-2-11-22.pdf).