Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Julie A. Marsh
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Education Finance and Policy (2020) 15 (4): 761–774.
Published: 01 October 2020
Abstract
View article
PDF
In this policy brief, we use the case of California's Local Control Funding Formula (LCFF) to provide policy makers and educators guidance on how to involve the public in goal setting and resource distribution decisions. We provide clarity around who is and is not participating, why, and what broader lessons we can draw for implementing federal and state education policies mandating public engagement. Our findings indicate tremendous room for improvement. LCFF's target populations (e.g., low-income, English learners) are not more likely to be aware of or participate in decisions than nontargeted groups, which suggests weak accountability for the use of public funds by the policy's target populations. Although LCFF has defined a broad set of stakeholders, only a narrow segment of the public (i.e., individuals with stronger ties to and positive views of schools) is aware of and engaging with the policy. Finally, we find a substantial gap between actual participation in LCFF and interest in participation, which may relate to a lack of self-efficacy, time, trust, perceived appropriateness, and information. As states and districts respond to mandates for engagement, these results suggest the need for greater investments in: (1) communication, (2) targeting a range of stakeholders, and (3) capacity building.
Includes: Supplementary data
Journal Articles
Publisher: Journals Gateway
Education Finance and Policy (2016) 11 (3): 251–282.
Published: 01 July 2016
FIGURES
Abstract
View article
PDF
We examine the Los Angeles Unified School District's Public School Choice Initiative (PSCI), which sought to turnaround the district's lowest-performing schools. We ask whether school turnaround impacted student outcomes, and what explains variations in outcomes across reform cohorts. We use a Comparative Interrupted Time Series approach using administrative student-level data, following students in the first (1.0), second (2.0), and third (3.0) cohorts of PSCI schools. We find that students in 1.0 turnaround schools saw no significant improvements in outcomes, whereas students enrolled in 2.0 schools saw significant gains in English Language Arts in both years of the reform. Students in 3.0 schools experienced significant decreases in achievement. Qualitative and survey data suggest that increased support and assistance and the use of reconstitution and restart as the sole turnaround methods contributed to gains in 2.0, whereas policy changes in 3.0 caused difficulties and confusion in implementation, leading to poor student performance.