## Abstract

As schools are making significant investments in education technologies it is important to assess whether various products are adopted by their end users and whether they are effective as used. This paper studies the adoption and ability to promote usage of one type of technology that is increasingly ubiquitous: school-to-parent communication technologies. Analyzing usage data from a Learning Management System across several hundred schools and then conducting a two-stage experiment across fifty-nine schools to nudge the use of this technology by families, I find that 57 percent of families ever use it, and adoption correlates strongly with measures of income and student achievement. Although a simple nudge increases usage and modestly improves student achievement, without more significant intervention to encourage usage by disadvantaged families, these technologies may exacerbate gaps in information access across income and performance levels.

## 1.  Introduction

New technologies in the public sector often aim to improve the quality of government-provided services. This is true in the education sector, where the purchase of technologies may improve curriculum delivery, data management, and school-to-parent communication. A number of papers have studied the educational impacts of information technologies, such as computers (Machin, McNally, and Silva 2007; Barrera-Osorio and Linden 2009; Malamud and Pop-Eleches 2011; Fairlie and Robinson 2013; Vigdor, Ladd, and Martinez 2014; Beuermann et al. 2015), access to the Internet (Goolsbee and Guryan 2006; Belo, Ferreira, and Telang 2013; Bulman and Fairlie 2016; Dettling, Goodman, and Smith 2018), computer-aided instruction (Angrist and Lavy 2002; Rouse and Krueger 2004; Barrow, Markman, and Rouse 2009; Banerjee et al. 2007; Linden 2008; Taylor 2015), teacher dashboards (Tyler 2013), and mobile devices (Fryer 2013; Bergman, forthcoming; Castleman and Page 2014; York, Loeb, and Doss 2019; Beland and Murphy 2016; Castleman and Page 2017; Bergman and Chan 2017; Page and Gehlbach 2017).

Similar to many other contexts, however, the end users of education technologies may be distinct from the administrators in control of procurement. Whereas the end users for local education agencies are often teachers, parents, and students, many purchasing decisions are made at the district or school level. For instance, New York City spent $95 million on their “Achievement Reporting and Innovation System,” which was subsequently ended as a result of high costs and low usage by teachers and parents, according to an agency spokesperson (Chapman 2014). Given the growing private-sector investments in new education technologies, from$600 million in 2009 to $2.5 billion in the first half of 2015 alone, plus an additional$11 billion spent by K–12 and higher-education institutions (Adkins 2016; McCarthy 2017), an important question is how the products purchased by local education agencies are adopted by their end users and whether promoting usage impacts outcomes.

This paper studies a technology that is increasingly ubiquitous in schools: school-to-family communication technologies. Unlike computer-aided instructional technologies, which can substitute for teacher instruction (Taylor 2015), communication technologies can complement instruction in the classroom by informing families about students’ academic progress. These technologies also have the potential to remedy the gap in communication quality that exists between low- and high-achieving schools and improve student outcomes (Bridgeland et al. 2008).

Many school districts leverage Learning Management Systems (LMS) to improve family access to student information at scale by placing students’ academic data onto an online “portal” for view online. This technology allows families to view performance indicators, such as their child's grades, attendance, and missing assignments, in real time as teachers update them. Figure A.1 shows an example of the portal studied in this paper. Families are provided a Web site address, a user name, and a password either by teachers or other school employees. Once a family logs in, they see the student's classes, teachers, and their associated grades. Figure A.2 displays the screen seen once they click on a specific class their child is taking. Families can then view their child's assignments, assignment scores, the grading scale, and scoring codes. The exact same information can be viewed through a student account with a separate user name and password, also for the purpose of tracking assignments and grades. These systems are typically purchased at the school or district level. Despite their potential to improve outcomes, the adoption, usage, and effects of this technology are unknown.

This paper studies the adoption of LMS technology by families and whether a simple nudge can promote usage and improve student outcomes. To examine usage, I analyze data from a learning management company operating in fifteen school districts that tracks parent and student logins as well as student grades. To nudge additional usage, I selected a sample of low-intensity users across three school districts and conducted a two-stage experiment informing families about the portal and providing families their account information. The experimental design is similar to that used by Duflo and Saez (2003) to study the role of social interactions in retirement plan decisions. First, schools are randomized to either have a sample of families treated or to have no families treated. Second, families within treated schools are randomly selected to actually receive the intervention, which provides sign-up information to parents via phone calls and letters. This design permits analysis of the direct effects of the intervention on usage as well as potential spillover effects.

I find that families' adoption of LMS technology follows an S-shape curve over the course of the school year that rises quickly then levels off. Usage is far from complete. Across several hundred schools, 25 percent of families have ever logged into their parent accounts by the end of the year, and roughly 4 percent of families log in to those accounts at least once per week. More families use the student accounts: 49 percent of student accounts have ever been used and 57 percent of either parent or student accounts have ever been used. School-level adoption rates positively correlate with measures of family income, school-level test scores, and teacher usage. Families with higher-achieving students are also more likely to adopt the technology. Importantly, these patterns suggest that this technology, without intervention, may not address the disparities in student achievement or school-to-family communication that exist across income and performance groups.

The descriptive results highlight how the demand and supply of information are determined simultaneously. Teachers may not update information frequently if they perceive that few parents log in to view it. Alternatively, few parents may log in if teachers rarely update the information on the portal. This simultaneity makes it difficult to estimate the causal effect of usage on student performance. The randomized intervention, which aims to increase family usage of the portal, helps recover this causal effect.

The experimental intervention increased total family usage (either to student or parent accounts) by nearly two logins per month compared with families in schools where no one received the intervention. There are significant spillovers, which is similar to the findings in Duflo and Saez (2003). However, the study is under-powered to detect whether these spillovers differ significantly in magnitude from the magnitude of the direct effect of the intervention. Increasing usage modestly improved student grades. For both the treatment and spillover groups, grade point average (GPA) improved by 0.10 points. Though it is difficult to consider a treatment-on-the treated effect given the effects on both adoption and usage through multiple channels, the results suggest that LMS technology is capable of inducing a modest improvement in student outcomes but that usage is not widespread without significant intervention, especially among schools serving lower-income and lower-performing students.

The rest of this paper is organized as follows. Section 2 reviews the literature on school-to-parent communication interventions. Section 3 describes the data and patterns of usage. Section 4 details the experimental design to nudge additional usage. Section 5 presents the results of the experiment. Section 6 concludes and provides a basic cost analysis.

## 2.  Literature Review

Previous research suggests school-to-parent communication can address information asymmetries that exist between parents, their children, and schools (Bergman, forthcoming; Kraft and Rogers 2015; Bergman and Chan 2017; Bergman and Rogers 2017; Bergman, Edmond-Verley, and Notario-Risk 2018; York, Loeb, and Doss 2019). These asymmetries can impede human capital investments (Weinberg 2001; Akabayashi 2006; Hao, Hotz, and Jin 2008; Cosconati 2009; Bursztyn and Coffman 2012; Bergman, forthcoming). Recent experimental evidence shows that reducing these information problems can improve student achievement, and often at low cost. For instance, Bergman (forthcoming) and Bergman and Chan (2017) randomized the provision of text messages to parents detailing their child's missing assignments and grades and found that this communication increased student effort and achievement. Kraft and Rogers (2015) show that messages from teachers to parents significantly reduced dropout from a high school credit recovery program, and Rogers and Feller (2018) find that letters to parents about their child's absences designed using ideas from behavioral science reduced absenteeism.

This paper contributes to the literature by studying a technology that allows parents to view their child's academic progress via an online portal. This pull of information contrasts with the experimental evidence described above, which actively pushes information to families. Thus, there are several potential barriers to adoption and usage of the online portal: families must have Internet access, be aware that the system exists, keep track of their user name and password, and remember to log in. Like many school-to-parent communication systems, parent user names and passwords must be downloaded from the LMS and distributed to parents. This distribution can occur by mail, email, or at school events. If more-advantaged families more readily overcome these barriers to usage and adoption than less-advantaged families, then LMS technology may exacerbate gaps in communication quality between low- and high-achieving schools.

Lastly, this paper studies how peers can influence the adoption and usage of this technology. In general, peer effects are difficult to estimate because of the reflection problem (Manski 1993). A number of papers overcome this problem by leveraging experimental or quasi-experimental variation to show how peer influence can either encourage or discourage the adoption of health and agricultural-related technologies, particularly in lower-income countries (Foster and Rosenzweig 1995, 2010; Kremer and Miguel 2007; Conley and Udry 2010; Duflo, Kremer, and Robinson 2011; Oster and Thornton 2012; Dupas 2014). Several other papers find that social norms can “nudge” the adoption of new behaviors in a variety of contexts (Cialdini et al. 2006; Goldstein, Griskevicius, and Cialdini 2007; Gerber and Rogers 2009; Allcott 2011; Allcott and Rogers 2014; Bhargava and Manoli 2015; Bird et al. 2017; Hallsworth et al. 2017). This paper contributes to this literature by studying whether peers influence the adoption of an education-related technology in the United States.

## 3.  Data and Descriptive Results

This study draws data from several sources. The first is deidentified data from an LMS company for the 2013–14 school year. This LMS provider hosts a parent portal, a teacher gradebook, and a student portal. The student portal shows the same academic information to students as the parent portal shows to parents, but the student user name and password are distinct from the parent user name and password.1

The LMS records logins into the parent, student, and teacher portals by date. During the 2012–13 school year, there are nearly 7,000,000 login-by-week observations across 149,107 students. The LMS also records student grades by marking period and course. Students in elementary school do not receive letter grades, so these marks are excluded from the analysis sample (9.75 percent of marks).

Although the data have the unique benefit of recording portal usage and student grades, there are several limitations as well. First, the LMS data only have a single demographic variable that is recorded across all schools, which is student gender. Second, grade levels for students are missing. Third, there are no standardized test scores in the data. However, GPA is a stronger predictor of college performance than SAT or ACT scores, even unadjusted for high school quality (Rothstein 2004; Bowen, Chingos, McPherson 2009; Hiss and Franks 2014; Scott-Clayton, Crosta, and Belfield 2014).

I supplement these LMS data with information from the National Center for Education Statistics (NCES) Common Core Data, which records school-level characteristics for the universe of public schools in the United States. These data describe, at the school level, demographic shares by race, receipt of free and reduced-price lunch, as well as Title I status and location in an urban, suburban, town, or rural location.

Lastly, to obtain a unified measure of school performance across school districts, I draw on the performance ratings constructed by GreatSchools, a nonprofit organization. At the time of the data used here, GreatSchools formulated these ratings by calculating the average share of students who are proficient in subject-specific exams, and averaging these shares across the grades a school offers. GreatSchools then uses this measure to assign schools their state-wide decile on this metric. Thus, if a school receives a rating of 10, that school is in the top 10 percent of the state according to that state's proficiency standards.2

Table 1 presents summary statistics of the data used to describe portal usage. There are 264 schools across fifteen school districts. These schools enroll 149,107 students. On average, schools are 78 percent white, 15 percent black, and 4 percent Hispanic. The majority of students (55 percent) receive free or reduced-price lunch. The plurality of the sample is rural (41 percent) with the remaining sample primarily urban and suburban. Although this geographic balance is not representative of the United States, it nonetheless has significant enough variation to find informative correlates of portal adoption and usage across a variety of contexts.

Table 1.
District Summary Statistics
VariableMeanObservations
Districts N/A 15
Schools N/A 264
Students N/A 149,107
Female 49% 149,107
Share Hispanic 5.2% 244
Share black 16.2% 244
Share white 77.5% 244
Share free/reduced-price lunch 54.5% 244
Urban 21.5% 244
Suburb 20.7% 244
Town 15.1% 244
Rural 42.6% 244
VariableMeanObservations
Districts N/A 15
Schools N/A 264
Students N/A 149,107
Female 49% 149,107
Share Hispanic 5.2% 244
Share black 16.2% 244
Share white 77.5% 244
Share free/reduced-price lunch 54.5% 244
Urban 21.5% 244
Suburb 20.7% 244
Town 15.1% 244
Rural 42.6% 244

Notes: This table describes school characteristics for the descriptive study. The upper four rows use data from the learning management system. The remaining rows use data from the National Center for Education Statistics Common Core Data. Each variable is averaged at the school level and then averaged across schools without enrollment weights.

Table 2 uses data from the LMS to describe basic usage patterns. Fifty-seven percent of families have logged into either the parent or the student portal, and 31 percent of families have logged in at least once per week. Most of this usage comes from the student portal—49 percent of families had ever used the student account and 22 percent were used at least once per week. The parent accounts are used much less frequently. During the 2013–14 school year, the share of families who had ever logged into that system was 25 percent. Overall, 8 percent of families had logged in at least once per week and a total of thirteen times, on average, during the year. Interviews with school officials and parents found that parents use both student and parent accounts. I focus on usage of either account from here on (figure 1).3 Lastly, figure 2 shows the distribution of teacher usage across the schools. Teacher usage is much higher, with a median of 202 logins per teacher, and more uniformly distributed than family usage. More teachers use the portal regularly. For instance, roughly 85 percent of teachers log in at least once per week during the school year.

Table 2.
Portal Usage Information: 2013—14
VariableMeanObservations
Share ever logged in 25% 149,107
Share ever logged in 49% 149,107
Share ever logged into portal 57% 149,107
Share ever logged in to portal 100% 7,376
Average total logins in 244 7,376
VariableMeanObservations
Share ever logged in 25% 149,107
Share ever logged in 49% 149,107
Share ever logged into portal 57% 149,107
Share ever logged in to portal 100% 7,376
Average total logins in 244 7,376

Notes: This table describes school characteristics for the descriptive study. These numbers are constructed using data from the learning management system.

Figure 1.

Total Portal Usage During the 2013—14 School Year, Conditional on Using at Least Once

Notes: The figure shows the distribution of total portal logins (either parent or student) during the 2013—14 school year conditional on logging in at least once. This figure is constructed using data from the learning management system and trims the top-most percentile from the data.

Figure 1.

Total Portal Usage During the 2013—14 School Year, Conditional on Using at Least Once

Notes: The figure shows the distribution of total portal logins (either parent or student) during the 2013—14 school year conditional on logging in at least once. This figure is constructed using data from the learning management system and trims the top-most percentile from the data.

Figure 2.

Teacher Portal Usage During the 2013—14 School Year

Notes: The figure shows the distribution of portal logins during the 2013—14 school year. This figure is constructed using data from the learning management system and trims the top-most percentile from the data.

Figure 2.

Teacher Portal Usage During the 2013—14 School Year

Notes: The figure shows the distribution of portal logins during the 2013—14 school year. This figure is constructed using data from the learning management system and trims the top-most percentile from the data.

Figure 3 traces out the adoption curve for either account—the share of families using either the parent portal or the student portal by date over the course of the 2013–14 school year. Adoption takes on an “S” shape, similar to that found in the adoption of other types of products and technologies (Rogers 1995). There is a sharp rise at the start of the school year, but by late November the curve levels off. The share of families who have ever logged into the system reaches just under 60 percent by the end of the school year.

Figure 3.

Portal Adoption During the 2012—13 School Year

Notes: The figure shows the share of families who have ever logged into a portal during the 2012--13 school year (SY). This figure is constructed using data from the learning management system.

Figure 3.

Portal Adoption During the 2012—13 School Year

Notes: The figure shows the share of families who have ever logged into a portal during the 2012--13 school year (SY). This figure is constructed using data from the learning management system.

To study how usage correlates with achievement at the individual level, I estimate the following regression model:
$GPAi=∑k=1Kβk*1logins∈ak,bk+ɛi,$
(1)
where $GPAi$ is the average grade of student i, and $βk$ are coefficients on indicator variables for whether a family has logged in to any account between $ak$ and $bk$times, where the latter take on values such as twenty-five to fifty times or fifty to seventy-five times. I report the GPAs associated with each category of logins.

Figure 4 plots these predicted GPAs based on the regression above. This graph shows the average grade of students whose family has never logged into the system, followed by those who have logged in between twenty-five and fifty times, and so on. There is a strong correlation between logins and GPA. The most substantial association—roughly half a GPA point—occurs between logging in zero times versus 25 times or more over the course of the year.

Figure 4.

Notes: This figure shows the grade-point averages associated with different levels of portal usage relative. This figure is constructed using data from the learning management system. Logins above 200 are grouped into a single category.

Figure 4.

Notes: This figure shows the grade-point averages associated with different levels of portal usage relative. This figure is constructed using data from the learning management system. Logins above 200 are grouped into a single category.

To study the correlates of adoption rates at the school level, I estimate the following:
$ShareAdopteds=γ+Xs'θ+ψs,$
where the dependent variable is the share of families who have ever logged into any account at school s. The independent variables, $Xs$, also measured at the school level, capture a variety of school characteristics. These include indicators for whether a school is a middle or high school; Title I status; urban, rural or suburban location; as well as variables for student share of Hispanic, black, and free or reduced-price lunch eligibility. Average student-to-teacher ratio and total teacher logins at school s are included as well. $ψs$ is the residual term, and the regression weights each school observation by the number of students enrolled.

Table 3 presents the results of this regression for the year 2012–13. The share of black students at a given school negatively correlates with adoption, whereas the coefficient on the share of Hispanic students is small and insignificant after controlling for the remaining covariates. Interestingly, adoption at the middle-school level is largest and statistically different from elementary and high school families' adoption. Though cross sectional, this disparity is in line with other cross-sectional measures of parental monitoring, such as parent–teacher conference attendance, which drops sharply from middle to high school (Noel, Stark, and Redford 2013). GreatSchools ratings—a proxy for schools’ test-score performance within the state—is a strong, positive predictor of adoption. For the highest-performing 10 percent of schools, roughly 75 percent of families have ever logged into the system. For the lowest-performing 10 percent of schools, roughly 20 percent of families have ever logged into the system (see figure A.3). Lastly, Title I schools and schools with a high share of free or reduced-priced lunch eligible students have lower adoption rates, though their coefficients may be negative or positive after conditioning on the other covariates of the model.4

Table 3.
School-Level Correlates of Families Ever Logging in to a Portal
Dependent Variable: Ever LoggedCoefficientStd. Error
Black −0.24*** (0.09)
Hispanic −0.04 (0.18)
Middle school 0.26*** (0.04)
High school −0.11** (0.04)
Share free/reduced-price lunch 0.22 (0.17)
Suburban 0.03 (0.04)
Urban −0.05 (0.05)
GreatSchools rating 0.020** (0.01)
Rural 0.01 (0.04)
Title I −0.08* (0.05)
Student/teacher −0.01*** (0.00)
Observations 264 schools 145,139 students
R2 0.69
Dependent Variable: Ever LoggedCoefficientStd. Error
Black −0.24*** (0.09)
Hispanic −0.04 (0.18)
Middle school 0.26*** (0.04)
High school −0.11** (0.04)
Share free/reduced-price lunch 0.22 (0.17)
Suburban 0.03 (0.04)
Urban −0.05 (0.05)
GreatSchools rating 0.020** (0.01)
Rural 0.01 (0.04)
Title I −0.08* (0.05)
Student/teacher −0.01*** (0.00)
Observations 264 schools 145,139 students
R2 0.69

Notes: This table presents results from a student-weighted regression of the school-level share of families who have ever logged in to the portal on school-level demographic and performance indicators. Student/teacher ratios are coded as missing if larger than 100. Teacher logins are coded as missing if larger than the 99th percentile of all logins. Login data are from the learning management system on 264 schools representing 145,139 students linked to school-level data from the National Center for Education Statistics Common Core Data. Missing values are imputed and indicators for missing data are included in the regression. Robust standard errors in parentheses.

*p < 0.10; **p < 0.05; ***p < 0.01.

The final row of table 3 shows the logins-per-teacher variable, which proxies for the supply of information provided to families. This variable is calculated by dividing total teacher logins by the number of teachers at the school.5 This measure of how often teachers use the gradebook positively correlates with family adoption of the system.6 Higher student-to-teacher ratios, which may make it more difficult to keep grade information up to date, negatively correlates with adoption.

Overall, these variables can explain nearly 70 percent of the variation in the adoption shares. Much of this variation appears to be explained by Title I status, the grade levels served by the school, and teacher logins. The results also highlight how the supply and demand for information are likely determined simultaneously, making it difficult to recover the causal effects of the technology on student outcomes. The experiment discussed in section 4 identifies, through an encouragement design, the effects of usage, spillovers, and achievement impacts of this technology.

## 4.  Experimental Design and Implementation

### Experimental Design

The experimental intervention consisted of a mailer and a phone call targeted to parents. The mailer informed families about the parent portal, told them they will be called regarding the parent portal service, and provided the school phone number so parents could obtain their account information directly from the school. The subsequent phone call to parents told families their user name, password, and the Web site URL for the parent portal if they had not already obtained it from the school.

The sample frame for the intervention comprised three districts operating fifty-nine elementary, middle, and high schools across two states. Within these districts, the sample was restricted to parents who had ever logged in to the parent portal five or fewer times.7 The latter restriction aims to target the intervention to low-usage parents while retaining 82 percent of all students’ parents.

Figure A.5 describes the treatment allocation. The assignment of the intervention was randomized in two stages. First, twenty-nine schools were randomly selected to have a sample of families receive the intervention. The remaining thirty schools had access to the parent portal, but no parent received any form of the intervention by the researchers. Within the twenty-nine selected schools, just under half of the parents in the sample frame were selected to receive the intervention. This allocation mechanism formed a treated group that was assigned to receive a phone call and a mailer; a spillover group that was in the same schools as the treated families but did not receive either a mailer or a phone call; and a control group that attended schools in which no one was treated. Spillovers may occur through word of mouth: Parents learn about the intervention from another family and either obtain their account information for their parent account or use the child's account, as suggested in the intervention designed by Duflo and Saez (2003).

School-level treatment assignment was stratified according to indicators for whether more than 25 percent of families had logged into the parent portal at baseline, more than 50 percent of students had received free or reduced-price lunch, and indicators for each school's district. Importantly, all families and teachers were blinded to the study and the intervention was a district-led outreach to parents.

### Data and Implementation

The data used for this experiment are similar to the descriptive data studied above. As above, baseline data from the LMS consist of portal login information and student course grades. NCES Common Core data were available for fifty-eight of fifty-nine schools in the sample. GreatSchools school quality ratings were available for fifty-four of the fifty-nine schools. Students’ GPAs are standardized by district according to the untreated schools’ means and standard deviations.

As described previously, 5,027 students' parents (4,557 unique phone numbers) were assigned to the treatment group. Mailers notifying parents about the parent portal, how to obtain their account information, and the impending phone call, arrived at the start of November 2013. A phone bank contacted families over the course of the second week of November 2013.

### Empirical Strategy

The random assignment of the phone and the mailer intervention across schools, and subsequently across individuals, means that families in the treatment, spillover, and control groups have similar potential outcomes with respect to the treatments. By comparing outcomes between each group it is possible to estimate the impacts on the treatment and spillover groups. I estimate intent-to-treat impacts as follows:8
$yis=β0+β1Treatschoolis+β2Spilloveris+Xis'Γ+ηis.$

Outcomes $yis$ are login and academic outcomes at the individual level for students in school s.9 The $Treatschoolis$variable indicates whether a student is in a school in which anyone receives the treatment. The $Spilloveris$ variable indicates a student who was not assigned to the intervention but was in a treated school. This specification implies that the $β1$ coefficient is the effect of the intervention on those families who were selected to receive the treatment. The coefficient on the spillover term, $β2$, estimates the differential impact on the spillover group—those who were in schools with families selected for treatment. The test of significance for this coefficient provides evidence for whether we can reject that the spillover group experienced an effect similar in magnitude to the treated group. This test is underpowered, however. To test whether statistically significant spillover effects exist, which are better powered, I also show the p value from a test of whether the spillover coefficient is significantly different from zero. The $Xis$ term is a vector with school- and individual-level controls as well as strata indicators: the shares of white and black students at the school, the GreatSchools rating of the school, the fraction receiving free or reduced-priced lunch, baseline total logins, and an indicator for ever logging in. I test for heterogeneity by interacting these terms with treatment indicators, however, for variables measured at the school level, these tests of heterogeneous effects are underpowered. I impute any missing values with the mean value of the variable and include indicators for missing data for any schools or students lacking such data. All standard errors are clustered at the school level.

The histograms described previously show the measures of the number of logins are heavily skewed count variables (e.g., figure 1). As such, I model the data on logins using a negative binomial regression and report marginal effects at the means. Though not shown, results are similar in magnitude and precision to a transformation of the data as well, such as inverse-hyperbolic sine transformation;10 results are also quite similar, though marginally less precise, when using linear regression. When the outcome is an indicator for any usage, I model the data using a linear-probability model, but average marginal effects are almost exactly the same when estimated using a probit or logit model.

Random assignment also implies background characteristics should be comparable across groups, in expectation. Table 4 shows the covariate balance across the three groups, respectively. The average GPA in the sample is 2.5, students miss 8 percent of their assignments, on average, and average total logins into the parent and student portals from the start of the school year until the second week of October are 0.6 and 22, respectively. As in the descriptive results, table 4 also shows that logins into the student account are much higher in the study sample.

Table 4.
Balance Table
Treatment (T) MeanControl (C) MeanT—CpNumber of SchoolsNumber of Observations
Treatment vs. control
Grade point average 2.43 2.48 −0.05 0.50 59 15,192
Fraction missing 0.08 0.07 0.01 0.65 59 16,174
Parent portal logins 0.60 0.74 −0.14 0.16 59 16,367
Student portal logins 23.3 20.6 2.68 0.24 59 16,367
Total logins 23.9 21.4 2.53 0.27 59 16,367
Spillover vs. control
Grade point average 2.44 2.48 −0.04 0.53 59 15,680
Fraction missing 0.08 0.07 0.01 0.64 59 16,639
Parent portal logins 0.66 0.74 −0.08 0.43 59 16,827
Student portal logins 22.7 20.6 2.71 0.21 59 16,827
Total logins 21.4 24.00 2.64 0.23 59 16,367
School level
White 0.63 0.64 −0.01 0.64 58 N/A
Black 0.30 0.31 0.02 0.53 58 N/A
Hispanic 0.02 0.03 −0.01 0.64 58 N/A
Fraction FRPL 0.60 0.61 −0.01 0.90 58 N/A
Rating 4.5 5.0 −0.49 0.34 54 N/A
Treatment (T) MeanControl (C) MeanT—CpNumber of SchoolsNumber of Observations
Treatment vs. control
Grade point average 2.43 2.48 −0.05 0.50 59 15,192
Fraction missing 0.08 0.07 0.01 0.65 59 16,174
Parent portal logins 0.60 0.74 −0.14 0.16 59 16,367
Student portal logins 23.3 20.6 2.68 0.24 59 16,367
Total logins 23.9 21.4 2.53 0.27 59 16,367
Spillover vs. control
Grade point average 2.44 2.48 −0.04 0.53 59 15,680
Fraction missing 0.08 0.07 0.01 0.64 59 16,639
Parent portal logins 0.66 0.74 −0.08 0.43 59 16,827
Student portal logins 22.7 20.6 2.71 0.21 59 16,827
Total logins 21.4 24.00 2.64 0.23 59 16,367
School level
White 0.63 0.64 −0.01 0.64 58 N/A
Black 0.30 0.31 0.02 0.53 58 N/A
Hispanic 0.02 0.03 −0.01 0.64 58 N/A
Fraction FRPL 0.60 0.61 −0.01 0.90 58 N/A
Rating 4.5 5.0 −0.49 0.34 54 N/A

Notes: All data are at the student level except the school-level comparisons, which are unweighted results at the school level. Data are from the learning management system, with the exception of variables under the “School Level” heading, which are from the National Center for Education Statistics Common Core Data and are school-level aggregate variables. FRPL = free or reduced-price lunch.

*p < 0.10; **p < 0.05; ***p < 0.01.

The schools are 63 percent white, 30 percent black, and 3 percent Hispanic. Sixty percent of students receive free or reduced-price lunch. At the individual and school levels there are no significant differences between the treatment, spillover, and control groups. The number of schools is small relative to the number of observations, and results will be shown with and without controls.

Differential attrition across treatment, spillover, and control groups could bias estimates of treatment effects. The login data do not indicate whether a student has left a participating district, but observing no final grades is an indicator of district attrition. Table A.1 tests for differential attrition across treatment and spillover groups by estimating equation 1, without controls, on an indicator for whether or not a student has a final grade. There is no evidence of differential attrition from the sample.

## 5.  Adoption, Spillovers, and Efficacy

### Usage Effects

Figure 5 plots the treatment effect on logins per month for the treatment schools compared with the control schools. The vertical dashed line in the figure indicates when the phone treatment occurred. Usage immediately increases by roughly 1.5 to 2.5 logins per month. The treatment effect persists through the remainder of the school year, with an upward spike in March. Figure 6 shows the same graph for the spillover group. Treatment effects on the spillover group exhibit a similar pattern as the treatment: a swift rise at the outset that largely persists throughout the remainder of the academic year. The magnitudes of the effect are slightly smaller, and though the confidence intervals for a given month are wide, there is nonetheless an indication of positive spillovers. Note that each confidence interval in a figure represents the precision for the treatment (figure 5) or spillover (figure 6) effect for usage over the course of a particular month. The regression results discussed below test, with much better power, total post-intervention usage and whether spillover effects, and post-intervention spillover effects, are statistically distinguishable from the treatment effects.

Figure 5.

Usage: Treatment vs. Control

Notes: This figure shows the treatment effect on the number of times families logged in per month over the course of the school year (SY). The vertical dashed line indicates when the treatment began. The effects are marginal effects at mean usage from the negative-binomial regression described in the text with usage for each month as the outcome. 95 percent confidence intervals shown. Data come from the learning management system company.

Figure 5.

Usage: Treatment vs. Control

Notes: This figure shows the treatment effect on the number of times families logged in per month over the course of the school year (SY). The vertical dashed line indicates when the treatment began. The effects are marginal effects at mean usage from the negative-binomial regression described in the text with usage for each month as the outcome. 95 percent confidence intervals shown. Data come from the learning management system company.

Figure 6.

Usage: Spillover vs. Control

Notes: This figure shows the spillover effect on the number of times families logged in per month over the course of the school year (SY). The vertical dashed line indicates when the treatment began. The effects are marginal effects at mean usage from the negative-binomial regression described in the text with usage for each month as the outcome. 95 percent confidence intervals shown. Data come from the learning management system company.

Figure 6.

Usage: Spillover vs. Control

Notes: This figure shows the spillover effect on the number of times families logged in per month over the course of the school year (SY). The vertical dashed line indicates when the treatment began. The effects are marginal effects at mean usage from the negative-binomial regression described in the text with usage for each month as the outcome. 95 percent confidence intervals shown. Data come from the learning management system company.

Table 5 presents the regression results. The Treatschool variable indicates whether a school was treated and the spillover differential indicates the extent to which the spillover group's effect differs from directly receiving the treatment. Though underpowered, the significance or not of the latter tests whether the spillover effect is statistically different from the direct effect of the intervention. The effect on the spillover group is the Treatschool coefficient plus the differential coefficient, and the p-values in the table reflect whether this sum statistically differs from zero. For each outcome, the first column shows the effects with no control variables (except strata indicators and baseline usage to assess effects at the mean) and the control mean. The adjacent column presents the same outcome with the additional controls as described above.

Table 5.
Effects on Usage
Treated school 12.07** 10.58** 0.07* 0.04**
(6.12) (4.89) (0.04) (0.02)
Spillover differential −0.78 −1.01* −0.04*** −0.03**
(0.54) (0.59) (0.01) (0.01)
Control mean 45.47  0.68
p-value, spillovers different from 0 0.046 0.030 0.45 0.44
Observations 21,854 21,854 21,854 21,854
Additional controls No Yes No Yes
Treated school 12.07** 10.58** 0.07* 0.04**
(6.12) (4.89) (0.04) (0.02)
Spillover differential −0.78 −1.01* −0.04*** −0.03**
(0.54) (0.59) (0.01) (0.01)
Control mean 45.47  0.68
p-value, spillovers different from 0 0.046 0.030 0.45 0.44
Observations 21,854 21,854 21,854 21,854
Additional controls No Yes No Yes

Notes: All data are at the student level and are constructed from the learning management system data. Total logins represents the total number of logins into the student or parent portal. Ever logged in is an indicator for whether there was any login to either the student or parent portal after the intervention. The Spillover differential variable shows the difference in effect between the treatment group and the spillover group. The first two columns are marginal effects from a negative-binomial regression. Columns 3 and 4 are marginal effects from a linear-probability model. Additional controls described in the text. Marginal effects reported at baseline-mean usage. The p-value shown below the control mean is the p-value for a test of whether the spillover coefficient is different from zero. Standard errors clustered at the school level are shown in parentheses.

*p < 0.10; **p < 0.05; ***p < 0.01.

The first two columns of table 5 show that total usage increased by roughly eleven logins, or nearly two logins per month, as a result of the intervention. The effects are smaller for the spillover group by one login in total, and this difference from the treatment group when controls are included is significant at the 10 percent level. This provides evidence that the spillover effect on logins is marginally smaller than the effect on the treatment group. The magnitude of the spillover effect is statistically different from zero at the 5 percent level, as indicated by the p-values.11

The remaining two columns show that, by the end of the year, just over two thirds of families had logged into either the student or parent portal. There is a 4 percentage point increase in the likelihood of ever logging into a portal. This effect is significantly smaller for the spillover group, which was not provided their account information. This smaller effect on the likelihood of ever logging in could be one reason the number of total logins is slightly smaller for the spillover group than the treatment group. In results not shown, the effects on take-up are significantly larger among those who had never logged in at baseline—8 percentage points—and remain smaller for the spillover group.

These results are consistent with a potential word-of-mouth effect from treated families to families within the same school. Whereas treated families received their account information, families in the spillover group would have had to seek out this information or perhaps just use the student portal. I cannot directly assess whether parents contacted the school to obtain their account information. However, there is suggestive evidence regarding whether families relied on their child's student account. This may be more likely to occur for parents in the spillover group who did not have or recall their own account information but whose children knew their student login information. These families may rely more on the student accounts than those who received their account information directly from treatment assignment. I can proxy for this by looking at the subgroup of families who had never logged in to the parent account but had logged in to their student account. In results not shown, I find this subgroup is significantly (2 percentage points) more likely to have relied exclusively on the student account than the treatment group.

The results on both adoption and usage also have implications for interpreting any kind of treatment-on-the-treated effect for other outcomes. Viewing the effects as only operating through one channel violates the exclusion restriction as there are effects on both the intensive and extensive margins and the intervention may have affected usage by revising parents’ view on the importance of monitoring their children. Overall, the reduced-form effects of the intervention show additional usage that is equivalent to between one and two logins per month for the treatment and spillover groups.

### Student Achievement Effects

This section examines the impact on student GPA from the nudge intervention described in section 4. Table 6 presents the results. The first column shows results without controls and the second column adds the controls described in the text, including baseline GPA. The latter improve the precision of the estimates significantly but the point estimate remains almost unchanged. Overall the effect size is 0.10 standard deviations and is significant at the 5 percent level.

Table 6.
Effects on Student Grade Point Average (GPA)
Dependent VariableGPAZ-Score
Treatment 0.11 0.10**
(0.09) (0.05)
Spillover differential −0.06 −0.01
−0.02 −0.02
p-value, spillovers different from 0 0.26 0.022
Observations 19,218 19,218
Dependent VariableGPAZ-Score
Treatment 0.11 0.10**
(0.09) (0.05)
Spillover differential −0.06 −0.01
−0.02 −0.02
p-value, spillovers different from 0 0.26 0.022
Observations 19,218 19,218

Notes: All data are at the student level and are constructed from the learning management system data. GPA standardized according to control group means. The Spillover differential variable shows the difference in effect between the treatment group and the spillover group. The p-value shown below the control mean is the p-value for a test of whether the spillover coefficient is different from zero. Additional controls variables described in the text. Standard errors clustered at the school level are shown in parentheses.

**p < 0.05.

The effect on student grades does not significantly differ by treatment or spillover group (though again this statistical test is underpowered). This result is consistent with the effects on total logins, which are similar for both treatment and spillover groups. The effect size is roughly half of the effect size found in Bergman (forthcoming), in which information was actively pushed to parents about their child's academic performance rather than pulled from a portal system.12 As stated above, it is difficult to scale this effect through an instrumental-variable strategy that uses the intervention as an instrument for usage; the exclusion restriction is not satisfied as adoption, usage, and awareness about the importance of monitoring and information may all have been affected by the intervention. Nonetheless, the results do highlight the potential for a low-cost intervention to leverage this technology to promote academic achievement.

Table 7 shows exploratory analyses of whether the effects on GPA vary by subgroup. These analyses are exploratory because the study is, in general, not well powered to detect interaction effects, and examining heterogeneity across several variables exacerbates the multiple-testing problem.

Table 7.
Subgroup Effects on Student Grade Point Average (GPA)
Treatschool 0.126 0.107** 0.135* −0.015 0.171 0.107** 0.072 −0.129*
(0.116) (0.045) (0.068) (0.141) (0.121) (0.046) (0.046) (0.065)
Treatschool × Base GPA −0.01
(0.043)
Treatschool × Female  −0.013
(0.022)
Treatschool × Share Black   −0.113
(0.114)
Treatschool × Share FRPL    0.214
(0.218)
Treatschool × GreatSchools Rating     −0.013
(0.024)
Treatschool × Base Usage      −0.008*
(0.004)
Treatschool × Student Base Usage       0.001**
(0.000)
Treatschool × Teacher Base Usage        0.001***
(0.000)
Observations 19,218 19,218 19,218 19,218 19,218 19,218 19,218 19,218
Controls Yes Yes Yes Yes Yes Yes Yes Yes
Outliers excluded No No No No No No No No
Treatschool 0.126 0.107** 0.135* −0.015 0.171 0.107** 0.072 −0.129*
(0.116) (0.045) (0.068) (0.141) (0.121) (0.046) (0.046) (0.065)
Treatschool × Base GPA −0.01
(0.043)
Treatschool × Female  −0.013
(0.022)
Treatschool × Share Black   −0.113
(0.114)
Treatschool × Share FRPL    0.214
(0.218)
Treatschool × GreatSchools Rating     −0.013
(0.024)
Treatschool × Base Usage      −0.008*
(0.004)
Treatschool × Student Base Usage       0.001**
(0.000)
Treatschool × Teacher Base Usage        0.001***
(0.000)
Observations 19,218 19,218 19,218 19,218 19,218 19,218 19,218 19,218
Controls Yes Yes Yes Yes Yes Yes Yes Yes
Outliers excluded No No No No No No No No

Notes: All data are at the student level and are constructed from the learning management system data. Standard errors clustered at the school level are shown in parentheses. FRPL = free or reduced-price lunch.

*p < 0.1; **p < 0.05; ***p < 0.01.

For ease of presentation, the analysis is conducted with a school-level treatment indicator, which combines treated and spillover groups. There are no differences in heterogeneity between the spillover and treatment groups (results available on request). The results show there are no differential effects by baseline GPA, gender, or school-level demographic and performance characteristics.

Heterogeneity does appear to occur by measures of baseline usage. Parents who used the system more at baseline saw smaller effects. Moreover, higher levels of student-portal usage are associated with larger effects, and students whose teachers use the system more frequently also experience larger gains in GPA. To benchmark the amount of heterogeneity, a half-standard deviation increase in student-portal usage leads to a 0.02 standard deviation gain in GPA, and a half-standard deviation increase in the average logins by a student's teachers leads to a 0.10 standard deviation increase in GPA. A half-standard deviation increase in parent usage reduces effects by 0.01 standard deviations. These results are consistent with, though do not necessarily imply, a complementarity between parent usage and teacher usage of the portal in promoting student outcomes.

## 6.  Discussion and Conclusion

Previous research has shown that school-to-parent communication can foster parental engagement and improve a range of student outcomes. This paper documents some of the first evidence on families’ adoption of a school communication technology that aims to scale school-to-family communication. Adoption is not universal; more than 40 percent of families have never logged in to the system. Logins to the student portal occur at a much higher rate than logins to the parent portal. Schools with higher login rates tend to serve higher-income and higher-performing students. This implies that without active efforts to promote usage, simply placing information online could still leave gaps in the receipt of timely, actionable information about student performance across advantaged and disadvantaged groups.

A simple intervention providing account information to parents increased families’ adoption and usage by almost two logins per month. There was also an increase in usage for families who did not receive the intervention. This additional usage led to a modest increase in grades for both the treated and spillover groups of students. The results also suggest the possibility of a complementarity between parent usage and teacher usage of the portal. Both the usage and the GPA treatment effects are larger for schools in which teachers used the system more frequently. One might hope the intervention could generate a demand shock sufficient enough to increase the supply of information, as proxied by teacher logins, but the study was not powered to detect such effects.

Though these gains are small, the intervention has a low marginal cost to implement. The mailers cost $0.70 to print and mail. The phone calls cost$1.36 per student to manage and implement. This does not incorporate any potential costs incurred by teachers, which may arise if teachers update their grade books more often or communicate with families more frequently as a result of the intervention.

Nonetheless, the effects on usage are far from sufficient to close the gaps between high-test score and low-test score schools or schools serving a majority of students who receive free or reduced-priced lunch, versus those that serve a majority of students who do not. Merely providing access to information online may not improve outcomes in low-income area schools and schools with low test scores.

However, it is important to not conflate a lack of usage with a lack of demand or low valuation of information; recent research has shown that seemingly trivial barriers to access, such as requiring families to opt in versus opt out of receiving notifications about their child's academic progress, can lead to dramatic differences in take-up (Bergman and Rogers 2017). Given that there are benefits to providing such information, school districts should monitor how many families access parent and student portals, check that usage is equitable across different student populations, and intervene to promote usage when it is low.

## Acknowledgments

I thank Josefa Aguirre, Eric Chan, and Susha Roy for excellent research assistance. I am particularly grateful to the Seminar for the Study of Development strategies and Abhit Bhandari and Kolby Hanson for thoughtfully replicating this paper and providing comments. I also thank George Bulman, Sue Dynarski, Jay Greene, Macartan Humphreys, Scott Imberman, Brian Jacob, Isaac McFarlin, Richard Murphy, Kevin Stange, and seminar participants at the University of Arkansas, the University of Michigan and the University of Connecticut for their comments and suggestions. All errors are my own.

## Notes

1.

Students also may wish to use the portal to track what their course grades are and what teachers have marked on their latest assignments.

2.

This variable is only used as a covariate and provides a readily available indicator of test scores across states. Currently, test scores at the school level have no anchored measure to directly compare levels across different tests, though the Stanford Education Data Archive does provide the latter at the district level.

3.

Figure 1 shows the distribution of total usage of either parent or student accounts for all users who have logged in at least once, which has a long right tail.

4.

Figure A.4 shows the negative correlation between the share of students receiving free or reduced-price lunch and the share of families who have ever logged in.

5.

Although teachers must use the portal to enter grades at the end of marking periods, teacher logins do not directly reflect how often teachers are updating their gradebooks. For instance, I cannot observe how many changes a teacher makes to the gradebook per login; a teacher could log in once to update one assignment grade for a particular student and this would be observationally equivalent to logging in once to update the grades for an entire class or to update multiple assignments for a given student.

6.

Similar measures of supply, such as the average number of teacher logins per student, also positively correlate with parent adoption.

7.

Unfortunately, our interviews with school personnel, which informed us that families often log in to either the parent portal or the student portal, occurred after the experiment. This is why we targeted low-intensity users of the parent portals rather than low-intensity users of both the parent and student portals.

8.

Treatment-on-the-treated impacts are confounded by the simultaneous impact on additional usage by existing users and adoption by new users through their parent or student account.

9.

There are no effects on the number of logins by teacher (results available upon request).

10.

This transformation is akin to a log transformation but it does not treat zeros as missing.

11.

Duflo and Saez (2003) also found large increases in the enrollment of Tax Deferred Accounts in both the treatment and spillover groups.

12.

For context, the Bergman (forthcoming) study was conducted at a middle school and high school near downtown Los Angeles. Both schools served more low-income and more Hispanic students relative to the current study.

## REFERENCES

,
Sam
.
2016
.
2015 international learning technology investment patterns
.
Monroe, WA
:
Ambient Insight
.
Akabayashi
,
Hideo
.
2006
.
An equilibrium model of child maltreatment
.
Journal of Economic Dynamics and Control
30
(
6
):
993
1025
.
Allcott
,
Hunt
.
2011
.
Social norms and energy conservation
.
Journal of Public Economics
95
(
9
):
1082
1095
.
Allcott
,
Hunt
, and
Todd
Rogers
.
2014
.
The short-run and long-run effects of behavioral interventions: Experimental evidence from energy conservation
.
American Economic Review
104
(
10
):
3003
3037
.
Angrist
,
Joshua
, and
Victor
Lavy
.
2002
.
New evidence on classroom computers and pupil learning
.
Economic Journal
112
(
482
):
735
765
.
Banerjee
,
Abhijit V.
,
Shawn
Cole
,
Esther
Duflo
, and
Leigh
Linden
.
2007
.
Remedying education: Evidence from two randomized experiments in India
.
Quarterly Journal of Economics
122
(
3
):
1235
1264
.
Barrera-Osorio
,
Felipe
, and
Leigh L.
Linden
.
2009
.
The use and misuse of computers in education: Evidence from a randomized experiment in Colombia
.
Available
http://documents.worldbank.org/curated/en/346301468022433230/pdf/WPS4836.pdf.
Accessed 9 April 2020
.
Barrow
,
Lisa
,
Lisa
Markman
, and
Cecilia Elena
Rouse
.
2009
.
Technology's edge: The educational benefits of computer-aided instruction
.
American Economic Journal: Economic Policy
1
(
1
):
52
74
.
Beland
,
Louis-Philippe
, and
Richard
Murphy
.
2016
.
Ill communication: Technology, distraction & student performance
.
Labour Economics
41
:
61
76
.
Belo
,
Rodrigo
,
Pedro
Ferreira
, and
Rahul
Telang
.
2013
.
Broadband in school: Impact on student performance
.
Management Science
60
(
2
):
265
282
.
Bergman
,
Peter
.
Forthcoming
.
Parent-child information frictions and human capital investment: Evidence from a field experiment
.
Journal of Political Economy
.
In press
.
Bergman
,
Peter
, and
Eric
Chan
.
2017
.
Leveraging technology to engage parents at scale: Evidence from a randomized controlled trial
.
CESifo Working Paper Series No. 6493.
Munich, Germany
.
Bergman
,
Peter
,
Chana
Edmond-Verley
, and
Nicole
Notario-Risk
.
2018
.
Parent skills and information asymmetries: Experimental evidence from home visits and text messages in middle and high schools
.
Economics of Education Review
66
:
92
103
.
Bergman
,
Peter
, and
Todd
Rogers
.
2017
.
The impact of defaults on technology adoption, and its underappreciation by policymakers
.
CESifo Working Paper Series No. 6721
.
Munich, Germany
.
Beuermann
,
Diether W.
,
Julian
Cristia
,
Santiago
Cueto
,
Ofer
Malamud
, and
Yyannu
Cruz-Aguayo
.
2015
.
One laptop per child at home: Short-term impacts from a randomized experiment in Peru
.
American Economic Journal: Applied Economics
7
(
2
):
53
80
.
Bhargava
,
Saurabh
, and
Dayanand
Manoli
.
2015
.
Psychological frictions and the incomplete take-up of social benefits: Evidence from an IRS field experiment
.
American Economic Review
105
(
11
):
3489
3529
.
Bird
,
Kelli A.
,
Benjamin L.
Castleman
,
Joshua
Goodman
, and
Cait
Lamberton
.
2017
.
Nudging at a national scale: Experimental evidence from a FAFSA completion campaign
.
NBER Working Paper No. 26158
.
Bowen
,
William G.
,
Matthew M.
Chingos
, and
Michael S.
McPherson
.
2009
.
Crossing the finish line: Completing college at America's public universities
.
Princeton, NJ
:
Princeton University Press
.
Bridgeland
,
John M.
,
John J.
DiIulio
,
Ryan T.
Streeter
, and
James R.
Mason
.
2008
.
One dream, two realities: Perspectives of parents on America's high schools
.
Washington, DC
:
Civic Enterprises
.
Bulman
,
George
, and
Robert W.
Fairlie
.
2016
.
Technology and education: Computers, software, and the Internet
. In
Handbook of the economics of education
,
Vol. 5, edited by
Eric A.
Hanushek
,
Stephen
Machin
, and
Ludger
Woessmann
, pp.
239
280
.
Oxford, UK
:
North Holland
.
Bursztyn
,
Leonardo
, and
Lucas C.
Coffman
.
2012
.
The schooling decision: Family preferences, intergenerational conflict, and moral hazard in the Brazilian favelas
.
Journal of Political Economy
120
(
3
):
359
397
.
Castleman
,
Benjamin L.
, and
Lindsay C.
Page
.
2014
.
Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates
?
Journal of Economic Behavior & Organization
115
:
144
160
.
Castleman
,
Benjamin
, and
Lindsay C.
Page
.
2017
.
Parental influences on postsecondary decision-making: Evidence from a text messaging experiment
.
Educational Evaluation and Policy Analysis
39
(
2
):
361
377
.
Chapman
,
Ben
.
2014
.
City schools dumping \$95 million computer system for tracking student data
.
New York Daily News
,
16 November
.
Cialdini
,
Robert B.
,
Linda J.
Demaine
,
Sagarin
,
Daniel W.
Barrett
,
Kelton
, and
Patricia L.
Winter
.
2006
.
Managing social norms for persuasive impact
.
Social Influence
1
(
1
):
3
15
.
Conley
,
Timothy G.
, and
Christopher R.
Udry
.
2010
.
Learning about a new technology: Pineapple in Ghana
.
American Economic Review
100
(
1
):
35
69
.
Cosconati
,
Marco
.
2009
.
Parenting style and the development of human capital in children
.
Available
https://economicdynamics.org/meetpapers/2011/paper_854.pdf.
Accessed 9 April 2020
.
Dettling
,
Lisa J.
,
Sarena
Goodman
, and
Jonathan
Smith
.
2018
.
Every little bit counts: The impact of high-speed internet on the transition to college
.
Review of Economics and Statistics
100
(
2
):
260
273
.
Duflo
,
Esther
, and
Emmanuel
Saez
.
2003
.
The role of information and social interactions in retirement plan decisions: Evidence from a randomized experiment
.
Quarterly Journal of Economics
118
(
3
):
815
842
.
Duflo
,
Esther
,
Michael
Kremer
, and
Jonathan
Robinson
.
2011
.
Nudging farmers to use fertilizer: Theory and experimental evidence from Kenya
.
American Economic Review
101
(
6
):
2350
2390
.
Dupas
,
Pascaline
.
2014
.
Short-run subsidies and long-run adoption of new health products: Evidence from a field experiment
.
Econometrica
82
(
1
):
197
228
.
Fairlie
,
Robert W.
, and
Jonathan
Robinson
.
2013
.
Experimental evidence on the effects of home computers on academic achievement among schoolchildren
.
American Economic Journal: Applied Economics
5
(
3
):
211
240
.
Foster
,
Andrew D.
, and
Mark R.
Rosenzweig
.
1995
.
Learning by doing and learning from others: Human capital and technical change in agriculture
.
Journal of Political Economy
103
(
6
):
1176
1209
.
Foster
,
Andrew D.
, and
Mark R.
Rosenzweig
.
2010
.
.
Annual Review of Economics
2
:
395
424
.
Fryer
,
Roland G.
2013
.
Information and student achievement: Evidence from a cellular phone experiment
.
NBER Working Paper No. 19113
.
Gerber
,
Alan S.
, and
Todd
Rogers
.
2009
.
Descriptive social norms and motivation to vote: Everybody's voting and so should you
.
Journal of Politics
71
(
1
):
178
191
.
Goldstein
,
Noah J.
,
Griskevicius
, and
Robert B.
Cialdini
.
2007
.
Invoking social norms: A social psychology perspective on improving hotels’ linen-reuse programs
.
Cornell Hotel and Restaurant Administration Quarterly
48
(
2
):
145
150
.
Goolsbee
,
Austan
, and
Jonathan
Guryan
.
2006
.
The impact of internet subsidies in public schools
.
Review of Economics and Statistics
88
(
2
):
336
347
.
Hallsworth
,
Michael
,
John A.
List
,
Robert D.
Metcalfe
, and
Ivo
Vlaev
.
2017
.
The behavioralist as tax collector: Using natural field experiments to enhance tax compliance
.
Journal of Public Economics
148
:
14
31
.
Hao
,
Lingxin
,
V.
Joseph Hotz
, and
Ginger Z.
Jin
.
2008
.
Games parents and adolescents play: Risky behaviour, parental reputation and strategic transfers
.
Economic Journal
118
(
528
):
515
555
.
Hiss
,
William C.
, and
Valerie W.
Franks
.
2014
.
Defining promise: Optional standardized testing policies in American college and university admissions
.
Available
Accessed 9 April 2020
.
Kraft
,
Matthew A.
, and
Todd
Rogers
.
2015
.
The underutilized potential of teacher-to-parent communication: Evidence from a field experiment
.
Economics of Education Review
47
:
49
63
.
Kremer
,
Michael
, and
Edward
Miguel
.
2007
.
The illusion of sustainability
.
Quarterly Journal of Economics
122
(
3
):
1007
1065
.
Linden
,
Leigh L.
2008
.
Complement or substitute? The effect of technology on student achievement in India
.
Washington, DC
:
infoDev Working Paper No. 17
.
Machin
,
Stephen
,
Sandra
McNally
, and
Olmo
Silva
.
2007
.
New technology in schools: Is there a payoff
?
Economic Journal
117
(
522
):
1145
1167
.
Malamud
,
Ofer
, and
Christian
Pop-Eleches
.
2011
.
Home computer use and the development of human capital
.
Quarterly Journal of Economics
126
(
2
):
987
1027
.
Manski
,
Charles F.
1993
.
Identification of endogenous social effects: The reflection problem
.
Review of Economic Studies
60
(
3
):
531
542
.
McCarthy
,
Shawn P.
2017
.
Pivot table: U.S. education IT spending guide, version 1, 2015-2020.
Available
https://www.idc.com/getdoc.jsp?containerId=US42532616.
Accessed 13 April 2020
.
Noel
,
Amber
,
Patrick
Stark
, and
Jeremy
Redford
.
2013
.
Parent and family involvement in education, from the national household education surveys program of 2012
.
Available
https://nces.ed.gov/pubs2013/2013028rev.pdf.
Accessed 9 April 2020
.
Oster
,
Emily
, and
Rebecca
Thornton
.
2012
.
Determinants of technology adoption: Peer effects in menstrual cup take-up
.
Journal of the European Economic Association
10
(
6
):
1263
1293
.
Page
,
Lindsay C.
, and
Hunter
Gehlbach
.
2017
.
How an artificially intelligent virtual assistant helps students navigate the road to college
.
AERA Open
3
(
4
):
1
12
.
Rogers
,
Everett M.
1995
.
Diffusion of innovations
, 4th edition.
New York
:
The Free Press
.
Rogers
,
Todd
, and
Avi
Feller
.
2018
.
Reducing student absences at scale by targeting parents’ misbeliefs
.
Nature Human Behavior
2
(
5
):
335
342
.
Rothstein
,
Jesse M.
2004
.
College performance predictions and the SAT
.
Journal of Econometrics
121
(
1
):
297
317
.
Rouse
,
Cecilia Elena
, and
Alan B.
Krueger
.
2004
.
Putting computerized instruction to the test: A randomized evaluation of a “scientifically based” reading program
.
Economics of Education Review
23
(
4
):
323
338
.
Scott-Clayton
,
Judith
,
Peter M.
Crosta
, and
Clive R.
Belfield
.
2014
.
Improving the targeting of treatment evidence from college remediation
.
Educational Evaluation and Policy Analysis
36
(
3
):
371
393
.
Taylor
,
Eric
.
2015
.
New technology and teacher productivity
.
Harvard University Working Paper
.
Tyler
,
John H.
2013
.
If you build it will they come? Teachers’ online use of student performance data
.
Education Finance and Policy
8
(
2
):
168
207
.
Vigdor
,
Jacob L.
,
Helen F.
, and
Erika
Martinez
.
2014
.
Scaling the digital divide: Home computer technology and student achievement
.
Economic Inquiry
52
(
3
):
1103
1119
.
Weinberg
,
Bruce A.
2001
.
An incentive model of the effect of parental income on children
.
Journal of Political Economy
109
(
2
):
266
280
.
York
,
Benjamin N.
,
Susanna
Loeb
, and
Christopher
Doss
.
2019
.
One step at a time: The effects of an early literacy text messaging program for parents of preschoolers
.
Journal of Human Resources
54
:
900
925
.

Table A.1.
Attrition
Treatschool 0.02
(0.020)
Spillover 0.00
(0.01)
Control mean 0.88
Observations 21,854
Treatschool 0.02
(0.020)
Spillover 0.00
(0.01)
Control mean 0.88
Observations 21,854

Notes: All data are at the student level and are constructed from the learning management system data. The outcome variable is an indicator for a student having a final grade in the system. Standard errors clustered at the school level are shown in parentheses.

Figure A.1.

Parent Portal: Main Screen

Notes: The figure shows an example of the type of academic information that can be found on parent portal. All information on this figure is fictional.

Figure A.1.

Parent Portal: Main Screen

Notes: The figure shows an example of the type of academic information that can be found on parent portal. All information on this figure is fictional.

Figure A.2.

Parent Portal: Specific Class Information

Notes: The figure shows an example of the type of academic information that can be found on parent portal once a parent clicks on a specific class. All information on this figure is fictional.

Figure A.2.

Parent Portal: Specific Class Information

Notes: The figure shows an example of the type of academic information that can be found on parent portal once a parent clicks on a specific class. All information on this figure is fictional.

Figure A.3.

Share of Families Who Ever Logged in by GreatSchools Rating

Notes: The figure shows the share of parents who have ever logged into a portal for each GreatSchools rating of schools. This figure is constructed using data from the learning management system and GreatSchools ratings.

Figure A.3.

Share of Families Who Ever Logged in by GreatSchools Rating

Notes: The figure shows the share of parents who have ever logged into a portal for each GreatSchools rating of schools. This figure is constructed using data from the learning management system and GreatSchools ratings.

Figure A.4.

Share Ever Logged In by Share Free or Reduced-Price Lunch Eligibility

Notes: The figure shows the share of families who have ever logged in to a portal plotted against the share of students who receive free or reduced-price lunch in each school. This figure is constructed using data from the learning management system and National Center for Education Statistics Common Core.

Figure A.4.

Share Ever Logged In by Share Free or Reduced-Price Lunch Eligibility

Notes: The figure shows the share of families who have ever logged in to a portal plotted against the share of students who receive free or reduced-price lunch in each school. This figure is constructed using data from the learning management system and National Center for Education Statistics Common Core.

Figure A.5.

Experimental Design

Notes: This figure shows the experimental design for the account-information intervention. Randomization occurs first at the school level and then at the student level.

Figure A.5.

Experimental Design

Notes: This figure shows the experimental design for the account-information intervention. Randomization occurs first at the school level and then at the student level.