Net price calculators (NPCs) are online tools designed to increase transparency in college pricing by presenting students with individualized estimates of net prices to attend a given postsecondary institution. The federal template NPC predicts identical aid awards for similarly profiled students attending the same institution. Using the 2012 National Postsecondary Student Aid Survey, we use regression analysis to assess variation in actual financial aid awards among students predicted by the federal template NPC to receive identical awards. We find estimated aid, derived from the federal template NPC, accounts for 70 percent of the variation in actual grant aid received by students. We then consider modifications to the federal template NPC that include an additional upper-income bracket option and indicators of both high school grade point average and Free Application for Federal Student Aid filing time. These modifications explain an additional 16 percentage points, or more than half, of the unexplained variation in actual grant aid awards across all institutional sectors. These findings are especially relevant as legislators consider policy efforts to bring greater transparency to college cost and pricing, including creating a universal NPC in which prospective students can enter information once to receive net price estimates at any institution.

College net prices—the out-of-pocket prices students and families pay for college after accounting for grant aid—are the best indicators of a postsecondary institution's affordability. However, net prices are stubbornly opaque and often remain so until after students have had to make important college choices, such as where to apply and sometimes even where to enroll.

Financial aid can make higher education feasible for families who otherwise would not be able to afford it. However, financial aid also contributes to an unclear pricing system in which inclusive sticker prices are often far higher and more visible than the net prices students face after receiving grant aid. Lack of clarity in pricing can contribute to inequalities in enrollment and persistence in higher education. Though a lower-income family would likely be eligible for more need-based financial aid than their wealthier counterparts, they are also less likely to successfully navigate complicated financial aid applications and accurately estimate college costs they would actually face (Avery and Kane 2004; Grodsky and Jones 2007; Hoxby and Turner 2015).

In response, policy makers have introduced a variety of efforts. Some, like regional Promise programs, aim to reduce real costs of college with clearly articulated “promises” of grant-based aid. Others, like Free Application for Federal Student Aid (FAFSA) simplification efforts, aim to reduce the complexity in the process of accessing aid. And still others, like net price calculators (NPCs), aim to make the real costs of college more transparent. NPCs are online tools to estimate the net price that a given student would pay to attend a given school. Because they provide students with estimates of net pricing prior to navigating complicated aid applications, NPCs are a primary way for colleges and universities to improve pricing transparency.

To be most useful to prospective students and their families, NPCs should be designed with a focus on providing reasonably accurate grant aid estimates while minimizing complicated inputs from users. All NPCs must allow for an estimate of how much a family would be expected to contribute toward the cost of college, or expected family contribution (EFC).1 However, colleges use different NPCs, and these different NPCs vary in terms of how much information they request from users to form this estimate. Here, we focus on the NPC template provided federally by the U.S. Department of Education because it is among the most common NPCs and its limited data inputs are relatively straightforward for student users to provide unassisted. Specifically, the federal template draws on the student's dependency status, approximate family income, residency status, and college housing arrangements to produce net price estimates.

More complicated NPC alternatives are also common. An NPC that requests highly detailed student and family financial information may generate more accurate aid estimates, but requiring fine-grained financial information risks introducing complexity and barriers to NPC use similar to those associated with federal aid applications (see Dynarski and Scott-Clayton 2013 and Dynarski and Wiederspan 2012). In contrast, a simple NPC requires only basic inputs that most users are readily able to provide, but may trade accuracy for ease of use.

We study this simplicity and accuracy balance by measuring the variation in actual financial grant aid amounts among students predicted by the federal template NPC to receive identical awards. We then explore modifications to the federal template NPC to reduce unexplained variation in aid awards. We find that estimated grant aid derived from the simple inputs of the federal template NPC accounts for 70 percent of the variation in actual grant aid received by students overall. We consider modifications to the federal template NPC, including further differentiation of household income categories at the upper end of the distribution, incorporating an indicator of high school performance, and including an indicator for the timing of financial aid application. These modifications explain more than half of the remaining variation in awards. Yet students predicted to receive identical grant aid awards can still receive actual aid awards that vary by thousands of dollars. Discrepancies between estimated and actual net prices may lead families to deem certain postsecondary options to be more (or less) affordable than they actually are, potentially shifting students’ application choices. Information on simple enhancements to NPCs is especially timely as Congress considers changes to existing NPCs and the creation of a “Universal NPC,” a topic of legislation since 2013, though such a bill has not yet been introduced in the current legislative session.

Net price calculators are a product of the 2008 renewal of the Higher Education Opportunity Act and have been federally mandated to be included on postsecondary institutions’ Web sites since October 2011. In an early review of NPCs, Cheng (2012) found the calculators challenging to find and use, inconsistently labeled, and difficult to compare across institutions. These critiques are the basis for many of the specific improvements cited in Net Price Calculator Improvement Act legislation. Previous versions of legislation would require postsecondary institutions to consistently and prominently label their calculators as “Net Price Calculators” (as opposed to “Education Cost Calculator” or “Tuition Calculator,” for example) and populate their calculators with data no more than two years old. Such legislation would also allow the Secretary of Education to create a universal NPC that would make it possible to complete one set of questions and receive net price estimates for any institution.

A more recent review of NPCs at eighty public and private four-year institutions found that about 40 percent of schools were using outdated data, and other schools used NPCs that did not clearly differentiate loans from grants or scholarships (Perna, Wright-Kim, and Jiang 2019). In addition to the enhancements listed in the previous NPC improvement legislation, Perna and colleagues recommend the federal NPC template list grant awards by source, variation in costs by major or academic discipline, and groups for whom estimates do not apply (e.g., non-U.S. citizens or part-time students).

A pilot study of NPC performance found that, on average, NPCs provide better estimates of out-of-pocket prices than sticker prices, yet actual grant aid awards may vary substantially from NPC predictions (Anthony, Page, and Seldin 2016). Especially for low-income families, even small disparities between predicted and actual aid may impact college decisions (Pallais 2009; Castleman and Page 2016). An NPC that severely overestimates grant aid may lead students to face unexpectedly large college net prices. Conversely, an NPC that substantially underestimates grant aid could tilt a school's applicant pool in favor of those students who are financially able to make up the predicted shortfall in grant aid, while less financially secure students may consider the school to be unaffordable and forgo even applying.

The complexity of the calculator is important because the very purpose of an NPC is to increase transparency in college pricing and financial aid. Substantial research points to the complexity of the financial aid application process as a primary cause of low take-up rates of student aid (Dynarski and Scott-Clayton 2006, 2008, 2013; Bettinger et al. 2012; Dynarski and Wiederspan 2012; Page and Scott-Clayton 2016). Overly complex calculators risk becoming an additional barrier to clear price information if the calculator tools themselves are too burdensome to use.

This study builds on prior research by Kane (1995), Stoll and Stedman (2004), and Dynarski and Scott-Clayton (2006) on exploring the sensitivity of financial aid calculations to manipulations in its independent components. Kane notes that most of the variation in Pell grants can be explained using just a few variables. Stoll and Stedman simulate the effect of excluding items from the calculation of EFC. Dynarski and Scott-Clayton show that federal need-based aid distribution can be reproduced using just a fraction of the information that is now collected on the FAFSA.

We expand on this line of research in two key ways. First, these prior studies focus on means-tested federal grant aid. We focus on all sources of grant aid, including institutional grant aid, which tends to be more variable, especially for private institutions. This is important to consider as more than 40 percent of all grant aid—the largest portion from any source—comes from the postsecondary institutions themselves (Ma et al. 2017). Second, the policy objective in these previous studies focused on strategically reducing financial aid data elements and maintaining aid distribution. Instead, we consider the possibility of strategically increasing or modifying data components to decrease variation in grant aid awards to similar students, while aiming to balance the benefit of decreased variation with the potential of increased complexity by requiring students to input additional information.

We use data from the 2012 National Postsecondary Student Aid Survey (NPSAS:12) and fixed-effects regression analyses to assess the extent to which actual grant aid received varies for students with identical NPC grant aid estimates. Ideally, our analysis would consider two grant aid figures central to this study: the grant aid students actually receive and the grand aid students are predicted to receive based on information used in the federal template NPC. If this data were available, we could estimate for what shares of students NPC estimates were within x percent (or dollars) of their actual aid awards. However, because NPSAS:12 data include actual grant aid information but not NPC-estimated grant aid, this approach is not possible. That is, while NPSAS provides us with information on actual aid received, we lack information on each student's actual aid estimate.

Although we do have the student- and institution-level information for estimating grant aid within the federal template NPC, we cannot do so directly because the data populating NPCs at the time of our analysis did not correspond to the year of our data. For this reason, we use a fixed-effects approach through which we focus on variation in grant aid for students who would receive identical NPC estimates. Specifically, we generate a set of fixed effects for groups of students that the federal template NPC would estimate at the same level of grant-based aid by virtue of attending the same institution, and being identical on the full set of characteristics on which the federal template NPC relies. We then examine the variation in actual aid awards within these observationally similar groups of students (see Appendix A for sampling and methods details, and figure A.1 for how the federal template NPC operates; appendix materials are available in a separate online appendix that can be accessed on Education Finance and Policy’s Web site at https://doi.org/10.1162/edfp_a_00353).

NPC estimates pertain to first-time, full-time undergraduate students who applied for federal financial aid and enrolled in a single institution for the full year. We limit our sample to include only these students. In addition, as our analysis relies on measuring variation in grant aid awards for students with the same NPC profile attending the same institution, we exclude from the sample any student who has an NPC profile different from all other students observed for their institution. About half of the remaining observations are instances of two (33 percent) or three (17 percent) observations within a given institution-NPC profile.

Next, we consider three modifications that might be incorporated into the federal template NPC to decrease variation in aid awards for similar students attending the same school.

Modification 1: Additional Upper-Income Boundaries

The federal NPC template uses income brackets ranging from “less than $30,000” to “$99,999” in $10,000 increments, and “above $100,000” to classify students’ household income. This binning process means, for example, that a family earning $100,000 annually is categorically identical to a family earning ten times that amount. Twenty percent of families in our sample are clustered into the uppermost of nine income brackets. We add a new uppermost income bracket, such that the resulting income levels include the original “less than $30,000” to “99,999,” plus “$100,000 to $150,000” and “above $150,000.”

Modification 2: Indicator of Academic Merit

Many institutions distribute merit aid based on predictable and widely used merit2 metrics (e.g., SAT or ACT scores or high school grade point average [HSGPA]). We consider whether adding a threshold indicator for relatively high versus low GPA explains additional variation in aid received. We consider three potential GPA threshold values: 2.5, 3.0, and 3.5.

Modification 3: FAFSA Filing Timing

Certain types of financial aid are awarded on a first-come, first-served basis3 (McKinney and Novak 2015). As a result, grant aid awards may be determined not only by what information the student provides on the FAFSA itself, but also by when the student completes the FAFSA. As more than 60 percent of students in our sample filed within the first three months of the filing window, we use monthly intervals within this timeframe to test thresholds for early FAFSA filers. We consider three potential timeframes, measured in months from the opening of the FAFSA application filing window: within one month, within two months, and within three months.4

Our results focus on two key metrics derived from our regression models: R2 and root mean square error (RMSE). R2 communicates the share of variation in a given outcome that a model explains. An R2 of 0, for example, indicates that a model does not predict any of the variation in the outcome, while an R2 of 1 indicates that a model fully explains all of the variation in the outcome. In our context, R2 shows how much variation in actual award packages is explained by information requested by the federal template NPC. RMSE is a measure of the distance between actual data points and the model's predictions. For our NPC models, the RMSE is the typical distance between actual aid awards and average awards for observationally identical students. An RMSE of 0 (or an R2 of 1) is likely not possible with any amount of information because of modest noise intentionally introduced in the NPSAS data to preserve confidentiality. Additionally, we note that we only know financial aid information for the aid offers students actually accept. It is possible that there are grants a student could not or would not accept.

In figure 1, we present the model modifications with the best performance. We first show the R2 statistic associated with the current federal template NPC (“Basic model”) compared to the most effective (“Best model”) of the proposed NPC modifications. Beneath the figure, we specify the most effective HSGPA and FAFSA filing thresholds for each sector, along with the percentage point improvement in R2 between the basic and “best” modification model.
Figure 1.
Overall Results

Notes: Overall results, comparing federal template Net Price Calculator (NPC) R2 and root mean square error (RMSE) values to those of the best (i.e., highest R2) NPC model. Modifications include additional upper-income brackets and indicators of high school grade point average (HSGPA) and Free Application for Federal Student Aid (FAFSA) filing time. The corresponding “best model” combinations of HSGPA and FAFSA filing dates along with the percentage point difference in R2 are listed below each institutional sector. RMSE indicates the typical dollar difference between actual awards and the amount anticipated by the federal template NPC. N = 7,560.

Figure 1.
Overall Results

Notes: Overall results, comparing federal template Net Price Calculator (NPC) R2 and root mean square error (RMSE) values to those of the best (i.e., highest R2) NPC model. Modifications include additional upper-income brackets and indicators of high school grade point average (HSGPA) and Free Application for Federal Student Aid (FAFSA) filing time. The corresponding “best model” combinations of HSGPA and FAFSA filing dates along with the percentage point difference in R2 are listed below each institutional sector. RMSE indicates the typical dollar difference between actual awards and the amount anticipated by the federal template NPC. N = 7,560.

Close modal

First consider the R2 value of 0.70 (figure 1, leftmost bar). This tells us the data elements gathered by the current federal template NPC explain 70 percent of the variation in actual grant aid awards. Continuing with the darker gray bars showing R2 values associated with the current federal template, or “basic” model, we see these rates differ somewhat by institutional sector: The federal template NPC data explain about 70 percent of variation in aid awards at public four-year institutions, 60 percent in private four-year and public two-year institutions, and 55 percent in for-profit institutions.

The lighter bars show the most effective modifications to the federal template NPC model. Within each institutional sector, the modifications explain more than half of the variation in grant aid awards that the current federal template NPC left unexplained. Each model includes additional upper income brackets, but optimal HSGPA and FAFSA filing thresholds vary by institutional sector. The combination of a 1 February FAFSA filing threshold and HSGPA indicator of 3.5 most effectively improves the federal template NPC's explanatory potential overall, increasing the R2 from 0.70 to 0.86. By sector, the best model combinations include a 3.0 (public two-year and for-profit) or 3.5 (public and private four-year) HSGPA, and a 1 February (private four-year) or 1 March (public four-year, two-year, and for-profit) FAFSA filing date. With improvements in R2 statistics ranging from 20 to 26 percentage points within sectors, the proposed modifications represent a sizeable increase in NPC explanatory potential over the current federal template model by adding easily reportable and simple metrics.

Next, we focus on the RMSE (shown in parentheses beneath the R2 statistic). We see that 1 standard deviation in actual aid awards from what current NPC inputs anticipate is $5,670. In other words, a typical student received an actual financial aid package that was nearly $5,700 more (or less) than what the NPC model would estimate. Because we do not have actual NPC estimates, we were not able to assess if the estimates were over- or under-predicted. However, residual analyses reveal a narrower residual distribution for those from households with up to $30,000 in annual income and a wider distribution above that level. This suggests a greater potential for inaccuracy for middle- and upper-income households. The RMSE may be a useful measure for prospective students to approximate high and low estimates of their expected grant awards.

Looking across institutional sectors, we also see that typical deviations from NPC-predicted awards vary substantially by institutional sector. For example, within the sample of private four-year institutions, where students receive relatively more grant aid, on average, the standard deviation between estimated and actual awards is nearly $11,000. By contrast, a typical community college student receives an award that is only $2,400 different from what the federal template NPC estimates. See table B.1 (in online Appendix B) for complete results of the effects of the different modification combinations, and figure B.1 for a chart summarizing changes in R2 from introducing one, two, and three additional NPC elements.

We demonstrate that with relatively simple modifications, the federal template NPC can explain up to 90 percent of the variation in actual grant aid awards, but that remaining variation in actual awards among similar students can remain quite large, especially within postsecondary sectors with higher sticker prices and more generous financial aid packages. Whether the remaining share of unexplained variation warrants the added complexity of alternative calculators is a subjective matter, which may vary depending on the student and the institution in question.

As the layout of the federal template NPC requires institutions to enter their data in terms of categorical variables, it is important for NPC developers to consider exactly what thresholds constitute “high” or “low” high school GPAs or “early” or “late” FAFSA filing. A one-size-fits-all model might adopt the thresholds that were most effective in our overall analysis—that is, 3.5 HSGPA and comparatively early FAFSA filing. A more targeted approach, however, would likely be more effective. One strategy is to allow for institutions to use HSGPA and FAFSA filing information best-suited to their own levels and methods of awarding financial aid. The front-facing end of the NPC would look the same but operational thresholds of high/low HSGPA and early/late FAFSA could be specific to individual institutions. For example, an institution can select a GPA threshold in line with the merit aid they provide or a FAFSA timing threshold in line with relevant priority deadlines for FAFSA filing.

An alternative approach is to provide NPC options for students to use according to the information and time they have available. Some examples of this approach include the college search site College Raptor (collegeraptor.com) and the MyinTuition NPC (see myintuition.org). The college search function on College Raptor allows a user to indicate the amount of financial data they are prepared to provide—options include “no financial data,” “I know my EFC,” “limited family and financial data,” and “full family and financial data”—and the extent of financial information requested adjusts accordingly.

The MyinTuition NPC requests more detailed information, including remaining mortgage balance and assets in retirement and nonretirement accounts, which may pose challenges for some users but is still simpler than many popular NPC alternatives. The MyinTuition NPC also provides a range of net price estimates (labeled “low,” “best,” and “high”), along with a graphic illustration of grant and loan sources (as opposed to the single line-item federal NPC estimate).

A modified federal template NPC may adopt a similar feature to present users with a prediction interval within which grant aid estimates are likely to vary. Even though we decrease the share of unexplained variation in grant aid awards, the typical difference between estimated and actual awards still exceeds $5,000 overall and approaches $11,000 within private four-year institutions. Providing an institution-specific estimated range of likely grant aid, in addition to a specific dollar estimate, may help students make more informed college enrollment decisions.

Minimizing complexity in user-provided data is a key motivator in our NPC design recommendation. Our suggested modifications would require relatively simple changes to both the front- and back-facing sides of the federal template NPC. On the user-facing side, we suggest adding options for household income of “$100,000 to $150,000” and “Above $150,000” to better distinguish upper-middle and upper-income households. We also suggest introducing two questions to assess HSGPA and expected FAFSA filing time. Users could report their unweighted HSGPA on a scale of 1 to 4, and their expected FAFSA filing time as a specific date on a calendar. On the back end, these HSGPA and FAFSA filing time measures would be translated into relatively high and low GPA values and relatively early and late FAFSA filing dates.

These limited data inputs could operate as a framework for designing a “Universal Net Price Calculator” as outlined in the Net Price Calculator Improvement Act, which had been introduced in Congress since 2013, but has not yet been introduced in the current legislative session. Additional data elements such as college-specific savings accounts, parents’ retirement savings, or remaining mortgage balances may be especially helpful for narrowing variation in aid among similar students attending schools where aid is relatively more plentiful—such as four-year private nonprofit schools—or where pricing varies substantially by field of study or academic major (as recommended by Perna, Wright-Kim, and Jiang 2019).

Previous policy recommendations for NPCs centered on their usefulness and usability (see Cheng 2012 and Perna, Wright-Kim, and Jiang 2019). With recommendations on consistent labeling of aid terminology, improved prominence on Web sites, and suggested additional helpful information such as differing costs by academic major or for whom the estimates do not apply, these reviews emphasize improvements for many specific facets of NPCs. They do not, however, assess how NPCs use the information they request to estimate aid. Our study differs because it is an assessment of potential additional data elements that are straightforward for students to provide and effective for improving college net price estimates. These recommendations represent an additional dimension of NPC improvement potentially allowing policy makers and institutions to improve the form and function of their NPCs. The Consolidated Appropriations Act of 2021 included changes to simplify the FAFSA and make aid predictable, but did not create a Universal Net Price Calculator. However, postsecondary institutions do not need to wait for federal legislation to make such improvements to their NPCs and overall price transparency.

Funding for this brief was provided by a grant from the American Educational Research Association, which receives funds for its AERA Grants Program from the National Science Foundation under grant number DRL-091014.

1. 

As part of the Consolidated Appropriations Act of 2021, EFC is set to be replaced with Student Aid Index (SAI) beginning in July 2023. The SAI will serve a similar function to EFC, but at the time of writing, it is not yet clear how the change from EFC to SAI will impact the formula for the federal template NPC; though a direct substitution of SAI for EFC is practical and plausible.

2. 

We use categorical variables for indicators of academic merit to be in line with the federal template NPC, which requires institutions to enter their data in terms of categorical variables.

3. 

We use categorical variables for indicators of FAFSA filing time to be in line with the federal template NPC, which requires institutions to enter their data in terms of categorical variables.

4. 

In 2012, when data for this study were collected, the FAFSA application window began on 1 January. It has since been moved to 1 October. For this reason, we discuss relative time periods (i.e., “within one month”) rather than specific dates and months.

Anthony
,
Aaron M.
,
Lindsay C.
Page
, and
Abigail
Seldin
.
2016
.
In the right ballpark? Assessing the accuracy of net price calculators
.
Journal of Student Financial Aid
46
(
2
):
25
50
.
Avery
,
Christopher
, and
Thomas J.
Kane
.
2004
.
Student perceptions of college opportunities: The Boston COACH program
. In
College choices: The economics of where to go, when to go, and how to pay for it
, edited by
Carolyn M.
Hoxby
, pp.
355
394
.
Chicago
:
University of Chicago Press
.
Bettinger
,
Eric P.
,
Bridget Terry
Long
,
Philip
Oreopoulos
, and
Lisa
Sanbonmatsu
.
2012
.
The role of application assistance and information in college decisions: Results from the H&R Block FAFSA experiment
.
Quarterly Journal of Economics
127
(
3
):
1205
1242
.
Castleman
,
Benjamin L.
, and
Lindsay C.
Page
.
2016
.
Freshman year financial aid nudges: An experiment to increase FAFSA renewal and college persistence
.
Journal of Human Resources
51
(
2
):
389
415
.
Cheng
,
Diane.
2012
.
Adding it all up 2012: Are college net price calculators easy to find, use, and compare?
Washington, DC: Institute for College Access & Success
.
Dynarski
,
Susan
, and
Judith
Scott-Clayton
.
2006
.
The cost of complexity in federal student aid: Lessons from optimal tax theory and behavioral economics
.
National Tax Journal
59
(
2
):
319
356
.
Dynarski
,
Susan
, and
Judith
Scott-Clayton
.
2008
.
Complexity and targeting in federal student aid: A quantitative analysis
.
NBER Working Paper No. 13801
.
Dynarski
,
Susan
, and
Judith
Scott-Clayton
.
2013
.
Financial aid policy: Lessons from research
.
NBER Working Paper No. 18710
.
Dynarski
,
Susan
, and
Mark
Wiederspan
.
2012
.
Student aid simplification: Looking back and looking ahead
.
NBER Working Paper No. 17834
.
Grodsky
,
Eric
, and
Melanie T.
Jones
.
2007
.
Real and imagined barriers to college entry: Perceptions of cost
.
Social Science Research
36
(
2
):
745
766
.
Hoxby
,
Caroline
, and
Sarah
Turner
.
2015
.
What high-achieving low-income students know about college
.
American Economic Review
105
(
5
):
514
517
.
Kane
,
Thomas J.
1995
.
Rising public college tuition and college entry: How well do public subsidies promote access to college?
NBER Working Paper No. 5164
.
Ma
,
Jennifer
,
Sandy
Baum
,
Matea
Pender
, and
Meredith
Welch
.
2017
.
Trends in college pricing. Available
https://files.eric.ed.gov/fulltext/ED586395.pdf. Accessed 6 May 2021.
McKinney
,
Lyle
, and
Heather
Novak
.
2015
.
FAFSA filing among first-year college students: Who files on time, who doesn't, and why does it matter?
Research in Higher Education
56
(
1
):
1
28
.
Page
,
Lindsay C.
, and
Judith
Scott-Clayton
.
2016
.
Improving college access in the United States: Barriers and policy responses
.
Economics of Education Review
51
:
4
22
.
Pallais
,
Amanda.
2009
.
Taking a chance on college: Is the Tennessee Education Lottery Scholarship Program a winner?
Journal of Human Resources
44
(
1
):
199
222
.
Perna
,
Laura W.
,
Jeremy
Wright-Kim
, and
Nathan
Jiang
.
2019
.
Questioning the calculations: Are colleges complying with federal and ethical mandates for providing students with estimated costs?
Philadelphia, PA
:
The Alliance for Higher Education and Democracy
.
Stoll
,
Adam
, and
James B.
Stedman
.
2004
.
Federal student aid need analysis: Background and selected simplification issues
. Available https://digital.library.unt.edu/ark:/67531/metacrs7373/m1/1/high_res_d/RL32083_2004Oct04.pdf. Accessed 6 May 2021.

Supplementary data