Abstract

To demonstrate the sequential nature of the college application process, in this paper I analyze the evolution of applications among high-achieving low-income students through data on the exact timing of SAT score sends. I describe at what point students send scores to colleges and which score sends ultimately become applications, resulting in three main points. First, score sends are not synonymous with applications—rather, only 62 percent of score sends in this sample turn into applications. Second, the conversion from score send to application is nonrandom as it relates to college characteristics: Score sends are more likely to convert into applications when they are to colleges with lower tuition, higher graduation rates, and relatively near a student's home. Third, the timing of score sends is related to the probability of its becoming an application, whereby score sends sent relatively early are least likely to become applications. These facts imply that there is room for improvement when modeling the application process and, in addition, the timing of an intervention or policy may be critical to its success.

1.  Introduction

Researchers typically treat the college application portfolio as a simultaneous choice problem, whereby students make a single decision at one point in time—choose the utility-maximizing set of colleges from all possible sets, subject to a budget constraint. In reality, the application portfolio evolves over the course of students’ high school careers, if not earlier, as they gather information about colleges, their own abilities, and their preferences. For example, students may know from a young age they want to apply to their state flagship or a school with a prominent sports team (Pope and Pope 2009) and then they build a portfolio around that single college, only to change the portfolio once they learn their SAT scores (Bond et al. 2018).

This simple example raises two lines of inquiry regarding the sequence of applications and education research and policy. First, assuming some colleges enter the application portfolio earlier than others, how do policies and interventions impact later applications, conditional on there already being some applications in the portfolio? Theoretical and structural models on education policies, such as affirmative action (e.g., Arcidiacono 2005) or financial aid (e.g., Epple, Romano, and Sieg 2006), do not address this issue.1 Because students are often counseled to apply to a “balanced portfolio,” policies may only be effective at changing application behavior if the early applications are of a certain type (e.g., low or high quality, safety or reach). Second, can policies and interventions administered at different times yield different results by virtue of changing the sequence of applications? Hoxby and Turner (2013) show that high-achieving low-income students’ applications can be changed with a low-cost informational intervention. An earlier intervention may change the applications sent the earliest, which, by virtue of the portfolio consideration, may indirectly impact the later applications. Despite the potential valuable information for education policy, we know almost nothing about the sequence of applications and the portfolio building process. This paper starts to fill that knowledge gap.

To demonstrate the sequential nature of the application process, this analysis makes use of SAT score sends. Students who apply to four-year colleges are typically required to send official documentation of their SAT or ACT scores, known as score sends, which have been argued to be good proxies for applications (Card and Krueger 2005; Pallais 2015) and “about 90 percent of first-time, degree-seeking students enrolling at traditional BA/BS granting institutions are either required or recommended to submit official college entrance exam scores with college applications” (Hurwitz et al. 2017, p. 77). The data I use offer two novelties to the often-used score-send data. First, through College Board's administrative data, I observe the exact date the score sends are requested—many of which are as early as junior year in high school and many of which are close to application deadlines (well into senior year). Second, through a survey of high-achieving low-income students, I learn which score sends become applications. Compared with their high-income peers, high-achieving low-income students are less likely to send college applications or enroll in colleges that are commensurate with their academic credentials (Hoxby and Avery 2012; Smith, Pender, and Howell 2013). These students also show improvements in the application process when given an informational and financial intervention (Hoxby and Turner 2013), making them both a disadvantaged population and a population receptive to change (and worthy of further research).2 Combined, these data shed light on the colleges these academically qualified students consider, at what point in time they consider the colleges, and which ones ultimately become applications.

There are three main findings from analyses of these data. First, score sends are not applications. Only 62 percent of score sends in this sample turn into applications.3 They have the advantage of seeing the entire set of score sends and applications to a set of colleges but the disadvantage of not seeing a student's complete set of score sends (or timing). Second, the conversion from score sends to applications is nonrandom as it relates to college characteristics. For example, I find that score sends to colleges with higher graduation rates and lower tuition—two desirable attributes—are more likely to convert into applications, as are those to colleges that are near home, private, and have a larger enrollment. These last few attributes are not objectively desirable, but rather demonstrate students’ preferences, all else equal. These first two findings suggest that past and future research that uses score sends as outcomes should consider how treatments impact the types of scores sends and whether these are likely to convert to applications.

Third, I test whether the sequence of decision making relates to the probability of application conversion. To test this, I construct a theoretical framework for how students sequentially choose applications and estimate two reduced-form models (with different assumptions) that exploit the timing of score sends and College Board score-sending policies that elicit the evolution of students’ application portfolio.4 These data were previously unavailable and highlight student decision making over time. I find that score sends that are free upon SAT registration and those chosen relatively early are not likely to convert into college applications, suggesting that students learn about themselves or colleges in a way that changes their ultimate portfolio.

Finally, I investigate how these variables associate with the probability of enrollment. This is similar in spirit to B. Long (2004), who examines the student and college characteristics that relate to enrollment and how they have changed over the decades. This paper adds to her work with the novel timing variables. I find that free score sends are more likely to convert to enrollment than non-free score sends, despite the fact they are less likely to convert to applications. This result highlights the complicated nature of the application process and suggests that students’ portfolios start with some colleges they highly prefer and are well thought out, but at the same time, include colleges that are not well thought out and never convert into an application.

Combined, these findings contribute to several strands of literature regarding college choice. First, as mentioned, theoretical and structural models of college applications and enrollment could be enhanced to include the sequencing of events to better mirror the behavior observed in this work. Data limitations often preclude this option, but the simplifying assumptions should be noted and considered. Second, this paper relates to the sizeable number of studies that use score sends as proxies for applications in the outcome for policy analysis.5 There is nothing inherently wrong with these studies or the use of score sends in this manner, but, given the nonrandom conversion of score sends to applications by college characteristics and timing, my results imply that previous research should consider which score sends the policy induced (and when) and whether that will ultimately translate into applications. For example, a policy that induces an additional free score send in a student's junior year of high school to a college with a low graduation rate that is far from home is unlikely to convert to an application. Consequently, score sends should more accurately be described as elements of a choice set prior to applications. Finally, this paper follows a long line of research on the college application process. This paper illuminates the sequencing by quantifying what has previously been described in generalities, such as the “college destination” process (Hossler and Gallagher 1987; Radford 2013) or the “application gauntlet” (Klasik 2012). The majority of the vast literature on the application process is about the determinants of applications, ranging from the impacts of rankings (e.g., Monks and Ehrenberg 1999; Luca and Smith 2013; Alter and Reback 2014; Bowman and Bastedo 2009), counseling (see Avery, Howell, and Page 2014 for an overview), geographic remoteness (Hoxby and Avery 2012), affirmative action type policies (e.g., M. Long 2004; Black, Cortes, and Lincove 2015), income (e.g., Toutkoushian 2001; Griffith and Rothstein 2009), general sets of correlates (e.g., Weiler 1994; DesJardins, Dundar, and Hendel 1999), and a host of barriers (Page and Scott-Clayton 2015). The sequencing of application decisions is not central (or even periphery) to any of these studies but it does lie in the subtext. For example, many of these studies rely on a change in policy or information, which occurs at a point in time and consequently impacts students’ applications. But which applications are they impacting—early or late and good or bad fits—and what if the policy occurred at a different time?

The paper proceeds as follows: Section 2 describes the data, which include College Board administrative data on SAT-takers and their score sends, along with the survey of a select sample to determine applications. Section 3 describes the theoretical framework and the corresponding estimation strategies. Section 4 presents the results and section 5 concludes.

2.  Data

The data come from three primary sources: College Board administrative data, a survey of high achieving low-income students conducted by the College Board, and the Integrated Postsecondary Educational Data System (IPEDS).

College Board Data

The main data come from SAT-takers in the graduating high school class of 2014. The College Board administers the SAT, one of the two major college entrance exams in the United States, to approximately 1.7 million students in each high school cohort. In doing so, the College Board maintains a database of students’ SAT scores, which range from 200 to 800 on each of the math and critical reading sections, with a composite score ranging from between 400 and 1600.6 At the time of SAT registration, students complete a questionnaire that includes basic demographics, such as race, gender, parental income and education, along with the student's home zip code.

Students who apply to colleges are frequently required to send official documentation of their SAT scores, known as score sends.7 Between the point of SAT registration and for nine days after taking the exam, students can designate up to four colleges to receive their SAT scores at no additional cost. After that period, additional score sends can be sent at any time for $11.25 each. Low-income students who receive an SAT fee waiver from their guidance counselor are eligible for four additional “flexible” score sends that are free and can be used at any time during high school.8 The data contain the exact colleges to which students send their scores, along with the exact date they request the score send.9 If students take the SAT multiple times, they have the option to send only the most favorable score but I cannot observe this information, just whether at least one score was sent to a college. Finally, I exclude the handful of students who first take the SAT prior to high school. Approximately 98.5 percent of students take the SAT for the first time sometime after their freshman year of high school and excluding the 1.5 percent does not impact future results.

National Student Clearinghouse

The College Board merges their administrative data to the National Student Clearinghouse (NSC). As of 2015, over 3,600 postsecondary institutions participate in NSC, which collects postsecondary enrollment information on more than 98 percent of students enrolled in public and private colleges within the United States. Because of data privacy laws and potential complications with student matching, the actual NSC coverage may be a bit lower than the advertised 98 percent rate (Dynarski, Hemelt, and Hyman 2015). These data provide a near complete view of the colleges in which SAT-takers ultimately enroll.

Application Data: Survey of High-Achieving Low-Income Students

The College Board conducted a survey of high-achieving low-income students (HALIs) at the end of their senior year to learn more about their application and enrollment process. High-achieving students were measured by either PSAT or SAT scores and generally have composite SAT scores of at least 1250.10 Potential low-income students were identified by a proprietary algorithm based on geocoded data that identify students with a high probability of living with families earning less than $40,000 annually. Students were asked to list up to ten colleges to which they applied and also the total number of applications if it exceeded ten.11

The survey completers serve as the analytic sample because I have complete information on score sends, applications, and enrollment. Approximately 20 percent of HALIs were sent the survey and approximately 30 percent of those invited to participate completed the survey, which translates to 6 percent of all HALIs completing the survey.12 The survey design oversampled racial minorities, which is reflected in the summary statistics of survey takers and all HALIs in Appendix table A.1.13 However, survey takers and HALIs are extremely similar on academic credentials and aspirations, including high school grade point average (GPA), SAT scores, number of SAT attempts, and perhaps most critically, number of score sends. It is possible that survey taking is correlated with some of the variables of primary interest, which can induce bias, but data limitations require a selection on observables assumption. In some analyses, I reweight the sample to reflect the true population of SAT-taking HALIs. It is important to note that these data do not represent all SAT takers, who differ substantially from HALIs and the analytic sample, as demonstrated in the rightmost panel of Appendix table A.1. However, the academic characteristics of the HALIs are not that different than those of all high-achieving students, not just low-income students.

Also, approximately 20 percent of applications cannot be matched to a score send. This may occur because the student also took the ACT and sent the ACT score instead of the SAT score or if the student did not send any test scores, perhaps because the college is test-optional. The ACT issue is assuaged in robustness checks on students who certainly submitted the SAT. The latter issue is likely small, because the sample of students are high-achieving, as measured by exam scores, and only students with relatively low scores tend not to submit (Conlin, Dickert-Conlin, and Chapman 2013).

Integrated Postsecondary Educational Data System

The IPEDS includes annual data on over 7,000 colleges in the United States. I use the most recently reported IPEDS data available as of July 2015, which, depending on the variable, are usually lagged a few years.14 For each score send across all students, I append a variety of college-level variables from IPEDS including: (1) average SAT of enrolled students, (2) six-year graduation rate, (3) in-state and out-of-state tuition and fees (listed tuition, not net tuition), (4) public or a flagship state institution, and (5) the first-time full-time enrollment count. In addition, I determine in-state status for and calculate distance to each college among a student's score sends using latitude and longitude of each college relative to student's home ZIP code.

Summary Statistics

Summary statistics for the 1,441 survey respondents are presented in table 1. Almost 52 percent of the students are male and there are near equal proportions of white, black, Hispanic, and Asian students, reflecting the survey design. Because the College Board's survey was administered to high-achieving students, it is not surprising their average high school GPA is close to 4.0 and the average SAT is 1329. Students typically took the SAT twice, which results in an average of almost seven score sends. On average, students report submitting just over four applications, which demonstrates that not all score sends convert into applications. Finally, students send their SAT scores to colleges where the average SAT score of enrolled students is 1278 and where the colleges are an average of 567 miles from home.

Table 1.
Student Summary Statistics
VariableMeanStd. Dev.MinMax
Male 0.518 0.500 
White 0.232 0.423 
Black 0.205 0.404 
Hispanic 0.266 0.442 
Asian 0.269 0.443 
Other race 0.027 0.162 
Northeast 0.248 0.432 
Midwest 0.087 0.282 
South 0.325 0.469 
West 0.335 0.472 
High school GPA 3.892 0.397 1.670 4.330 
SAT score 1,329 93 830 1,600 
SAT attempts 1.987 0.822 
Number of score sends 6.994 4.646 27 
Number of applications 4.353 3.099 20 
Average SAT of score sends (100s) 1,278 109 925 1,508 
Average distance of score sends (100s of miles) 5.666 5.322 0.006127 29.7044 
N = 1,441     
VariableMeanStd. Dev.MinMax
Male 0.518 0.500 
White 0.232 0.423 
Black 0.205 0.404 
Hispanic 0.266 0.442 
Asian 0.269 0.443 
Other race 0.027 0.162 
Northeast 0.248 0.432 
Midwest 0.087 0.282 
South 0.325 0.469 
West 0.335 0.472 
High school GPA 3.892 0.397 1.670 4.330 
SAT score 1,329 93 830 1,600 
SAT attempts 1.987 0.822 
Number of score sends 6.994 4.646 27 
Number of applications 4.353 3.099 20 
Average SAT of score sends (100s) 1,278 109 925 1,508 
Average distance of score sends (100s of miles) 5.666 5.322 0.006127 29.7044 
N = 1,441     

Note: Sample includes high-achieving low-income students who responded to a College Board survey about the application process and sent at least one SAT score send and application to a four-year college.

Summary statistics at the score send level appear in the left panel of table 2. Among the over 10,000 observed score sends, only 62 percent turn into applications. Students send SAT scores to very selective colleges, as evidenced by the high average SAT scores of enrolled students and the graduation rates of these colleges. Many of the score sends are to expensive institutions (at least based on sticker price), but in reality, many of these low-income students would be eligible for generous financial aid packages from these well-resourced institutions. On average, students are sending scores to colleges that are 668 miles from home. Finally, score sends go to in-state public flagships 6.7 percent of the time, in-state public non-flagships 17.6 percent of the time, in-state private colleges 15 percent of the time, and a disproportionate 47.2 percent of the time to out-of-state private colleges. The large fraction going to private colleges demonstrates the options these HALIs have that students with lower scores do not.

Table 2.
Score Send and Application Portfolio Summary Statistics
Obs = Score Send (N = 10,073)Obs = Simulated and Actual Application Portfolio (N = 20,503)
VariableMeanStd. Dev.MinMaxMeanStd. Dev.MinMax
Student action         
Applied 0.623 0.485 0.054 0.225 
Enrolled 0.122 0.327 — — — — 
College attributes         
Average SAT (100s) 13.020 1.460 8.250 15.250 13.186 0.958 9.125 15.250 
Safety 0.312 0.463 0.293 0.271 
Match 0.506 0.500 0.526 0.259 
Reach 0.183 0.387 0.181 0.246 
Six-year graduation rate 80.975 14.978 12.000 100.000 82.596 9.503 26.000 97.000 
Tuition and fees ($1,000s) 31.673 14.241 0.000 49.793 33.340 8.331 0.000 47.246 
Distance from home (100s of miles) 6.682 7.918 0.001 33.389 7.264 5.271 0.011 31.915 
In-state public flagship 0.067 0.250 0.057 0.097 
In-state public non-flagship 0.176 0.381 0.147 0.212 
In-state private 0.150 0.357 0.141 0.185 
Out-of-state public 0.135 0.341 0.135 0.182 
Out-of-state private 0.472 0.499 0.520 0.291 
First-time full-time enrollment (1,000s) 2.582 1.896 0.000 8.393 2.506 1.024 0.096 8.393 
Score send attributes         
Free score send 0.334 0.472 0.327 0.363 
Sent prior to spring junior year (Apr 2013) 0.117 0.321 0.123 0.244 
Sent spring junior year (Apr—Jun 2013) 0.085 0.279 0.078 0.202 
Sent summer prior to senior year 0.021 0.142 0.017 0.097 
(Jul—Aug 2013) 
Sent fall senior year (Sept—Nov 2013) 0.411 0.492 0.403 0.346 
Sent winter senior year 0.327 0.469 0.345 0.341 
(Dec 2013—Jan 2014) 
Sent late senior year (after Jan 2014) 0.040 0.195 0.027 0.105 
Obs = Score Send (N = 10,073)Obs = Simulated and Actual Application Portfolio (N = 20,503)
VariableMeanStd. Dev.MinMaxMeanStd. Dev.MinMax
Student action         
Applied 0.623 0.485 0.054 0.225 
Enrolled 0.122 0.327 — — — — 
College attributes         
Average SAT (100s) 13.020 1.460 8.250 15.250 13.186 0.958 9.125 15.250 
Safety 0.312 0.463 0.293 0.271 
Match 0.506 0.500 0.526 0.259 
Reach 0.183 0.387 0.181 0.246 
Six-year graduation rate 80.975 14.978 12.000 100.000 82.596 9.503 26.000 97.000 
Tuition and fees ($1,000s) 31.673 14.241 0.000 49.793 33.340 8.331 0.000 47.246 
Distance from home (100s of miles) 6.682 7.918 0.001 33.389 7.264 5.271 0.011 31.915 
In-state public flagship 0.067 0.250 0.057 0.097 
In-state public non-flagship 0.176 0.381 0.147 0.212 
In-state private 0.150 0.357 0.141 0.185 
Out-of-state public 0.135 0.341 0.135 0.182 
Out-of-state private 0.472 0.499 0.520 0.291 
First-time full-time enrollment (1,000s) 2.582 1.896 0.000 8.393 2.506 1.024 0.096 8.393 
Score send attributes         
Free score send 0.334 0.472 0.327 0.363 
Sent prior to spring junior year (Apr 2013) 0.117 0.321 0.123 0.244 
Sent spring junior year (Apr—Jun 2013) 0.085 0.279 0.078 0.202 
Sent summer prior to senior year 0.021 0.142 0.017 0.097 
(Jul—Aug 2013) 
Sent fall senior year (Sept—Nov 2013) 0.411 0.492 0.403 0.346 
Sent winter senior year 0.327 0.469 0.345 0.341 
(Dec 2013—Jan 2014) 
Sent late senior year (after Jan 2014) 0.040 0.195 0.027 0.105 

Notes: In the left panel, an observation is a student score send for high-achieving low-income students who responded to a College Board survey and sent at least one score send and application. In the right panel, an observation is the average of the applications characteristics in a simulated portfolio. A simulated portfolio consists of the same number of observed applications but removes one application and substitutes with a student's score send that was not an application. This is done for all possible applications being substituted with all possible score sends that were not applications. Simulated portfolios require at least one score send that was not an application.

Table 2 also presents information on the timing of score sends. Approximately one-third of score sends are “free,” which means they are chosen at the time of an SAT registration and therefore typically are sent earlier than non-free score sends. Because 11.7 percent of score sends were sent prior to the spring of junior year, this suggests numerous students took the SAT relatively early and they started considering colleges well in advance of application deadlines. Another 8.5 percent of score sends are sent in the spring of junior year—the most frequent season these students first took the SAT. Almost no score sends are sent in the following summer and, in contrast, approximately 41 percent of score sends are sent during the fall of one's junior year when almost all these students take the SAT (often for the second time), and some early application deadlines lapse.

Determining if these measures of sequencing matter is an important contribution of this paper and preliminary evidence is presented in figure 1. Time is plotted on the horizontal axis, month by month, and counts of score sends and applications are on the vertical axis. For example, in March 2013, 369 score sends are sent but only 183 of those turned into applications (note that the applications are not sent in March 2013). In contrast, in December 2013, a time when college applications deadlines are looming, we see 2,234 score sends, of which 1,398 convert into applications. Regardless of when students send their scores, some of the score sends do not convert to applications. I will explicitly examine the relationship between time and conversion rates, conditional on a host of covariates, in the next section.
Figure 1.

Score Sends and Applications by Month

Notes: Sample includes high-achieving low-income students from the graduating high school cohort of 2014 who responded to a College Board survey about the application process and sent at least one SAT score send and application to a four-year college. Score sends are sent in the listed months but applications associated with those score sends can be sent at any time.

Figure 1.

Score Sends and Applications by Month

Notes: Sample includes high-achieving low-income students from the graduating high school cohort of 2014 who responded to a College Board survey about the application process and sent at least one SAT score send and application to a four-year college. Score sends are sent in the listed months but applications associated with those score sends can be sent at any time.

3.  Theoretical Framework and Empirical Strategy

I assume students choose a set of colleges to apply to that maximizes their expected utility subject to a budget constraint. Prior to that, students sequentially gather information on colleges and their own ability and preferences. This departure from typical static models is supported by the above descriptive statistics that show how score sends are sent at different time periods. These general ideas are formalized below.

Students start at time t=0 with no information about the set of S colleges in the marketplace. They begin to consider the college application process at time t=1 long before applications are due at time t=T. At each time period t, student i has an information set Ωit where ΩitΩit+1. Let Ωit=[Xit,Zit], which includes a vector of details regarding the student's abilities and preferences, Xit, such as SAT scores, and characteristics of colleges and the market place, Zit, such as the existence of a college.15 The latter is informed by the empirical evidence that students do not have full information on the thousands of colleges across the country (see, e.g., Dillon and Smith 2017).

With the information set Ωit, students form a set of colleges they are considering, Cit. Note that Cit is a vector of colleges and I assume that, as time progresses, the choice set can only grow, such that mathematically CitCit+1CiTS.16 Let the colleges in the choice set at time t be denoted Cit=[sit1,sit2,,sitk,,sitK] such that k indexes each of the K colleges in the set. College k can be described as follows: sk=s(zk,τk) where zk are the characteristics of the college and τk describe when colleges enter the choice set.

The students’ optimization problem is to choose a portfolio of college applications, PiT, at time T, conditional on the contemporaneous information set ΩiT and the choice set CiT, where PiTCiT.

Below, I consider two different optimization problems a student may face, which also dictates the most suitable estimation strategy:

  • Model 1: Students independently choose whether to apply to each college in their choice set.

  • Model 2: Students jointly choose which colleges to apply to from among the colleges in the choice set.

Model 1 rests on the strong assumption that applications are chosen independently of one another. The model has the merits of an easily interpretable estimation strategy and has been used in previous research (e.g., Hoxby and Avery 2012; Black, Cortes, and Lincove 2015). Model 2 relaxes the strong assumption and allows for application decisions to depend on one another, despite being more complicated to estimate and interpret.

Next, I discuss the theoretical framework for each model and the corresponding assumptions, weaknesses, and estimation strategies.

Model 1: Independent Applications

For each college k in the final choice set CiT, student i must decide whether to send an application. An application to the college yields utility:17
U(sikT|ΩiT)=U(zikT|XiT,ZiT,τik).
(1)
Assume a constant marginal cost of an application,18 denoted θ, which includes time and financial costs. With a budget constraint of ωi, this student can only apply to K¯K colleges such that:
θ·K¯ωi.
(2)
Therefore, a student applies to a college if it yields the most utility among all colleges in her choice set CiT, mathematically:
U(sikT|ΩiT)U(sijT|ΩiT)foralljk,
(3)
and if the utility is greater than the marginal cost
U(sikT|ΩiT)θ.
(4)
If the college with the highest utility does not end in an application, the student does not consider applying to the less-preferred colleges. If the student does apply to the most-preferred college, then the student considers the next most preferred college, say, m:
U(sikT|ΩiT)U(simT|ΩiT)U(sijT|ΩiT)foralljkm,
(5)
and applies if equation 4 holds for college m. The process continues for all colleges in the choice set until the budget constraint is binding, the utility of the next preferred application is outweighed by the marginal cost, or there are no more colleges in the choice set.

It should be emphasized that the decision to apply to one college is not dependent on the other colleges in the choice set. The primary advantage of this assumption is the simplification of the estimation strategy and interpretation that I describe below, which is similar to Hoxby and Avery (2012) and Black, Cortes, and Lincove (2015).

Estimation

Assuming linearity, equation 1 can be rewritten as follows:
U(sikT|ΩiT)=α+βzikT+γXiT+ɛik.
(6)
I estimate whether the college gives sufficiently large utility to warrant an application. As is usually the case, I do not observe the latent variable but, rather, I only observe whether the student applies to the college, denoted Aik. Let Aik=1 if the student applies to the college and Aik=0 if not, such that I can estimate the probability of an application using ordinary least squares (OLS) with the following specification:
P(Aik=1|ΩiT)=α+βzikT+γXiT+ɛik.
(7)

In practice, the main specification uses a student fixed effect, such that student invariant characteristics in XiT are excluded other than the interaction of zikT and XiT, such as distance from home or whether the college is in the same state as the student.19 The fixed effects models rely on variation of college characteristics within a student's score-sending portfolio.20 Because students send an average of seven score sends, there is substantial variation in the colleges to exploit.

Model 2: Joint Applications

Student i has K colleges in her choice set and must choose the optimal portfolio of applications from among those colleges. Students must choose a portfolio of applications, PiT, from the choice set, CiT, that maximizes utility, subject to the budget constraint in equation 4. Let the utility of a portfolio be written as
U(PiT|ΩiT)=U(PiT|XiT,ZiTP,τikP),
(8)
where ZiTP and τikP are the college and timing characteristics of the portfolio. Then the maximization problem implied by model 2 is for a student to choose
PiT*suchthatU(PiT*|ΩiT)U(PiT|ΩiT)forallpossiblePiTCiTandθ·K¯ωi.
(9)

Estimation

To estimate the optimal portfolio, I simulate marginally different portfolios of applications that were not chosen. I use the actual and simulated portfolios to estimate the utility of certain characteristics of a portfolio with a revealed preference strategy—students get more utility from the observed portfolio than from the simulated portfolios that were not chosen.21 I start by simulating portfolios of applications with the following algorithm:

  1. Start with the observed application portfolio PiT, which consists of K¯ colleges.

  2. Remove one of the observed applications from the application set PiT.

  3. Take one score send from the choice set CiT that did not become an application and add it to the portfolio from step 2 (that has one fewer application) to create a simulated portfolio PiT' (with the same number of applications as PiT).22

  4. Calculate key statistics on the simulated portfolio PiT', namely, the average characteristics of the colleges in the portfolio (e.g., graduation rate or fraction in-state public flagships).

  5. Repeat steps 3 and 4 for each observed application in the observed portfolio PiT.

  6. Repeat steps 2 through 5 for each score send in the choice set CiT that did not become an application.

To illustrate the algorithm, Appendix table A.2 (available online) shows five hypothetical scenarios observed in the data. The first example shows a student who sent scores to colleges A, B, and C but only applied to college A. This example leads to two simulated portfolios, shown in the last column, whereby we replace college A with college B (the first simulated portfolio) and separately we replace college A with college C (the second simulated portfolio). This produced two new portfolios to compare to the original portfolio (just college A). The second example walks through a scenario with three score sends but two applications, which also produces two simulated portfolios. The next three examples consider a student with four score sends and one, two, and three applications, respectively.

There are several important things to note with this algorithm. First, I only simulate portfolios of the same size as the observed portfolio. This can be justified by assuming that the marginal utility from even the least preferred school far outweighs the marginal cost. This small assumption implies that the budget constraint is always as close to binding as integers will allow and there is no such portfolio with fewer applications that is preferred to any portfolio with more applications. Second, these simulated portfolios are only marginal changes to the observed portfolio, not larger changes that substitute multiple applications with different score sends.

Finally, in step 4, I create key statistics about the simulated (and observed) portfolios that mimic the attributes of a single application, as the right panel of table 2 demonstrates. Each observation is the (unweighted) average across colleges in the portfolio. Notice that these statistics mirror the left panel because the simulated portfolios are small perturbations from the observed portfolio. The big difference is that only 5.4 percent of the portfolios are applied to, which implies that each student is associated with approximately 20 portfolios (one observed and the rest simulated).

With the simulated and observed portfolio in hand and the maximization problem in equation 9, I once again can assume linearity in equation 8 such that it can be rewritten as follows:
U(PiT|XiT,ZiTP,τikP)=αP+βPZiTP+γPXiT+ɛikP.
(10)
Let AikP=1 if the student applies to college k in portfolio P and AikP=0 if not, such that I can estimate the probability of applying to an application portfolio using OLS with the following specification:23
P(AikP=1|ΩiT)=αP+βPZiTP+γPXiT+ɛikP.
(11)

In practice, the main specification uses a student fixed effect, such that student invariant characteristics in XiT are excluded. Again, with the fixed effect models, identification comes from variation of potential portfolios within a student's set of portfolios.

Timing

To this point, the sequencing and timing of score sends have not played a role in the estimation. Students’ preferences and college characteristics can change as time evolves but at the application deadline T, these weights and variables are set. This implies that colleges that enter the choice set in the early periods may eventually be valued differently at time T or, more formally, U(sikt|Ωit)U(sikT|ΩiT), and similarly U(Pit|Ωit)U(PiT|ΩiT).24

I formally test the idea that score sends (or portfolios) that are sent at different times have different probabilities of becoming applications. By doing so, I provide evidence that students preferences, information, and/or decision making change as time evolves. To do so, I take a simplistic reduced-form approach and include measures of timing (τik, which include a set of dummies for the month/season of score send and whether the score send is free) in the regressions. Specifically, for model 1, I estimate:
P(Aik=1|ΩiT)=α+βzikT+γXiT+τik+ɛik.
(12)
And for model 2, I estimate:
P(AikP=1|ΩiT)=αP+βPZiTP+γPXiT+τikP+ɛikP.
(13)

These specifications test whether the timing of score sends is related to the probability of application and, therefore, whether a student's application process is in part sequential. It is important to note that equations 12 and 13 are reduced-form estimates of the model and the timing variables (τik) do not actually enter students’ utility.

4.  Results

Model 1: Independent Applications

The primary results from model 1 (independent applications) are in table 3. The outcome is whether the student applies to the college and the first column uses OLS to estimate equation 7. The student characteristics (XiT) are the same as those in table 1: gender, ethnicity, student SAT score and number of attempts, and high school GPA. The college characteristics (zikT) are the same as in table 2: average SAT score, six-year graduation rate, in-state tuition, distance from home (and its square), and mutually exclusive dummies for being an in-state public flagship, in-state public non-flagship, in-state private, or out-of-state public (the omitted variable is out-of-state private).25 Robust standard errors are used.

Table 3.
Score Sends to Application Conversion: Model 1 (Independent Applications)
Ordinary Least SquaresStudent Fixed Effects
(1)(2)(3)(4)(5)(6)(7)(8)
College attributes         
Average SAT (100s) −0.0122 −0.0073 −0.0061 −0.0053 −0.0076 −0.0052 −0.0064 —0.0057 
 (0.0076) (0.0076) (0.0076) (0.0076) (0.0089) (0.0088) (0.0088) (0.0087) 
Six-year graduation rate 0.0030*** 0.0029*** 0.0028*** 0.0027*** 0.0028*** 0.0029*** 0.0030*** 0.0030*** 
 (0.0007) (0.0007) (0.0007) (0.0007) (0.0008) (0.0008) (0.0008) (0.0008) 
Tuition and fees ($1,000s) −0.0029** −0.0037*** −0.0040*** −0.0040*** −0.0026* −0.0028* −0.0030** —0.0030** 
 (0.0012) (0.0012) (0.0012) (0.0012) (0.0014) (0.0014) (0.0014) (0.0014) 
Distance from home (100s of miles) −0.0113*** −0.0105*** −0.0119*** −0.0116*** −0.0090** −0.0096*** −0.0103*** —0.0102*** 
 (0.0028) (0.0028) (0.0028) (0.0028) (0.0037) (0.0037) (0.0036) (0.0036) 
(Distance from home)2 0.0026** 0.0023** 0.0028*** 0.0027*** 0.0021 0.0023* 0.0026* 0.0026* 
 (0.0011) (0.0011) (0.0010) (0.0010) (0.0014) (0.0013) (0.0013) (0.0013) 
In-state public flagship 0.0206 0.0148 0.0060 0.0059 0.0397 0.0412 0.0364 0.0364 
 (0.0441) (0.0438) (0.0437) (0.0437) (0.0523) (0.0523) (0.0521) (0.0522) 
In-state public non-flagship −0.0082 −0.0205 −0.0306 −0.0313 0.0327 0.0309 0.0260 0.0259 
 (0.0391) (0.0388) (0.0387) (0.0387) (0.0474) (0.0474) (0.0471) (0.0472) 
In-state private 0.0234 0.0313* 0.0303* 0.0311* 0.0500** 0.0534*** 0.0544*** 0.0549*** 
 (0.0173) (0.0173) (0.0172) (0.0172) (0.0199) (0.0198) (0.0198) (0.0198) 
Out-of-state public −0.1084*** −0.1136*** −0.1171*** −0.1179*** −0.1225*** −0.1226*** −0.1289*** —0.1292*** 
 (0.0235) (0.0234) (0.0232) (0.0232) (0.0271) (0.0271) (0.0268) (0.0269) 
First-time full-time enrollment (1,000s) 0.0145*** 0.0148*** 0.0153*** 0.0150*** 0.0172*** 0.0177*** 0.0177*** 0.0176*** 
 (0.0035) (0.0035) (0.0035) (0.0035) (0.0041) (0.0041) (0.0041) (0.0041) 
Score send attributes         
Free score send — −0.1176*** — −0.0378*** — −0.1111*** — —0.0584** 
 — (0.0103) — (0.0137) — (0.0192) — (0.0234) 
Sent prior to spring junior year (Apr 2013) — — −0.2005*** −0.1653*** — — −0.1309*** —0.0791* 
 — — (0.0276) (0.0304) — — (0.0394) (0.0447) 
Sent spring junior year (Apr—Jun 2013) — — −0.1958*** −0.1620*** — — −0.1219*** —0.0747 
 — — (0.0289) (0.0314) — — (0.0429) (0.0465) 
Sent summer prior to senior year (Jul—Aug 2013) — — −0.0472 −0.0343 — — −0.0002 0.0202 
 — — (0.0397) (0.0400) — — (0.0546) (0.0551) 
Sent fall senior year (Sept—Nov 2013) — — 0.0024 0.0125 — — 0.0598* 0.0762** 
 — — (0.0247) (0.0249) — — (0.0338) (0.0343) 
Sent winter senior year (Dec 2013—Jan 2014) — — −0.0298 −0.0281 — — 0.0159 0.0197 
 — — (0.0252) (0.0252) — — (0.0347) (0.0346) 
Student characteristic controls Yes Yes Yes Yes No No No No 
Observations 10,073 10,073 10,073 10,073 10,073 10,073 10,073 10,073 
R2 0.037 0.049 0.060 0.061 0.034 0.040 0.045 0.046 
Ordinary Least SquaresStudent Fixed Effects
(1)(2)(3)(4)(5)(6)(7)(8)
College attributes         
Average SAT (100s) −0.0122 −0.0073 −0.0061 −0.0053 −0.0076 −0.0052 −0.0064 —0.0057 
 (0.0076) (0.0076) (0.0076) (0.0076) (0.0089) (0.0088) (0.0088) (0.0087) 
Six-year graduation rate 0.0030*** 0.0029*** 0.0028*** 0.0027*** 0.0028*** 0.0029*** 0.0030*** 0.0030*** 
 (0.0007) (0.0007) (0.0007) (0.0007) (0.0008) (0.0008) (0.0008) (0.0008) 
Tuition and fees ($1,000s) −0.0029** −0.0037*** −0.0040*** −0.0040*** −0.0026* −0.0028* −0.0030** —0.0030** 
 (0.0012) (0.0012) (0.0012) (0.0012) (0.0014) (0.0014) (0.0014) (0.0014) 
Distance from home (100s of miles) −0.0113*** −0.0105*** −0.0119*** −0.0116*** −0.0090** −0.0096*** −0.0103*** —0.0102*** 
 (0.0028) (0.0028) (0.0028) (0.0028) (0.0037) (0.0037) (0.0036) (0.0036) 
(Distance from home)2 0.0026** 0.0023** 0.0028*** 0.0027*** 0.0021 0.0023* 0.0026* 0.0026* 
 (0.0011) (0.0011) (0.0010) (0.0010) (0.0014) (0.0013) (0.0013) (0.0013) 
In-state public flagship 0.0206 0.0148 0.0060 0.0059 0.0397 0.0412 0.0364 0.0364 
 (0.0441) (0.0438) (0.0437) (0.0437) (0.0523) (0.0523) (0.0521) (0.0522) 
In-state public non-flagship −0.0082 −0.0205 −0.0306 −0.0313 0.0327 0.0309 0.0260 0.0259 
 (0.0391) (0.0388) (0.0387) (0.0387) (0.0474) (0.0474) (0.0471) (0.0472) 
In-state private 0.0234 0.0313* 0.0303* 0.0311* 0.0500** 0.0534*** 0.0544*** 0.0549*** 
 (0.0173) (0.0173) (0.0172) (0.0172) (0.0199) (0.0198) (0.0198) (0.0198) 
Out-of-state public −0.1084*** −0.1136*** −0.1171*** −0.1179*** −0.1225*** −0.1226*** −0.1289*** —0.1292*** 
 (0.0235) (0.0234) (0.0232) (0.0232) (0.0271) (0.0271) (0.0268) (0.0269) 
First-time full-time enrollment (1,000s) 0.0145*** 0.0148*** 0.0153*** 0.0150*** 0.0172*** 0.0177*** 0.0177*** 0.0176*** 
 (0.0035) (0.0035) (0.0035) (0.0035) (0.0041) (0.0041) (0.0041) (0.0041) 
Score send attributes         
Free score send — −0.1176*** — −0.0378*** — −0.1111*** — —0.0584** 
 — (0.0103) — (0.0137) — (0.0192) — (0.0234) 
Sent prior to spring junior year (Apr 2013) — — −0.2005*** −0.1653*** — — −0.1309*** —0.0791* 
 — — (0.0276) (0.0304) — — (0.0394) (0.0447) 
Sent spring junior year (Apr—Jun 2013) — — −0.1958*** −0.1620*** — — −0.1219*** —0.0747 
 — — (0.0289) (0.0314) — — (0.0429) (0.0465) 
Sent summer prior to senior year (Jul—Aug 2013) — — −0.0472 −0.0343 — — −0.0002 0.0202 
 — — (0.0397) (0.0400) — — (0.0546) (0.0551) 
Sent fall senior year (Sept—Nov 2013) — — 0.0024 0.0125 — — 0.0598* 0.0762** 
 — — (0.0247) (0.0249) — — (0.0338) (0.0343) 
Sent winter senior year (Dec 2013—Jan 2014) — — −0.0298 −0.0281 — — 0.0159 0.0197 
 — — (0.0252) (0.0252) — — (0.0347) (0.0346) 
Student characteristic controls Yes Yes Yes Yes No No No No 
Observations 10,073 10,073 10,073 10,073 10,073 10,073 10,073 10,073 
R2 0.037 0.049 0.060 0.061 0.034 0.040 0.045 0.046 

Notes: The unit of observation is a score send. Standard errors are presented in parentheses. Student characteristic controls include sex and ethnicity dummies, student SAT scores and number of SAT attempts, and high school grade point average. Additional controls in all regressions include dummies for missing values or college average SAT scores, graduation rate, tuition and fees, and distance from home. Score sends are observed in administrative data, and applications come from survey data.

***p < 0.01; **p < 0.05; *p < 0.1.

The coefficient on average SAT score is not a good predictor of whether a student will apply but six-year graduation rates are positively related. A 10-percentage point increase in graduation rate translates into a 3-percentage point increase in the probability of applying, conditional on sending a score to that college. Because average SAT scores are perhaps the most visible characteristic of a college, especially early in the decision-making process, perhaps students only learn about graduation rates after a more thoughtful investigation into which colleges to apply, conditional on sending a score. Restated, students initially know average SAT scores but not graduation rates. Alternatively, the impact of average SAT on conversion may not be linear, especially given the qualitative and nonlinear application advice around “safety,” “match,” and “reach” colleges, which we explore later in this section. As expected, higher tuition and more distance translate into lower probabilities of applying to the college. The quadratic term on distance is positive, implying some nonlinearities, but the relationship between the probability of applying and distance is strictly increasing at all possible values of distance. Relative to an out-of-state private college, out-of-state public colleges are 10.8 percentage points less likely to convert into applications. Approximately 45 percent of the students send scores to in-state flagships, a popular choice for these HALIs, but, given their high achievement, they are not more likely to apply to a flagship than out-of-state private colleges, where they may receive generous financial aid offerings. Alternatively, flagship status may not be appealing over and above graduation rates and average SAT scores. Also, larger colleges attract more applications.

The second column of table 3 adds on the timing variable “free score send” and corresponds to equation 12. If the score send were free, which means it typically occurred earlier than many non-free score sends and prior to when students know their SAT score, there is an 11.76 percentage point lower probability of applying to that college. This is consistent with Bond et al. (2018), who find that students adjust their score sends in response to new information about their academic ability, such as receiving SAT scores. Column 3 shows the result of equation 10 that only includes the set of dummies for month/season of score sends. Relative to score sends sent after January of senior year, the earliest score sends are approximately 20 percentage points less likely to convert to applications. However, conditional on all the other college characteristics, there is no difference in the probability of application for score sends that are sent between the summer prior to one's senior year and the latest of score sends after January of senior year.

Lastly, in column 4, I include both timing variables, which are correlated with one another. The coefficient on free score sends drops dramatically to −0.0378 but is still statistically different than zero. Even conditional on score sends being free, the earliest of score sends are still very unlikely to convert to applications compared with later score sends.

The next four columns are the preferred specification for model 1 as it includes a student fixed effect. Therefore, identification comes from variation in the college characteristics and timing within a student's choice set. There are few qualitative differences relative to the previously described results that simply control for student characteristics. In fact, there are few quantitative differences with the previously described results as it relates to college characteristics, suggesting that student unobservables may not be correlated with the college characteristics of their score sends or perhaps the subset of HALIs have similar preferences. However, we do see some differences in the magnitude of the coefficients on the timing of score sends when including student fixed effects, which indicates that students do send scores at different times from one another.

Joint Applications

Table 4 is analogous to table 3 but includes the estimates for model 2, where students jointly choose their applications. For estimation, the unit of observation is a portfolio of colleges and so the variables included are the average characteristics of the portfolio, not individual institutions. The outcome is whether the student applies to that portfolio, which occurs exactly once for each student.

Table 4.
Score Sends to Application Conversion: Model 2 (Joint Applications)
Ordinary Least SquaresStudent Fixed Effects
(1)(2)(3)(4)(5)(6)(7)(8)
College attributes         
Average SAT (100s) 0.0016 0.0016 0.0026 0.0018 −0.0345* −0.0272 −0.0263 −0.0256 
 (0.0067) (0.0067) (0.0068) (0.0068) (0.0188) (0.0189) (0.0189) (0.0189) 
Six-year graduation rate 0.0004 0.0004 0.0005 0.0005 0.0046** 0.0048*** 0.0050*** 0.0050*** 
 (0.0008) (0.0008) (0.0008) (0.0008) (0.0018) (0.0018) (0.0018) (0.0018) 
Tuition and fees ($1,000s) −0.0034*** −0.0034*** −0.0037*** −0.0037*** −0.0040 −0.0042 −0.0048 −0.0048 
 (0.0012) (0.0012) (0.0012) (0.0012) (0.0031) (0.0031) (0.0031) (0.0031) 
Distance from home (100s of miles) −0.0050*** −0.0050*** −0.0052*** −0.0055*** −0.0193** −0.0210*** −0.0259*** −0.0255*** 
 (0.0014) (0.0014) (0.0014) (0.0015) (0.0081) (0.0079) (0.0080) (0.0080) 
(Distance from home)2 0.0001*** 0.0001*** 0.0002*** 0.0002*** 0.0006** 0.0006** 0.0008*** 0.0008*** 
 (0.0001) (0.0001) (0.0001) (0.0001) (0.0003) (0.0003) (0.0003) (0.0003) 
In-state public flagship 0.1315** 0.1316** 0.1266** 0.1251** 0.1097 0.1026 0.0953 0.0949 
 (0.0511) (0.0511) (0.0516) (0.0516) (0.1166) (0.1184) (0.1190) (0.1192) 
In-state public non-flagship −0.0426 −0.0428 −0.0509 −0.0501 0.0438 0.0323 0.0165 0.0167 
 (0.0373) (0.0374) (0.0376) (0.0377) (0.1012) (0.1024) (0.1036) (0.1036) 
In-state private 0.0100 0.0101 0.0127 0.0116 0.0455 0.0492 0.0373 0.0389 
 (0.0143) (0.0143) (0.0144) (0.0144) (0.0458) (0.0452) (0.0444) (0.0444) 
Out-of-state public −0.0904*** −0.0904*** −0.0931*** −0.0930*** −0.2615*** −0.2682*** −0.2736*** −0.2748*** 
 (0.0215) (0.0215) (0.0217) (0.0217) (0.0572) (0.0587) (0.0593) (0.0594) 
First-time full-time enrollment (1,000s) 0.0121*** 0.0120*** 0.0123*** 0.0127*** 0.0273*** 0.0294*** 0.0315*** 0.0314*** 
 (0.0037) (0.0038) (0.0038) (0.0038) (0.0098) (0.0099) (0.0100) (0.0100) 
Score send attributes         
Free score send — −0.0007 — 0.0153* — −0.2587*** — −0.0600 
 — (0.0054) — (0.0078) — (0.0363) — (0.0453) 
Sent prior to spring junior year (Apr 2013) — — 0.0180 0.0153 — — −0.0215 −0.0197 
 — — (0.0231) (0.0232) — — (0.1846) (0.1855) 
Sent spring junior year (Apr—Jun 2013) — — 0.0323 0.0292 — — −0.0051 −0.0104 
 — — (0.0251) (0.0253) — — (0.1883) (0.1893) 
Sent summer prior to senior year (Jul—Aug 2013) — — 0.0408 0.0475 — — 0.2572 0.2209 
 — — (0.0305) (0.0306) — — (0.2216) (0.2260) 
Sent fall senior year (Sept—Nov 2013) — — 0.0389* 0.0468** — — 0.3425* 0.3080 
 — — (0.0222) (0.0225) — — (0.1855) (0.1893) 
Sent winter senior year (Dec 2013—Jan 2014) — — 0.0336 0.0446* — — 0.3875** 0.3384* 
 — — (0.0227) (0.0232) — — (0.1856) (0.1914) 
Sent late senior year (after Jan 2014) — — 0.1081*** 0.1188*** — — 0.4277** 0.3758* 
 — — (0.0313) (0.0316) — — (0.2042) (0.2112) 
Student characteristic controls Yes Yes Yes Yes No No No No 
Observations 20,503 20,503 20,503 20,503 20,503 20,503 20,503 20,503 
R2 0.040 0.040 0.041 0.041 0.033 0.041 0.048 0.048 
Ordinary Least SquaresStudent Fixed Effects
(1)(2)(3)(4)(5)(6)(7)(8)
College attributes         
Average SAT (100s) 0.0016 0.0016 0.0026 0.0018 −0.0345* −0.0272 −0.0263 −0.0256 
 (0.0067) (0.0067) (0.0068) (0.0068) (0.0188) (0.0189) (0.0189) (0.0189) 
Six-year graduation rate 0.0004 0.0004 0.0005 0.0005 0.0046** 0.0048*** 0.0050*** 0.0050*** 
 (0.0008) (0.0008) (0.0008) (0.0008) (0.0018) (0.0018) (0.0018) (0.0018) 
Tuition and fees ($1,000s) −0.0034*** −0.0034*** −0.0037*** −0.0037*** −0.0040 −0.0042 −0.0048 −0.0048 
 (0.0012) (0.0012) (0.0012) (0.0012) (0.0031) (0.0031) (0.0031) (0.0031) 
Distance from home (100s of miles) −0.0050*** −0.0050*** −0.0052*** −0.0055*** −0.0193** −0.0210*** −0.0259*** −0.0255*** 
 (0.0014) (0.0014) (0.0014) (0.0015) (0.0081) (0.0079) (0.0080) (0.0080) 
(Distance from home)2 0.0001*** 0.0001*** 0.0002*** 0.0002*** 0.0006** 0.0006** 0.0008*** 0.0008*** 
 (0.0001) (0.0001) (0.0001) (0.0001) (0.0003) (0.0003) (0.0003) (0.0003) 
In-state public flagship 0.1315** 0.1316** 0.1266** 0.1251** 0.1097 0.1026 0.0953 0.0949 
 (0.0511) (0.0511) (0.0516) (0.0516) (0.1166) (0.1184) (0.1190) (0.1192) 
In-state public non-flagship −0.0426 −0.0428 −0.0509 −0.0501 0.0438 0.0323 0.0165 0.0167 
 (0.0373) (0.0374) (0.0376) (0.0377) (0.1012) (0.1024) (0.1036) (0.1036) 
In-state private 0.0100 0.0101 0.0127 0.0116 0.0455 0.0492 0.0373 0.0389 
 (0.0143) (0.0143) (0.0144) (0.0144) (0.0458) (0.0452) (0.0444) (0.0444) 
Out-of-state public −0.0904*** −0.0904*** −0.0931*** −0.0930*** −0.2615*** −0.2682*** −0.2736*** −0.2748*** 
 (0.0215) (0.0215) (0.0217) (0.0217) (0.0572) (0.0587) (0.0593) (0.0594) 
First-time full-time enrollment (1,000s) 0.0121*** 0.0120*** 0.0123*** 0.0127*** 0.0273*** 0.0294*** 0.0315*** 0.0314*** 
 (0.0037) (0.0038) (0.0038) (0.0038) (0.0098) (0.0099) (0.0100) (0.0100) 
Score send attributes         
Free score send — −0.0007 — 0.0153* — −0.2587*** — −0.0600 
 — (0.0054) — (0.0078) — (0.0363) — (0.0453) 
Sent prior to spring junior year (Apr 2013) — — 0.0180 0.0153 — — −0.0215 −0.0197 
 — — (0.0231) (0.0232) — — (0.1846) (0.1855) 
Sent spring junior year (Apr—Jun 2013) — — 0.0323 0.0292 — — −0.0051 −0.0104 
 — — (0.0251) (0.0253) — — (0.1883) (0.1893) 
Sent summer prior to senior year (Jul—Aug 2013) — — 0.0408 0.0475 — — 0.2572 0.2209 
 — — (0.0305) (0.0306) — — (0.2216) (0.2260) 
Sent fall senior year (Sept—Nov 2013) — — 0.0389* 0.0468** — — 0.3425* 0.3080 
 — — (0.0222) (0.0225) — — (0.1855) (0.1893) 
Sent winter senior year (Dec 2013—Jan 2014) — — 0.0336 0.0446* — — 0.3875** 0.3384* 
 — — (0.0227) (0.0232) — — (0.1856) (0.1914) 
Sent late senior year (after Jan 2014) — — 0.1081*** 0.1188*** — — 0.4277** 0.3758* 
 — — (0.0313) (0.0316) — — (0.2042) (0.2112) 
Student characteristic controls Yes Yes Yes Yes No No No No 
Observations 20,503 20,503 20,503 20,503 20,503 20,503 20,503 20,503 
R2 0.040 0.040 0.041 0.041 0.033 0.041 0.048 0.048 

Notes: The unit of observation is a simulated application portfolio, which have same number of applications of actual portfolio but substitutes observed score sends that did not become an application. Standard errors are presented in parentheses. Student characteristic controls include sex and ethnicity dummies, student SAT scores and number of SAT attempts, and high school GPA. Additional controls in all regressions include dummies for missing values or college average SAT score, graduation rate, tuition and fees, and distance from home. Score sends are observed in administrative data, and applications come from survey data.

***p < 0.01; **p < 0.05; *p < 0.1.

In the first column of model 2 results, which corresponds to equation 1, there are mostly similarities and few differences relative to model 1 results. For example, similar to model 1, there is no statistical association between average SAT scores of the college portfolios and the probability of applying to the portfolio—however, unlike model 1, there is no statistical relationship with graduation rates when using OLS. The positive relationship between graduation rates and score sends converting to applications reappears in the preferred specification with student fixed effects (column 5). Also, if the average tuition of a portfolio increases by $1,000, students are 0.34 percent less likely to apply to that portfolio, all else equal. As in the previous model, there is also a strong negative relationship between distance from home and application conversion. Portfolios with public flagships are more likely to receive applications, although this is not robust to including student fixed effects. Portfolios with relatively more out-of-state public colleges are not likely to be chosen, just as in model 1.

Keep in mind that the variables in this model use the average of the entire portfolio, not a single application. Most portfolios have about four applications and so one can think about how students value individual colleges. For example, increasing the average number of out-of-state public colleges in the portfolio by, say, 25 percentage points, occurs if an average-sized portfolio has one additional out-of-state public college. The coefficient of −0.09 implies that adding the additional college to the portfolio and increasing the average number of out-of-state publics by 25 percentage points translates to a 2.25 percentage point decrease (−0.09 × 25) in choosing that portfolio.

In column 6, I introduce the timing variable “free score send” such that the specification corresponds to equation 13. Portfolios that consist of more free score sends are less likely to be chosen. At the extreme, a portfolio with all free score sends compared with one with no free score sends would be 26 percent less likely to be chosen, all else equal. Column 7 includes the set of month/season variables. In the portfolio model, these are not actually dummies but measures of the percent of score sends with the underlying dummies and hence are not mutually exclusive. The coefficients become increasingly large as time goes on, which is consistent with the previous results that early score sends (and portfolios with early score sends) are unlikely to convert to applications (or be chosen). The last column of the table includes all the timing variables and, as before, the coefficient on free score sends drops in magnitude (and out of statistical significance) but there are clear differences in the coefficients on early score sends versus late score sends that are statistically different from one another (even if not different from zero).

Robustness Tests

I perform several robustness tests on each model's preferred specification from tables 3 and 4, which include student fixed effects and all measures of timing. The results of these tests are in table 5.

Table 5.
Score Sends to Application Conversion: Robustness Tests
Model 1: Score Sends
(1) At Least 2 Applications(2) Has Score Send After Last SAT(3) Fewer than 10 Applications on Survey(4) No SAT Fee Waiver(5) Using Sample Weights
College attributes      
Average SAT (100s) −0.0043 −0.0108 −0.0152 −0.0114 0.0038 
 (0.0091) (0.0092) (0.0098) (0.0093) (0.0107) 
Six-year graduation rate 0.0029*** 0.0035*** 0.0027*** 0.0035*** 0.0025** 
 (0.0009) (0.0009) (0.0009) (0.0009) (0.0010) 
Tuition and fees ($1,000s) −0.0030** −0.0033** −0.0025 −0.0033** —0.0036** 
 (0.0015) (0.0015) (0.0016) (0.0016) (0.0016) 
Distance from home (100s of miles) −0.0084** −0.0075** −0.0080* −0.0083** —0.0091** 
 (0.0038) (0.0038) (0.0041) (0.0039) (0.0042) 
(Distance from home)2 0.0020 0.0016 0.0018 0.0021 0.0022 
 (0.0014) (0.0014) (0.0015) (0.0015) (0.0016) 
In-state public flagship 0.0301 0.0200 0.0603 0.0133 0.0200 
 (0.0536) (0.0543) (0.0563) (0.0579) (0.0592) 
In-state public non-flagship 0.0267 0.0018 0.0315 0.0171 —0.0077 
 (0.0488) (0.0494) (0.0513) (0.0530) (0.0528) 
In-state private 0.0572*** 0.0548*** 0.0428* 0.0584*** 0.0573** 
 (0.0203) (0.0205) (0.0225) (0.0212) (0.0224) 
Out-of-state Public −0.1255*** −0.1322*** −0.1406*** −0.1422*** —0.1232*** 
 (0.0279) (0.0277) (0.0295) (0.0295) (0.0304) 
First-time full-time enrollment (1,000s) 0.0175*** 0.0174*** 0.0153*** 0.0166*** 0.0199*** 
 (0.0042) (0.0042) (0.0046) (0.0044) (0.0047) 
Score send attributes      
Free score send −0.0594** −0.0621*** −0.0737*** −0.0607** —0.0590** 
 (0.0238) (0.0239) (0.0264) (0.0272) (0.0266) 
Sent prior to spring junior year (Apr 2013) −0.0888* −0.0848* −0.1012** −0.0894* —0.0848* 
 (0.0454) (0.0460) (0.0513) (0.0523) (0.0502) 
Sent spring junior year (Apr—Jun 2013) −0.0794* −0.0786 −0.1148** −0.1102** —0.0271 
 (0.0472) (0.0482) (0.0515) (0.0552) (0.0532) 
Sent summer prior to senior year (Jul—Aug 2013) 0.0114 0.0158 0.0427 0.0103 0.0103 
 (0.0546) (0.0573) (0.0619) (0.0608) (0.0671) 
Sent fall senior year (Sept—Nov 2013) 0.0690** 0.0770** 0.0418 0.0601 0.0788** 
 (0.0346) (0.0344) (0.0400) (0.0390) (0.0396) 
Sent winter senior year (Dec 2013—Jan 2014) 0.0128 0.0178 0.0036 −0.0019 0.0480 
 (0.0350) (0.0347) (0.0399) (0.0394) (0.0409) 
Sent late senior year (after Jan 2014) — — — — — 
 — — — —  
Observations 9,377 9,181 7,799 8,772 10,073 
R2 0.044 0.045 0.052 0.045 1,441 
 Model 2: Simulated Application Portfolios 
 (6) At Least 2 Applications (7) Has Score Send After Last SAT (8) Fewer than 10 Applications on Survey (9) No SAT Fee Waiver (10) Using Sample Weights 
College attributes      
Average SAT (100s) −0.0301 −0.0416** −0.0323 −0.0365* —0.0161 
 (0.0236) (0.0191) (0.0208) (0.0209) (0.0238) 
Six-year graduation rate 0.0053** 0.0062*** 0.0048** 0.0054*** 0.0047** 
 (0.0023) (0.0020) (0.0020) (0.0020) (0.0023) 
Tuition and fees ($1,000s) −0.0053 −0.0047 −0.0044 −0.0069** —0.0038 
 (0.0037) (0.0033) (0.0033) (0.0035) (0.0036) 
Distance from home (100s of miles) −0.0121 −0.0167* −0.0255*** −0.0237*** —0.0174* 
 (0.0096) (0.0085) (0.0089) (0.0088) (0.0093) 
(Distance from home)2 0.0003 0.0005 0.0008*** 0.0008** 0.0005* 
 (0.0004) (0.0003) (0.0003) (0.0003) (0.0003) 
In-state public flagship 0.1136 0.1415 0.1191 −0.0043 0.1013 
 (0.1281) (0.1275) (0.1286) (0.1372) (0.1278) 
In-state public non-flagship 0.0456 −0.0192 0.0298 −0.0609 0.0099 
 (0.1171) (0.1104) (0.1123) (0.1170) (0.1126) 
In-state private 0.0834 0.0513 0.0311 0.0480 0.0461 
 (0.0519) (0.0467) (0.0498) (0.0496) (0.0510) 
Out-of-state public −0.3416*** −0.2817*** −0.2879*** −0.3419*** —0.2638*** 
 (0.0703) (0.0621) (0.0652) (0.0656) (0.0670) 
First-time full-time enrollment (1,000s) 0.0419*** 0.0306*** 0.0310*** 0.0340*** 0.0400*** 
 (0.0109) (0.0107) (0.0110) (0.0110) (0.0113) 
Score send attributes      
Free score send −0.0752* −0.0323 −0.0621 −0.0738 —0.0555 
 (0.0450) (0.0479) (0.0519) (0.0572) (0.0484) 
Sent prior to spring junior year (Apr 2013) 0.1282 0.0014 −0.0438 −0.0104 —0.0092 
 (0.2720) (0.1598) (0.1966) (0.1924) (0.1698) 
Sent spring junior year (Apr—Jun 2013) 0.1936 −0.0566 −0.0531 −0.0349 0.0567 
 (0.2716) (0.1591) (0.2013) (0.1984) (0.1753) 
Sent summer prior to senior year (Jul—Aug 2013) 0.3666 0.2491 0.2424 0.2188 0.2499 
 (0.2845) (0.1988) (0.2412) (0.2412) (0.2243) 
Sent fall senior year (Sept—Nov 2013) 0.5397** 0.3325** 0.2818 0.3042 0.3104* 
 (0.2733) (0.1596) (0.2019) (0.1984) (0.1731) 
Sent winter senior year (Dec 2013—Jan 2014) 0.6113** 0.3623** 0.2997 0.3177 0.3663** 
 (0.2740) (0.1620) (0.2051) (0.2013) (0.1764) 
Sent late senior year (after Jan 2014) 0.7704*** 0.4146** 0.3450 0.4733** 0.3713* 
 (0.2817) (0.1826) (0.2294) (0.2218) (0.2048) 
Observations 19,918 19,177 13,062 17,756 20,503 
R2 0.045 0.045 0.055 0.047 0.041 
Model 1: Score Sends
(1) At Least 2 Applications(2) Has Score Send After Last SAT(3) Fewer than 10 Applications on Survey(4) No SAT Fee Waiver(5) Using Sample Weights
College attributes      
Average SAT (100s) −0.0043 −0.0108 −0.0152 −0.0114 0.0038 
 (0.0091) (0.0092) (0.0098) (0.0093) (0.0107) 
Six-year graduation rate 0.0029*** 0.0035*** 0.0027*** 0.0035*** 0.0025** 
 (0.0009) (0.0009) (0.0009) (0.0009) (0.0010) 
Tuition and fees ($1,000s) −0.0030** −0.0033** −0.0025 −0.0033** —0.0036** 
 (0.0015) (0.0015) (0.0016) (0.0016) (0.0016) 
Distance from home (100s of miles) −0.0084** −0.0075** −0.0080* −0.0083** —0.0091** 
 (0.0038) (0.0038) (0.0041) (0.0039) (0.0042) 
(Distance from home)2 0.0020 0.0016 0.0018 0.0021 0.0022 
 (0.0014) (0.0014) (0.0015) (0.0015) (0.0016) 
In-state public flagship 0.0301 0.0200 0.0603 0.0133 0.0200 
 (0.0536) (0.0543) (0.0563) (0.0579) (0.0592) 
In-state public non-flagship 0.0267 0.0018 0.0315 0.0171 —0.0077 
 (0.0488) (0.0494) (0.0513) (0.0530) (0.0528) 
In-state private 0.0572*** 0.0548*** 0.0428* 0.0584*** 0.0573** 
 (0.0203) (0.0205) (0.0225) (0.0212) (0.0224) 
Out-of-state Public −0.1255*** −0.1322*** −0.1406*** −0.1422*** —0.1232*** 
 (0.0279) (0.0277) (0.0295) (0.0295) (0.0304) 
First-time full-time enrollment (1,000s) 0.0175*** 0.0174*** 0.0153*** 0.0166*** 0.0199*** 
 (0.0042) (0.0042) (0.0046) (0.0044) (0.0047) 
Score send attributes      
Free score send −0.0594** −0.0621*** −0.0737*** −0.0607** —0.0590** 
 (0.0238) (0.0239) (0.0264) (0.0272) (0.0266) 
Sent prior to spring junior year (Apr 2013) −0.0888* −0.0848* −0.1012** −0.0894* —0.0848* 
 (0.0454) (0.0460) (0.0513) (0.0523) (0.0502) 
Sent spring junior year (Apr—Jun 2013) −0.0794* −0.0786 −0.1148** −0.1102** —0.0271 
 (0.0472) (0.0482) (0.0515) (0.0552) (0.0532) 
Sent summer prior to senior year (Jul—Aug 2013) 0.0114 0.0158 0.0427 0.0103 0.0103 
 (0.0546) (0.0573) (0.0619) (0.0608) (0.0671) 
Sent fall senior year (Sept—Nov 2013) 0.0690** 0.0770** 0.0418 0.0601 0.0788** 
 (0.0346) (0.0344) (0.0400) (0.0390) (0.0396) 
Sent winter senior year (Dec 2013—Jan 2014) 0.0128 0.0178 0.0036 −0.0019 0.0480 
 (0.0350) (0.0347) (0.0399) (0.0394) (0.0409) 
Sent late senior year (after Jan 2014) — — — — — 
 — — — —  
Observations 9,377 9,181 7,799 8,772 10,073 
R2 0.044 0.045 0.052 0.045 1,441 
 Model 2: Simulated Application Portfolios 
 (6) At Least 2 Applications (7) Has Score Send After Last SAT (8) Fewer than 10 Applications on Survey (9) No SAT Fee Waiver (10) Using Sample Weights 
College attributes      
Average SAT (100s) −0.0301 −0.0416** −0.0323 −0.0365* —0.0161 
 (0.0236) (0.0191) (0.0208) (0.0209) (0.0238) 
Six-year graduation rate 0.0053** 0.0062*** 0.0048** 0.0054*** 0.0047** 
 (0.0023) (0.0020) (0.0020) (0.0020) (0.0023) 
Tuition and fees ($1,000s) −0.0053 −0.0047 −0.0044 −0.0069** —0.0038 
 (0.0037) (0.0033) (0.0033) (0.0035) (0.0036) 
Distance from home (100s of miles) −0.0121 −0.0167* −0.0255*** −0.0237*** —0.0174* 
 (0.0096) (0.0085) (0.0089) (0.0088) (0.0093) 
(Distance from home)2 0.0003 0.0005 0.0008*** 0.0008** 0.0005* 
 (0.0004) (0.0003) (0.0003) (0.0003) (0.0003) 
In-state public flagship 0.1136 0.1415 0.1191 −0.0043 0.1013 
 (0.1281) (0.1275) (0.1286) (0.1372) (0.1278) 
In-state public non-flagship 0.0456 −0.0192 0.0298 −0.0609 0.0099 
 (0.1171) (0.1104) (0.1123) (0.1170) (0.1126) 
In-state private 0.0834 0.0513 0.0311 0.0480 0.0461 
 (0.0519) (0.0467) (0.0498) (0.0496) (0.0510) 
Out-of-state public −0.3416*** −0.2817*** −0.2879*** −0.3419*** —0.2638*** 
 (0.0703) (0.0621) (0.0652) (0.0656) (0.0670) 
First-time full-time enrollment (1,000s) 0.0419*** 0.0306*** 0.0310*** 0.0340*** 0.0400*** 
 (0.0109) (0.0107) (0.0110) (0.0110) (0.0113) 
Score send attributes      
Free score send −0.0752* −0.0323 −0.0621 −0.0738 —0.0555 
 (0.0450) (0.0479) (0.0519) (0.0572) (0.0484) 
Sent prior to spring junior year (Apr 2013) 0.1282 0.0014 −0.0438 −0.0104 —0.0092 
 (0.2720) (0.1598) (0.1966) (0.1924) (0.1698) 
Sent spring junior year (Apr—Jun 2013) 0.1936 −0.0566 −0.0531 −0.0349 0.0567 
 (0.2716) (0.1591) (0.2013) (0.1984) (0.1753) 
Sent summer prior to senior year (Jul—Aug 2013) 0.3666 0.2491 0.2424 0.2188 0.2499 
 (0.2845) (0.1988) (0.2412) (0.2412) (0.2243) 
Sent fall senior year (Sept—Nov 2013) 0.5397** 0.3325** 0.2818 0.3042 0.3104* 
 (0.2733) (0.1596) (0.2019) (0.1984) (0.1731) 
Sent winter senior year (Dec 2013—Jan 2014) 0.6113** 0.3623** 0.2997 0.3177 0.3663** 
 (0.2740) (0.1620) (0.2051) (0.2013) (0.1764) 
Sent late senior year (after Jan 2014) 0.7704*** 0.4146** 0.3450 0.4733** 0.3713* 
 (0.2817) (0.1826) (0.2294) (0.2218) (0.2048) 
Observations 19,918 19,177 13,062 17,756 20,503 
R2 0.045 0.045 0.055 0.047 0.041 

Notes: Standard errors are presented in parentheses. All regressions include student fixed effects. Additional controls include dummies for missing values or college average SAT score, graduation rate, tuition and fees, and distance from home. Score sends are observed in administrative data, and applications come from survey data. Simulated application portfolios have same number of applications of actual portfolio but substitutes observed score sends that did not become an application.

***p < 0.01; **p < 0.05; *p < 0.1.

The first concern is that students may apply to college with early admission or action and therefore only ever send one application without thinking of the portfolio. I rerun the analyses including only students who had more than one application. Estimates (columns 1 and 6) are largely unchanged, in part because most of these students, perhaps unsurprisingly, did not receive offers of admission through early decision or early action plans.26

A second concern is that students also take the ACT and choose to send these ACT scores but not their SAT scores. This would mean that we observe a truncated choice set. I rerun the analyses on the subset of students who sent at least one score send after their last SAT is taken. This is a deliberate and costly action that is most likely to occur if the student is using her SAT score in the application process. Again, results are largely unchanged in both models (columns 2 and 7) primarily because this subset includes most SAT-takers.

Third, although the survey of students only asks for information on ten applications, a question about the total number of college applications permits me to rerun the analyses on the subset of SAT-takers who applied to fewer than ten colleges. The results are largely unchanged (columns 3 and 8).

Fourth, low-income students who register for the SAT with a fee waiver also have the option to receive four more “flexible” score sends. These score sends are free and can be sent at any time, unlike the “free upon registration” score sends. I replicate my analyses on the subset of students who did not register for the SAT with a fee waiver in order to omit students who may have used free nonregistration fee waivers. Results are largely unchanged (columns 4 and 9).

Finally, I weight the sample to reflect the racial composition of HALIs because the survey oversampled racial minorities. Results are in columns 5 and 10 and are largely unchanged.

Combined, table 5 shows that the results are robust to a number of potential issues. Specifically, graduation rates and enrollment size are consistently positive, and tuition, distance from home, and out-of-state publics are almost always negative and statistically significant. As for timing, free score sends always have a negative relationship with applications and are always statistically significant in model 1 specifications (and sometimes in model 2 specifications).

I perform several more robustness tests that are available in the online Appendix tables. First, I change the timing variable from a set of month/season dummies to months relative to the mean in the sample. I use specifications with just the linear term and separately including the quadratic term. As Appendix table A.3 shows, in all models there is a positive coefficient on date, confirming that score sends sent later in time are more likely to convert to applications, even conditional on whether it is a free score send. The quadratic term is not statistically different than zero.

Next, Appendix table A.4 deals explicitly with students who had applications with no corresponding score send. Those students’ observed score sends are included in the primary analysis but the nonexistent score sends are clearly not. I rerun the model 1 preferred specification with fixed effects but only include students for which there are certain numbers of missing score sends. I start by including only the students who are missing fewer than six score sends given their stated applications. I progressively go from missing fewer than five score sends on, until I only include students who are not missing any score sends. As Appendix table A.4 shows, in specifications that use the fewest students (only those with almost no missing score sends), statistical power is an issue, but overall, the qualitative results are unchanged.

Finally, I also estimate model 2 using a conditional logit and results are qualitatively similar.27 The logit estimates are in the first column of Appendix table A.5.28

Additional Portfolio Attributes

Whether score sends convert into applications is not entirely determined by college characteristics but also how they intersect with student characteristics and the characteristics of other score sends. Table 6 explores some alternative explanatory variables that address this issue.

Table 6.
Score Sends to Application Conversion: Alternative Explanatory Variables
Model 1: Score SendsModel 2: Simulated Application Portfolios
(1)(2)(3)(4)(5)(6)(7)
Student SAT less college average SAT 0.0037 — — 2.5629  — — 
 (0.0085) — — (1.8883)  — — 
Safety — −0.0179 — — −0.0297 — — 
 — (0.0174) — — (0.0412) — — 
Reach — −0.0510*** — — −0.1055*** — — 
 — (0.0179) — — (0.0377) — — 
Average SAT score of college (100s) — — 0.1601** —  −0.3266** — 
 — — (0.0733) —  (0.1537) — 
(Average SAT score of college)2 — — −0.0065** —  0.0118** — 
 — — (0.0029) —  (0.0059) — 
Minimum average SAT score among portfolio — — — — — — 0.0387*** 
 — — — — — — (0.0056) 
Maximum average SAT score among portfolio — — — — — — −0.0777*** 
 — — — — — — (0.0094) 
Observations 10,073 10,073 10,073 20,503 20,503 20,503 20,503 
R2 0.046 0.047 0.047 0.048 0.049 0.049 0.066 
Model 1: Score SendsModel 2: Simulated Application Portfolios
(1)(2)(3)(4)(5)(6)(7)
Student SAT less college average SAT 0.0037 — — 2.5629  — — 
 (0.0085) — — (1.8883)  — — 
Safety — −0.0179 — — −0.0297 — — 
 — (0.0174) — — (0.0412) — — 
Reach — −0.0510*** — — −0.1055*** — — 
 — (0.0179) — — (0.0377) — — 
Average SAT score of college (100s) — — 0.1601** —  −0.3266** — 
 — — (0.0733) —  (0.1537) — 
(Average SAT score of college)2 — — −0.0065** —  0.0118** — 
 — — (0.0029) —  (0.0059) — 
Minimum average SAT score among portfolio — — — — — — 0.0387*** 
 — — — — — — (0.0056) 
Maximum average SAT score among portfolio — — — — — — −0.0777*** 
 — — — — — — (0.0094) 
Observations 10,073 10,073 10,073 20,503 20,503 20,503 20,503 
R2 0.046 0.047 0.047 0.048 0.049 0.049 0.066 

Notes: Standard errors are presented in parentheses. All regressions include student fixed effects. Additional controls include dummies for missing values or college average SAT score, graduation rate, tuition and fees, and distance from home. Score sends are observed in administrative data, and applications come from survey data. Simulated application portfolios have same number of applications of actual portfolio but substitutes observed score sends that did not become an application.

***p < 0.01; **p < 0.05.

In columns 1 and 4, corresponding to models 1 and 2, respectively, the average SAT score of the college is replaced by the difference between the students’ SAT scores and the average SAT score of the college. This is a measure often referred to as “match,” which quantifies how similar a student's academic ability is to the colleges’, as measured by SAT scores. Students may be counseled to consider match in the application process. However, we do not see any statistical relationship between this measure of match and application conversion. This may be because the assumption of linearity is too strong and so columns 2 and 5 use the concept of “safety” and “reach” colleges. Although there are negative coefficients on safety colleges, they are not statistically significant. On the other hand, the coefficients on reach colleges are negative and statistically significant. This suggests students are less likely to apply to colleges that are less likely to accept them than the omitted match colleges. This is consistent with results in column 3, whereby students are more likely to apply to colleges with higher average SAT scores, but the quadratic term suggests this is less true at high SAT levels (i.e., reaches). However, the results in column 6 do not suggest this relationship.

Finally, using model 2, I assess how students value the portfolio as a whole, including the range of colleges in the portfolio. Column 7 shows that portfolios with higher minimum average SAT scores are more likely to be chosen than those with lower scores. Also, portfolios where the maximum average SAT score is lower are more likely to be chosen than those with higher scores. This suggests that students do not have strong preferences for extremely disparate applications and, rather, they prefer colleges that are somewhat similar to each other—not too much of a safety and not too much of a reach.

Overall, the preponderance of evidence from table 6 suggests students prefer score sends and portfolios that are not far from their own measured ability. This might relate to the advice they receive from parents or counselors and may also relate to their level or risk tolerance.

Heterogeneous Effects

Testing for heterogeneous effects is limited by the sample size. I divide the sample and conduct two analyses: male versus female and underrepresented minority (black or Hispanic) versus not (white and Asian). There are very few differences by these subgroups, in part because of statistical power, and so I only present results in table A.6.29 Perhaps the only thing of note is that male students are less likely to apply to portfolios with flagships and out-of-state public colleges than female students. The lack of differences between underrepresented minority students and non-underrepresented minorities is surprising at first glance. However, these students are very high-achieving, and underrepresented minorities at the highest end of the measured academic ability spectrum have different application patterns than those elsewhere in the distribution. This is in part because those students have great opportunities and tend to be sought after by colleges who value diversity. Additionally, there are many organizations that aim to improve the college application and enrollment experience for these students (for examples, see Hoxby and Turner 2013 and College Board outreach programs).

Enrollment

Next, I consider the relationship between the characteristics of score sends (and applications) and eventual enrollment. This is not necessarily the same as the relationship between score sends and applications. I also consider the relationship between application and enrollment.

This analysis is similar in spirit to B. Long (2004), who looks at the determinants of where students enroll among the entire set of colleges in the United States. She is particularly interested in the dynamics of average SAT scores, tuition, and distance from home as she shows their relative importance over several decades. This analysis differs in two ways. First, I take one step back, by looking at score send conversion to enrollment, along with applications to enrollment, which means I only consider the determinants of enrollment from a refined choice set. Second, I am able to see how the timing of score sends relates to enrollment, which has never been examined.

Similar to B. Long (2004), one should expect a student's score sends to convert to enrollment when there is low tuition, high SAT scores, and colleges closer to home—although none of these is necessary if students underestimate or overestimate their probabilities of acceptance. As for timing variables, there are two primary but disparate scenarios worth noting. First, score sends may be relatively uninformed decisions throughout the application process and so when the time comes to enroll, there is no relationship between timing of score sends and enrollment. Second, the timing of score sends may say a lot about student preferences for a college. On the early side, students always planned to attend college X if admitted and so sending their SAT scores early in the process was never a question. On the late side, students may scramble to meet a deadline or find a college late in the application process that is an exceptionally good fit. Thus, the timing of score sends would be a good indicator of how likely a student is to enroll but the direction is an empirical question. These examples are not the only potential scenarios that could dictate the relationship between score-send timing and enrollment, but, based on previous literature, students tend not to overthink the application process and therefore these are the leading candidates driving the empirics.

I estimate the factors associated with a score send and college application converting into students’ enrollment choice using a strategy similar to model 1 (equation 6). However, there need not be any assumptions on whether students choose applications independently or jointly, because there is at most one choice of enrollment. Results based on OLS with student fixed effects are presented in table 7. The first column estimates how score sends relate to enrollment. The second column estimates how applications relate to enrollment. Unlike the previous analyses, it is important to account for the role of the admissions process in enrollment through controls for measures of selectivity (e.g., average SAT scores and graduation rate).30

Table 7.
Score Sends and Applications to Enrollment Conversion
Score Send to Enrollment ConversionApplication to Enrollment Conversion
College attributes   
Average SAT (100s) 0.0089 0.0196* 
 (0.0061) (0.0102) 
Six-year graduation rate −0.0015** −0.0031*** 
 (0.0006) (0.0010) 
Tuition and fees ($1,000s) −0.0007 −0.0014 
 (0.0009) (0.0016) 
Distance from home (100s of miles) −0.0085*** −0.0051 
 (0.0026) (0.0045) 
(Distance from home)2 0.0028*** 0.0015 
 (0.0010) (0.0016) 
In-state public flagship 0.1226*** 0.0612 
 (0.0379) (0.0620) 
In-state public non-flagship 0.0210 −0.0132 
 (0.0315) (0.0560) 
In-state private 0.0086 0.0048 
 (0.0154) (0.0245) 
Out-of-state public −0.0629*** −0.0659** 
 (0.0169) (0.0285) 
First-time full-time enrollment (1,000s) 0.0103*** 0.0130** 
 (0.0033) (0.0052) 
Score send attributes   
Free score send 0.0243* 0.0483** 
 (0.0146) (0.0234) 
Sent prior to spring junior year (Apr 2013) −0.0361 0.0443 
 (0.0327) (0.0543) 
Sent spring junior year (Apr—Jun 2013) −0.0582* 0.0053 
 (0.0325) (0.0547) 
Sent summer prior to senior year (Jul—Aug 2013) −0.0619 −0.0571 
 (0.0488) (0.0791) 
Sent fall senior year (Sept—Nov 2013) −0.0561** −0.0663 
 (0.0276) (0.0439) 
Sent winter senior year (Dec 2013—Jan 2014) −0.0611** −0.0532 
 (0.0273) (0.0439) 
Applied No Yes 
Observations 10,073 6,271 
R2 0.037 0.030 
Score Send to Enrollment ConversionApplication to Enrollment Conversion
College attributes   
Average SAT (100s) 0.0089 0.0196* 
 (0.0061) (0.0102) 
Six-year graduation rate −0.0015** −0.0031*** 
 (0.0006) (0.0010) 
Tuition and fees ($1,000s) −0.0007 −0.0014 
 (0.0009) (0.0016) 
Distance from home (100s of miles) −0.0085*** −0.0051 
 (0.0026) (0.0045) 
(Distance from home)2 0.0028*** 0.0015 
 (0.0010) (0.0016) 
In-state public flagship 0.1226*** 0.0612 
 (0.0379) (0.0620) 
In-state public non-flagship 0.0210 −0.0132 
 (0.0315) (0.0560) 
In-state private 0.0086 0.0048 
 (0.0154) (0.0245) 
Out-of-state public −0.0629*** −0.0659** 
 (0.0169) (0.0285) 
First-time full-time enrollment (1,000s) 0.0103*** 0.0130** 
 (0.0033) (0.0052) 
Score send attributes   
Free score send 0.0243* 0.0483** 
 (0.0146) (0.0234) 
Sent prior to spring junior year (Apr 2013) −0.0361 0.0443 
 (0.0327) (0.0543) 
Sent spring junior year (Apr—Jun 2013) −0.0582* 0.0053 
 (0.0325) (0.0547) 
Sent summer prior to senior year (Jul—Aug 2013) −0.0619 −0.0571 
 (0.0488) (0.0791) 
Sent fall senior year (Sept—Nov 2013) −0.0561** −0.0663 
 (0.0276) (0.0439) 
Sent winter senior year (Dec 2013—Jan 2014) −0.0611** −0.0532 
 (0.0273) (0.0439) 
Applied No Yes 
Observations 10,073 6,271 
R2 0.037 0.030 

Notes: Standard errors are presented in parentheses. All regressions include student fixed effects. Additional controls include dummies for missing values or college average SAT score, graduation rate, tuition and fees, and distance from home. Score sends and enrollment are observed in administrative data, and applications come from survey data.

***p < 0.01; **p < 0.05; *p < 0.1.

Similar to the application analyses (tables 3 and 4), in the first column of table 7 there is no statistical relationship between average SAT score and the probability of enrollment. Unlike the results from the application analyses, there is a slightly negative coefficient on six-year graduation rate. Taken together, these results imply that score sends are more likely to convert to high graduation rate college applications but less likely to convert to high graduation rate college choice. All else equal, average SAT score is not a big determinant in application or enrollment. Again, the relative importance of average SAT score and graduation rates may have to do with students’ prior knowledge of the statistics. The probability of enrollment is negatively (and nonlinearly) related to distance from home, as it was with the probability of application. The two columns of results have coefficients that are similar in magnitude to one another, suggesting that much of the negative enrollment effect is driven by students not applying to the college, which is consistent with recent research in the area (Hoxby and Turner 2013; Smith, Pender, and Howell 2013). The sign of the coefficients on out-of-state public colleges is the same as the sign of the coefficient in the application regression. However, in-state public flagship is extremely positive, despite not being robustly related to the probability of an application. It is difficult to pin down what drives these differences. Is it that students learn more about flagships after they've applied? Or that flagships give better (and unexpected) financial aid? Or perhaps in-state students have a much higher probability of acceptance at flagships than their applications to selective private colleges.

Moving to the bottom of the first column, the coefficient on free score sends is positive and marginally significant. This is in stark contrast to the negative coefficients in the application stage. Similarly, the coefficients on the latest applications well into one's senior year are negative—the opposite sign as the application stage.

The second column of table 7 finds some results consistent with B. Long (2004). In particular, conditional on applying, students are more likely to enroll in colleges with higher average SATs. I do not find that distance is a great predictor, but again, these results are conditional on factors such as whether the college is in-state. Students are much less likely to enroll in out-of-state publics, conditional on an application. Finally, a free score send that turns into an application is 4.8 percentage points more likely to lead to enrollment than one that was not free and again, we see negative coefficients on late score sends (not statistically significant). These are in stark contrast to the relationships on the probability of applying.

Overall, the results of table 7 tell two stories. First, the same college characteristics that influence conversion to applications often, but not always, influence the conversion to enrollment. Second, the timing variables that relate to application conversion work in the opposite direction as the way they relate to enrollment conversion. Specifically, score sends that are chosen relatively early do not often end in application, but if they do, they tend to end up in enrollment. What can explain this pattern? One explanation consistent with the data is that free score sends go to a range of colleges known by students, those that are highly preferable (that end in enrollment), and those that are not well thought-out (that don't get an application). This is not surprising given that students in the early stages of building a portfolio are likely to start with what they know (e.g., local colleges, colleges in the news, colleges their friends or teacher went to). As time passes and students learn about their abilities, options, and those colleges, they may remove some from their choice set but one or two of them remain and end in enrollment.

5.  Conclusion

The results of this paper are clear—students’ application portfolios change over time. Some applications are decided upon early and others are decided much later but they are certainly not chosen at the same time. This may not be surprising to those outside the research community, especially counselors and parents. The simplifying assumptions that most education researchers make are understandable, given their data constraints. Most researchers only observe the final application portfolio, not how it was constructed. However, doing so implies that the simultaneous choice models, which are often used, lose valuable information on which applications are being impacted. Again, back to the simple example, if a student decides early on to apply to a college, the optimization problem down the road is how to construct a portfolio that already has one college in it. There should be more thought and discussion into how data and model limitations differ from reality and how the assumptions used to overcome the issues may impact the information obtained and potentially the estimates.

From a policy perspective, these results shed light on several topical issues. There is substantial effort to provide students with information to make good choices. This ranges from no-touch Web sites (e.g., College Scorecard, College Navigator, U.S. News and World Report, and Big Future), to low-touch interventions (e.g., Hoxby and Turner 2013), to high-touch counseling (e.g., Carrell and Sacerdote 2017). But which colleges in the students’ choice set are these interventions impacting? Is it the ones chosen early that students have strong (or weak) preferences for? Is it the ones that come late to round out their portfolio once more information is aggregated? More importantly, does this mean the impact of an intervention depends on the timing in part because students’ early choices are more or less malleable? These answers are especially important if the aforementioned colleges differ in quality or fit. In addition, the results suggest that preferences evolve over time and so perhaps frequent discussions, interventions, or counseling sessions are important. This would likely require research, funding, and the support of institutions.

This paper is also relevant to colleges, admissions officers, and enrollment managers. A lot of time and money is spent trying to identify and recruit prospective students to become applicants and eventually matriculants. They could harness this information about the timing of student contact to evaluate whether they should invest more energy in the student or divert their resources elsewhere. Similarly, colleges have applicants of varying backgrounds and so the results of this paper may differ depending on the target population. Colleges could use their own data to further investigate score sending patterns and conversion to applications may prove informative. In a related vein, there may be information in the timing of applications that colleges can harness.

This research also sets the stage for future research. First, there is a need for interventions at different times in a student's decision process to see how applications differentially shift, if at all. Second, this paper was also performed on a nonrepresentative sample and it would be worthwhile to run similar analyses on a more diverse pool of students. Lower-achieving students tend to send fewer score sends (and applications) than higher-achieving students, so the results of this paper may be less applicable. However, higher-income students who are comparably high-achieving tend to send the most score sends (and applications), and therefore likely have some parallels of non-conversion. Third, this paper makes no effort to understand how the colleges get into a student's choice set and whether the conversion to applications are demand-driven (e.g., early decision) or supply-driven (e.g., student preferences). Is it more efficient (or possible) to change the choice set or the low-hanging fruit of converting score sends into applications? Finally, this paper merely describes the sequential nature of college applications and future theoretical and econometric models should be enhanced to account for this fact.

Acknowledgments

This paper does not reflect the views of The College Board. All errors are my own.

REFERENCES

REFERENCES
Alter
,
Molly
, and
Randall
Reback
.
2014
.
True for your school? How changing reputations alter demand for selective U.S. colleges
.
Educational Evaluation and Policy Analysis
36
(
3
):
346
370
. doi:10.3102/0162373713517934.
Arcidiacono
,
Peter
.
2005
.
Affirmative action in higher education: How do admission and financial aid rules affect future earnings
?
Econometrica
73
(
5
):
1477
1524
. doi:10.1111/j.1468-0262.2005.00627.x.
Avery
,
Christopher
,
Jessica
Howell
, and
Lindsay
Page
.
2014
.
A review of the role of college counseling, coaching, and mentoring on students’ posetsecondary outcomes
.
The College Board Research Brief
.
Black
,
Sandra
,
Kalena
Cortes
, and
Jane
Lincove
.
2015
.
Apply yourself: Racial and ethic difference in college applications
.
NBER Working Paper No. 21368
.
Bond
,
Timothy
,
George
Bulman
,
Xiao
Li
, and
Jonathan
Smith
. (
2018
).
Updating human capital decisions: Evidence from SAT score shocks and college applications
.
Journal of Labor Economics
36
(
3
):
807
839
.
Bowman
,
Nicholas
, and
Michael
Bastedo
.
2009
.
Getting on the front page: Organizational reputation, status signals, and the impact of U.S. News and World Report on student decisions
.
Research in Higher Education
50
(
5
):
415
436
. doi:10.1007/s11162-009-9129-8.
Card
,
David
, and
Alan
Krueger
.
2005
.
Would the elimination of Affirmative Action affect highly qualified minority applicants? Evidence from California and Texas
.
Industrial & Labor Relations Review
58
(
3
):
414
434
. doi:10.1177/001979390505800306.
Carrell
,
S.
, and
B.
Sacerdote
.
2017
.
Why do college-going interventions work
?
American Economic Journal: Applied Economics
9
(
3
):
124
151
.
Chade
,
Hector
,
Gregory
Lewis
, and
Lones
Smith
.
2014
.
Student portfolios and the college admissions problem
.
Review of Economic Studies
81
(
3
):
971
1002
. doi:10.1093/restud/rdu003.
Chade
,
Hector
, and
Lones
Smith
.
2006
.
Simultaneous search
.
Econometrica
74
(
5
):
1293
1307
. doi:10.1111/j.1468-0262.2006.00705.x.
Conlin
,
Michael
,
Stacy
Dickert-Conlin
, and
Gabrielle
Chapman
.
2013
.
Voluntary disclosure and the strategic behavior of colleges
.
Journal of Economic Behavior & Organization
96
:
48
64
. doi:10.1016/j.jebo.2013.09.007.
DesJardins
,
Stephen
,
Halil
Dundar
, and
Darwin
Hendel
.
1999
.
Modeling the college application decision process in a land-grant university
.
Economics of Education Review
18
:
117
132
. doi:10.1016/S0272-7757(98)00023-5.
Dillon
,
Eleanor
, and
Jeffry
Smith
.
2017
.
The determinants of mismatch between student ability and colleges
.
Journal of Labor Economics
35
(
1
):
44
66
. doi:10.1086/687523.
Dynarski
,
Susan
,
Steven
Hemelt
, and
Joshua
Hyman
.
2015
.
The missing manual: Using National Student Clearinghouse data to track postsecondary outcomes
.
Educational Evaluation and Policy Analysis
37
:
53S
79S
. doi:10.3102/0162373715576078.
Epple
,
Dennis
,
Richard
Romano
, and
Holger
Sieg
.
2006
.
Admission, tuition, and financial aid policies in the market for higher education
.
Econometrica
74
(
4
):
885
928
. doi:10.1111/j.1468-0262.2006.00690.x.
Fu
,
Chao
.
2014
.
Equilibrium tuition, applications, admissions and enrollment in the college market
.
Journal of Political Economy
122
(
2
):
225
281
. doi:10.1086/675503.
Griffith
,
Amanda
, and
Donna
Rothstein
.
2009
.
Can't get here from there: The decision to apply to a selective institution
.
Economics of Education Review
28
(
5
):
620
628
. doi:10.1016/j.econedurev.2009.01.004.
Hossler
,
Don
, and
K.
Gallagher
.
1987
.
Studying student college choice: A three-phase model and the implications for policymakers
.
College and University
62
(
3
):
207
221
.
Howell
,
Jessica
.
2010
.
Assessing the impact of eliminating Affirmative Action in higher education
.
Journal of Labor Economics
28
(
1
):
113
166
. doi:10.1086/648415.
Hoxby
,
Caroline
, and
Christopher
Avery
.
2012
.
The missing “one-offs”: The hidden supply of high-achieving, low income students
.
NBER Working Paper No. 18586
.
Hoxby
,
Caroline
, and
Sarah
Turner
.
2013
.
Expanding college opportunities for high-achieving, low-income students
.
SIEPR Discussion Paper No. 12–014
.
Hurwitz
,
Michael
,
Preeya
Mbekeani
,
Margaret
Nipson
, and
Lindsay
Page
.
2017
.
Surprising ripple effects: How changing the SAT score sending policy for low-income students impacts college access and success
.
Educational Evaluation and Policy Analysis
39
(
1
):
77
103
. doi:10.3102/0162373716665198.
Klasik
,
Daniel
.
2012
.
The college application gauntlet: A systematic analysis of the steps to four-year college enrollment
.
Research in Higher Education
53
(
5
):
506
549
. doi:10.1007/s11162-011-9242-3.
Long
,
Bridget T.
2004
.
How have college decisions changed over time? An application of the conditional logistic choice model
.
Journal of Econometrics
121
:
271
296
. doi:10.1016/j.jeconom.2003.10.004.
Long
,
Mark
.
2004
.
College applications and the effect of affirmative action
.
Journal of Econometrics
121
(
1–2
):
319
342
. doi:10.1016/j.jeconom.2003.10.001.
Luca
,
Michael
, and
Jonathan
Smith
.
2013
.
Salience in quality disclosure: Evidence from U.S. News college rankings
.
Journal of Economics & Management Strategy
22
(
1
):
58
77
. doi:10.1111/jems.12003.
Manski
,
Charles
, and
David
Wise
.
1983
.
College choice in America
.
Cambridge, MA
:
Harvard University Press
. doi:10.4159/harvard.9780674422285.
Monks
,
James
, and
Ronald
Ehrenberg
.
1999
.
U.S. News and World Report's college rankings: Why do they matter
?
Change
31
(
6
):
42
51
. doi:10.1080/00091389909604232.
Page
,
Lindsay
, and
Judith
Scott-Clayton
.
2015
.
Improving college access in the United States: Barriers and policy responses
.
NBER Working Paper No. 21781
.
Pallais
,
Amanda
.
2015
.
Small differences that matter: Mistakes in applying to college
.
Journal of Labor Economics
33
(
2
):
493
520
. doi:10.1086/678520.
Pope
,
Devin
, and
Jaren
Pope
.
2009
.
The impact of college sports success on the quantity and quality of student applications
.
Southern Economic Journal
75
(
3
):
750
780
.
Radford
,
Alexandria W.
2013
.
Top student, top school? How social class shapes where valedictorians go to college
.
Chicago
:
University of Chicago Press
. doi:10.7208/chicago/9780226041148.001.0001.
Smith
,
Jonathan
,
Matea
Pender
, and
Jessica
Howell
.
2013
.
The full extent of academic undermatch
.
Economics of Education Review
32
:
247
261
. doi:10.1016/j.econedurev.2012.11.001.
Toutkoushian
,
Robert
.
2001
.
Do parental income and educational attainment affect initial choices of New Hampshire's college-bound students
?
Economics of Education Review
20
(
3
):
245
262
. doi:10.1016/S0272-7757(99)00052-7.
Weiler
,
William
.
1994
.
Transition from consideration of a college to the decision to apply
.
Research in Higher Education
35
(
6
):
631
646
. doi:10.1007/BF02497079.

Notes

1. 

These simplifying assumptions are for justifiable reasons related to data limitations and model tractability. For more examples, see Manski and Wise (1983), Arcidiacono (2005), Epple, Romano, and Sieg (2006), Chade and Smith (2006), Howell (2010), Chade, Lewis, and Smith (2014), and Fu (2014).

2. 

There are also numerous organizations that aim to serve this population, including the Jack Kent Cooke Foundation and the College Board's Access 2 Opportunity program, to name two.

3. 

Through personal communication with A. Orzech and S. Minicucci about analyses most similar to this one, they find that 72.4 percent of score sends convert into applications. They have the advantage of seeing the entire set of score sends and applications to a set of colleges but the disadvantage of not seeing students' complete set of score sends (or timing).

4. 

The two models make assumptions about whether each application decision is independent of one another or whether they are jointly determined. I discuss the weaknesses of each model but they provide the benefit of tackling one problem with two different estimation strategies.

5. 

Most of my papers fall under this category, as do Toutkoushian (2001), M. Long (2004), Card and Krueger (2005), Pallais (2015), Hurwitz et al. (2017).

6. 

There is a writing section out of 800 points as well, but the writing section is now optional for students. I use math and critical reading for better comparability with older cohorts and future cohorts.

7. 

Score sends are infrequently sent to non-postsecondary institutions, such as scholarship organizations (e.g., National Merit) and athletic programs (e.g., NCAA). The 6 percent of these score sends are excluded from all analyses.

8. 

Hurwitz et al. (2017) show that this policy increases the number of score sends and improves enrollment and completion rates.

9. 

Free score sends are often requested at the time of registration but fulfilled several months later, after the SAT is taken and scored. I rely on the request date, not fulfill date.

10. 

Formally, the students had to either score a 125 on their PSAT (maximum of scores across attempts) or a 1250 on their SAT (using the sum of the maximum scores on each section).

11. 

Exceeding ten applications was rare and I test the sensitivity of my results to not having complete application data.

12. 

The exact percent of students invited to participate in the survey was lost, including exactly which students were invited.

13. 

All Appendix tables are available in a separate online appendix that can be accessed on Education Finance and Policy's Web site at www.mitpressjournals.org/doi/suppl/10.1162/edfp_a_00235.

14. 

The variables are from the 2013 collection year, change very little from year to year, and are similar to what students would see when considering an application.

15. 

Alternatively, Zit could include all colleges and attributes but assign zero weight (or infinitely negative utility) to the attributes or colleges to which the students are unaware—ensuring that the colleges will never be chosen.

16. 

Students can down-weight a college such that it is never going to become an application but I still assume it is in the choice set, just with less or even zero weight.

17. 

The utility from an application can be thought of as the expected utility, net of the probability of admission.

18. 

Applications likely do not have a constant marginal cost. In addition, there likely exist student-college specific application costs. I assume constant marginal costs for model simplicity and data limitations.

19. 

Logit models, as opposed to OLS, and conditional logit models, as opposed to student fixed effects, produce qualitatively similar results. Hoxby and Avery (2012) and Black, Cortes, and Lincove (2015) use the conditional logit.

20. 

Another consequence of the fixed effects model is that there is no within-student variation to exploit for the approximately 20 percent of students who send applications to all the colleges to which they send scores. These students typically send scores to very few colleges.

21. 

A similar strategy of simulating portfolios is used in Arcidiacono (2005).

22. 

This requires that one such score send exists. I exclude the small number of students who converted all score sends into applications.

23. 

Conditional logit models require a different assumption on the error term, produce qualitatively similar results, and are shown in online Appendix table A.5.

24. 

Note that we never observe Pit, only PiT.

25. 

There are also dummies in the rare event that these variables are missing.

26. 

Early action and early decision enrollment skews heavily toward wealthy students.

27. 

A student can choose to apply to more than one college in the consideration set so the conditional logit is inappropriate for model 1.

28. 

The coefficients are presented as odds ratio, therefore, a coefficient larger than one is an increase in the probability of a score send converting to an application, and a coefficient less than one is a decrease.

29. 

The table is available in the online Appendix. The results are for model 2 but model 1 yields qualitatively similar results.

30. 

In results not shown, controlling for acceptance rate adds little to the model. The coefficient estimate on acceptance rate is zero and all other coefficients are unchanged.

Supplementary data