Abstract

To complement the existing literature on expanding access to high-quality college education, we investigate the understudied topic of university outreach. Using a field experiment at West Point, we estimate the effectiveness of four university admissions outreach efforts (Admissions Office phone call, application encouragement from a role model, recruiting visit by a university staff member, and an invitation to visit campus) on the probabilities of initiating an application and matriculating. We find that all four methods are effective relative to the control group (a mass e-mail solicitation) at increasing applications but only suggestive evidence for increasing matriculation. We observe a few differences in the relative effectiveness of the methods. We complete a simple cost-effectiveness analysis that suggests the admissions call is the preferred method. This evidence should inform researchers, policy makers, and institutions on optimal outreach efforts and program evaluation in student recruitment.

1.  Introduction

Given the returns to education and its potential to enable social mobility, economists and policy makers have extensively studied mechanisms for improving college access. But whereas expanding and enhancing access to quality education is one of the most visible and widely studied areas of public policy, decades of aid and other public policies have failed to eliminate the gap in college-going between low- and high-income students (Bettinger et al. 2012; Castleman, Page, and Schooley 2014). As a result, there remains significant interest in identifying barriers to and suitable methods for expanding college access, especially for academically qualified students (see, e.g., Bettinger et al. 2012; Hoxby and Turner 2013; Castleman, Owen, and Page 2016). Hoxby and Avery (2013) specifically highlight the importance of providing better information to low-income, high-achieving students.

Beyond simply increasing qualified applicants, policy and institutional efforts have also sought to expand college access to diverse groups that include minorities, first-generation students, and students from low socioeconomic status (SES) backgrounds. Alger et al. (2000) describe institutional motivations for seeking diverse student bodies, and Gudeman (2000) details the priority that elite liberal arts colleges in particular place on diversity and tolerance. In addition to traditional diversity efforts focused on gender and race/ethnicity, modern institutions are increasingly trying to attract other groups, including veterans and first-generation students (Williams 2013). To date, however, there is little evidence on the effectiveness of these efforts.

Together these studies suggest the need for continued assessment of college outreach efforts. Existing research demonstrates the impact of high school programs that include: requiring college entrance test-taking (Goodman 2013); hosting entrance exams (Bulman 2015); providing application assistance (Carrell and Sacerdote 2017); and providing mentoring and other enrollment support (Castleman and Page 2015; Castleman, Page, and Schooley 2014).1

Given their interests in generating diverse and capable student bodies and their unique institutional knowledge of requirements and opportunities, colleges and universities themselves might represent another important channel for advancing access to higher education. Despite these interests and advantages, there is surprisingly little experimental and empirical research on outreach by colleges and universities. This research aims to improve our understanding of these institutional outreach efforts.

We conducted a field experiment at the United States Military Academy (hereafter, West Point) to systematically improve its recruitment of college-ready individuals with recent active duty military experience. We then evaluated the effectiveness of four different university admissions outreach efforts: (1) Admissions Office phone call, (2) application encouragement from a role model, (3) recruiting visit by a university staff member, and (4) an invitation to visit campus. We assessed the probabilities of initiating an application and matriculating to West Point. To leverage existing research, all of the treatments provided more information about the university (Hoxby and Avery 2013), and they provided encouragement and reassurance about competitiveness to increase social belonging (Walton and Cohen 2007, 2011). As we will discuss, the treatments differed in their primary economic and social–psychological motivations.

Three of the four methods (Admissions Officer call, outreach visit, and campus visit invitation) outperformed the control group (a mass e-mail solicitation) in generating applications. The other method (unit commander office call) generated positive but statistically insignificant gains. Together, the outreach methods increased matriculations, but their separate effects are measured imprecisely. We observe some differences between the methods with respect to initiating applications but not matriculations. We also complete a simple cost-effectiveness analysis that suggests the best method may be a phone call by university admissions officers.

Our work expands the existing college outreach literature in several ways. We separately analyze multiple proactive institutional outreach strategies that attempt to inform and motivate prospective students to both apply and matriculate. This differs from Castleman, Owen, and Page (2016), who analyze the effectiveness of a suite of outreach efforts by the University of New Mexico. They find that the college-based outreach effort improved the matriculation of an underrepresented group (i.e., Hispanic males) relative to a control group and a high school outreach effort. Our setting is more similar to Andrews, Imberman, and Lovenheim (2016), who analyze the unique outreach and support programs by Texas’ flagship universities. Using quasi-experimental methods, they find significant improvements in matriculation and graduation for the University of Texas at Austin program but not the Texas A&M University program. Our results also provide broader geographic evidence because the recruitment applied to students nationwide.

In addition, our setting enables us to study the effectiveness of various recruitment strategies for academically qualified students in one population of interest—individuals with previous military experience (basically, veterans), who likely increase the diversity in a typical student body. Junior enlisted service members in the Army, the focus of our study, represent a population of college applicants with potential to increase a university's economic and racial diversity. Although we lack direct data on the family income of our sample, there is reason to believe that a sizeable portion comes from low SES backgrounds—almost 30 percent of recent military enlistees come from the bottom two quintiles of neighborhood income (Watkins and Sherk 2008), and a substantial body of research (e.g., Cooper 1977; Eitelberg et al. 1984; Orvis and Gahart 1990) documents the negative relationship between SES and the probability of enlisting in the military. In addition, approximately 34 percent of Army enlistees identify as a minority, higher than the 27 percent representation in the Army officer corps (DOD 2014a) and close to the national average of 38 percent for post-secondary enrollment (Snyder, de Brey, and Dillow 2016, chapter 3).

However, because we study the question of expanding access to higher education at an especially unique undergraduate institution, our external validity has limits. West Point's uniqueness derives from a requirement to serve in the Army after graduation, a special admissions process, different financial considerations (no financial aid is required), federal management, and a national focus. Still, given the relative dearth of experimental research in this area and our rich administrative data, the study provides initial evidence on a number of potential institutional outreach methods that could expand college access for underrepresented groups.

Our paper proceeds as follows. In section 2 we review the institutional setting. Section 3 describes the experimental design and section 4 analyzes its validity. We present our results in section 5 and discuss them in section 6. In section 7 we provide a cost-effectiveness analysis. We summarize our results and potential applications in section 8.

2.  Institutional Setting

Each year West Point matriculates approximately 1,200 freshmen or “Plebes” as they are locally known. The Academy's positive reputation and high college rankings produce enough competitive applicants to earn a “most competitive” ranking in Barron's 2009 Profile of American Colleges. Yet despite its overall attractiveness, West Point faces significant challenges in recruiting individuals from selected groups, notably currently serving enlisted service members and minorities. Congress authorizes eighty-five admissions to active component soldiers and an additional eighty-five seats to Army Reserve and Army National Guard personnel each year. The purpose of these authorizations is to provide opportunities for high-potential soldiers to serve as officers and to increase the diversity of the student body in terms of military experience.

Military service academies are somewhat unique from other colleges in that they are fully funded by the federal government. In exchange for tuition, living expenses, and a small stipend, all graduates are required to serve in the active duty military upon graduation.2 Applicants to West Point must be legally eligible and receive a political nomination before they can receive an offer of admission. Legal eligibility requires that candidates must be U.S. citizens, have no dependents (spouse or children), and be under the age of 23 on the day they report for basic training. The majority of accepted candidates receive a nomination from their congressional representative or senator but federal law provides the Secretary of the Army with 170 nominations (eighty-five for the active component and eighty-five for the National Guard/Reserve) per year for currently serving soldiers.3

Congress authorizes eighty-five admissions to active component soldiers and an additional eighty-five seats to Army Reserve and Army National Guard personnel each year. Soldiers are not exempt from meeting other Academy qualification standards, but they are not required to compete against other candidates for the limited number of congressional nominations within their district. This advantage is substantial given that many highly qualified candidates do not receive offers of admission due to the inability to secure a nomination. Candidates who receive a nomination are evaluated based on their academic aptitude, leadership potential, medical screenings, and performance on a physical fitness assessment. Furthermore, West Point's Admissions Board heavily weights the character of enlisted service of an applicant when making admissions offers.4

Enlisted soldiers are an important demographic group for the academy and its Admissions Office to recruit and retain. However, in recent years (since at least 2010), West Point has consistently failed to attract enough qualified candidates to fill its seats reserved for soldiers.5 Although we cannot be sure, we suspect that many of the barriers that reduce college applications for youth nationwide (e.g., low SES, lack of good information, social network effects, and complex application processes) may also contribute to the consistently low application rates of qualified soldiers to West Point. A second factor may be that highly qualified soldiers have equal or better options at civilian institutions, especially given access to the post-9/11 GI Bill, though this seems unlikely given existing data.6 A final factor discouraging application may be the continued military service requirements of attending West Point. Taken together, we conclude that some but not all of the challenges faced by applicants and by the institution are similar to those of broader interest to education researchers, policy makers, and practitioners (see Hoxby 2004 and College Board 2011 for summaries).

For the cohort entering in the summer of 2014, only thirty-seven qualified active duty soldiers received an offer of admission despite more than 60,000 eligible soldiers in the active duty Army.7 Prior to this study, West Point's Directorate of Admissions tried different outreach methods with limited success and no systematic evaluation plan. These efforts generally included a combination of broadly distributed public announcements to the Army encouraging eligible soldiers to apply, Admissions Officer briefings to general audiences of soldiers at various Army bases, and different types of mass mailings (typically in e-mail). Two recent efforts show more promise in enabling more targeted recruiting. The first method concentrated its efforts on eligible candidates who have high test scores on the Armed Services Vocational Aptitude Battery General Technical (ASVAB-GT) test (cutoffs of 115–120, approximately the upper quartile).8 This screening process identified individuals with high academic aptitude and reduced the roughly 60,000 eligible soldiers to about 18,000 soldiers, an important reduction but one that still makes targeting difficult. Second, in 2013–14, West Point's Admissions Office worked with the Office of Economic and Manpower Analysis to obtain SAT scores for current soldiers to further narrow this population of high potential soldiers to those with the highest likelihood of receiving an offer of admission. Although these methods improved the Admissions Office targeting and encouraged some candidates to apply to West Point, significant barriers remained.

3.  Experimental Design

In our review of existing research, we found no experimental or empirical evaluations of different college outreach programs, college marketing efforts, or specific institutional program evaluations. Although we suspect that many institutions utilize data (e.g., SAT scores and demographic characteristics) to identify potential applicants, we are unaware of any published studies on the most effective targeting or outreach methods. We are unable to explain this lack of research, but hypothesize that institutions may be unable or unwilling to conduct deliberate evaluations; or they may have completed evaluations they hold privately, viewing the results as a source of comparative advantage in a competitive market for high-quality college students.

To improve West Point's recruiting of eligible enlisted service members, we conducted a field experiment in conjunction with the Admissions Office. After combining Army administrative data with College Board data on individual SAT scores from 2012 to 2014 to identify potential candidates, we designed multiple treatment arms based on economic, social, and psychological research (that we describe in the following sections) and institutional priorities. We evaluated the effectiveness of each outreach method using Admissions Office data on initiated applications and matriculations to the Academy (or its preparatory school) in the subsequent year.9 Finally, we combined the admissions data with estimated costs to complete a cost-effectiveness analysis.

Outreach Methods

Our intervention consisted of a control group and four treatment groups. The control group received the typical Admissions Office e-mail. The e-mail identified the opportunity to attend West Point, described the application process, and provided contact information for the Admissions Office. All treatment group individuals also received this e-mail. For each treatment arm described, we provided standardized guidance (e.g., scripts for Admissions Officer phone calls, talking points for unit commanders) for the interactions with soldiers, but we recognize that there is some heterogeneity in the implementation of each treatment by individual. Our estimates reflect the average of these differences.

The first and least resource-intensive treatment consisted of a phone call from a West Point Admissions representative. Chapman (1981) provides a theoretical model of student choice that hypothesizes a role for direct institutional efforts, but provides no evidence on its relative importance. However, existing research (Dynarski and Scott-Clayton 2006; Bettinger et al. 2012; Hoxby and Turner 2013) documents the deleterious effects of poor information about schools, eligibility, costs, and application processes on educational attainment. This method was designed for potential applicants to directly and efficiently overcome a lack of information about West Point. It is most analogous to an unsolicited contact from colleges to potential applicants. Admissions Office personnel scheduled times to talk with soldiers via e-mail and rescheduled if required. During the calls with candidates, the Admissions Officer described West Point, compared this opportunity to other college options, and discussed the prospect of becoming an officer in the Army. The Admissions Officer reviewed the soldier's eligibility and competitiveness given his or her existing SAT scores, and encouraged the soldier to complete his or her application. The content of each conversation differed as the Admissions Officer answered questions and discussed different aspects of the opportunity based on the soldier's level of interest.

The second outreach method utilized an office call with a soldier's battalion commander.10 This method aims to evaluate the impact of using role models in the institutional recruiting effort. Leadership theory (Bass 1990) and empirical evidence (Howell and Hall-Merenda 1999) suggest that spatial proximity in work and social relationships may facilitate relationship development and communication between leaders and subordinates. A variety of studies (Cowart 1988; Haveman and Wolfe 1995; Ainsworth 2002; Bettinger and Long 2005) document the importance of role models (e.g., parents, friends at college, high school counselors) in students’ college decisions. Additionally, previous military research establishes that high-ability military battalion commanders (the same level we exploit) have strong positive effects on those they mentor (Lyle and Smith 2014). As a result, we hypothesize that military leaders might have a large influence on soldiers’ decisions to apply for college. For the junior soldiers in our sample, battalion commanders are probably the most senior military leaders they may know. The closest corollary for many traditional college students would be a meeting with an authority figure such as a high school principal, superintendent, or perhaps an employer. We also note that the seniority of the officers involved could generate a chilling effect in the individual's willingness to ask questions during the meeting (thus lowering the probability of applying) or to cause an individual to feel compelled to apply (increasing the probability of application but perhaps not matriculation). Admissions staff members prepared the battalion commanders for the office call by describing the motivation for the meeting, providing talking points about West Point and the application process, and answering commanders’ questions. Each commander was free to arrange a meeting with the junior soldier to discuss the opportunity at his leisure. Given normal military unit procedures, this likely occurred by the commander informing staff and/or subordinate leaders to arrange a time with the candidate soldier. The staff would likely have informed the soldier about the purpose of the meeting and the soldier would have then reported to the commander for the office call and discussion. We expect the content of each discussion varied based on the individuals.

In the third outreach method, a senior enlisted military staff member from West Point visited the soldier at her base. Descriptive evidence (Cowart 1988) suggests that visiting college representatives influence students’ decisions. The visiting non-commissioned officers (NCOs) contacted a solider and arranged a meeting to discuss applying and attending West Point. As with the commander contact, this method evaluates the effects of role models, though it is more analogous to college representatives who visit high school students. The NCOs are closer in rank to the potential applicants and so they may be able to facilitate a more open dialog about the application opportunity than with the officers used in the first two methods, thereby increasing the likelihood of applying. The NCOs may also be viewed as more relatable role models, since they have pursued and succeeded in the same career path as the soldiers (enlisting in the Army). Finally, given their current assignment at West Point, they can also credibly discuss student and campus life. This method included visiting fourteen candidates at ten different Army bases nationwide (e.g., Fort Bliss near El Paso, Texas, and the Presidio near Monterey, California).11

The fourth outreach method offered individuals a free campus visit that included meeting with the Admissions Office, attending classes and activities with current cadets, staying in the cadet barracks (dorms), and touring the campus. This method sought to expose soldiers to students at West Point, who they were likely to view as peers, and to expose them to Academy officials. Chapman (1981) hypothesizes and empirical work (Cowart 1988; Zimmer and Toma 2000; Fryer and Austen-Smith 2007) suggests an important role for peers in students’ college decisions. There is also direct evidence of the importance of peers in military settings including their effects on academic major selection at West Point and retention in the Army (Lyle 2007), undergraduate academic performance at West Point (Lyle 2009), and select financial decisions (Lieber and Skimmyhorn 2018). This method is most comparable to typical campus visits scheduled by colleges and universities nationwide, though in this case the visit was free for candidates. Unlike the other three treatment groups, the Admissions Office required the screened candidates to initiate their application prior to scheduling their trip to West Point. The estimated effects here thus reflect the offer of a visit and not necessarily the visit itself.

Implementation

Of the more than 8,000 eligible active duty soldiers who took the SAT, we limit our sample to 225 candidates with SAT scores that were competitive with typical West Point applicants (our sample mean and median for the SAT total are 1273 and 1300, respectively) based on Admissions Office guidance. Using a randomized block design at the gender-ethnic/racial group level, we assigned these 225 individuals to one of five groups: four treatment groups and a control group. Table 1 provides data on the assignments and outcomes by group.

Table 1.
Experimental Assignments and Outcomes
Control Group (1)Academy Admissions Call (2)Unit Commander Contact (3)Academy Outreach Visit (4)Offer of Campus Visit (5)Full Sample (6)
Number assigned N 42 46 46 47 44 225 
Passed screening N N/A 30 29 26 30 115 
 N/A 65 63 55 68 51 
Opened an application N 12 10 35 
 26 19 23 16 
Matriculated N 13 
 11 
Control Group (1)Academy Admissions Call (2)Unit Commander Contact (3)Academy Outreach Visit (4)Offer of Campus Visit (5)Full Sample (6)
Number assigned N 42 46 46 47 44 225 
Passed screening N N/A 30 29 26 30 115 
 N/A 65 63 55 68 51 
Opened an application N 12 10 35 
 26 19 23 16 
Matriculated N 13 
 11 

Notes: Author compilations using Department of Defense (DOD) data. The percentages reported in the table use all members assigned to each group (Number Assigned row) as the denominator and not the number of candidates who passed the screening. Control group members were not screened as part of the pilot program and this results in our main analyses being intent-to-treat estimates.

We did not complete formal power calculations ex ante and instead relied on common (but fallible) rules of thumb (i.e., at least N = 30 individuals in each group [Brock and Vasilaky 2016; List, Sadoff, and Wagner 2011]) in determining the group sizes. Although a larger sample was desired, limited institutional resources available to support the screening process and outreach methods governed the initial sample size. At the same time, the institution wanted to test multiple outreach strategies. Both factors worked against executing an ideally powered study and we discuss this in detail later.12

Admissions staff at the university conducted a screening process for eligible candidates in each of the four experimental groups prior to the administration of any treatment. The screening served to validate the candidate's demographic eligibility (e.g., ensure they were not married or too old), obtain the best contact information for the candidate, and gather recommendations from the candidate's supervisors as to their potential for success at college and as a leader in the Army (e.g., are they good performers, have they had any disciplinary issues). These recommendations are a critical part of the admissions committee review of a soldier's application. Screening took place via phone calls or e-mails to a candidate's company-level supervisor.13 The table 1 results reveal that screening success rates are comparable across groups, as expected with random assignment.14 After screening, staff members contacted the eligible individuals based on their assigned outreach method. In order to prioritize the pilot program's efforts and avoid generating applications with little chance of admission, individuals who did not pass the screening process were not contacted outside of the initial mass e-mail (similar to the control group).

Ideally, the screening results could be used to compute treatment-on-the-treated (TOT) estimates. However, in this case, individuals assigned to the control group were not subject to the screening process. Program administrators made this decision in order to maintain support for the admissions process among field unit commanders. Asking commanders and supervisors to spend time and effort with an Admissions Officer working through a screening process and then assigning the screened candidate to a control group might undermine support for the West Point efforts in future years. Such practical challenges are common in experimental policy work (see Cook 2002 for a discussion).

As a result, we rely on the random assignment of individuals to groups and we instead complete intent-to-treat (ITT) estimates for all members of the sample. Table 1 provides the actual success rates for applications and matriculations by treatment group using this method. The percentages throughout the table, and in our remaining analysis, rely on the total number of individuals assigned to each group as the denominator. So, for example, when we analyze the number of applications initiated for the Academy Admissions Call (column 2), we divide the number who opened an application (12) by the number assigned to the group (46) for a success rate of 26 percent. Similarly, when we analyze the matriculation rate for this group, we divide the number who matriculated in their freshman year (5) by the number in the group (46) for a success rate of 11 percent. We discuss the implications of using ITT estimates in the final section of the paper.

4.  Experiment Validity

In table 2 we provide summary statistics for our sample by their assigned treatment group. The typical sample member is 21 years old, has served in the Army for slightly longer than one year, is male, and is about equally likely to be white or a racial/ethnic minority. As a result of the underlying demographics of the Army, women are underrepresented when compared with the typical college-aged population. In addition, African American soldiers are more likely to be in our sample than in the Army given our oversampling of this group (26 percent of our sample compared with 22 percent of the eligible population).15 Sample members have relatively high SAT scores, with a mean of 632 and 641 for the critical reading and math scores, respectively. These individuals are comparable to West Point's most recent class (entered in the summer of 2014 and will graduate in 2018), which had mean scores of 628 and 642, respectively. More importantly, the sample compares very favorably in its average test scores relative to the soldiers offered admission in 2014, who had mean scores of 591 and 593.

Table 2.
Experimental Group Summary Statistics
Experimental Group
ControlAdmissions Phone CallUnit Commander ContactAcademy Outreach VisitOffer of Campus Visit
Mean (Std Dev) (1)Mean (Std Dev) (2)Diff: (2) — (1) p-value (3)Mean (Std Dev) (4)Diff: (4) — (1) p-value (5)Mean (Std Dev) (6)Diff: (6) — (1) p-value (7)Mean (Std Dev) (8)Diff: (8) — (1) p-value (9)Joint Test of Equality p-value (10)
Panel A. Individual Characteristics
Age 21.14 21.00 −0.14 21.11 −0.03 21.15 0.01 21.14 −0.01 0.9152 
 (0.75) (0.84) [0.40] (0.77) [0.83] (0.86) [0.97] (0.77) [0.97]  
Female 0.12 0.13 0.01 0.11 −0.01 0.13 0.01 0.14 0.02 0.9946 
 (0.33) (0.34) [0.87] (0.31) [0.88] (0.34) [0.90] (0.35) [0.81]  
Black 0.33 0.33 −0.01 0.33 −0.01 0.30 −0.04 0.32 −0.02 0.9974 
 (0.48) (0.47) [0.94] (0.47) [0.94] (0.46) [0.72] (0.47) [0.88]  
Hispanic 0.12 0.15 0.03 0.15 0.03 0.11 −0.01 0.11 −0.01 0.9150 
 (0.33) (0.36) [0.65] (0.36) [0.65] (0.31) [0.85] (0.32) [0.94]  
Other race 0.10 0.09 −0.01 0.09 −0.01 0.06 −0.03 0.09 0.00 0.9661 
 (0.30) (0.28) [0.89] (0.28) [0.89] (0.25) [0.59] (0.29) [0.95]  
White 0.45 0.43 −0.02 0.43 −0.02 0.53 0.08 0.48 0.02 0.8786 
 (0.50) (0.50) [0.87] (0.50) [0.87] (0.50) [0.46] (0.51) [0.82]  
Years of service 1.26 1.13 −0.13 1.17 −0.09 1.13 −0.13 1.14 −0.126 0.9273 
 (0.77) (0.65) [0.39] (0.49) [0.53] (0.54) [0.35] (0.55) [0.39]  
ASVAB-GT score 126 124 −1.73 125 −0.52 125 −1.06 128 1.59 0.7572 
 (12.53) (14.25) [0.55] (12.00) [0.84] (13.50) [0.70] (10.87) [0.53]  
SAT math score 633 628 −4.86 630 −3.77 631 −2.27 637 3.94 0.9859 
 (70.42) (74.98) [0.75] (79.69) [0.81] (69.91) [0.88] (74.22) [0.80]  
SAT verbal score 650 642 −8.02 634 −16.07 641 −8.49 638 −12.03 0.8885 
 (74.19) (76.11) [0.62] (78.67) [0.33] (67.10) [0.57] (72.91) [0.45]  
Observations 42 46 46 47 44  
Panel B. Covariate Regression Results 
R2 for indiv. characteristics – 0.0209 0.0316 0.0461 0.0504  
p-value for F-test of joint significance – 0.6656 0.4347 0.2260 0.2727  
observations (control+outreach group)  88 88 89 86  
Experimental Group
ControlAdmissions Phone CallUnit Commander ContactAcademy Outreach VisitOffer of Campus Visit
Mean (Std Dev) (1)Mean (Std Dev) (2)Diff: (2) — (1) p-value (3)Mean (Std Dev) (4)Diff: (4) — (1) p-value (5)Mean (Std Dev) (6)Diff: (6) — (1) p-value (7)Mean (Std Dev) (8)Diff: (8) — (1) p-value (9)Joint Test of Equality p-value (10)
Panel A. Individual Characteristics
Age 21.14 21.00 −0.14 21.11 −0.03 21.15 0.01 21.14 −0.01 0.9152 
 (0.75) (0.84) [0.40] (0.77) [0.83] (0.86) [0.97] (0.77) [0.97]  
Female 0.12 0.13 0.01 0.11 −0.01 0.13 0.01 0.14 0.02 0.9946 
 (0.33) (0.34) [0.87] (0.31) [0.88] (0.34) [0.90] (0.35) [0.81]  
Black 0.33 0.33 −0.01 0.33 −0.01 0.30 −0.04 0.32 −0.02 0.9974 
 (0.48) (0.47) [0.94] (0.47) [0.94] (0.46) [0.72] (0.47) [0.88]  
Hispanic 0.12 0.15 0.03 0.15 0.03 0.11 −0.01 0.11 −0.01 0.9150 
 (0.33) (0.36) [0.65] (0.36) [0.65] (0.31) [0.85] (0.32) [0.94]  
Other race 0.10 0.09 −0.01 0.09 −0.01 0.06 −0.03 0.09 0.00 0.9661 
 (0.30) (0.28) [0.89] (0.28) [0.89] (0.25) [0.59] (0.29) [0.95]  
White 0.45 0.43 −0.02 0.43 −0.02 0.53 0.08 0.48 0.02 0.8786 
 (0.50) (0.50) [0.87] (0.50) [0.87] (0.50) [0.46] (0.51) [0.82]  
Years of service 1.26 1.13 −0.13 1.17 −0.09 1.13 −0.13 1.14 −0.126 0.9273 
 (0.77) (0.65) [0.39] (0.49) [0.53] (0.54) [0.35] (0.55) [0.39]  
ASVAB-GT score 126 124 −1.73 125 −0.52 125 −1.06 128 1.59 0.7572 
 (12.53) (14.25) [0.55] (12.00) [0.84] (13.50) [0.70] (10.87) [0.53]  
SAT math score 633 628 −4.86 630 −3.77 631 −2.27 637 3.94 0.9859 
 (70.42) (74.98) [0.75] (79.69) [0.81] (69.91) [0.88] (74.22) [0.80]  
SAT verbal score 650 642 −8.02 634 −16.07 641 −8.49 638 −12.03 0.8885 
 (74.19) (76.11) [0.62] (78.67) [0.33] (67.10) [0.57] (72.91) [0.45]  
Observations 42 46 46 47 44  
Panel B. Covariate Regression Results 
R2 for indiv. characteristics – 0.0209 0.0316 0.0461 0.0504  
p-value for F-test of joint significance – 0.6656 0.4347 0.2260 0.2727  
observations (control+outreach group)  88 88 89 86  

Notes: Department of Defense data. N = 225. The table presents summary statistics using administrative data. Standard deviations of each variable appear in parentheses and p-values for t-tests of the differences in means appear in brackets. We describe the outreach groups (columns 2, 4, 6, and 8) in section 2. In panel B, the partial R2 and p-values at the bottom of the columns report the results from a regression of an indicator variable for each treatment group (relative to the control group) on the covariates in the rows. In each case, the observable characteristics are unrelated to the assigned outreach method (at the 5% level). Logistic regressions yield the same results. The final column depicts the p-values for a joint F-test of coefficient equality after completing a regression of the characteristic in each row on indicators for each of the five experimental groups. The p-values suggest that the coefficients are equal and therefore the characteristic is equal across groups. In all regressions we cluster the standard errors at the installation level (robust standard errors yield very similar results). ASVAB-GT = Armed Services Vocational Aptitude Battery General Technical.

To assess the extent to which randomization balanced groups with respect to observable characteristics, we return to table 2. First, in panel A, using t-tests we compare the control group (column 1) to the treatment groups (columns 2, 4, 6, and 8) by each of our individual characteristics. We report the mean differences, standard deviations (in parentheses), and p-values for these tests (in brackets). Encouragingly, none of the treatment groups differs from the control group on any of the individual characteristics, suggesting valid random assignment. Second, using regression estimates we report joint tests of equality for all five groups for each individual characteristic (column 10). In all cases we fail to reject the null hypothesis of equality across groups.

Third, in panel B we estimate the joint effects of the group differences in predicting treatment. To do so we regress a treatment indicator on all of the individual characteristics and then conduct a joint test of significance for all of these characteristics. The results show that for all four outreach methods (columns 2, 4, 6 and 8), the individual characteristics are jointly unrelated to treatment, further reassuring us of balance across assigned treatment groups. Finally, in tables 3 and 4, we also provide our main estimates with and without individual characteristics to demonstrate the stability of our results and the validity of our experimental design.

Table 3.
Admissions Outreach Effort Treatment Effects
Outcome
Initiated ApplicationMatriculated
(1)(2)(3)(4)(5)(6)
Pooled treatment 0.1620***   0.0418**   
 (0.0393)   (0.0186)   
Academy admissions phone call  0.2371*** 0.2207***  0.0849 0.0836 
  (0.0720) (0.0757)  (0.0521) (0.0621) 
Unit commander contact  0.0414 0.0297  0.0197 0.0233 
  (0.0478) (0.0454)  (0.0412) (0.0498) 
Academy outreach visit  0.1677*** 0.1544***  0.0187 0.0225 
  (0.0471) (0.0485)  (0.0174) (0.0161) 
Offer of campus visit  0.2035*** 0.1786***  0.0444 0.0360* 
  (0.0642) (0.0656)  (0.0333) (0.0401) 
Individual characteristics 
Observations 225 225 225 225 225 225 
R2 0.0303 0.0643 0.1058 0.0049 0.0157 0.0604 
Coefficient differences:       
Phone call — Commander contact  0.1910   0.0603 
p-value   0.0066   0.2653 
Phone call — Outreach visit   0.0663   0.0611 
p-value   0.3510   0.3495 
Phone call — Campus visit   0.0421   0.0476 
p-value   0.5973   0.4407 
Commander contact — Outreach visit  −0.1247   0.0008 
p-value   0.0266   0.9880 
Commander contact — Campus visit  −0.1489   −0.0127 
p-value   0.0474   0.7891 
Outreach visit — Campus visit   −0.0242   −0.0135 
p-value   0.7750   0.7673 
Outcome
Initiated ApplicationMatriculated
(1)(2)(3)(4)(5)(6)
Pooled treatment 0.1620***   0.0418**   
 (0.0393)   (0.0186)   
Academy admissions phone call  0.2371*** 0.2207***  0.0849 0.0836 
  (0.0720) (0.0757)  (0.0521) (0.0621) 
Unit commander contact  0.0414 0.0297  0.0197 0.0233 
  (0.0478) (0.0454)  (0.0412) (0.0498) 
Academy outreach visit  0.1677*** 0.1544***  0.0187 0.0225 
  (0.0471) (0.0485)  (0.0174) (0.0161) 
Offer of campus visit  0.2035*** 0.1786***  0.0444 0.0360* 
  (0.0642) (0.0656)  (0.0333) (0.0401) 
Individual characteristics 
Observations 225 225 225 225 225 225 
R2 0.0303 0.0643 0.1058 0.0049 0.0157 0.0604 
Coefficient differences:       
Phone call — Commander contact  0.1910   0.0603 
p-value   0.0066   0.2653 
Phone call — Outreach visit   0.0663   0.0611 
p-value   0.3510   0.3495 
Phone call — Campus visit   0.0421   0.0476 
p-value   0.5973   0.4407 
Commander contact — Outreach visit  −0.1247   0.0008 
p-value   0.0266   0.9880 
Commander contact — Campus visit  −0.1489   −0.0127 
p-value   0.0474   0.7891 
Outreach visit — Campus visit   −0.0242   −0.0135 
p-value   0.7750   0.7673 

Notes: Department of Defense data. The table depicts the results from linear probability model regressions of equation 1 for a pooled treatment indicator (columns 1 and 4) and for individual outreach method group indicators (columns 2, 3, 5, and 6). Standard errors clustered at the base (location) level are in parentheses. The results at the bottom of the table reflect the results (coefficient differences and p-values in italics) of post-estimation tests of coefficient equality for the outreach methods listed in each row.

*Statistically significant at the 1% level; **statistically significant at the 5% level; ***statistically significant at the 10% level.

Table 4.
Cost Effectiveness Analysis of Outreach Methods
Costs
Academy Admissions CallUnit Commander ContactAcademy Outreach VisitOffer of Campus Visit
Number assigned 46 46 47 44 
Number treated (post-screening) 30 15 14 15 
Travel costs per outreach $0 $0 $744 $653 
Personnel cost per outreach $81 $166 $1,305 $767 
Total cost per outreach $81 $166 $2,049 $1,420 
Applications initiated 12 10 
Average cost per application $202 $828 $3,187 $2,130 
Students matriculated 
Average cost per matriculation $485 $1,242 $14,340 $7,101 
Costs
Academy Admissions CallUnit Commander ContactAcademy Outreach VisitOffer of Campus Visit
Number assigned 46 46 47 44 
Number treated (post-screening) 30 15 14 15 
Travel costs per outreach $0 $0 $744 $653 
Personnel cost per outreach $81 $166 $1,305 $767 
Total cost per outreach $81 $166 $2,049 $1,420 
Applications initiated 12 10 
Average cost per application $202 $828 $3,187 $2,130 
Students matriculated 
Average cost per matriculation $485 $1,242 $14,340 $7,101 

Notes: Author calculations using Department of Defense (DOD) data. Average costs use DOD Comptroller 2014 estimates for reimbursement rates by pay grade. Academy Outreach Visit group assumes two days of total travel and meetings and one visit per trip unless noted with four visits per trip. Offer of Campus Visit group assumes a three-day long trip with candidates residing in cadet barracks and eating in cadet dining facility. See Appendix A for details.

5.  Results

As a result of this experiment, thirty-five (16 percent) soldiers submitted an initial application to West Point and thirteen (6 percent) matriculated to either West Point or West Point's preparatory school for the semester beginning in Fall 2015. To estimate the causal effects of each outreach method compared with the control group for these outcomes, we estimate equation 1:
Yi=α+β1Calli+β2OfficeCalli+β3StaffVisiti+β4CampusTouri+δgr+Xiγ+μi,
(1)

where Yi represents a binary outcome indicator (i.e., initiated an application or received an offer of admission) for this linear probability model. The β coefficients are of primary interest and reflect the causal effects of each outreach method relative to the omitted control group. δgr represents the block-level fixed effects for the full interaction of gender and racial/ethnic groups and we omit reporting the coefficients for simplicity. δgr is a vector of additional individual characteristics (i.e., military experience, ASVAB-GT scores, SAT Math, and SAT Verbal scores) used as covariates in our full specifications, although we also omit reporting the γ coefficients for simplicity. μi is the error term. Note that we can also estimate the average treatment effect for the four methods using a single treatment indicator (=1 for those assigned to any outreach method) and report only one β coefficient. For all specifications, we cluster our standard errors at the military base level to account for unobserved correlations between individuals at each location.16

We present our linear probability model regression results in table 3. The first three columns report the estimates for the initiated application outcome and the last three columns report the estimates for the matriculation outcome. For both outcomes, we present a combined treatment effect estimate (columns 1 and 4), and separate outreach method treatment effect estimates without covariates (columns 2 and 5) and with covariates (columns 3 and 6).

On average, the outreach methods increased applications by 16 percentage points (pp) and the effect is statistically significant (column 1, p < 0.01). In addition, three of the four methods had positive and statistically significant effects on applications (column 2): the admissions call by 24 pp (p < 0.01); the West Point staff visit by 17 pp (p < 0.01); and the offer of a West Point campus visit by 20 pp (p < 0.01). The battalion commander office call estimate is positive (4 pp), but not statistically significant (p = 0.516). The results are stable to the inclusion of the individual characteristics (column 3). We estimate our treatment effects using logit regressions and find similar results (see Appendix table A.1).

At the completion of the admissions process and individual decisions, thirteen individuals matriculated to West Point or the Academy's one-year preparatory school and we analyze this matriculation in table 3, columns 4 through 6. Pooling across the four groups (to confirm the overall benefit of the intervention), candidates were 4 pp more likely to matriculate than members of the control group (p = 0.029). However, although all of the outreach methods (columns 5 and 6) have positive and suggestive point estimates that suggest an increase in the probability of matriculating, none of the results is statistically significant. Relative to the control group, the admissions call contact group increased the probability of matriculation (column 3) by 8 pp (p = 0.185), the battalion commander contact group by 2 pp (p = 0.642), the staff visit group by 2 pp (p = 0.170), and the offer of a free campus visit group by 4 pp (p = 0.374). While these results are suggestive, the lower levels of matriculation and the small sample size prevent firm conclusions.

Having estimated the main effects, we briefly evaluate potential differential treatment effects. In the lower half of table 3 (columns 3 and 6), we provide the results of post estimation tests of coefficient equality for all of the outreach method pairs. The brief discussion above suggests relatively comparable effects by method with the exception that the unit commander contact method appears somewhat less successful. The application results (column 3) suggest that the Admissions Office Call was 19 pp more successful than the Unit Commander Contact (p = 0.007), the Academy Outreach Visit was 12 pp more successful than the Unit Commander Contact (p = 0.027), and the Offer of a Campus Visit was 15 pp more successful than the Unit Commander Contact (p = 0.047). The Admissions Office Call appears to be more effective than the Academy Outreach Visit and the Offer of a Campus Visit (point estimates are positive) but the results are imprecise and not statistically distinguishable from zero.

For matriculation, the outreach methods are similar in the ordering of their magnitudes, though none is statistically significant. The Admissions Office Call again appears to be the most successful, with the Offer of a Campus Visit second, but we cannot interpret these too strongly given their imprecision. Our main results suggest that sample members seem to be more directly influenced by being individually contacted by an authority figure rather than by any one type of individual. However, although we detect few differences in effectiveness, in the following section we explore the widely different resource requirements for each method.

6.  Cost Effectiveness Analysis

In a simple cost-effectiveness analysis, depicted in table 4, we estimate the costs associated with each outreach method and combine them with the numbers of soldiers who applied and matriculated. We forego an analysis of the control group because, as the status quo, it has failed to deliver the desired number of soldier applications. Thus, even if it were the most cost effective (e-mails are inexpensive, once designed), its overall effectiveness is inadequate for West Point's purposes. We determine the total costs by multiplying the costs for each outreach by the number of individuals contacted in each group (post-screening).

Of the new methods, the most expensive outreach was sending a West Point staff member to visit candidates. On average, this method cost $744 in travel expenses and an additional $1,305 in labor costs, resulting in $2,049 per outreach. When considering the number of applications (nine) and matriculations (two) from this method beyond the control group production, the average additional costs were $3,187 and $14,340, respectively.

The second most expensive effort was the candidate campus visit at $2,130 per application and $7,101 per matriculation. It was slightly more cost effective because it produced a greater number of new accessions and because the opportunity cost of the junior soldier's time to visit the campus was far less than that of more senior soldiers visiting various installations.

Not surprisingly, the direct admissions phone call and battalion leadership contact were far less expensive, both because they did not require travel and because they required less time to coordinate. Based on time spent corresponding with the prospective students and their unit leadership, the Unit Commander Contact cost approximately $828 per additional application and $1,242 per additional matriculation. The Academy Admissions Call cost approximately $202 per application and $485 per matriculation. The main differences in expenses between these groups were the additional coordination time, the labor costs for the senior officer, and the lower yield from the Unit Commander Contact method.

We briefly summarize the required productivity of each method to change our cost effectiveness rankings. Holding other factors constant, the battalion commander contact method would need to increase its effectiveness by four new matriculations (200 percent) to become the most cost-effective method. The soldier campus visit would need to produce fifteen additional matriculations (400 percent) to become more cost effective than the battalion commander contact. The Academy NCO visit would need to produce three more matriculations (150 percent) to become more cost effective than the soldier campus visit. For more details on our cost-effectiveness analysis, see Appendix table A.1.

7.  Discussion

Using Army administrative data and standardized test score data enabled West Point admissions personnel to effectively focus their efforts on soldier candidates who are most qualified and likely to receive an offer of admissions. The average SAT math and critical reading scores for the soldiers who matriculated under this program was 1303, ten points higher than that of the full class that entered.17 Because each of these candidates has completed initial entry training upon joining the Army, and demonstrated successful performance in assigned units, we expect these future cadets to have a better chance of success at West Point and in the Army than a typical nonmilitary applicant (e.g., high school seniors)—especially given their propensity for service and other noncognitive characteristics revealed in securing their recommendations. Put another way, the matriculating students from this program appear slightly better academically prepared—and more prepared overall—to succeed than the average entering student.

In addition, the soldiers who matriculated under this program had a 114-point higher average SAT math and critical reading score than the 37 soldiers who matriculated in 2014 (1302 vs. 1188). Therefore, these entering soldiers appear substantially more college-ready when compared with those with comparable military training and experience from previous cohorts. Thus, even if there is a desire to limit the number of soldier entries in each entering class, these methods identify and enable admission of especially qualified individuals.

Taken together, these two comparisons suggest that expansion of the program would enable West Point to target hundreds or even thousands of eligible and highly qualified soldiers each year. Given its federal mandate to recruit Army active duty, reserve, and National Guard soldiers, the pilot program demonstrated the potential for West Point to transform this requirement into an asset for increasing the quality and diversity of its student body.

We briefly consider why the Unit Commander Contact method generated smaller effects than the other contact methods. One potential reason is that a few of the commanders, given their own opinions, overruled the recommendations provided by the company level leadership and did not support the candidates’ applications. While this reduced the number of applications and offers, we are unsure of which leaders’ assessments (company or battalion level) are more accurate. An additional explanation that we cannot rule out is that the Admissions Office staff is simply more effective in marketing the school given their experience, expertise, and incentives. They may be better able to get potential candidates excited about the opportunity or they may be more effective in addressing soldiers’ concerns about the application process or attendance at West Point. However, it may also be the case that the additional time required to contact and prepare the commanders pushed candidates too close to the school's application deadline and discouraged some from applying. Despite the differences observed here, we think that the Unit Commander Contact method warrants further study, given that West Point's Directorate of Admissions and the Office of Economic and Manpower Analysis conducted a smaller pilot in 2013–14 and found that the Academy Admissions Call and Unit Commander Contact had similar effects for applications and offers of admission.18 Although some institutions do not leverage authority figures who know potential candidates, the existence of active alumni networks and regional Admissions Officers for many elite schools suggests this may be an effective outreach method.

We also highlight some potential concerns with the internal and external validity of our results. Given the limited availability of admissions staff, we conducted the outreach over a four-month time period. Individuals contacted during the later portion of this window had less time to decide if they wanted to apply or to complete an application. In addition, we provide ITT estimates because we did not complete the outreach methods for everyone in the sample, even if they were positively screened (e.g., we did not visit all locations with soldiers who passed the screening and not all Unit Commanders completed their office calls). In addition, recall that control group members were not screened, and so we compute application and matriculation rates based on the initial group assignments in table 2.

Even though we cannot reliably calculate TOT estimates, we take our estimates as lower bounds on the actual program effects. Based on West Point application data and subsequent communication with the candidates, we observe that a significant portion of company-level leadership had discussed the opportunity to attend West Point with its soldiers prior to applying the respective contact treatments. Many of the candidates had prepared questions for the West Point outreach team and mentioned that their chain of command had fully explained this opportunity in anticipation of their conversation. Although this contamination undermines our empirical estimates, it also suggests that there are positive externalities to an outreach program in educating more influencers and increasing the potential for spillover applications. We also expect that some contamination across treatment groups may have occurred since certain Army locations (and units within these locations) housed multiple candidates. As a result, the effects of each method may have been inadvertently applied across groups if soldiers shared their experiences with one another.19 Another challenge is measuring the differences across methods that all provide improved information to soldiers. We test different methods of providing this information (e.g., peers vs. role models) but the channel may be less important than the information provision itself.

In terms of external validity for soldiers, although this paper focuses on Active Component soldiers, if effectively identified and recruited, enlisted soldiers from the Reserve and National Guard components could also become a valuable source of quality applicants rather than a mandated accessions goal. There are potentially thousands of eligible soldiers with proven records of strong military service and above average academic aptitude in the Army.

As to external validity with respect to other institutions, our results should be interpreted carefully. First, the Army (and hence West Point) may have access to more robust administrative data than typical colleges to support such a program. However, national testing service data can provide very similar characteristics and contact information to other institutions. Second, the role model effects examined here may be stronger than in most civilian high school and undergraduate contexts. However, we expect that they are roughly comparable to more typical mentors such as school counselors, principals, and coaches that colleges often contact. The Offer of a Campus Visit may be relatively generous in this case as it was free to the candidates once they completed an application. We are not aware of any reasons that the Academy Admissions Call results are not generalizable and so this strategy may be a particularly promising approach for other higher education institutions.

Finally, we return to the issue of experimental power given that we find few statistically significant differences in the treatment effects for our matriculation outcome and between our experimental groups for both outcomes. We follow broader calls in the literature (Cohen 1992; Brock and Vasilaky 2016) to report our power in support of replicability and our external validity. To detect a “medium” effect size of d=0.5 (as proposed by Cohen 1988) between our control group and one experimental group, we would require a sample size of N = 128 (N = 64 in each group).20 For our study (five groups), this would require a total sample of N = 320. We were therefore underpowered to detect these effect sizes. We note that this intervention would have been powered appropriately to detect such effects, but a second wave of military data matched to SAT data (to determine eligibility) was not received in time for screening and outreach in support of the West Point admissions cycle. With our smaller sample, we were powered to detect Cohen's (1988) “large” effect sizes (d=0.8).21 However, because the differences between our experimental groups (e.g., the Admissions Office Call and the Invitation to visit West Point) were likely to be small, we remained underpowered to detect the likely effects. Specifically, we only have the power to detect effect sizes of d=0.6041 for our ITT effects between groups.22

8.  Conclusion

Our findings indicate that targeted outreach methods can produce substantial improvements over routine mailings and advertisements in undergraduate admissions programs for elite colleges and universities. In particular, we identify three specific methods that increase applications and may increase matriculations. These results should be particularly helpful for colleges and aid organizations attempting to contact and attract nontraditional students who have already graduated high school. Because West Point, like most colleges and universities, operates with a limited advertising and admissions budget, relatively inexpensive contact methods—such as a direct phone call from an Admissions Officer—appear to produce high-quality applicants from diverse groups. For institutions with more resources, our cost-effectiveness analysis provides additional evidence on optimal outreach policies. Equally important, this research demonstrates the value of experimental approaches to policy analysis in this previously unstudied aspect of higher education access.

West Point is undoubtedly a unique institution, yet our results provide the first causal evidence we know of on the question of the effectiveness (and cost-effectiveness) of different institutional outreach strategies. In addition, they provide direct evidence on the question of enlisted service member recruiting for national military academies, institutions of substantial policy interest given their cost (more than $632 million in 2015)23 and public interest concerns (e.g., diversity goals highlighted in GAO 2015). Finally, they provide plausibly generalizable evidence to institutions nationwide interested in expanding their population of veteran and nontraditional students. More research on the effectiveness of these and other strategies can further enhance college-going among other groups of interest at other institutions. These results suggest promise for expanding the population of veteran, nontraditional, and/or high-achieving underserved students at high-quality educational institutions.

Recruiting able and diverse students is one of many tasks universities undertake in an environment of competing budgets and admissions priorities that include diversity and quality (Hossler 2004). Although in some cases these goals may compete, there may also be dominant alternatives. Empirical evidence on the effectiveness of outreach policies enables administrators in admissions offices to improve their performance for any given set of institutional priorities. This research demonstrates the potential for obtaining such evidence at one unique institution.

Acknowledgments

We thank Ben Castleman, Susan Carter, Mike Kofoed, and Mike Walker for their assistance in completing this research. A special thanks to Captain Jason Dupuis and Major Ryan Liebhaber of the West Point Admissions Office for their support in program design and execution, and to Staff Sergeant David Walik and Luke Gallagher of the Office of Economic and Manpower Analysis for assistance with the data. All errors are our own. The opinions expressed herein reflect the personal views of the authors and do not reflect the views of the United States Military Academy, the United States Army, or the United States Department of Defense.

REFERENCES

Ainsworth
,
James W.
2002
.
Why does it take a village? The mediation of neighborhood effects on educational achievement
.
Social Forces
81
(
1
):
117
152
. doi:10.1353/sof.2002.0038.
Alger
,
Jonathan
,
Jorge
Chapa
,
Roxanne
Gudeman
,
Patricia
Marin
,
Geoffrey
Maruyama
,
Jeffrey
Milem
,
Jose
Moreno
, and
Deborah
Wilds
.
2000
.
Does diversity make a difference? Three research studies on diversity in college classrooms
.
Washington, DC
:
American Council on Education and American Association of University Professors
.
Andrews
,
Rodney J.
,
Scott A.
Imberman
, and
Michael F.
Lovenheim
.
2016
.
Recruiting and supporting low-income, high-achieving students at flagship universities
.
NBER Working Paper No. 22260.
Bass
,
Bernard M.
1990
.
Bass and Stogdill's handbook of leadership: Theory, research and management applications
. 3rd ed.
New York
:
Free Press
.
Bettinger
,
Eric P.
, and
Bridget T.
Long
.
2005
.
Do faculty serve as role models? The impact of instructor gender on female students
.
American Economic Review
95
(
2
):
152
157
. doi:10.1257/000282805774670149.
Bettinger
,
Eric P.
,
Bridget Terry
Long
,
Philip
Oreopoulos
, and
Lisa
Sanbonmatsu
.
2012
.
The role of application assistance and information in college decisions: Results from the H&R Block FAFSA experiment
.
Quarterly Journal of Economics
127
(
3
):
1205
1242
. doi:10.1093/qje/qjs017.
Brock
,
J. Michelle
, and
Kathryn N.
Vasilaky
.
2016
.
Traversing the landscape of experimental power
.
Available
http://dx.doi.org/10.7916/D8HD7VKF.
Accessed 20 November 2017
.
Bulman
,
George
.
2015
.
The effect of access to college assessments on enrollment and attainment
.
American Economic Journal: Applied Economics
7
(
4
):
1
36
. doi:10.1257/app.20140062.
Carrell
,
Scott E.
, and
Bruce
Sacerdote
.
2017
.
Why do college going interventions work
?
American Economic Journal: Applied Economics
9
(
3
):
124
151
. doi:10.1257/app.20150530.
Castleman
,
Benjamin L.
,
Laura
Owen
, and
Lindsay C.
Page
.
2016
.
Stay late or start early? Experimental evidence on the benefits of college matriculation support from high schools versus colleges
.
Economics of Education Review
47
:
168
179
. doi:10.1016/j.econedurev.2015.05.010.
Castleman
,
Benjamin L.
, and
Lindsay C.
Page
.
2015
.
Summer nudging: Can personalized text messages and peer mentor outreach increase college going among low-income high school graduates
?
Journal of Economic Behavior & Organization
115
:
144
160
. doi:10.1016/j.jebo.2014.12.008.
Castleman
,
Benjamin L.
,
Lindsay C.
Page
, and
Korynn
Schooley
.
2014
.
The forgotten summer: Does the offer of college counseling after high school mitigate summer melt among college-intending, low-income high school graduates
?
Journal of Policy Analysis and Management
32
(
2
):
320
344
. doi:10.1002/pam.21743.
Chapman
,
Daniel W.
1981
.
A model of student choice
.
Journal of Higher Education
52
(
5
):
490
505
. doi:10.1080/00221546.1981.11778120.
Cohen
,
John
.
1988
.
Statistical power analysis for the behavioral sciences
. 2nd ed.
Hillsdale, NJ
:
Erlbaum
.
Cohen
,
John
.
1992
.
A power primer
.
Psychological Bulletin
112
(
1
):
155
159
. doi:10.1037/0033-2909.112.1.155.
College
Board
.
2011
.
The College Keys Compact: Expanding options for low-income students. A review of barriers, research, and strategies.
New York
:
The College Board
.
Cook
,
Thomas D.
2002
.
Randomized experiments in educational policy research: A critical examination of the reasons the education evaluation community has offered for not doing them
.
Educational Evaluation and Policy Analysis
24
(
3
):
175
199
. doi:10.3102/01623737024003175.
Cooper
,
Richard N.
1977
.
Military manpower and the all-volunteer force
.
Santa Monica, CA
:
RAND Corporation
.
Cowart
,
S. C.
1988
.
College choice and the student transition process
.
Iowa City, IA
:
Research Report of the American College Testing National Center for the Advancement of Educational Practices
.
Department of Defense (DOD)
.
2014a
.
2014 Demographics: Profile of the military community
.
Ava-ilable
http://download.militaryonesource.mil/12038/MOS/Reports/2014-Demographics-Report.pdf.
Accessed 20 November 2017
.
Department of Defense (DOD).
2014b
.
Memorandum: FY 2014 Department of Defense(DoD) military personnel composite standard pay and reimbursement rates
.
Available
http://comptroller.defense.gov/Portals/45/documents/rates/fy2014/2014_k.pdf.
Accessed 27 November 2017.
Department of Defense (DOD)
.
2016
.
DoD budget request: Military personnel programs (M-1); Operation and maintenance programs (O-1)
.
Available
http://comptroller.defense.gov/Budget-Materials/Budget2016/.
Accessed 30 November 2017
.
Dynarski
,
Susan
, and
Judith
Scott-Clayton
.
2006
.
The cost of complexity in federal student aid: Lessons from optimal tax theory and behavioral economics
.
National Tax Journal
59
(
2
):
319
356
. doi:10.17310/ntj.2006.2.07.
Eitelberg
,
Mark J.
,
Janice H.
Laurence
,
Brian K.
Waters
, and
Linda S.
Perelman
.
1984
.
Screening for service: Aptitude and education criteria for military entry
.
Available
www.dtic.mil/dtic/tr/fulltext/u2/a142167.pdf.
Accessed 27 November 2017
.
Fryer
,
Roland
, and
David
Austen-Smith
.
2007
.
An economic analysis of “acting white
.”
Quarterly Journal of Economics
120
(
2
):
551
583
.
Goodman
,
Sarena
.
2013
.
Learning from the test: Raising selective college enrollment by providing information
.
Review of Economics and Statistics
98
(
4
):
671
684
. doi:10.1162/REST_a_00600.
Government Accountability Office (GAO)
.
2015
.
Military personnel: Oversight framework and evaluations needed for DOD and the Coast Guard to help increase the number of female officer applicants
.
Available
https://gao.gov/products/GAO-16-55.
Accessed 30 November 2017
.
Gudeman
,
Roxane Harvey
.
2000
.
College missions, faculty teaching, and student outcomes in a context of low diversity
. In
Does diversity make a difference? Three research studies on diversity in college classrooms
,
edited by
Jonathan
Alger
,
Jorge
Chapa
,
Roxanne
Gudeman
,
Patricia
Marin
,
Geoffrey
Maruyama
,
Jeffrey
Milem
,
Jose
Moreno
, and
Deborah
Wilds
, pp.
37
61
.
Washington, DC
:
American Council on Education and American Association of University Professors
.
Haveman
,
Robert
, and
Barbara
Wolfe
.
1995
.
The determinants of children's attainments: A review of methods and findings
.
Journal of Economic Literature
33
(
4
):
1829
1878
.
Hossler
,
Don
.
2004
.
Refinancing public universities: Student enrollments, incentive-based budgeting, and incremental revenue
. In
Public funding of higher education: Changing contexts and new rationale
,
edited by
Edward P. St.
John
and
Michael D.
Parsons
, pp.
145
163
.
Baltimore, MD
:
The Johns Hopkins University Press
.
Howell
,
Jane M.
, and
Kathryn A.
Hall-Merenda
.
1999
.
The ties that bind: The impact of leader-member exchange, transformational and transactional leadership, and distance on predicting follower performance
.
Journal of Applied Psychology
84
(
5
):
680
694
. doi:10.1037/0021-9010.84.5.680.
Hoxby
,
Caroline
, ed.
2004
.
College decisions: The economics of where to go, when to go, and how to pay for it
.
Chicago, IL
:
University of Chicago Press
. doi:10.7208/chicago/9780226355375.001.0001.
Hoxby
,
Caroline
, and
Christopher
Avery
.
2013
.
The missing “one-offs”: The hidden supply of high-achieving, low-income students
.
Available
www.brookings.edu/wp-content/uploads/2016/07/2013a_hoxby.pdf.
Accessed 20 November 2017.
Hoxby
,
Caroline
, and
Sarah
Turner
.
2013
.
Expanding college opportunities for high- achieving, low income students
.
Stanford Institute for Economic Policy Research Working Paper No. 12–014.
Lieber
,
Ethan M.
, and
William
Skimmyhorn
.
2018
.
Peer effects in financial decision-making: Social spending but private saving
.
Forthcoming.
Journal of Public Economics
.
List
,
John
,
Sally
Sadoff
, and
Mathis
Wagner
.
2011
.
So you want to run an experiment, now what? Some simple rules of thumb for optimal experimental design
.
Experimental Economics
14
(
4
):
439
457
. doi:10.1007/s10683-011-9275-7.
Lyle
,
David
.
2007
.
Estimating and interpreting peer and role model effects from randomly assigned social groups at West Point
.
Review of Economics and Statistics
89
(
2
):
289
299
. doi:10.1162/rest.89.2.289.
Lyle
,
David
.
2009
.
The effects of peer group heterogeneity on the production of human capital at West Point
.
American Economic Journal: Applied Economics
1
(
4
):
69
84
. doi:10.1257/app.1.4.69.
Lyle
,
David S.
, and
John Z.
Smith
.
2014
.
The effect of high-performing mentors on junior officer promotion in the US Army
.
Journal of Labor Economics
32
(
2
):
229
258
. doi:10.1086/673372.
Orvis
,
Bruce R.
, and
Martin T.
Gahart
.
1990
.
Enlistment among applicants for military service
.
Santa Monica, CA:
Rand Corporation
Report No. R-3359-FMP
.
Pallais
,
Amanda
.
2015
.
Small differences that matter: Mistakes in applying to College
.
Journal of Labor Economics
33
(
2
):
493
520
. doi:10.1086/678520.
Snyder
,
Thomas D.
,
Cristobal
de Brey
, and
Sally A.
Dillow
.
2016
Digest of education statistics 2014
, 50th
edition
.
U.S. Department of Education
.
Available
https://nces.ed.gov/pubs2016/2016006.pdf.
Accessed 20 November 2017
.
Walton
,
Gregory M.
, and
Geoffrey L.
Cohen
.
2007
.
A question of belonging: Race, social fit, and achievement
.
Journal of Personality and Social Psychology
92
(
1
):
82
96
. doi:10.1037/0022-3514.92.1.82.
Walton
,
Gregory M.
, and
Geoffrey L.
Cohen
.
2011
.
A brief social-belonging intervention improves academic and health outcomes of minority students
.
Science
331
:
1447
1451
. doi:10.1126/science.1198364.
Watkins
,
Shanea
, and
James
Sherk
.
2008
.
Who serves in the U.S. military? The demographics of enlisted troops and officers
.
Available
www.heritage.org/defense/report/who-serves-the-us-military-the-demographics-enlisted-troops-and-officers.
Accessed 1 December 2017
.
Williams
,
Damon A.
2013
.
Strategic diversity leadership: Activating change and transformation in higher education
.
Sterling, VA
:
Stylus Publishing, LLC
.
Zimmer
,
Ron W.
, and
Eugenia F.
Toma
.
2000
.
Peer effects in private and public schools across countries
.
Journal of Policy Analysis and Management
19
(
1
):
75–92
. doi:10.1002/(SICI)1520-6688(200024)19:1<75::AID-PAM5>3.0.CO;2-W.

Notes

1. 

In addition to these secondary school efforts, other studies document the potential role of private or nonprofit organizations in improving college access. These methods include providing financial aid application assistance as part of annual tax return preparations (Bettinger et al. 2012), expanding standardized test score submission opportunities (Pallais 2015), and providing application assistance and fee waivers (Hoxby and Turner 2013).

2. 

By law, cadets are not obligated to serve in the military until beginning their third year at West Point. At the start of their junior year, students must sign a contract obligating them to five years of active duty service with an additional three years of service in the reserves. Cadets are commissioned as officers (2nd Lieutenants) upon graduation.

3. 

10 U.S. Code § 4342 - Cadets: appointment; numbers, territorial distribution. The Secretary of the Army has 85 nominations per year for the active component and 85 nominations per year for reserve and guard components.

4. 

The Army maintains detailed information on each soldier's service record. Admission Board staff may consider promotions, awards, military schools, and letters of recommendation from military leadership when considering the overall potential of a military applicant.

5. 

Between 2010 and 2014, West Point averaged forty-eight active duty admissions (of eighty-five available) and forty-three reserve admissions (of eighty-five available) per year. The highest single year total for enlisted applications was the class entering in 2013, where sixty-two active component and sixty-seven reserve soldiers matriculated.

6. 

In unpublished work, using Army administrative data combined with National Student Clearinghouse data on 2006–08 separating cohorts of Army enlistees roughly comparable to those under consideration here (no college degree and Armed Forces Qualification Test (AFQT) scores above the 65th percentile), we determined that only 11 percent earn bachelor's degrees within six years of separation (15 percent within eight years). The institutions where these individuals are most frequently enrolled for bachelor's degrees are: University of Phoenix, Central Texas College, University of Maryland University College, DeVry University, and Troy University. The institutions where these individuals most frequently graduated with bachelor's degrees are: University of Phoenix, University of Maryland University College, ITT Technical Institute, Embry Riddle Aeronautical University, and American Public University. Although individual circumstances and preferences may vary, these observed decisions do not appear superior to the possibility of a West Point degree.

7. 

Numbers of eligible active duty soldiers are from the Army's Total Army Personnel Data Base. This number reflects demographic eligibility (i.e., no legal dependents, single, U.S. citizen, and under the age of 23) and not necessarily academic readiness.

8. 

All military personnel must take the ASVAB prior to signing an enlistment contract with the military and therefore they have a GT score. The GT score is very similar to the AFQT score with which many readers may be more familiar. The ASVAB consists of nine subtests that measure aptitudes in verbal, math, science, and spatial domains. The Army GT score is a combination of three subtests (Word Knowledge, Paragraph Comprehension, and Arithmetic Reasoning). The AFQT includes these same three subtests and one more (Mathematics Knowledge). Not surprisingly, in other work we find the correlation between the AFQT combination and the GT combination to be approximately 0.9963. The mean AFQT (percentile) in our sample is 88 and the mean ASVAB-GT is 126.

9. 

We include matriculation to the West Point preparatory school in our outcome since West Point manages the school and since the preparatory school was created specifically to improve the readiness of select applicants to attend West Point. During the preparatory year, cadet candidates improve their academic skills (especially in English and mathematics), orient themselves to a collegiate environment, and work to improve a variety of noncognitive skills and behaviors, including time management and study skills. Although not guaranteed, admission to the preparatory school is nearly always followed by admission to West Point. Using data from the Admissions Office, we compute the matriculation rate for non-athlete preparatory school entrants from 2005 to 2016 as 86 percent.

10. 

Battalions are commanded by Lieutenant Colonels and vary in size from 300 to 1,500 soldiers.

11. 

Some eligible soldiers (N = 4) in this group were serving overseas or otherwise unavailable for visits.

12. 

A second (approximately equally-sized) second wave of candidates was planned but not completed because of a delay in a contract for standardized test data.

13. 

Army companies typically consist of 50–175 soldiers, though these can vary some by the type of unit. The screening process typically involved talking with the organization's officer (Company Commander) or NCO (First Sergeant) leaders. In some instances, these leaders solicited input from or transferred the screening responsibility to a soldier's direct supervisors (e.g., an academic instructor or section/platoon leader).

14. 

More formally, we regress an indicator for passing the screening process on the four treatment indicators and then jointly test the outreach method coefficient equality. We fail to reject that the coefficients are equal (p = 0.614).

15. 

Although this oversampling means our estimates may not reflect the full Army population, it represents the appropriate population of interest, given Army and West Point diversity goals.

16. 

Robust standard errors yield virtually identical results and have no effect on our levels of statistical significance.

17. 

The mean scores for the class of 2019 (year of this study) were 637 and 656, respectively.

18. 

We ran a smaller pilot with N = 93 soldiers in the spring of 2014 with just two outreach methods. The admissions call and battalion commander contact yielded 10.5 percent and 8.1 percent rate of admission offers, respectively, with both being statistically significant (p < 0.1). The differences in the results might be attributed to the experience of the Admissions Officer responsible for the pilots and/or the resourcing from the West Point Admissions Office.

19. 

The N = 225 candidates (including control) were distributed around the Army at 28 locations. A total of 189 candidates worked at the same installation as another sample member. Of these, 69 candidates worked in the same unit as another sample member. Although units are large (typically several hundred) and installations are very large (typically more than 5,000), we cannot rule out this potential contamination.

20. 

We use common assumptions: power = 0.8, equal standard deviations for different treatments, and a type I error rate α=0.05.

21. 

This power calculation assumes the same parameters: power = 0.8, equal standard deviations for different treatments, and a type I error rate α=0.05. This requires N = 26 individuals in each group, which we meet.

22. 

This is based on the sample sizes in our experimental groups (N = 44, 46, 46, and 47 for groups 1 through 4). For TOT effects, if we condition on those who passed screening in each group (i.e., N = 30, 29, 26, and 30 for our four groups), then we are only powered to detect effect sizes of approximately d=0.7769, which are very large.

23. 

Author calculations using DOD (2016) and DOD (2014b).

24. 

The $7/day estimate for food is based on unpublished internal West Point budget documents. Removing this cost (or doubling it) in the cost-effectiveness analysis leaves the rankings and conclusions unchanged.

Appendix A:  Additional Data

Cost Effectiveness Analysis

This analysis compares the cost of each outreach method against the number of additional students that matriculate to West Point or its preparatory school as a result of the method. Costs fit into two general categories: travel expenditures and the opportunity costs of all participants in the recruiting efforts. Evaluating the actual value of diversity at West Point and/or the returns to a West Point education and service are beyond the scope of this paper.

Two of the outreach methods (recruiting visit by a West Point NCO and the soldier campus visit) required participant travel. We assume that all visits by Academy personnel would take two days while all visits by soldiers to West Point would take three days, similar to other ongoing recruiting visit programs that West Point's Directorate of Admissions sponsors. Expenses for West Point personnel traveling match the DOD per diem rates for lodging, meals, and incidentals (DOD 2016), and we used an average airfare cost of $500, which approximates the actual airfare expenses of this study. Soldiers offered the opportunity to visit West Point stay in the cadet barracks and dine in the cadet mess hall. Travel expenses for these soldiers are the same as it for the Academy NCO visit, with the exception of no lodging costs while at West Point and only $7 per day (West Point budget) allocated for feeding the candidates at the West Point mess hall.24

All four outreach methods require multiple individuals to spend time in support of the experiment in lieu of completing other work. To capture this opportunity cost, we estimate the average amount of time each individual spends for a given outreach method and then multiply it by the hourly reimbursement pay rates established by the DOD Comptroller (DOD 2014b). As an example, for the battalion commander office visit, we estimate the following: a Lieutenant Colonel spends 30 minutes discussing this opportunity with the prospective student and West Point admissions staff ($112 per hour), a Major serving as an Admissions Officer spends 45 minutes supporting this process ($96 per hour), and the soldier (assumed to be a private first class) spends 30 minutes discussing this opportunity with his or her chain of command ($28 per hour).

The sum of travel expenses and labor costs divided by the number of soldiers that matriculate as a result of this experiment (see table 4) is the average cost per matriculation, our primary measure of cost-effectiveness. Rank ordering these outreach methods by average matriculation costs, we find that the Academy admissions call was the most cost effective technique ($485) followed by the battalion commander contact ($1,242), offer of a campus visit ($7,101), and the Academy NCO outreach visit ($14,340). The admissions call proved to be the most cost-effective option because it required no travel, took the least total time, and yielded the most new students. The battalion commander contact was less cost-effective due to its lower number of matriculated students and the time required to include the senior officer. The third most cost-effective method, the soldier's campus visit, required additional travel costs and a large time commitment of the junior soldier. The least cost-effective treatment method is the Academy NCO outreach visit. This technique was the most expensive and it produced the fewest new students. For more details on the cost effectiveness analysis, see table 4.

For a sensitivity analysis, we vary the number of soldiers that an NCO could visit on a recruiting trip and the actual yield of each outreach method. When we remove the assumption that NCOs visit only one candidate per trip, the cost per outreach decreases substantially because much of the total time and money spent was due to travel. Once an NCO can visit three soldiers per visit, the cost of an Academy outreach visit falls below the average cost of offering campus visits.

Second, we varied the effectiveness of each outreach method to determine the number of students that each technique would need to matriculate in order to change in the rankings while holding all other outcomes constant. The battalion commander contact method would need to increase its effectiveness by four new matriculations (a 200 percent increase) to become the most cost effective method. The soldier campus visit would need to produce 15 additional matriculations (a 400 percent increase) to become more cost effective than the battalion commander contact. The Academy NCO visit becomes more cost effective than the soldier campus visit if it increases its number of matriculations by three (a 150 percent increase).

Although this cost effectiveness analysis makes many simplifying assumptions, it clearly demonstrates that the phone call from the Admissions Officer is the most cost-effective strategy to matriculate current soldiers into West Point or its preparatory school. The stability of this method to plausible changes in costs and yields is further reassuring. These results prove unsurprising given that the phone call was both the most successful and least expensive outreach method.

Table A.1.
Admissions Outreach Effort Treatment Effects (Logit Regressions)
Outcome
Opened ApplicationMatriculated
(1)(2)(3)(4)(5)(6)
Pooled treatment 2.2360**   1.0568   
 (1.0298)   (1.0553)   
Academy admissions phone call  2.6721** 2.6155**  1.6094 1.5849 
  (1.0664) (1.0785)  (1.1175) (1.1696) 
Unit commander contact  1.0510 0.9774  0.6225 0.6966 
  (1.1752) (1.1858)  (1.2438) (1.2911) 
Academy outreach visit  2.2732** 2.2301**  0.6001 0.6697 
  (1.0779) (1.0905)  (1.2436) (1.2808) 
Offer of campus visit  2.4898** 2.3714**  1.0986 0.8930 
  (1.0742) (1.0881)  (1.1756) (1.2242) 
Constant −3.7136*** −3.7136*** 1.8456 −3.7136*** −3.7136*** −13.7082 
 (1.0121) (1.0121) (6.8202) (1.0121) (1.0121) (9.9071) 
Individual characteristics 
Observations 225 225 225 225 225 225 
Phone call — Commander contact   1.6381   0.8883 
p-value   0.0195   0.3264 
Phone call — Outreach visit   0.3854   0.9152 
p-value   0.4629   0.3099 
Phone call — Campus visit   0.2441   0.6919 
p-value   0.6356   0.3951 
Commander contact — Outreach visit   −1.2527   0.0269 
p-value   0.0801   0.9795 
Commander contact — Campus visit   −1.3940   −0.1964 
p-value   0.0509   0.8402 
Outreach visit — Campus visit   −0.1413   −0.2233 
p-value   0.7933   0.8193 
Outcome
Opened ApplicationMatriculated
(1)(2)(3)(4)(5)(6)
Pooled treatment 2.2360**   1.0568   
 (1.0298)   (1.0553)   
Academy admissions phone call  2.6721** 2.6155**  1.6094 1.5849 
  (1.0664) (1.0785)  (1.1175) (1.1696) 
Unit commander contact  1.0510 0.9774  0.6225 0.6966 
  (1.1752) (1.1858)  (1.2438) (1.2911) 
Academy outreach visit  2.2732** 2.2301**  0.6001 0.6697 
  (1.0779) (1.0905)  (1.2436) (1.2808) 
Offer of campus visit  2.4898** 2.3714**  1.0986 0.8930 
  (1.0742) (1.0881)  (1.1756) (1.2242) 
Constant −3.7136*** −3.7136*** 1.8456 −3.7136*** −3.7136*** −13.7082 
 (1.0121) (1.0121) (6.8202) (1.0121) (1.0121) (9.9071) 
Individual characteristics 
Observations 225 225 225 225 225 225 
Phone call — Commander contact   1.6381   0.8883 
p-value   0.0195   0.3264 
Phone call — Outreach visit   0.3854   0.9152 
p-value   0.4629   0.3099 
Phone call — Campus visit   0.2441   0.6919 
p-value   0.6356   0.3951 
Commander contact — Outreach visit   −1.2527   0.0269 
p-value   0.0801   0.9795 
Commander contact — Campus visit   −1.3940   −0.1964 
p-value   0.0509   0.8402 
Outreach visit — Campus visit   −0.1413   −0.2233 
p-value   0.7933   0.8193 

Notes: Department of Defense data. The table depicts the results from logit regressions of equation 1 for a pooled treatment indicator (columns 1 and 4) and for individual outreach method group indicators (columns 2, 3, 5, and 6). The results at the bottom of the table reflect the results (coefficient differences and p-values in italics) of post-estimation tests of coefficient equality for the outreach methods listed in each row.

**Statistically significant at the 5% level; ***statistically significant at the 10% level.