This paper uses Advanced Placement (AP) exams to examine how receiving college credit in high school alters students’ subsequent human capital investment. Using data from one large state, I link high school students to postsecondary transcripts from in-state, public institutions. I estimate causal impacts using a regression discontinuity that compares students with essentially identical AP performance but who receive different offers of college credit. I find that female students who earn credit from science, technology, engineering, and mathematics (STEM) exams take higher level STEM courses, significantly increasing their depth of study, with no observed impacts for male students. As a result, the male–female gap in STEM courses taken shrinks by roughly one third to two thirds, depending on the outcome studied. Earning non-STEM AP credit increases overall coursework in non-STEM courses and increases the breadth of study across departments. Early credit policies help assist colleges to produce graduates whose skills aligns with commonly cited social or economic priorities, such as developing STEM graduates with stronger skills, particularly among traditionally underrepresented groups.

The typical United States citizen spends at least a dozen years in formal presecondary schooling, developing skills in preparation for college and the labor force. As states and districts have incredible latitude in preparing students for adulthood, this raises the question of how to improve the efficiency of our current educational system. One option is to provide more opportunities for rigorous, “college-level” work during high school. Exposing students to a college-level curriculum can have multiple benefits: preparing students for the academic rigor of college; increasing students’ knowledge of or confidence about college-readiness; and learning about potential college majors or career choices.

I examine how early college coursework alters students’ subsequent human capital investment in the context of Advanced Placement (AP) exams. AP courses are a nationally recognized college-level curriculum, and high performance on AP exams can impact college pathways by allowing students to receive college credit for introductory courses, as well as providing a signal of academic preparedness. Prior work has shown both effects, as earning college credit reduces time to degree, and higher integer scores can independently shift students’ college major through a signaling effect (Mattern, Shaw, and Ewing 2011; Smith, Hurwitz, and Avery 2017; Avery et al. 2018; Evans 2019).

This paper focuses on an alternate outcome, estimating the impact of AP performance on students’ postsecondary curricular choices, and how these impacts vary by student gender. I estimate the causal impact of higher AP integer scores using a regression discontinuity design that compares students in one large state with essentially identical academic performance but who receive different integer exam scores (e.g., a 3 versus a 2); as a result, some students receive different signals of their academic performance and are provided the opportunity to skip introductory college courses. In addition to graduating earlier or altering the choice of major, students can use AP credit to deepen their study by taking more courses within their preferred discipline, or conversely, broadening the scope of their exposure to different disciplines. Knowing more about which courses students take is key to understanding the impact of early credit opportunities, as the types of skills developed in school may matter as much as the degree obtained (Altonji and Pierret 2001; Arteaga 2018; Goodman 2019). Postsecondary enrollment is inherently driven by tradeoffs in course-taking, as many colleges promote the holistic benefits of a broad liberal arts curriculum against the need to delve deeply into coursework that will serve as the basis of a future career.1

I find that receiving college credit from an AP exam in science, technology, engineering or mathematics (STEM) increases the depth of study within STEM departments, at the cost of slightly reducing the breadth of study across departments. Earning STEM credit leads students to take higher level courses within that same AP field and additional STEM courses outside of the AP field more generally. Receiving college credit from a non-STEM AP exam increases the number of non-STEM courses taken, though the strongest evidence points to this occurring outside of the discipline of the AP exam (i.e., earning English credit does not increase the number of English courses taken).

The results point to the actual credit received as the key mechanism, as achieving a higher integer score at a threshold that does not offer credit has no effect on courses taken.2 These results are not driven by changes in college major; prior work found that AP exams shift students across STEM fields but do not increase overall STEM completion (Avery et al. 2018), and the estimated magnitude on those effects, which are corroborated in this paper, are not large enough to account for the observed differences in course-taking.

I analyze gender differences in course-taking patterns, given abundant evidence on lower levels of female student participation and success in STEM in the postsecondary sector. I find that female students are significantly more likely than male students to use the college credit to progress deeper within STEM disciplines. As a result, the male–female gap in STEM courses taken shrinks by roughly one third to two thirds, depending on the outcome studied. Previous work has repeatedly shown that women are less likely to earn STEM degrees or participate in STEM-related careers, though debate continues as to the role variation in preferences, academic preparation, discrimination, and other factors drives these differences (Niederle 2016). Although results based on individual AP exams are noisy, I find larger effects on higher level STEM courses when female students earn higher AP scores in traditional—and typically optional—science courses (biology, chemistry, physics) than on math exams, which are more frequently required for graduation. This work shows that providing female high school students the opportunity to delve more quickly into higher level STEM courses increases overall STEM participation, potentially improving later career decisions and performance.

This paper provides rigorous causal evidence that earning STEM credit increases the level of human capital investment within STEM disciplines, whereas earning non-STEM credit appears to free up students to broaden their curriculum. This result adds to our understanding of how students promote their own human capital development in secondary and postsecondary education. Although high school exposure to a college-level curriculum has been shown to improve college-going and completion (Edmunds et al. 2012; Haxton et al. 2016), this paper adds to our knowledge of how students use earned credit to shape postsecondary pathways. Shifts toward STEM courses may be particularly beneficial for students, as increasing human capital investment in even lower level math improves long-run labor force outcomes (Goodman 2019). The results also highlight one pathway that promotes equitable outcomes for female students in college-level STEM coursework.

The paper proceeds as follows. Section 2 describes the literature on AP and issues in STEM participation, section 3 provides context for the sample of AP participants, section 4 describes the methodology, section 5 provides results, and section 6 discusses implications.

The strongest correlate of college success is a student's level of academic preparation.3 Raising curricular standards or the amount of time spent in rigorous courses has been shown to increase short-term academic and long-term labor market outcomes (Edmunds et al. 2012; Taylor 2014; Cortes, Goodman, and Nomi 2015; Haxton et al. 2016; Goodman 2019). A number of papers have shown how AP exams in particular impact student outcomes. First, being induced to take an AP exam improved college preparatory behaviors, leading to increases in degree attainment and earnings (Jackson 2010, 2014). Receiving a higher AP score increases on-time bachelor's degree completion when the exam is linked to college credit (Smith, Hurwitz, and Avery 2017). Students who participate in AP STEM fields are more likely to major in those disciplines (Smith et al. 2018), and some portion of this shift in major is causal (Avery et al. 2018). Students are more likely to major in the AP discipline as they exhibit a behavioral response from the signal of receiving a higher AP integer score, but college credit had no independent impact on this shift in major. In addition, the signal was much stronger for students receiving a 5 (instead of a 4), compared with students receiving a 3 (instead of a 2).

Evans (2019) uses the Beginning Postsecondary Survey to examine linkages between AP credits and a variety of postsecondary outcomes, and that paper serves as a good companion study to this paper. He uses a nationally representative study and a strong set of controls to minimize bias typically found in correlational studies. This approach allows him to examine not just graduation and courses taken but a broader set of outcomes such as hours worked, debt, and graduate school enrollment. In contrast, my study uses a regression discontinuity design (described below) that makes a stronger causal claim to comparing essentially identical students, though at the cost of being unable to make broader generalizations about, as just one example, students who took multiple APs to those who took few or none. Discussed below, both papers find strong evidence of increased math and science course-taking, though differ along a few dimensions: My more narrowly tailored question shows no impacts on credits earned in the first year or overall, and prior national-level work using a similar methodology finds no impacts on enrollment in four-year colleges or bachelor's degree completion after six years (Smith, Hurwitz, and Avery 2017).4

Policy makers continue to focus on STEM education as vital for preparing the workforce of the future, given the importance of technological innovation for economic growth (Mokyr, Vickers, and Ziebarth 2015). Although many students enter college with a desire to major in a STEM discipline, attrition rates are high, particularly for female and minority students (Arcidiacono, Aucejo, and Spenner 2012; Morgan, Gelbgiser, and Weeden 2013; Stinebrickner and Stinebrickner 2014). The large returns to a STEM degree should serve as a natural motivation, but heterogeneous tastes dominate the preference for particular majors (Carnevale, Rose, and Cheah 2011; Carnevale, Cheah, and Hanson 2015; Wiswall and Zafar 2015). These results suggest that we need more information about which types of interventions are needed to increase STEM degrees. For example, financial aid and incentives may help students overcome short-term credit constraints and invest more in their schooling, whereas tying aid eligibility to college GPA may increase the likelihood that students drop out of these more challenging courses (Dee and Jackson 1999; Denning and Turley 2017; Castleman, Long, and Mabel 2018).

Who might benefit most from the “intervention”—eliminating introductory courses and receiving a signal of higher skills—described in this paper? Among relatively higher-achieving students, research consistently finds lower STEM attainment among female students, in part due to differences in educational opportunities, how math or other science skills are socialized, or the importance of having same gender role models (Ellison and Swanson 2010; Niederle and Vesterlund 2010; Lim and Meer 2017; Mansour et al. 2018; Kofoed and mcGovney 2019).5 One explanation for male STEM success is overconfidence (relative to one's own skills), suggesting that the signaling value of higher AP scores (an external signal of their readiness) could be more valuable for female students (Huang and Kisgen 2013; Reuben, Wiswall, and Zafar 2017). Additionally, female students may gain more from skipping introductory “gateway” courses, which are typically large courses that—intentionally or not—create a broad distribution of grades through difficult examinations. Research has found that female students are more likely to exit STEM disciplines after receiving a poor grade (e.g., Buser and Yuan 2019), and the combination of removing a course requirement while eliminating the potential for a negative course grade “shock” may be particularly valuable for women. This is consistent with prior work finding that female students may be especially prone to benefiting from an academically rich environment at the high school or college level (Deming et al. 2014; Dynarski et al. 2018).

In sum, the AP “intervention” in this paper attempts to differentiate between two types of treatments. There is a signaling effect from receiving a higher score, which can be analogously compared to an “informational” intervention, which occurs at all thresholds. There is an additional more intensive intervention at only some thresholds that provides the student college credit that eliminates an administrative hurdle toward graduation. Although both types of interventions have been shown to work in different contexts, interventions that eliminate an administrative or logistical barrier are typically found to be more effective. For example, providing students information about the Free Application for Federal Student Aid (FAFSA) had essentially no impact, whereas helping them complete the form increased college attendance; similar results were also found for elderly individuals eligible for Supplemental Nutrition Assistance Program benefits (Bettinger et al. 2012; Finkelstein and Notowidigdo 2019). In another context, providing students a signal of their high ability had little independent impact on college-going behaviors or outcomes, except when paired with college outreach and financial aid (Gurantz, Hurwitz, and Smith 2017). Comparing impacts across these two types of thresholds provides an independent impact of the “informational” versus “administrative” support in pushing students into higher level courses.

Background Context

Data come from all AP exam takers who graduated from one large state in 2004, 2005, and 2006. Students in an AP course but who did not attempt an exam are unobserved in our data. For privacy reasons, I cannot identify the state that shared the postsecondary transcripts. During this time period, this state was a proponent of “accelerated courses,” with the number of AP exams taken rising faster than other large states. One way this occurred was through funding incentives, such as (1) covering AP exam fees for all students (whereas most states only subsidized low-income students); (2) offering small teacher bonuses for each student who passes an AP exam; and (3) extra funding for teachers in “low ranked” schools.

I focus on students who took one of the ten largest AP exams during this time period. I split these exams into two groups: STEM exams (Biology, Calculus AB/BC, Chemistry, Physics B, Statistics) and non-STEM exams (English Language, English Literature, Psychology, U.S. Government, World History). Students who take AP exams receive an integer score between 1 (the lowest score) and 5 (highest). Nationwide, many colleges offer college credit for specific AP exam scores—in some cases, colleges offer students the opportunity to skip a particular class (e.g., a student who scores a 3 on the Calculus AB exam can skip the first semester of introductory Calculus)—whereas in others a student could earn general credit that is not tied to a specific course. Earning a score of 3 or higher is often considered “passing” and frequently translates into one of these two benefits, though many schools offer additional credit for higher scores (e.g., a student who scores a 5 on the Calculus AB exam could skip a full year of introductory Calculus). In some cases, particularly at more selective schools, students might need to earn a 4 or 5 to earn basic credit, or may not be eligible for credit at all.

For the purposes of this study, and during the time period under review, the state's public colleges offered essentially uniform AP policies across postsecondary institutions. (These results are summarized in Appendix table A.1, which is available in a separate online appendix that can be accessed on Education Finance and Policy’s Web site at https://doi.org/10.1162/edfp_a_00298.) First, each college offered AP credit—meaning the student was obviated from having to take a specific introductory course—for earning a 3 on any of the ten subjects used in this study. In seven of the ten subjects students who earned a 4 could earn credit explicitly tied to a second course (Biology, Calculus BC, Chemistry, Physics B, English Language, English Literature, World History). Only two STEM exams (Chemistry, Biology) offered students the right to skip an additional course for scoring a 5. In all cases the college credit was linked to the ability to skip an additional specified course. All of the courses also met general education requirements for earning a bachelor's degree. Biology and Chemistry differ from other disciplines as students who want to major in these fields must take two semesters of “bundled” introductory coursework, where earning a 4 or 5 allows students to skip either one or both semesters, respectively; students earning a 3 met the general education requirement for science, but would have to take higher level courses to earn the major. From archival records it appears the state's two-year colleges followed similar guidelines as in-state, four-year colleges.

College Board Data

As noted, students who take AP exams receive an integer score between 1 (the lowest score) and 5 (highest), but underlying each AP exam is a continuous metric that generally runs between 0 and 150 points and is unobserved by the student. The continuous score is a composite that takes into account performance on both multiple choice and free response sections, extends for four decimal places, and is collapsed into scaled integer scores using the modified Angoff method (Angoff 1971). Psychometricians determine cut points that differentiate integer scores (i.e., a minimum score to earn a 2, 3, 4, or 5) prior to the exam being offered, but individual exam graders are unable to manipulate scores by offering enough credit to surpass specific thresholds.

My data contain both raw (unobserved) and integer (observed) AP scores, in addition to demographic characteristics (gender, ethnicity, high school attended) and information derived from other College Board services (e.g., SAT participation).

AP exam takers were linked to postsecondary transcript files from in-state public two- and four-year colleges. Two thirds of in-state, four-year attendees enrolled in the three largest public four-year colleges, but community college enrollment was much more broadly split across colleges. Transcript data are provided by the state and follow each cohort for four years of postsecondary attendance. Transcripts identify courses taken, their associated department and number of units, and the grade received. The transcript data do not contain fields on whether a student graduated, so I utilize National Student Clearinghouse data to identify those who earned a degree and their major.

Table 1 provides descriptive statistics for the full sample of AP exam takers. The first column shows that of the 70,770 unique students, 64.2 percent and 25.9 percent attended an in-state four-year or two-year college, respectively, with most of the remaining students attending in-state private or out of state schools (for which I do not have transcripts). Roughly 60 percent of the sample were female and 60 percent white. Socioeconomic status is significantly higher than national averages, with 57 percent of families having a college-educated parent and 19 percent earning $100,000 or more in income (both statistics are student self-reports). Roughly 90 percent of students attended an in-state four- or two-year college, with about 60 percent earning a bachelor's degree within six years and 8 percent earning a STEM degree.6 The second column of table 1 more closely matches the later analytic sample by restricting to students who attend an in-state public college and earned at least one 3 on an AP exam. Compared with the full sample, these students are more likely to be white and have higher socioeconomic status, earn higher SAT scores, and take, on average, almost five AP exams.

Table 1.

Summary Statistics, Advanced Placement (AP) Exam Takers from 2004 through 2006 High School Cohorts

Attended In-State Public School and at Least One AP Exam Score ≥ 3
All studentsAllTook STEMNo STEM
(1)(2)(3)(4)
N 70,770 30,887 21,448 9,439 
Student demographics     
Female 59.8% 56.3% 51.7% 66.6% 
African American 10.9% 5.7% 5.3% 6.5% 
Asian 6.5% 6.8% 8.3% 3.6% 
Hispanic 18.1% 15.0% 14.4% 16.4% 
White 60.9% 69.3% 69.1% 69.8% 
One parent has a bachelor's degree 57.3% 64.1% 66.2% 59.1% 
Student reported family income ≥ $100K 18.9% 22.4% 23.8% 19.4% 
Exam scores     
Took SAT 91.8% 95.5% 96.9% 92.4% 
Final verbal score 553 599 601 594 
Final math score 555 597 617 549 
Total AP 3.6 4.6 5.4 3.0 
Total STEM 1.0 1.4 2.0 0.0 
Total non-STEM 2.6 3.3 3.4 3.0 
College outcomes     
Attend in-state four-year college 64.2% 84.3% 89.1% 73.2% 
Attend in-state two-year college 25.9% 15.7% 10.9% 26.8% 
Six-year bachelor's degree completion 59.4% 71.8% 75.9% 62.5% 
STEM degree 8.1% 12.0% 15.6% 3.7% 
Attended In-State Public School and at Least One AP Exam Score ≥ 3
All studentsAllTook STEMNo STEM
(1)(2)(3)(4)
N 70,770 30,887 21,448 9,439 
Student demographics     
Female 59.8% 56.3% 51.7% 66.6% 
African American 10.9% 5.7% 5.3% 6.5% 
Asian 6.5% 6.8% 8.3% 3.6% 
Hispanic 18.1% 15.0% 14.4% 16.4% 
White 60.9% 69.3% 69.1% 69.8% 
One parent has a bachelor's degree 57.3% 64.1% 66.2% 59.1% 
Student reported family income ≥ $100K 18.9% 22.4% 23.8% 19.4% 
Exam scores     
Took SAT 91.8% 95.5% 96.9% 92.4% 
Final verbal score 553 599 601 594 
Final math score 555 597 617 549 
Total AP 3.6 4.6 5.4 3.0 
Total STEM 1.0 1.4 2.0 0.0 
Total non-STEM 2.6 3.3 3.4 3.0 
College outcomes     
Attend in-state four-year college 64.2% 84.3% 89.1% 73.2% 
Attend in-state two-year college 25.9% 15.7% 10.9% 26.8% 
Six-year bachelor's degree completion 59.4% 71.8% 75.9% 62.5% 
STEM degree 8.1% 12.0% 15.6% 3.7% 

Notes: All students took an AP exam and graduated from high school in one large state between 2004 and 2006. All values are based on unique student observations. Demographics are based on student self-reports, exam scores are based on College Board data, and college outcomes are based on National Student Clearinghouse data. STEM = science, technology, engineering, and math.

Columns 3 and 4 of table 1 describe large differences in characteristics between students who took at least one STEM exam or no STEM exams, respectively. STEM exam students are less likely to be African American or Hispanic and more likely to be Asian. STEM exam takers have SAT math scores that are almost 70 points higher, even though they have relatively similar SAT verbal scores. Students who took a STEM AP exam were roughly four times as likely to earn a STEM degree (15.6 percent versus 3.7 percent). This suggests that early exposure to college-level STEM courses is a key precursor to earning a STEM degree, even though this cannot be interpreted as a causal impact due to strong observable—and likely unobservable—differences between these two groups.

I create two sets of outcome measures to examine students’ curricular choices. The first is based on total courses attempted in three broad categories: (1) higher level courses within the AP exam field (e.g., does passing AP Biology lead students to take Biology courses that are a higher level than the AP-relevant course?); (2) total STEM units outside the AP field; and (3) total non-STEM units.7 The second set of outcome measures attempts to examine student tradeoffs in curricular breadth and depth. Curricular breadth is the count of each student's total number of departments in which they participate, and whether they took just one course or two or more courses within that department. An increase (decrease) in this number then indicates a broader (narrower) set of courses. For depth, I create a metric that identifies units that are most closely aligned with the relevant AP field, relying on college course transcripts as a guide. To do so I follow the following process: (1) identify all students who majored in a specific field; (2) aggregate their total units attempted within each of the many possible departments, and (3) identify departments that have the highest average unit totals.8 This process then uses the transcript data, rather than human attribution, to identify key departments linked to a specific major.9 My primary outcome defines depth as taking more units within the top three departments related to the major.

I use a regression discontinuity design to compare students with essentially identical continuous AP scores who receive different AP exam integer scores. As a result of these minor differences in exam performances, students either do or do not receive AP credit that gives them the choice of skipping introductory courses.

As each AP exam contains multiple thresholds, the preferred estimation strategy is as follows. First, I stack the 2/3, 3/4, and 4/5 thresholds, thus estimating the combined effect of crossing any integer threshold. (I omit the 1/2 threshold—as no college offers credit at that point—but discuss these results below). In order to avoid using the same students multiple times (e.g., someone above a score of 3 could also be considered below a 4), I restrict to relatively short bandwidths around each discontinuity. In the sample, restricting to six-point bandwidths produces results without overlap; for comparison, using seven- and ten-point bandwidths results in overlap rates of 1.5 percent and 16.7 percent, respectively.10 Finally, estimated results include only thresholds where there is differential credit offered by in-state public colleges. This approach has two virtues. First, thresholds that offer no specific college credit serve as an alternative test of whether course-taking patterns are impacted just by signaling, and I later show there are no large impacts at these points. Second, this approach maximizes power by focusing results only on credit-granting thresholds and concentrating on students relatively close to these boundaries.

This leads to the following estimating equation:
Yirty=β0+β1*f(scoreirty)+β2*Thresholdirty+β3*Thresholdirty*f(scoreirty)+θrty+Xirty+ɛirty
(1)
In this model, Yirty is the outcome of interest (e.g., courses attempted) for student i taking exam r at threshold t in year y, f(scoreirty) is a function that indicates an individual's distance from the year- and exam-specific threshold centered at the eligibility cutoff, θrty is a set of year-exam-threshold fixed effects, and Xirty is a vector of baseline observable characteristics that are only included in results as robustness checks. Thresholdirty is a variable that equals one if a student scores above an AP threshold, and so identifies β2 as the causal reduced-form parameter of being offered AP credit on later outcomes. Results rely on linear specifications with rectangular kernels and utilize robust standard errors.

The validity of the research design depends on no sorting near the AP integer thresholds that invalidates the assumption of no differences between students. Theoretically, this appears evident because students are unaware and unable to manipulate their continuous score to precisely score above these benchmarks. Cut scores are determined prior to AP scoring, so are not set by natural breaks that might idiosyncratically occur in a given year.

Empirically, I follow the standard approach to testing equality by examining the continuity of the density of observations and students’ baseline characteristics near the threshold. Given that I can only observe transcripts for students who attend in-state public colleges, any evidence that AP performance shifts students toward private or out-of-state colleges might invalidate the research design. The top panel of online Appendix table A.2 presents results that show students are not more likely to attend an in-state four- or two-year public college from receiving a higher integer score. In addition, table A.2 shows that earning a higher integer score has no impact on a student persisting in all four years of the transcript data. Thus my reliance on using only students with transcripts does not invalidate the research design.

I restrict to only those students with transcripts and examine a variety of individual-level covariates theoretically unassociated with AP performance: gender, ethnicity, taking the SAT, SAT performance, total AP exams taken, parent education, and reported income. Online Appendix table A.3 shows no evidence that any of these values change discontinuously at the threshold, providing more evidence that the research design is valid.11 A standard McCrary test finds no evidence of a discontinuity in observations at the thresholds.12

Impacts of AP Credit on Participation in Introductory Courses

Table 2 examines to what extent students decrease their participation in introductory courses when offered AP credit. (All tables in this section utilize a regression discontinuity design using student-by-exam observations within six points of the 2/3, 3/4, and 4/5 AP integer thresholds, adding AP test-by-year and cohort fixed effects; design decisions are discussed in section 4.) The first column of table 2 shows that students are 35, 23, and 26 percentage points less likely to take the corresponding course when offered credit at the 2/3, 3/4, and 4/5 thresholds, respectively. This corresponds to a decline of 73 percent, 64 percent, and 45 percent, respectively, over baseline rates of students who are below the threshold and choose to take the course. As expected, most students offered credit opt out of the course, but a nontrivial portion of students choose to retake the same material. Close to 20 percent of students offered STEM credit retake the course, with a little over 10 percent retake rate in non-STEM exams; scatter plots for the 2/3 threshold are shown in figure 1.

Table 2.

Impacts of Crossing Advanced Placement (AP) Threshold on Taking AP Relevant Course within Four Years

Student Achieved an AP Integer Score of:NTook College Course that Corresponded to Credit Offered at AP ThresholdTook College Course that Corresponded to Credit Offered at Lower Level AP Threshold
25,971 

−0.346**

(0.010)

 
 
Baseline value  47.6% 
11,344 −0.229** −0.041** 
  (0.015) (0.009) 
Baseline value  35.8% 7.4% 
659 −0.258** −0.231** 
  (0.074) (0.071) 
Baseline value  57.4% 47.8% 
Student Achieved an AP Integer Score of:NTook College Course that Corresponded to Credit Offered at AP ThresholdTook College Course that Corresponded to Credit Offered at Lower Level AP Threshold
25,971 

−0.346**

(0.010)

 
 
Baseline value  47.6% 
11,344 −0.229** −0.041** 
  (0.015) (0.009) 
Baseline value  35.8% 7.4% 
659 −0.258** −0.231** 
  (0.074) (0.071) 
Baseline value  57.4% 47.8% 

Notes: All students in the sample took an AP exam, graduated from high school in one large state between 2004 and 2006, and attended an in-state public college. Estimates derive from equation 1 in the text, which provides a regression discontinuity estimate using a linear specification over a bandwidth of six AP points, and includes test-by-year and cohort fixed effects. An observation is a unique student-by-test observation, with standard errors clustered by student. Baseline values are listed below the regression estimates and standard error, and include all students within two points below the threshold.

**p < 0.01.

Table 2 shows two potential reasons students might retake a course for which they have already earned credit.13 The first reason is that some students may desire a stronger signal of their academic performance before they feel ready to skip the college material. This can be seen in the second column of table 2, where some students who are eligible to skip a course by earning a 3 will wait until they earn a 4 before doing so. At the 3/4 threshold, only 7.4 percent of students earning a 3 still retake the course, but this drops by 4.1 percentage points (55 percent) for identical students earning an AP score of 4 (It bears noting that these are students with a very strong 3, even though they do not observe their exact raw score). The second reason appears related to course “bundling,” where two courses are part of a multicourse sequence, and students choose to skip only when they receive credit for both courses (e.g., Biology 1A and 1B). We can observe this at the 4/5 threshold, where students who earn a 4 or 5 can skip either one or both semesters, respectively, of the introductory courses required for Biology or Chemistry majors.14 The data show that students earning a 5 are roughly 25 percentage points less likely to take both the first and second semester of Biology/Chemistry when offered the opportunity. Thus, some students are making the choice to skip the entire sequence or not, rather than considering individual courses in isolation.15

Impacts of College-Credit on Taking Introductory Courses Associated with Advanced Placement (AP) Exams

Figure 1.
Impacts of College-Credit on Taking Introductory Courses Associated with Advanced Placement (AP) Exams

Notes: Figure includes all student-by-exam observations within six points of integer AP score thresholds that provide college credit. Sample includes all high school graduates from 2004 to 2006 in one large state who attended in-state public colleges. STEM = science, technology, engineering, and math.

Figure 1.
Impacts of College-Credit on Taking Introductory Courses Associated with Advanced Placement (AP) Exams

Notes: Figure includes all student-by-exam observations within six points of integer AP score thresholds that provide college credit. Sample includes all high school graduates from 2004 to 2006 in one large state who attended in-state public colleges. STEM = science, technology, engineering, and math.

A final finding, not shown in the tables, is that earning credit at the 2/3 threshold in either Calculus or Statistics reduces the likelihood that students take lower level, remedial math courses by roughly 15 percentage points.16 The most plausible reason is that students are often required to take math placement exams but the presence of AP credit can serve as a substitute accountability method that reduces the likelihood students perform poorly and are misplaced (Scott-Clayton and Rodriguez 2015). I find no statistically significant or meaningful results on taking lower level English courses for students at the margins of earning AP English Language or Literature exams, though this could be due to differences in the accuracy or utilization of the specific placement policies.

Impacts of AP Credit on Subsequent Course Taking

Table 3 examines the total number of courses taken by students after one and four years of postsecondary enrollment, with graphical results presented in figure 2. Overall, students who are offered STEM credit are significantly more likely to take additional STEM (non-STEM) courses when passing a STEM (non-STEM) AP exam. Subsequent tables show this is not due to signaling or other alternative explanations, and is driven solely by allowing students the opportunity to skip introductory courses.

Table 3.

Impacts of Crossing Advanced Placement (AP) Threshold on Courses Attempted, Credit-Granting Thresholds Only

STEMNon-STEM
1 Year4 Years1 Year4 Years
(1)(2)(3)(4)
Total courses 0.077 −0.211 0.016 −0.161 
 (0.105) (0.399) (0.067) (0.257) 
Baseline value 9.5 36.0 9.1 34.1 
Total courses in AP subject above AP exam 0.095** 0.141+ −0.003 −0.034 
 (0.034) (0.078) (0.012) (0.036) 
 0.5 1.6 0.3 0.9 
Took zero courses (%) −0.067*** −0.036+ 0.013 0.032** 
 (0.019) (0.019) (0.010) (0.011) 
 68.8% 42.2% 70.0% 48.6% 
Took one course (%) 0.047** 0.017 −0.022* −0.010 
 (0.016) (0.015) (0.010) (0.011) 
 18.7% 20.2% 27.3% 32.0% 
Took two or more courses (%) 0.020 0.019 0.010* −0.022* 
 (0.013) (0.020) (0.004) (0.009) 
 12.5% 37.6% 2.7% 19.4% 
STEM courses 0.234** 0.244 0.039 −0.258 
 (0.084) (0.289) (0.056) (0.181) 
 2.9 9.5 3.0 9.0 
Non-STEM courses 0.015 −0.129 0.249*** 0.477* 
 (0.096) (0.380) (0.059) (0.237) 
 5.5 23.9 5.4 23.8 
N 9,801 28,159 
STEMNon-STEM
1 Year4 Years1 Year4 Years
(1)(2)(3)(4)
Total courses 0.077 −0.211 0.016 −0.161 
 (0.105) (0.399) (0.067) (0.257) 
Baseline value 9.5 36.0 9.1 34.1 
Total courses in AP subject above AP exam 0.095** 0.141+ −0.003 −0.034 
 (0.034) (0.078) (0.012) (0.036) 
 0.5 1.6 0.3 0.9 
Took zero courses (%) −0.067*** −0.036+ 0.013 0.032** 
 (0.019) (0.019) (0.010) (0.011) 
 68.8% 42.2% 70.0% 48.6% 
Took one course (%) 0.047** 0.017 −0.022* −0.010 
 (0.016) (0.015) (0.010) (0.011) 
 18.7% 20.2% 27.3% 32.0% 
Took two or more courses (%) 0.020 0.019 0.010* −0.022* 
 (0.013) (0.020) (0.004) (0.009) 
 12.5% 37.6% 2.7% 19.4% 
STEM courses 0.234** 0.244 0.039 −0.258 
 (0.084) (0.289) (0.056) (0.181) 
 2.9 9.5 3.0 9.0 
Non-STEM courses 0.015 −0.129 0.249*** 0.477* 
 (0.096) (0.380) (0.059) (0.237) 
 5.5 23.9 5.4 23.8 
N 9,801 28,159 

Notes: All students in the sample took an AP exam, graduated from high school in one large state between 2004 and 2006, and attended an in-state public college. Estimates derive from equation 1 in the text, which provides a regression discontinuity estimate using a linear specification over a bandwidth of six AP points, and includes test-by-year and cohort fixed effects. An observation is a unique student-by-test observation, with standard errors clustered by student. Baseline values are listed below the regression estimates and standard error, and include all students within two points below the threshold. STEM = science, technology, engineering, and math.

+p < 0.1, *p < 0.05, **p < 0.01.

The first two columns of table 3 focus on STEM involvement. First, students take an additional 0.10 (19 percent) higher level courses during the first year that are within the AP discipline (e.g., skipping introductory Biology allows students to take higher level and upper-division Biology courses). (All outcome measures are described in section 3.) Students also take 0.23 (8 percent) more STEM courses outside of the immediate AP exam field during the first year of study. After four years, the estimated impacts on courses in the AP exam subject and total STEM courses are similar in magnitude, but either retains marginal significance or loses significance given the increased dispersion of students’ course-taking choices. Graphical results presented in online Appendix figure A.2 shows regression estimates across bandwidths and support the finding of consistent positive results on taking higher level AP field and STEM courses, with some estimates on four-year impacts reaching marginal levels of statistical significance. Receiving STEM credit has no impact on non-STEM units taken.

Impact of College-Credit on One- and Four-Year Course-Taking Patterns

Figure 2.
Impact of College-Credit on One- and Four-Year Course-Taking Patterns

Notes: Figure includes all student-by-exam observations within six points of integer Advanced Placement (AP) score thresholds that provide college credit. Sample includes all high school graduates from 2004 to 2006 in one large state who attended in-state public colleges. STEM = science, technology, engineering, and math.

Figure 2.
Impact of College-Credit on One- and Four-Year Course-Taking Patterns

Notes: Figure includes all student-by-exam observations within six points of integer Advanced Placement (AP) score thresholds that provide college credit. Sample includes all high school graduates from 2004 to 2006 in one large state who attended in-state public colleges. STEM = science, technology, engineering, and math.

Students earning non-STEM credit use this opportunity to take more non-STEM courses, averaging an additional 0.25 (5 percent) and 0.48 (2 percent) courses after one and four years, respectively (table 3, columns 3 and 4). There are no impacts on STEM courses taken. In contrast to STEM, earning non-STEM credit leads students to take fewer courses within the AP discipline in the long-run. After four years, students offered non-STEM credit are 3.2 percentage points (7 percent) more likely to have taken no courses in that discipline, indicating that they avoid this subject area entirely in favor of taking courses in alternative departments.

Impact estimates are not due to functional form or measurement issues. Online Appendix figure A.2 confirms results for STEM exam-takers over bandwidths from three to nine points, with covariate-adjusted results producing similar results. Online Appendix figure A.3 does the same for non-STEM exams. Alternate measures of course-taking that examine total units attempted or passed, rather than courses, also finds similar results (omitted for brevity).

Distinguishing Between College Credit or Alternative Explanations

Although these course-taking shifts could indicate that students are finishing college with increased investment or specialization within specific disciplines, there may be alternative explanations. First, I eliminate the possibility that increased courses taken comes from students on either side of the threshold, staying shorter or longer periods of time due to early graduation (as in Smith, Hurwitz, and Avery 2017), dropout, or transfer rates. Early exits are not a concern as there are no impacts of AP thresholds on remaining in an in-state public college for four years, as previously shown in the top panel of Appendix table A.2, and results are identical when using only students who attend for at least four years (results omitted for brevity). A separate concern is that the AP induces on-time graduation, such that the increase in units attempted does not represent increased depth of study in the long run. The bottom panel of table A.2 confirms the results from Smith, Hurwitz, and Avery (2017), and does find small increases in four-year bachelor's degree completion. Nonetheless, these impacts are statistically insignificant at the credit thresholds, and too small in magnitude to explain the shifts in course-taking documented in this study.17 In particular, the estimated impact on earning a STEM degree is a statistically insignificant 0.3 percent, suggesting that increases in STEM courses represent real human capital investment gains and do not arise from shifts in college major.

An alternate explanation is that these gains arise from the signal that students internalize from receiving a higher AP score, rather than college credit. It first bears noting that this would contradict the two prior papers that use a similar methodology: (1) gains in on-time degree completion were driven by credit-granting thresholds; (2) signaling effects at non-credit thresholds are too small to explain the effects observed here (and in previous work did not produce additional STEM majors, instead shifting STEM majors across specific disciplines). Online Appendix table A.4 reproduces table 3 but uses only the AP thresholds where students are not offered any course credit. As the results show, there are no significant results for any AP STEM results and only one significant finding for non-STEM after one year, highlighting that these gains occur from the credit increases rather than the positive signal associated with the integer score itself.18

Finally, observed gains in course-taking might not necessarily be indicative of increased depth within the curriculum if the gains are relatively unconnected to the relevant discipline. As described in section 3, I test additional metrics for depth and breadth designed to proxy for these curricular values, using only students who persisted for four years. Online Appendix table A.5 finds that STEM credit increases depth as students take 0.24 more courses (11 percent) in the top three departments most directly related to that AP major. This comes at the expense of curricular breadth, with students taking courses in 0.43 fewer total departments. For non-STEM students the results are quite different, as students take 0.4 more courses in departments that are not considered essential to the AP-specific major, with no evidence that they focus more narrowly on the AP-related non-STEM major. Results are identical when using alternate thresholds (e.g., the top five departments) or metrics (e.g., departments as the total percentage of courses).

Gender Differences in the Impacts of AP Credit on Subsequent Course-Taking

Table 4 examines gender differences in the responsiveness to college credit, and shows that female students are more likely than male students to shift their course-taking patterns.19 After four years, STEM credit increases female student investment by 0.17 AP exam subject courses (12 percent) and 0.76 STEM courses (9 percent). Male students take essentially no additional higher level STEM courses, with female–male differences in taking more courses in the AP discipline or total STEM courses statistically different at the p < 0.1 level.20

Table 4.

Gender Differences in Impacts of Crossing Advanced Placement (AP) Threshold on Courses Attempted within Four Years

STEMNon-STEM
FemaleMaleFemaleMale
(1)(2)p-value(3)(4)p-value
Total courses −0.334 0.075 0.60 −0.089 −0.368 0.60 
 (0.535) (0.582)  (0.323) (0.420)  
Baseline value 37.3 34.8  34.8 33.1  
Total courses in AP subject above AP exam 0.172+ 0.032 0.36 −0.028 −0.042 0.84 
 (0.104) (0.113)  (0.049) (0.050)  
 1.4 1.7  1.0 0.8  
Took zero courses (%) −0.064* 0.004 0.07 0.031* 0.028+ 0.88 
 (0.027) (0.026)  (0.014) (0.017)  
 44.9% 39.2%  46.5% 51.8%  
Took one course (%) 0.039+ 0.001 0.22 −0.007 −0.007 1.00 
 (0.023) (0.021)  (0.014) (0.017)  
 22.1% 18.6%  32.4% 31.4%  
Took two or more courses (%) 0.025 −0.005 0.43 −0.024* −0.021 0.86 
 (0.026) (0.028)  (0.012) (0.013)  
 33.0% 42.2%  21.1% 16.7%  
STEM courses 0.759+ −0.263 0.08 −0.105 −0.391 0.45 
 (0.421) (0.394)  (0.221) (0.305)  
 8.6 10.5  8.2 10.2  
Non-STEM courses −0.828 0.765 0.03 0.424 0.361 0.90 
 (0.537) (0.510)  (0.302) (0.370)  
 26.3 21.6  25.1 21.7  
N 4,809 4,991  17,140 11,019  
STEMNon-STEM
FemaleMaleFemaleMale
(1)(2)p-value(3)(4)p-value
Total courses −0.334 0.075 0.60 −0.089 −0.368 0.60 
 (0.535) (0.582)  (0.323) (0.420)  
Baseline value 37.3 34.8  34.8 33.1  
Total courses in AP subject above AP exam 0.172+ 0.032 0.36 −0.028 −0.042 0.84 
 (0.104) (0.113)  (0.049) (0.050)  
 1.4 1.7  1.0 0.8  
Took zero courses (%) −0.064* 0.004 0.07 0.031* 0.028+ 0.88 
 (0.027) (0.026)  (0.014) (0.017)  
 44.9% 39.2%  46.5% 51.8%  
Took one course (%) 0.039+ 0.001 0.22 −0.007 −0.007 1.00 
 (0.023) (0.021)  (0.014) (0.017)  
 22.1% 18.6%  32.4% 31.4%  
Took two or more courses (%) 0.025 −0.005 0.43 −0.024* −0.021 0.86 
 (0.026) (0.028)  (0.012) (0.013)  
 33.0% 42.2%  21.1% 16.7%  
STEM courses 0.759+ −0.263 0.08 −0.105 −0.391 0.45 
 (0.421) (0.394)  (0.221) (0.305)  
 8.6 10.5  8.2 10.2  
Non-STEM courses −0.828 0.765 0.03 0.424 0.361 0.90 
 (0.537) (0.510)  (0.302) (0.370)  
 26.3 21.6  25.1 21.7  
N 4,809 4,991  17,140 11,019  

Notes: All students in the sample took an AP exam, graduated from high school in one large state between 2004 and 2006, and attended an in-state public college. Estimates derive from equation 1 in the text, which provides a regression discontinuity estimate using a linear specification over a bandwidth of six AP points and includes test-by-year and cohort fixed effects. An observation is a unique student-by-test observation, with standard errors clustered by student. Baseline values are listed below the regression estimates and standard error, and include all students within two points below the threshold. The p-values derive from a test of differences between the male and female regression estimates.

+p < 0.1; *p < 0.05.

The point estimates suggest that earning AP credit shrinks roughly one half of the male–female gap in STEM participation. This derives from the fact that baseline values—shown underneath the point estimates in table 4—indicate that female students take 1.4 AP exam subject courses and 8.6 STEM courses, whereas male student control means are 1.7 and 10.5, respectively. Thus, even the highest-performing high school students—those who are taking and passing AP STEM exams—continue to exhibit large differences in subsequent STEM participation, and receiving early college credit helps ameliorate this issue.21

Differences Across Required and Optional College Courses

General education requirements for earning a bachelor's degree lead to a fundamental difference between AP exams. Math (i.e., Calculus and Statistics) and English AP exams eliminate a course requirement that is essentially required for all students, such that students who do not pass these exams must repeat them. In contrast, the other STEM and non-STEM exams can be satisfied by many different general education courses, so a student who earns a 2 on AP Psychology, as one example, may take any number of other social science courses to earn her degree.22

Table 5 probes whether there are differences in outcomes across AP exams and whether these vary by gender. As each exam is individually underpowered, I combine exams into four categories that are “required” (English; Math) and “not required” (Biology, Chemistry, and Physics; U.S. Government, History, and Psychology). I find the largest impacts on STEM courses taken are among female students passing these “not required” STEM exams. Thus, female students, whose AP scores make them eligible to skip core science courses, take additional STEM courses (outside the AP exam field). The results point to the idea that offering and encouraging participation in more science-focused courses in high school can encourage increased depth of STEM study for female college students. Although female students take an additional 1.2 STEM courses, implying roughly a one for one substitution effect, all estimates are quite noisy at this level of disaggregation and should be considered general trends in behavior rather than precise estimates. The increase in higher level STEM courses appears to come from a shift out of non-STEM courses.

Table 5.

Gender Differences in Impacts of Crossing Advanced Placement (AP) Threshold on Courses Attempted within Four Years, by Exam Type

Calculus/StatisticsBiology/Chemistry/PhysicsEnglish Language/LiteratureHistory/Government/Psychology
FemaleMaleFemaleMaleFemaleMaleFemaleMale
(1)(2)(3)(4)(5)(6)(7)(8)
Total courses in AP subject above AP exam 0.212+ 0.170 0.172 −0.016 −0.004 −0.037 −0.085* −0.053 
 (0.108) (0.141) (0.182) (0.176) (0.068) (0.077) (0.042) (0.045) 
Baseline value 1.08 1.76 1.67 1.71 1.28 1.19 0.32 0.30 
STEM courses 0.294 −0.188 1.226* −0.376 0.110 −0.473 −0.591 −0.251 
 (0.589) (0.575) (0.595) (0.554) (0.264) (0.388) (0.403) (0.488) 
 8.05 9.35 9.19 11.60 8.09 10.32 8.32 9.92 
Non-STEM courses −0.173 0.747 −1.492+ 0.813 0.264 0.528 0.794 0.082 
 (0.744) (0.745) (0.767) (0.717) (0.361) (0.475) (0.546) (0.594) 
 26.75 21.95 25.81 21.20 24.82 21.24 25.88 22.35 
N 2,470 2,526 2,341 2,466 11,999 6,679 5,141 4,340 
Calculus/StatisticsBiology/Chemistry/PhysicsEnglish Language/LiteratureHistory/Government/Psychology
FemaleMaleFemaleMaleFemaleMaleFemaleMale
(1)(2)(3)(4)(5)(6)(7)(8)
Total courses in AP subject above AP exam 0.212+ 0.170 0.172 −0.016 −0.004 −0.037 −0.085* −0.053 
 (0.108) (0.141) (0.182) (0.176) (0.068) (0.077) (0.042) (0.045) 
Baseline value 1.08 1.76 1.67 1.71 1.28 1.19 0.32 0.30 
STEM courses 0.294 −0.188 1.226* −0.376 0.110 −0.473 −0.591 −0.251 
 (0.589) (0.575) (0.595) (0.554) (0.264) (0.388) (0.403) (0.488) 
 8.05 9.35 9.19 11.60 8.09 10.32 8.32 9.92 
Non-STEM courses −0.173 0.747 −1.492+ 0.813 0.264 0.528 0.794 0.082 
 (0.744) (0.745) (0.767) (0.717) (0.361) (0.475) (0.546) (0.594) 
 26.75 21.95 25.81 21.20 24.82 21.24 25.88 22.35 
N 2,470 2,526 2,341 2,466 11,999 6,679 5,141 4,340 

Notes: All students in the sample took an AP exam, graduated from high school in one large state between 2004 and 2006, and attended an in-state public college. Estimates derive from equation 1 in the text, which provides a regression discontinuity estimate using a linear specification over a bandwidth of six AP points and includes test-by-year and cohort fixed effects. An observation is a unique student-by-test observation, with standard errors clustered by student. Baseline values are listed below the regression estimates and standard error, and include all students within two points below the threshold.

+p < 0.1; *p < 0.05.

I find no strong positive effects when examining non-STEM exams, though again results are statistically imprecise. Being offered college credit does not appear to lead to substantially different results when comparing English exams versus other social sciences (such as World History, U.S Government, or Psychology), with all exhibiting positive but statistically insignificant impacts on additional non-STEM courses. There is some weak evidence that female students who are offered credit on these “not required” social science exams are less likely to take higher level courses in these disciplines.

This study uses AP exam takers to show the impact of early college credit on postsecondary pathways. For all students, I find that being offered course credit leads to large reductions in the likelihood that they take introductory college courses, though how they use their newfound freedom varies by exam type and gender. Female students who earn STEM credit increase their curricular depth by taking higher level science courses in areas both directly and indirectly related to the AP exam, with no observable impacts for male students. Students who earn AP credit in non-STEM disciplines diversify the breadth of their curriculum, taking more courses in departments not directly tied to their major. It bears repeating that the increases in courses taken do not arise from large changes in STEM or non-STEM majors, which are too small in magnitude to account for these shifts; thus the results should be considered a comparison of two identical STEM or non-STEM majors who make different decisions in their allocation of courses taken.

There are large differences in how often students major in the AP field that may explain the different patterns between STEM and non-STEM exams. For example, students who take AP Biology are much more likely to major in Biology than students who take AP History to major in History (specific rates are given in Avery et al. 2018). Thus, STEM exam takers may be more focused on their field and shift more effort into STEM courses than might be observed in non-STEM areas. On the other hand, non-STEM disciplines may be more interdisciplinary in nature, with History majors benefitting more from insights in other fields (e.g., Economics or Sociology). Unfortunately, even when using one very large state with multiple years of data and a causal design that produces large changes in introductory course-taking, there is so much variation in curricular patterns that I am underpowered to investigate some of these possibilities.

Although there are some concerns that students are not using their scores to waive out of college credit, I observe large changes in student participation in introductory courses, with retake rates roughly 10 percent in non-STEM and 20 percent in STEM disciplines. Retake rates in STEM disciplines are higher in part due to “bundled” courses, where students prefer to use their credit only when they can eliminate the entire first-year sequence of introductory Science courses. If we would like to discourage retaking, one option may be to provide students a more nuanced picture of their true AP performance. This could include some details of their raw scores as additional information, given that consumers might be overly responsive to broad grading programs (Figlio and Lucas 2004; Anderson and Magruder 2012).

Earning college-level credit in high school leads female students to increase their human capital investment in STEM, producing stronger college graduates. Although this paper cannot answer why female students are more responsive to the credit offering, I offer a few thoughts. An important contextual point is that these shifts in STEM participation substantially shrink but do not eliminate female–male gaps in STEM participation, even though we are implicitly conditioning on the AP score. Thus, the impacts are in part explained by the fact that female students with equal academic performance on AP STEM exams take fewer STEM courses in college. Prior research has analyzed how women are more likely to avoid competitive environments, even when they have the requisite ability to perform well, and large introductory courses might be a particularly challenging environment for some female STEM students, leading them to shift some of their future coursework into the areas (Niederle and Vesterlund 2007; Niederle 2016). The results cannot be driven entirely by signaling effects—where women might be more responsive to external markers of their ability due to underrepresentation in STEM disciplines—because I find no impacts at thresholds that do not grant credit. Nonetheless, there might be some particular benefit to female students from the combination of the signal and credit, as an earlier start in higher level coursework might constitute a large personal boost in their momentum toward the degree or could result in female students receiving extra mentoring from professors that encourages them to take additional coursework.

One implication of this study is that encouraging credit policies assists colleges to produce graduates whose skills align with commonly cited social or economic priorities. Yet there is an open question about how best to meet this goal, with one claim that too much rigorous coursework at an early level might negatively impact outcomes for some students (Pope, Brown, and Miles 2015). Some colleges have suggested that introductory courses serve an important purpose in socializing students to their new environment, and students might benefit from the unique experiences of college faculty (Bettinger and Long 2005; Figlio, Schapiro, and Soter 2013). Although worthy goals, states subsidize much of the cost of postsecondary enrollment, and this paper shows that they might want to promote these programs to support broader societal goals, such as developing STEM graduates with stronger skills (Webber 2017). Although balancing these competing demands is a complex issue, states and districts are engaged in numerous efforts to increase early credit opportunities, through avenues such as Advanced Placement exams, International Baccalaureate programs, early college high schools, or traditional dual-enrollment in public two- and four-year colleges. Although there are many potential benefits to engaging in these programs, their ultimate utility will be limited if students’ early college credit is not recognized when they enroll in college.

Thanks to the College Board for providing the data and opportunity to work on this project. The paper was improved via feedback from participants at the Association for Public Policy Analysis and Management conference, as well as the reviewers and editor of this journal. This research reflects the views of the author and not his corresponding institutions.

1. 

There are three other mechanisms where AP scores impact postsecondary investments. One is that stronger academic preparation shifts students’ course decisions (Smith et al. 2018; Wyatt, Jagesic, and Godfrey 2018), but the use of a regression discontinuity (RD) design (described below) eliminates that issue in this context. The other is if AP exams lead students to attend more selective institutions, but this has been shown not to be the case in a narrowly tailored RD framework (Smith, Hurwitz, and Avery 2017). A third is if AP credits increase the likelihood of double-majoring. Both Evans (2019) and Ewing, Jagesic, and Wyatt (2018) find positive impacts on double majors after comparing students who take or pass AP to those who do not, after applying a rich set of controls. In an RD framework similar to this paper, Avery et al. (2018) find no impacts on double-majors, though this was not reported in published versions.

2. 

These results align with prior work that found time to degree effects predominately at credit-granting thresholds, with meaningful but much smaller signaling effects irrespective of credit offerings.

3. 

Obviously, academic preparation is correlated with other factors, such as family income or wealth, but these are often not well captured in traditional administrative datasets.

4. 

Evans (2019) also finds increases in first-year grade point average (GPA) but I show that students with AP credit generally skip the introductory course, so it is challenging to determine whether this is due to AP causally improving academic performance or simply shifting students into classes with different grade distributions. Wyatt, Jagesic, and Godfrey (2018) also find similar or stronger performance in higher level college courses when comparing AP students who skipped the introductory course versus non-AP students who passed the introductory course.

5. 

Female academic attainment has increased significantly over the past few decades and numerous studies find strong intervention impacts among male students, but these are often focused on students at the lower end of the ability distribution.

6. 

STEM degree is defined as a major whose first two-digits of the Classification of Instructional Programs (CIP) code are 11 (Computer and Information Sciences), 14 (Engineering), 15 (Engineering Technologies and Related Fields), 26 (Biological and Biomedical Sciences), 27 (Mathematics and Statistics), or 40 (Physical Sciences).

7. 

To account for different AP exam types, I alter the STEM and non-STEM definition to exclude courses directly related to that field. For example, if the AP exam is Biology, then STEM courses means all courses not in the Biology department. If the AP exam is English, then non-STEM courses means all courses not in the English department.

8. 

College major is defined using the two-digit CIP code in the NSC data. Most AP exams link easily to a specific major (e.g., AP English matches to English) but there are three exceptions: (1) Chemistry and Physics both link to the CIP code for “Physical Sciences”; (2) Calculus and Statistics are linked to both Math and Engineering, given that Engineering is the most popular major choice beyond the relatively few Math majors, and (3) AP U.S. Government is linked to the broad Social Sciences CIP code.

9. 

As an illustrative example, students who take AP Chemistry are linked to the Chemistry college major, whose top three departments are (in order): Chemistry, Physics, and Process Biology Genetics (PBG). For example, Chemistry majors take 20.5 percent of their courses in Chemistry, 12.9 percent in Physics, and 3.4 percent in PBG. I could alternately estimate the departments that constitute 30 percent (or 40 percent, etc.) of all major-specific units, which would then include just Chemistry and Physics (as these three listed above combined to 33.5 percent of all units). These two alternate formulations provide similar results, particularly as I vary the number of top departments or percentage totals in robustness checks. Online Appendix table A.6 shows the top ten departments and associated percentages for each AP exam major used in the paper.

10. 

Among the tests used in this study average gap between thresholds is approximately twenty points, but varies anywhere from twelve to thirty. In robustness tests that vary bandwidth, I randomly assign students in these overlap regions to one side of the discontinuity, such that someone halfway between 3 and 4 will in some cases be considered “above three” and in other cases “below four.”

11. 

I follow the standard approach of estimating discontinuities by putting covariates on the left-hand side of equation 1. Only 86 percent and 62 percent of students report any value for parental education or income, respectively, so these baseline values are not representative of the full population.

12. 

McCrary test shown in online Appendix figure A.1. Regression estimates for the McCrary test using a six-point bandwidth are a statistically insignificant 0.032, with a standard error of 0.023. McCrary tests using longer bandwidths that randomly assign students to being above or below the threshold also produce statistically insignificant results.

13. 

In general, it is not recommended to compare the magnitude of the impact estimates across different thresholds, as there are a number of changes in which students are represented, the types of institutions attended, and other factors that could all weigh into the decision to skip or retake a course.

14. 

Students who earn a 3 on Biology or Chemistry are offered credit for a course that meets general education science graduation requirements but is not part of the major sequence.

15. 

Another way to see this is through the high retake rates, where 47.8 percent of students who scored a 4 and are eligible to skip the first semester of Biology/Chemistry choose to retake this course anyway, and 57.4 percent of these students take the second semester of this course at some future point; thus there is little difference in retake rates between these two courses for those scoring a very strong AP exam score of 4.

16. 

I define remedial math courses as those with Pre-Calculus, Trigonometry, or Algebra in their title.

17. 

If we assume AP increases four-year degree completion by 2 percentage points, which is the upper limit of previously estimated impacts, and STEM courses increase by 0.38 (coefficients 0.14 and 0.24 from rows 2 and 6 from table 3), then we can only think of this impact coming from “new” majors (rather than in-depth study among existing majors) if the induced majors took nineteen additional STEM courses than they would have otherwise.

18. 

Additional results at the 1/2 threshold, which offer no credit but could induce signaling, are also null and omitted for brevity.

19. 

Although there are additional heterogeneity outcomes one might investigate, the results are weakly powered and favor attempts that split the sample relatively equally. Ethnic differences are challenging given that relatively few students fall into traditional categories underrepresented in AP, such as African American and Hispanic students.

20. 

Male-female differences after one year also show larger increases in female STEM participation than for males, but are omitted for brevity.

21. 

Differences could arise if male students are more likely to score at higher or lower AP levels and these weighted differences are driving differences in effects or control means. Alternate analysis focused only on students at the 2/3 threshold are identical and suggest that this is not the case.

22. 

Some colleges offer alternative math or English courses but most students within a university retake the course associated with AP college credit. Many students who earn a 2 still choose to retake the same course for which they did not earn credit, likely in part because this is easier than passing an entirely new course.

Altonji
,
Joseph G.
, and
Charles R.
Pierret
.
2001
.
Employer learning and statistical discrimination
.
Quarterly Journal of Economics
116
(
1
):
313
350
.
Anderson
,
Michael
, and
Jeremy
Magruder
.
2012
.
Learning from the crowd: Regression discontinuity estimates of the effects of an online review database
.
Economic Journal
122
(
563
):
957
989
.
Angoff
,
W. H.
1971
.
Scales, norms, and equivalent scores
. In
Educational measurement
,
edited by
Robert Ladd
Thorndike
, pp.
508
600
.
Washington, DC
:
American Council on Education
.
Arcidiacono
,
Peter
,
Esteban M.
Aucejo
, and
Ken
Spenner
.
2012
.
What happens after enrollment? An analysis of the time path of racial differences in GPA and major choice
.
IZA Journal of Labor Economics
1
(
5
):
1
24
.
Arteaga
,
Carolina.
2018
.
The effect of human capital on earnings: Evidence from a reform at Colombia's top university
.
Journal of Public Economics
157
:
212
225
.
Avery
,
Christopher N.
,
Oded
Gurantz
,
Michael
Hurwitz
, and
Jonathan
Smith
.
2018
.
Shifting college majors in response to Advanced Placement exam scores
.
Journal of Human Resources
53
(
4
):
918
956
.
Bettinger
,
Eric P.
, and
Bridget Terry
Long
.
2005
.
Do faculty serve as role models? The impact of instructor gender on female students
.
American Economic Review
95
(
2
):
152
157
.
Bettinger
,
Eric P.
,
Bridget Terry
Long
,
Philip
Oreopoulos
, and
Lisa
Sanbonmatsu
.
2012
.
The role of simplification and information in college decisions: Results from the H&R Block FAFSA experiment
.
Quarterly Journal of Economics
127
(
3
):
1205
1242
.
Buser
,
Thomas
, and
Huaiping
Yuan
.
2019
.
Do women give up competing more easily? Evidence from the lab and the Dutch Math Olympiad
.
American Economic Journal: Applied Economics
11
(
3
):
225
252
.
Carnevale
,
Anthony P.
,
Ban
Cheah
, and
Andrew
Hanson
.
2015
.
The economic value of college majors
.
Available
https://1gyhoq479ufd3yna29x7ubjn-wpengine.netdna-ssl.com/wp-content/uploads/The-Economic-Value-of-College-Majors-Full-Report-web-FINAL.pdf.
Accessed 17 July 2020
.
Carnevale
,
Anthony P.
,
Stephen J.
Rose
, and
Ban
Cheah
.
2011
.
The college payoff: Education, occupations, and lifetime earnings
.
Available
https://files.eric.ed.gov/fulltext/ED524299.pdf.
Accessed 17 July 2020
.
Castleman
,
Benjamin L.
,
Bridget Terry
Long
, and
Zachary
Mabel
.
2018
.
Can financial aid help to address the growing need for STEM education? The effects of need-based grants on the completion of Science, Technology, Engineering, and Math courses and degrees
.
Journal of Policy Analysis and Management
37
(
1
):
136
166
.
Cortes
,
Kalena E.
,
Joshua S.
Goodman
, and
Takako
Nomi
.
2015
.
Intensive math instruction and educational attainment: Long-run impacts of Double-Dose Algebra
.
Journal of Human Resources
50
(
1
):
108
158
.
Dee
,
Thomas S.
, and
Linda A.
Jackson
.
1999
.
Who loses HOPE? Attrition from Georgia's college scholarship program
.
Southern Economic Journal
66
(
2
):
379
390
.
Deming
,
David J.
,
Justine S.
Hastings
,
Thomas J.
Kane
, and
Douglas O.
Staiger
.
2014
.
School choice, school quality, and postsecondary attainment
.
American Economic Review
104
(
3
):
991
1013
.
Denning
,
Jeffrey T.
, and
Patrick
Turley
.
2017
.
Was that SMART?: Institutional financial incentives and field of study
.
Journal of Human Resources
52
(
1
):
152
186
.
Dynarski
,
Susan M.
,
C. J.
Libassi
,
Katherine
Michelmore
, and
Stephanie
Owen
.
2018
.
Closing the gap: The effect of a targeted, tuition-free promise on college choices of high-achieving, low-income students
.
NBER Working Paper No. 25349
.
Edmunds
,
Julie A.
,
Lawrence
Bernstein
,
Fatih
Unlu
,
Elizabeth
Glennie
,
John
Willse
,
Arthur
Smith
, and
Nina
Arshavsky
.
2012
.
Expanding the start of the college pipeline: Ninth-grade findings from an experimental study of the impact of the Early College High School Model
.
Journal of Research on Educational Effectiveness
5
(
2
):
136
159
.
Ellison
,
Glenn
, and
Ashley
Swanson
.
2010
.
The gender gap in secondary school mathematics at high achievement levels: Evidence from the American Mathematics Competitions
.
Journal of Economic Perspectives
24
(
2
):
109
128
.
Evans
,
Brent J.
2019
.
How college students use Advanced Placement credit
.
American Educational Research Journal
56
(
3
):
925
954
.
Ewing
,
Maureen
,
Sanja
Jagesic
, and
Jeff
Wyatt
.
2018
.
Choosing double: The relationship between successful AP exams and college double major
.
New York
:
College Board
.
Figlio
,
David N.
, and
Maurice E.
Lucas
.
2004
.
What's in a grade? School report cards and the housing market
.
American Economic Review
94
(
3
):
591
604
.
Figlio
,
David N.
,
Morton O.
Schapiro
, and
Kevin B.
Soter
.
2013
.
Are tenure track professors better teachers?
Review of Economics and Statistics
97
(
4
):
715
724
.
Finkelstein
,
Amy
, and
Matt
Notowidigdo
.
2019
.
Take-up and targeting: Experimental evidence from SNAP
.
Quarterly Journal of Economics
134
(
3
):
1505
1556
.
Goodman
,
Joshua S.
2019
.
The labor of division: Returns to compulsory high school math coursework
.
Journal of Labor Economics
37
(
4
):
1141
1182
.
Gurantz
,
Oded
,
Michael
Hurwitz
, and
Jonathan
Smith
.
2017
.
College enrollment and completion among nationally recognized high-achieving Hispanic students
.
Journal of Policy Analysis and Management
36
(
1
):
126
153
.
Haxton
,
Clarisse
,
Mengli
Song
,
Kristina
Zeiser
,
Andrea
Berger
,
Lori
Turk-Bicakci
,
Michael S.
Garet
,
Joel
Knudson
, and
Gur
Hoshen
.
2016
.
Longitudinal findings from the Early College High School initiative impact study. Educational Evaluation and Policy Analysis
38
(
2
):
410
430
.
Huang
,
Jiekun
, and
Darren J.
Kisgen
.
2013
.
Gender and corporate finance: Are male executives overconfident relative to female executives?
Journal of Financial Economics
108
(
3
):
822
839
.
Jackson
,
C. Kirabo.
2010
.
A little now for a lot later: A look at a Texas Advanced Placement incentive program
.
Journal of Human Resources
45
(
3
):
591
639
.
Jackson
,
C. Kirabo.
2014
.
Do college-preparatory programs improve long-term outcomes?
Economic Inquiry
52
(
1
):
72
99
.
Kofoed
,
Michael S.
, and
Elizabeth
mcGovney
.
2019
.
The effect of same-gender and same-race role models on occupation choice: Evidence from randomly assigned mentors at West Point
.
Journal of Human Resources
54
(
2
):
430
467
.
Lim
,
Jaegeum
, and
Jonathan
Meer
.
2017
.
The impact of teacher–student gender matches: Random assignment evidence from South Korea
.
Journal of Human Resources
52
(
4
):
979
997
.
Mansour
,
Hani
,
Daniel I.
Rees
,
Bryson M.
Rintala
, and
Nathan N.
Wozny
.
2018
.
The effects of professor gender on the post-graduation outcomes of female students
.
IZA Discussion Paper No. 11820
.
Mattern
,
Krista D.
,
Emily J.
Shaw
, and
Maureen
Ewing
.
2011
.
Advanced Placement exam participation: Is AP exam participation and performance related to choice of college major?
College Board Research
Report No. 2011-6
.
Mokyr
,
Joel
,
Chris
Vickers
, and
Nicolas L.
Ziebarth
.
2015
.
The history of technological anxiety and the future of economic growth: Is this time different?
Journal of Economic Perspectives
29
(
3
):
31
50
.
Morgan
,
Stephen L.
,
Dafna
Gelbgiser
, and
Kim A.
Weeden
.
2013
.
Feeding the pipeline: Gender, occupational plans, and college major selection
.
Social Science Research
42
(
4
):
989
1005
.
Niederle
,
Muriel.
2016
.
Gender
. In
The handbook of experimental economics
, volume
2
,
edited by
John H.
Kagel
and
Alvin E.
Roth
, pp.
481
562
.
Princeton, NJ
:
Princeton University Press
.
Niederle
,
Muriel
, and
Lise
Vesterlund
.
2007
.
Do women shy away from competition? Do men compete too much?
Quarterly Journal of Economics
122
(
3
):
1067
1101
.
Niederle
,
Muriel
, and
Lise
Vesterlund
.
2010
.
Explaining the gender gap in math test scores: The role of competition
.
Journal of Economic Perspectives
24
(
2
):
129
144
.
Pope
,
Denise
,
Maureen
Brown
, and
Sarah
Miles
.
2015
.
Overloaded and underprepared: Strategies for stronger schools and healthy, successful kids
.
New York
:
John Wiley & Sons
.
Reuben
,
Ernesto
,
Matthew
Wiswall
, and
Basit
Zafar
.
2017
.
Preferences and biases in educational choices and labour market expectations: Shrinking the black box of gender
.
Economic Journal
127
(
604
):
2153
2186
.
Scott-Clayton
,
Judith
, and
Olga
Rodriguez
.
2015
.
Development, discouragement, or diversion? New evidence on the effects of college remediation policy
.
Education Finance and Policy
10
(
1
):
4
45
.
Smith
,
Jonathan
,
Michael
Hurwitz
, and
Christopher
Avery
.
2017
.
Giving college credit where it is due: Advanced Placement exam scores and college outcomes
.
Journal of Labor Economics
35
(
1
):
67
147
.
Smith
,
Kara
,
Sanja
Jagesic
,
Jeff
Wyatt
, and
Maureen
Ewing
.
2018
.
AP STEM participation and postsecondary STEM outcomes: Focus on underrepresented minority, first-generation, and female students
.
New York
:
College Board
.
Stinebrickner
,
Ralph
, and
Todd R.
Stinebrickner
.
2014
.
A major in Science? Initial beliefs and final outcomes for college major and dropout
.
Review of Economic Studies
81
(
1
):
426
472
.
Taylor
,
Eric.
2014
.
Spending more of the school day in math class: Evidence from a regression discontinuity in middle school
.
Journal of Public Economics
117
:
162
181
.
Webber
,
Douglas A.
2017
.
State divestment and tuition at public institutions
.
Economics of Education Review
60
:
1
4
.
Wiswall
,
Matt
, and
Basit
Zafar
.
2015
.
Determinants of college major choice: Identification using an information experiment
.
Review of Economic Studies
82
(
2
):
791
824
.
Wyatt
,
Jeff
,
Sanja
Jagesic
, and
Kelly
Godfrey
.
2018
.
Postsecondary course performance of AP exam takers in subsequent coursework
.
New York
:
College Board
.

Supplementary data