I use nationally representative data from the Education Longitudinal Survey (ELS) to update the literature on returns to community college education. I compare the experiences of the ELS cohort that graduated high school in 2004 with those of the National Education Longitudinal Survey (NELS) cohort that graduated high school more than a decade earlier, in 1992. I estimate that community college students from the ELS cohort were more likely to be employed, and that those who were earned about 21 percent more than comparable peers with only a high school education. This estimate is at least as large as that observed for the NELS cohort, though I find some evidence that the value of an associate's degree is smaller for the more recent cohort. I compare these results with those from the burgeoning body of research using state administrative data to answer similar questions.

Over the past several decades, the number and proportion of young Americans going to college have steadily increased. The proportion going to two-year (community) colleges has grown especially fast. In 1990, 20.1 percent of recent high school graduates enrolled in two-year colleges, and 40 percent enrolled in four-year colleges.1 By 2015, the proportion enrolling in two-year colleges increased to 25.2 percent, and rates of enrollment at four-year colleges increased to 44 percent.2 As enrollment has grown, so too has our understanding of the employment and earnings impacts of community college education. Beginning in the 1990s, nationally representative survey data permitted economists to measure the earnings effects of community college, improving on a literature that relied heavily on nonrepresentative, institution-based analyses. Over the past decade, research on the topic has shifted toward the use of rich state administrative data combining postsecondary enrollment and Unemployment Insurance (UI) wage records. Administrative data have many advantages, but for reasons I detail below, research findings using these data have important limitations and their findings can be difficult to generalize broadly.

In this paper, I revisit the use of nationally representative survey data to update the literature on returns to community college education. This update is useful for comparing the experiences of recent cohorts of high school graduates with those of a previous generation, and for comparing results from survey data with those from administrative records. I aim to do so to give the reader a sense of the range of estimates, but also to illustrate the empirical challenges and limitations inherent in estimating returns to postsecondary data, a problem that by its nature cannot rely on experimental designs.

To update estimates from survey data, I study the experiences of students from the Education Longitudinal Survey (ELS) cohort, collected by National Center for Education Statistics. Students in the ELS were high school seniors in 2004, and began their postsecondary and labor market careers at the doorstep of the Great Recession. The experiences of this cohort are important in their own right, because they provide insight into the experiences of American workers during and after one of the largest economic downturns in modern history.

I also compare the employment and earnings effects of community college enrollment for the ELS cohort with the cohort of students surveyed in the National Education Longitudinal Survey (NELS). The NELS cohort began their postsecondary education and their working careers in the early to mid-1990s—a very different labor market than the one young people in the ELS cohort entered. These different settings provide the opportunity to assess the merits of current policy proposals encouraging sub-baccalaureate education.

Central to the task of estimating the effects of postsecondary education is the omitted variables problem. Indeed, estimating earnings effects of education is often literally the textbook example of this common empirical problem. Studies that use survey or administrative data necessarily rely on quasi-experimental variation. Without random or as-good-as random variation in access to community college, researchers using survey and administrative data adopt different strategies to limit unmeasured heterogeneity between those with and without postsecondary education. Naturally, survey and administrative data have different strengths and weaknesses. Though these are well known, they are worth restating here: Although administrative records provide data on large samples or even the universe of relevant units, they seldom provide much information on relevant controls or include an obvious control group. And, though survey data typically provide richer sets of potential controls, sample size and power can be limited.

In this context, these characterizations are germane. Researchers using administrative data limit the potential impact of omitted variables by differencing or controlling for pre-enrollment outcomes. The survey data used in the literature are of young persons, where no meaningful pre-enrollment outcomes are observed. This advantage for administrative data comes at a cost: Results only generalize to the population of students with a work history. This leaves out traditional college students. And, using pre-enrollment earnings requires researchers to rely on a common-trends assumption that is often violated in this context. Further, because of the reliance on records from community colleges, there is no natural control group. Thus, treatment effects are estimated only at the intensive margin. Researchers utilizing survey data attempt to limit the omitted variables problem by saturating regression models. However, most survey data offer limited opportunity to employ other strategies to limit the potential impact of heterogeneity between treatment and control groups.

In this paper, I revisit the use of nationally representative survey data and, in doing so, address and attempt to assess the importance of these limitations. This is a useful update to the literature for at least two reasons. First, the ELS data provide estimates of employment and earnings outcomes of community college for young people studying and starting their careers in the 2000s and 2010s, and will serve as an important comparison to studies of previous cohorts. Understanding whether or how the economic value of community college study has changed for young Americans is vital for evaluating policy proposals that encourage sub-baccalaureate study as a foundation for improving college access and reducing costs. Second, comparing results from survey data to contemporaneous findings from studies using administrative data can help calibrate the findings from studies at the state level.

I estimate that community college students from the ELS cohort were more likely to be employed, and that those who were earned about 21 percent more than their high school–educated peers. Further, students accumulating one to two full-time equivalent (FTE) years’ worth of credits earn more than 30 percent more than their high school–educated peers. This is slightly larger than the earnings difference for students from the NELS cohort, and equivalent to results from research using cohorts graduating high school in the 1970s and 1980s (Kane and Rouse 1995). I view this as evidence that the returns to community college education are likely increasing over time, and at the least are holding steady.

As with research using other survey data in this literature, the capacity to develop convincing causal estimates is limited with the ELS. As a consequence, the main results here should not be viewed as causal estimates. Nonetheless, I show that the estimates obtained from regression adjusted and inverse probability weighting are quite similar. Further, by comparing unconditional earnings with those conditional on observables, I find large differences between community college– and high school–educated workers under a wide variety of assumptions about the relative degree of selection on unobservables versus observables.

Community colleges have played a key role in access to postsecondary education among both recent high school graduates, and older workers attempting to upgrade their skills.3 More than 43 percent of all students enrolled in public postsecondary education in 2014 were at two-year institutions—up from approximately 27 percent in 1970.4 Community colleges have a mission that includes providing open-access education to adults, as well as lifelong learning and training for nontraditional learners. But a central mission is to provide a low-cost, open-admission opportunity for students to take college coursework, earn sub-baccalaureate degrees, and potentially transfer to four-year colleges.

Estimating the success of community colleges is made complicated by the variety of educational objectives of their students. Some students are full-time, degree-seeking, and right out of high school. Others are mid-career workers seeking specific skills with no intent of earning an associate's degree or pursuing continuing education. A further source of heterogeneity is that some degree-seeking students include those intending to earn an associate's degree as a terminal degree whereas others intend to transfer into colleges providing bachelor's degrees. A different group of degree-seeking students enroll in career technical education with the aim of earning vocational certificates and degrees that lead to employment in fields such as information technology and health or protective services.

The task of estimating earnings differences by education level has been central to the study of human capital. Because it often relied on cross-sectional survey data, like the Current Population Surveys, early work on the topic measured education using years of completed schooling reported by respondents, and estimated the earnings effects of college by defining college graduates as those reporting at least four years of education beyond twelfth grade (Levy and Murnane 1992; Murphy and Welch 1992).

Evidence from Panel Data in the 1980s and 1990s

Improvements in this measurement strategy were made possible with the advent of several panel datasets surveying young people in the years following high school. These included the National Longitudinal Survey of Youth, High School and Beyond, and NELS. Early work using panel data and detailed enrollment information found that an additional year of enrollment in postsecondary education increased earnings by about 5 to 8 percent for traditional-age college students (Kane and Rouse 1995) and for older adults returning to school (Gill and Leigh 1997). Kane and Rouse also estimate that women who receive an associate's degree earned about 29 percent more than their comparable peers with only a high school diploma. For men, they estimate the earnings premium for a community college degree at about 8 percent.

Belfield and Bailey (2011) review the literature of the effects of community college on earnings, highlighting a number of studies using the NLS and National Longitudinal Survey of Youth data for this purpose. Despite differences across studies (due in part to definitions, timing of outcome measures, specification differences, or sample inclusion or exclusion restrictions), these studies report earnings premia for associate's degrees over high school diplomas of 10 to 15 percent for men, and 20 to 25 percent for women

Marcotte et al. (2005) and Marcotte (2010) updated this early work on the earnings effects of community colleges. These studies used data from NELS. This cohort matriculated into college and started working in the 1990s, whereas previous work focused mainly on students graduating high school in the 1970s. Despite the fact that the relative earnings of college-educated workers rose over the period, the authors’ estimates of earnings premia for young workers with community college educations in the 1990s were similar to those in earlier decades: They estimated that full-time enrollment in a community college increases earnings between 5 and 8 percent for each year enrolled, even if no degree was received—and that earning an associate's degree increases earnings by about 15 to 30 percent.

Evidence from State Administrative Data

The recent literature on the earnings effects of community college education has focused heavily on the use of state administrative data that provide the opportunity to link students attending community college to UI wage records. Among the earliest papers of this type is the work of Jacobson, LaLonde, and Sullivan (2005), who examine the impact of community college coursework for workers dislocated from jobs in the early 1990s in Washington state. They estimate that an academic year's worth of community college education increased displaced workers’ earnings by about 9 percent for men, and 13 percent for women. They also found that returns varied substantially, with those taking technical, vocational coursework earning substantially more than those taking nonvocational coursework.

A more recent and burgeoning literature using administrative records has studied the effects of community college course taking on students who are not (necessarily) displaced workers. This includes studies using data from Kentucky (Jepsen, Troske, and Coomes 2014), North Carolina (Liu, Belfield, and Trimble 2015; Xu and Trimble 2016), Virginia (Xu and Trimble 2016), Arkansas (Belfield 2015), and Washington (Dadgar and Trimble 2015). The relative earnings differences estimated in these studies vary a good bit. For example, using data from Kentucky, Jepsen, Troske, and Coomes (2014) report substantial earnings returns for students completing associate's degrees from 2002 to 2004, with an increase in earnings for degree holding women of more than 50 percent, and smaller increases (less than 10 percent) for men. Xu and Trimble (2016) use data from North Carolina and Virginia and find significant but smaller earnings effects for students receiving certificates at community colleges. They report adjusted earnings differences of associate's degree recipients to be about 30 percent higher for women and 18 percent higher for men. In other states, researchers using administrative data report substantively smaller relative earnings gains for associate's degree holders (Belfield 2015; Dadgar and Trimble 2015).

Although research using state administrative data has many advantages for estimating employment and earnings effects of many aspects of education at community colleges,5 the task of estimating causal effects of community college education on employment and earnings remains empirically challenging. The principal challenge is the evaluation problem inherent in establishing the counterfactual in a setting where access to treatment cannot be feasibly assigned at random. A second (and related) problem is identifying the treatment of interest when some students enroll with no intent to earn a diploma. Rather, they might be taking a class or two to learn a skill they perceive important in the labor market. One approach to dealing with the latter problem is to estimate the effects of any enrollment, separate from credits earned or diplomas received.

However, this does not resolve the central problem: The choices of whether to enroll in community college and what to study once enrolled are surely affected by factors that cannot be controlled by the researcher but nonetheless shape anticipated outcomes. Researchers using survey data have attempted to deal with this problem primarily by using the relatively rich sets of control variables those data afford. These include measures of student and family socioeconomic and demographic attributes, as well as measures of academic preparation and ability measured prior to postsecondary enrollment. Such approaches provide causal estimates insofar as selection is on observables.

Researchers using administrative data have access to much more limited sets of observable attributes for students and their families. But, because they have access to quarterly wage records from state UI systems, these researchers typically can use pre-post earnings differences to approximate causal effects. However, relying on within-student earnings differences requires knowledge or assumptions about pre-enrollment earnings trends for treatment and control groups. The well-known (Ashenfelter's) earnings dip prior to enrollment for adults in job training or vocational programs is relevant here, because all administrative data studies focus on adults with a work history prior to community college enrollment. Jacobson, LaLonde, and Sullivan (2005) illustrate several empirical specification issues important for estimating earnings effects of community college courses for these students, including the need to measure earnings after a sufficient job-search period following enrollment.

Although the use of within-student variation in studies using administrative records potentially strengthens internal validity for estimates of employment and earnings effects, this comes at the cost of limiting external validity. These studies rely on data on community college students who have previous work experience in UI-covered jobs. Hence, these results cannot generalize to students with no work history or who are employed in part-time or contract work—as is the case for most teenagers or those right out of high school. Further, studies using administrative data typically do not have a control group of comparable adults with no postsecondary education. Rather, they focus on variation in credits or degrees among persons enrolling in community colleges. So, the treatment-control comparison is between those with degrees or many credits and those dropping out. For most policy debates, the counterfactual of greatest interest is not enrolling in postsecondary education at all.

Finally, studies that rely on state administrative data can only infer employment effects: If a wage record is found in a state for a person observed attending community college, then the person is assumed to be employed, and if no wage record is found, they are assumed to be unemployed. This is often a safe assumption, but certainly not always. Prior research suggests interstate mobility increases with education (Wozniak 2010; Malamud and Wozniak 2012). If those with the most postsecondary education are more likely to be lost to interstate migration, this may result in an underestimate of the impact of community college on employment and earnings.6

Although these external validity problems are less relevant for nationally representative panel survey data, the lack of information on earnings over time is often a limitation in those data. In the section below, I describe the survey data used in this paper, and describe the estimation problems and models used here.

Survey Data

The ELS is a high school clustered random sample of students in the tenth grade in 2002. Student respondents were interviewed (along with school administrators, teachers, and parents) in the initial year, and in 2004, 2006, and 2012. The overwhelming majority of high school graduates in the ELS cohort received their diplomas in 2004.7 I restrict my analysis to the students who graduated high school on time (in 2004) and either enrolled in community or four-year college by the time of the 2006 interview (i.e., within about two years of graduation), or did not enroll in postsecondary education at all.8

I exclude those who delay enrollment in postsecondary education because there is less information about postsecondary education for those enrolling after 2006. After 2006, the next (and last) follow-up is 2012. So, unlike those who start postsecondary study by 2006, for those who start after 2006 information data on postsecondary study would come from the same follow-up survey year (2012) as the outcome data, and there would be insufficient opportunity to observe postsecondary outcomes for this group.9 Further, this would require a much longer recall period than for those enrolling between the 2004 follow-up during the high school graduation year and the 2006 follow-up two years later. Therefore, the comparison of interest is between those who enroll in community college within two years of completing high school, and those with only a high school degree.10 For those who enrolled in college, I distinguish between those enrolling only in community college, or who enrolled first in a community and later a four-year college.

The ELS collects detailed information about students and schools, and provides information on family and community life. This includes information about students’ prior achievement, college plans, and college enrollment decisions. As I describe below, I attempt to limit differences between students who enroll in community college and those who do not by controlling for student attributes, and the educational level and income of their parents. I also control for standardized math and reading scores on tests administered to all students while still in high school. Naturally, students with higher academic ability are more likely to enroll in postsecondary study, so controlling for precollege achievement levels helps isolate the impact of community college on employment and earnings.

I measure employment and earnings outcomes from the 2012 follow-up survey. These are self-reported measures of any employment for pay, and total annual labor earnings in the 2011 calendar year (NCES 2014). The students in this cohort were entering the labor market and/or finishing college at the start of the Great Recession. Indeed, the labor market prospects of young workers during this period were among the worst in a generation. At the time of the ELS follow-up, the unemployment rate for teens exceeded 25 percent, and was above 15 percent for those in their 20s.11

To limit academic and skills differences between community college students and their high school–educated peers that may affect subsequent labor market outcomes, I control for standardized scores from math and reading tests administered to the ELS sample while in the tenth grade. The math tests included questions on algebra, geometry, probability and summary statistics, and select advanced topics (NCES 2014). The reading assessment tested comprehension and other fundamental reading and English language skills. The ELS tests weighted problem solving and applications more heavily than did its predecessor, the NELS (NCES 2014). The scores are based on Item Response Theory, which uses the pattern of responses (correct vs. incorrect) to estimate the probability of correct answers for unanswered questions (weighted by question difficulty). The test scores used here are norm-referenced to the population of eligible tenth graders.

In table 1, I present descriptive statistics for the ELS sample of 3,025 who earned a high school diploma and ended their education there, or who enrolled in a community college within two years of high school graduation.12 Just under two thirds of the sample (65.9 percent) enrolled in community college. The sample is unremarkable on basic demographic characteristics, such as gender and race/ethnicity composition. The mean values of the math and reading assessments are about 48, with standard deviations of about 9. About 56 percent of sample members’ parent attended at least some college, with 17 percent earning a bachelor's degree.

Table 1.
Descriptive Statistics of the Education Longitudinal Survey Sample
VariableMeanStandard Dev.
Enrolled in community college 0.659 0.474 
Female 0.505 0.500 
White, non-Hispanic 0.631 0.483 
Black, non-Hispanic 0.121 0.326 
Hispanic 0.165 0.371 
Asian 0.031 0.173 
Math score (10th grade) 47.89 8.553 
Reading score (10th grade) 47.97 8.928 
Native English speaker (0/1) 0.84 0.367 
Parent highest education level   
Some college 0.393 0.489 
College graduate 0.170 0.376 
N = 3,025   
VariableMeanStandard Dev.
Enrolled in community college 0.659 0.474 
Female 0.505 0.500 
White, non-Hispanic 0.631 0.483 
Black, non-Hispanic 0.121 0.326 
Hispanic 0.165 0.371 
Asian 0.031 0.173 
Math score (10th grade) 47.89 8.553 
Reading score (10th grade) 47.97 8.928 
Native English speaker (0/1) 0.84 0.367 
Parent highest education level   
Some college 0.393 0.489 
College graduate 0.170 0.376 
N = 3,025   

In addition to estimating the earnings premium associated with sub-baccalaureate education for Millennials, another goal for this paper is to understand whether this premium has changed over time. To assess this, I construct an identical sample from NELS graduates from the high school class of 1992. I define control and independent variables to be directly comparable to the ELS. So, for the NELS cohort I include students matriculating at a community college between 1992 and 1994 as their first postsecondary enrollment. Employment outcomes were measured during the 2000 NELS follow-up survey. In both cases, employment outcomes were measured eight years after high school graduation, and at least six years after the onset of any postsecondary education. Because the main and new analyses here are based on the ELS cohort, I direct the interested reader to Marcotte (2010) for a detailed discussion of the NELS.

Empirical Methods

Regardless of whether data come from administrative records or surveys, the absence of as-good-as random assignment to college attendance means the potential for an omitted variables problem is an inherent complication for researchers in this area. To address this problem, using the ELS survey data, I start with models of the following type:
(1)
where yi is either a 0/1 indicator of whether individual i is employed or a measure of the log of individual i’s annual earnings. The treatment variable(s) of interest in the initial models is CCi, which measures whether individual i ever enrolled for credit in a community college. To limit heterogeneity between those who enroll in community college and those who receive no education beyond high school, in all models I include measures of family and demographic attributes (Xi) and previous schooling aptitude (Si), as described above. εi is a random disturbance term assumed to be normally distributed.

In this model, the coefficient β measures the average effect of enrollment in community college, conditional on observed family and student attributes. Specifically, it is a weighted average of the relative differences in the likelihood of employment for those with more/fewer earned credits and with/without degrees. In separate models, I distinguish between students who complete different levels of credits or who earn degrees.

To attempt to further limit the potential impact of unobserved differences between community college students and high school graduates, I first make use of the cluster design of the ELS and models of the following type:
(2)
where μs is a fixed effect for high school s. The employment and earnings outcomes and the residual εis now each have idiosyncratic and high school specific components. Including a high school fixed effect, outcome differences are estimated by comparing students who enrolled in community college with peers from their same high school who did not, conditional on observed student and family attributes. Note that model 2 can be estimated for individuals who graduated from ELS high schools where at least one sample member enrolled in community college and at least one sample member obtained no education beyond the high school diploma. In total, the ELS survey collected data on students in 750 high schools. Six hundred twenty-eight high schools contribute to the estimation of the fixed effect models of employment and 615 contribute to the earnings models.13

The high school fixed effects models limit threats to validity due to the possibility that high schools vary in their academic culture and quality, or are in different labor markets, both of which can affect the likelihood of postsecondary study as well as employment prospects. Of course, models that include high school fixed effects assume that within-school factors that affect postsecondary enrollment decisions are captured by observable variables. This is a weaker assumption than the previous models that also assume observables adequately control for local economic and social factors that shape postsecondary enrollment decisions and employment prospects.

In addition to utilizing high school fixed effects to limit threats to validity, I estimate outcome differences between community college–educated (treatment) and high school–educated (control) members of the ELS sample using nonparametric matching estimates. Matching estimators can be an improvement over regression analysis because they reduce model dependence (King et al. 2011). I estimate employment/earnings differences by estimating treatment propensity as a function of observable individual and family attributes, as well as high school attended. I then use the propensity scores as inverse probability weights (IPWs), where the weight for individual i is:
(3)
where p^i is the propensity that individual i received treatment.14 This strategy weights both the treatment and control groups up to the full sample, just as probability sampling weights are used to generate population estimates for disproportionately sampled subgroups in surveys (Stuart 2010). As with all matching estimators, IPW requires trimming samples to enforce common support. I estimate propensity scores with replacement, and drop all observations not in the area of common support. Though IPW rests on the same assumptions about unconfoundedness as the fixed effects regression models above, it requires no assumptions about functional form on treatment effects. The IPW estimates will provide a point of comparison to the regression-based estimates of average treatment effects of community college education.

I also estimate models of the relationship between degree and credit completion at community colleges and employment and earnings. To assess the relationship between degrees and employment outcomes, I include mutually exclusive dummy variables of the highest academic degree earned by students matriculating at a community college. These are either an associate's (AA) degree, or a bachelor's (BA) degree earned after transferring to a four-year college (even if an AA degree was earned). Of the ELS sample who started college at a community college, 13.9 percent earned an AA as their highest degree and 23.1 percent earned a BA. These are generally comparable to estimates from the National Student Clearinghouse data that about 39 percent of first-time college students enrolling in community college in Fall 2010 earned either an AA or BA after six years (Shapiro et al. 2016).

To study the impact of credit hours, I differentiate between community college students earning various multiples of 15 credit hours (a full load for one semester). Approximately 20 percent of those starting at a community college had earned less than 15 credit hours after eight years (figure 1). The distribution of completed credits is bi-modal, as the proportion earning less than two, three, and four full semesters’ worth of credits is smaller than the proportion earning less than 15 credit hours—while the largest group earns at least 60 credits (or two full years). Typically, 60 credits are required for an AA. Students transferring to a four-year college seeking a BA are often required to complete 120 credits.

Figure 1.

Distribution of Completed Postsecondary Credits of Community College Matriculants

Figure 1.

Distribution of Completed Postsecondary Credits of Community College Matriculants

Close modal

It is important to point out that the completion of various milestones or credits may be associated with underlying differences in student ability or intent. Or, they may pick up essential heterogeneity in the impact of community college on student employment outcomes, since students learn about the value (and costs) of continued enrollment during their studies, and this may shape both decisions about persistence as well as subsequent outcomes.

To estimate the relationship between community college education, and employment and earnings, I estimate a series of models in which the dependent variables are either indicators of being employed, or the log of annual labor earnings (conditional on employment) at the time of the 2012 follow-up, when the modal age of respondents was 26 years. I first control only for student demographic characteristics and parents’ income and education. I then add in scores on math and reading achievement tests administered when respondents were in tenth grade, to control for differences in ability that might be correlated both with the likelihood of postsecondary study and labor market outcomes. I control for potential labor market experience, measured as a quadratic in months since last enrollment in school.

To assess whether the earnings and employment outcomes for community college–educated workers have changed, I develop a comparison sample from the NELS, and define outcome, treatment, and control measures identically, and estimate the same models, described above.15 In the case of the NELS, outcomes were measured in 2000.

As a first step in understanding earnings and employment differences between community college–educated workers and their high school–educated peers, in figure 2 I present characteristics of ELS sample members who attended community college compared with those whose education ended with a high school diploma. Each panel presents differences between community college– and high school–educated ELS sample members. Panel (a) shows differences between the two groups while still in high school (in 2002). Those who would go on to community college were twice as likely to be from high-income families (with annual incomes above $75,000) than those who would get no education beyond high school. They were also more likely to have parents who graduated from college and scored higher on standardized tests.16 All differences are statistically significant at the 5 percent level.

Figure 2.

Comparison of High School— and Community College—Educated Education Longitudinal Survey Sample: In (a) 2002 (Tenth Grade) and (b) 2012

Note: All differences by education level are significant at the 5% level.

Figure 2.

Comparison of High School— and Community College—Educated Education Longitudinal Survey Sample: In (a) 2002 (Tenth Grade) and (b) 2012

Note: All differences by education level are significant at the 5% level.

Close modal

In panel (b), I present employment differences between the two groups in 2012, when they were typically 26 years old. Respondents with postsecondary education had better employment and earnings outcomes than their high school–educated peers. Among those with postsecondary education, 82 percent were employed at age 26 years, compared with 76 percent of high school graduates. Further, the average earnings of those with at least some college was $24,200, compared with $20,700 for their high school–educated peers.

To further assess the employment outcomes of community college students compared with their high school counterparts, table 2 presents results of the estimation of models 1 and 2 and the IPW matching estimator. In each case, the coefficient of interest provides an estimate of differences in the expected value of employment outcomes between those with any enrollment in community college and those with no education beyond high school, conditional on observed individual and family characteristics. The results in model 2 and the IPW estimator also condition/match on high school attended.

Table 2.
Community College Enrollment and Subsequent Employment and Earnings: Education Longitudinal Survey, High School Class of 2004
Employmentln(Earnings)
Independent variable(1)(2)(3)(4)(5)(6)
Enrolled in community college 0.052 0.026 0.094 0.284** 0.199* 0.197** 
 (0.032) (0.036) (0.054) (0.091) (0.097) (0.067) 
High school fixed effects No Yes Yes No Yes Yes 
Estimator OLS OLS IPW OLS OLS IPW 
N 3,025 3,025 2,272 2,669 2,669 1,903 
Number of high schools – 628 225 – 615 243 
R2 0.065 0.301  0.069 0.278  
Employmentln(Earnings)
Independent variable(1)(2)(3)(4)(5)(6)
Enrolled in community college 0.052 0.026 0.094 0.284** 0.199* 0.197** 
 (0.032) (0.036) (0.054) (0.091) (0.097) (0.067) 
High school fixed effects No Yes Yes No Yes Yes 
Estimator OLS OLS IPW OLS OLS IPW 
N 3,025 3,025 2,272 2,669 2,669 1,903 
Number of high schools – 628 225 – 615 243 
R2 0.065 0.301  0.069 0.278  

Notes: All models control for respondent race, gender, potential labor market experience (quadratic), parental education, family income when in 10th grade, and performance on standardized reading and math assessments in high school. See text for details. Standard errors in parentheses. OLS = ordinary least squares; IPW = inverse probability weight.

*p < 0.05; **p < 0.01.

The left side of the table presents models where the dependent variable is employment at the time of the last follow-up survey. The first two columns are parametric estimates from linear probability models.17 The third column shows the IPW matching estimate. Regardless of the model or estimate, I find no significant difference in employment likelihood between community college– and high school–educated workers.

The right side of table 2 presents results from the models where the dependent variable is earnings conditional on employment. In column 4, I present the results from model 1. The results suggest that on average, by their late 20s, Millennial workers who enrolled in community college earned approximately 32.8 percent more annually than their high school–educated peers (p < 0.01), conditioning on observed demographic, family, and academic background.18 In column 5, I present results from model 2, which includes high school fixed effects. The conditional earnings difference between community college– and high school–educated young workers falls to 22 percent. Notably, the earnings difference between observationally identical high school– and community college–educated workers falls by about a third when we compare students who attend the same high school. This suggests that some of the differences observed in column 4 are due to differences in earnings that would have been expected anyway, because college students on average attended better high schools or lived in areas with better labor markets.

Because of the importance of high school fixed effects, the IPW estimator matches on observable characteristics for between workers with and without postsecondary education who attend the same high school. Of course, this is the same variation that underlies the parametric fixed effect model in column 5. The IPW is nearly identical to the ordinary least squares estimate, suggesting the average community college-educated worker earns about 21.7 percent more than observationally comparable workers who attended the same high school but obtained no additional education.

The models in table 2 do not distinguish between community college students who complete just a few credits and those who complete more than 60 or who earn a degree. In table 3, I present results from models that build on the fixed effects estimates above (columns 2 and 5), to estimate earnings differences from high school–educated workers, by community college persistence. Because there are now various levels of treatment, matching on all these dimensions is difficult. So, I estimate the coefficients here parametrically. The fact that the ordinary least squares and IPW estimates in table 2 are nearly identical provides some reassurance that within-school estimates of magnitudes are not driven by the choice of estimator. The columns on the left present results of models for which the dependent variable measures whether a respondent was employed in 2012. In column 1, community college students who earn degrees are distinguished from those who do not. In column 2, I make distinctions by completed credits for all community college students, in addition to maintaining the dummy variables for highest degree received. Note that the credit level and degree receipt indicators are not mutually exclusive.

Table 3.
Community College Enrollment and Subsequent Employment Outcomes
ELS: HS Class of 2004
Employmentln(Earnings)
Independent Variable(1)(2)(3)(4)
Intensity of enrollment     
Any enrollment 0.036  0.225*  
 (0.036)  (0.098)  
<15 Credits  0.04  0.126 
  (0.035)  (0.098) 
15 to 30  0.062  0.340** 
  (0.041)  (0.109) 
30 to 45  0.041  0.176 
  (0.039)  (0.117) 
45 to 60  0.009  0.285** 
  (0.047)  (0.108) 
More than 60  0.038  0.271** 
  (0.036)  (0.086) 
Highest earned degree     
Associate's degree 0.080** 0.082* 0.127 0.101 
 (0.030) (0.087) (0.087) (0.087) 
Bachelor's degree (0/1) 0.097*** 0.102*** 0.197** 0.181** 
 (0.027) (0.028) (0.070) (0.069) 
High school fixed effects Yes Yes Yes Yes 
Joint significance of earned degrees (p-value of F-statistic) <0.001 <0.001 0.015 0.032 
Joint significance of earned credits (p-value of F-statistic)  0.544  0.011 
N 3,025 3,025 2,669 2,669 
R2 0.306 0.307 0.347 0.35 
ELS: HS Class of 2004
Employmentln(Earnings)
Independent Variable(1)(2)(3)(4)
Intensity of enrollment     
Any enrollment 0.036  0.225*  
 (0.036)  (0.098)  
<15 Credits  0.04  0.126 
  (0.035)  (0.098) 
15 to 30  0.062  0.340** 
  (0.041)  (0.109) 
30 to 45  0.041  0.176 
  (0.039)  (0.117) 
45 to 60  0.009  0.285** 
  (0.047)  (0.108) 
More than 60  0.038  0.271** 
  (0.036)  (0.086) 
Highest earned degree     
Associate's degree 0.080** 0.082* 0.127 0.101 
 (0.030) (0.087) (0.087) (0.087) 
Bachelor's degree (0/1) 0.097*** 0.102*** 0.197** 0.181** 
 (0.027) (0.028) (0.070) (0.069) 
High school fixed effects Yes Yes Yes Yes 
Joint significance of earned degrees (p-value of F-statistic) <0.001 <0.001 0.015 0.032 
Joint significance of earned credits (p-value of F-statistic)  0.544  0.011 
N 3,025 3,025 2,669 2,669 
R2 0.306 0.307 0.347 0.35 

Notes: Standard errors in parentheses. All models control for respondent race, gender, potential labor market experience (quadratic), parental education, family income when in tenth grade, and performance on standardized reading and math assessments in high school. See text for details.

*p < 0.05; **p < 0.01; ***p < 0.001.

In the results presented in columns 1 and 2, I estimate that the likelihood that a community college student is employed in 2012 is not significantly higher than the likelihood for a comparable high school graduate, unless the student earned an AA or BA degree. Regardless of the number of credits earned, I estimate that those who earn an associate's degree are about 8 percentage points more likely to be employed and those who earn a BA degree are about 10 percentage points more likely to be employed. The mean employment rate for high school graduates was 76.1 percent. These estimates imply an employment rate of 83 and 86 percent for AA and BA recipients, respectively.

In columns 3 and 4, I present the results of similar models where the dependent variable is the log of earnings, conditional on employment. Interestingly, AA degree receipt is not a key determinant of earnings advantages. In column 3, I estimate that regardless of accumulated credits, community college attendees earn about 25 percent more than their high school–educated peers. There is no statistically significant additional earnings difference conveyed to those who complete an AA degree. Those who earn a BA degree earn another 21.7 percent premium. So, the results in column 3 imply that those who started in community college and earned a BA degree had earnings in their late 20s that were more than 45 percent higher than comparable members of the class of 2004 who attended the same high school but did not go to college. The results in column 4 suggest that there is no sheepskin effect for an AA degree, but they do suggest that specific milestones are important. Namely, community college matriculants who earn no more than 15 credits do not earn significantly more than high school graduates. However, those who complete up to one or two years’ worth of credits earn significantly more, even without a degree.

By Gender

Next, I consider the questions of whether the relationship between community college enrollment and subsequent employment and earnings effects differ by gender. In table 4, I present estimates of the fully specified model of community college enrollment, credits, and degree receipt on employment and then earnings, separately by gender.19 In the first two columns, I find that the relationship between community college enrollment and employment is stronger for women than men. I find that women receiving an AA degree are 15.2 percentage points more likely to be employed than comparable women with a high school degree, and women continuing on to earn a BA degree are 11.5 percentage points more likely to be employed. I find weaker patterns—and no relationship between community college degree receipt and employment—for men. For both men and women, I find no significant relationship between community college education that does not lead to a degree and employment probability.

Table 4.
Community College Enrollment and Employment Outcomes, by Gender
Employmentln(Earnings)
(1)(2)(3)(4)
Intensity of enrollment Men Women Men Women 
<15 credits −0.002 0.067 0.109 0.209 
 (0.047) (0.068) (0.155) (0.183) 
15 to 30 credits 0.072 0.007 0.503** 0.191 
 (0.047) (0.076) (0.164) (0.185) 
30 to 45 credits 0.013 −0.049 0.455** −0.001 
 (0.047) (0.078) (0.166) (0.236) 
45 to 60 credits −0.016 0.006 0.458** 0.148 
 (0.072) (0.085) (0.161) (0.211) 
More than 60 credits 0.025 0.006 0.462*** 0.159 
 (0.048) (0.067) (0.136) (0.157) 
Highest earned degree     
Associate's received 0.049 0.152* 0.157 0.098 
 (0.040) (0.067) (0.136) (0.144) 
Bachelor's received 0.092* 0.115* 0.093 0.132 
 (0.045) (0.048) (0.123) (0.118) 
High school fixed effects Yes Yes Yes Yes 
Joint significance of earned credits (p-value of F-statistic) 0.500 0.948 0.004 0.806 
Joint significance of earned degrees (p-value of F-statistic) 0.125 0.022 0.474 0.504 
N 1,496 1,529 1,385 1,284 
R2 0.459 0.442 0.498 0.514 
Employmentln(Earnings)
(1)(2)(3)(4)
Intensity of enrollment Men Women Men Women 
<15 credits −0.002 0.067 0.109 0.209 
 (0.047) (0.068) (0.155) (0.183) 
15 to 30 credits 0.072 0.007 0.503** 0.191 
 (0.047) (0.076) (0.164) (0.185) 
30 to 45 credits 0.013 −0.049 0.455** −0.001 
 (0.047) (0.078) (0.166) (0.236) 
45 to 60 credits −0.016 0.006 0.458** 0.148 
 (0.072) (0.085) (0.161) (0.211) 
More than 60 credits 0.025 0.006 0.462*** 0.159 
 (0.048) (0.067) (0.136) (0.157) 
Highest earned degree     
Associate's received 0.049 0.152* 0.157 0.098 
 (0.040) (0.067) (0.136) (0.144) 
Bachelor's received 0.092* 0.115* 0.093 0.132 
 (0.045) (0.048) (0.123) (0.118) 
High school fixed effects Yes Yes Yes Yes 
Joint significance of earned credits (p-value of F-statistic) 0.500 0.948 0.004 0.806 
Joint significance of earned degrees (p-value of F-statistic) 0.125 0.022 0.474 0.504 
N 1,496 1,529 1,385 1,284 
R2 0.459 0.442 0.498 0.514 

Notes: Standard errors in parentheses. All models control for respondent race, gender, potential labor market experience (quadratic), parental education, family income when in tenth grade, and performance on standardized reading and math assessments in high school. See text for details.

*p < 0.05; **p < 0.01; ***p < 0.001.

In columns 3 and 4, I present gender-specific estimates of the relationship between community college enrollment and earnings. The results here are different than the patterns seen for employment in two ways. First, a relationship is apparent only for accumulated credits and not for degrees, and is large and significant only for men. Specifically, I find that men completing more than 15 credits earn over 50 percent more than comparable high school–educated peers. The point estimates for women are smaller and insignificant. One potential explanation for these differences is that men often take vocational and occupation-specific coursework at community colleges. Indeed, community colleges can have specific credentialing arrangements with employers in their areas. Taken together with the results in columns 1 and 2, the results in table 4 imply that community college education increases employment likelihood for women, and earnings for men.

That earnings advantage of a community college education is at least as large for men as for women is an interesting and different outcome for the ELS cohort. For previous cohorts, the earnings differences for those with sub-baccalaureate education was smaller for men compared with women (Marcotte 2010; Xu and Trimble 2016). The results here are consistent with findings in Jepsen, Troske, and Coomes (2014) of earnings benefits for men. If sub-baccalaureate education is beginning to have relatively large effects on men's earnings, this may slow the growing gap in educational outcomes between men and women.

Comparison to NELS Cohort

I next consider whether the employment and earnings effects of community college education are different for Millennials, compared with the earlier NELS cohort. Recall that the NELS cohort finished schooling approximately twelve years before the ELS cohort, and their employment outcomes were measured in 2000. In table 5, I present the results of conditional employment and earnings differences between community college–educated and high school–educated workers for both the NELS and ELS cohorts. In panel A, I measure community college as any enrollment (as in table 2). In panel B, I distinguish between different levels of completed credits and earned degrees (as in table 3). In panel A, I find no significant difference for either cohort in the likelihood of employment between young persons who attended community college and those whose education ended at high school graduation. For both cohorts, community college–educated workers earned more than comparable high school–educated workers. And, this difference increased between the NELS and ELS cohort. For those from the NELS cohort, community college–educated workers earned about 14.2 percent more than their high school–educated peers. Twelve years later, when the ELS cohort was of comparable age, community college–educated workers earned approximately 22 percent more than similar high school–educated workers.

Table 5.
Community College and Employment Outcomes: National Education Longitudinal Survey (NELS) and Education Longitudinal Survey (ELS)
Panel A: Any Community College Enrollment
Outcome:Employmentln(Earnings)
Cohort:NELSELSNELSELS
Independent Variable(1)(2)(3)(4)
Enrolled in community college 0.05 0.026 0.133* 0.199* 
 (0.026) (0.036) (0.065) (0.097) 
High school fixed effects Yes Yes Yes Yes 
N 1801 3025 1677 2669 
R2 0.483 0.301 0.59 0.278 
Panel B: Community College Credits and Degrees 
Outcome: Employment ln(Earnings) 
Cohort: NELS ELS NELS ELS 
Independent Variable (1) (2) (3) (4) 
Intensity of enrollment     
<15 credits 0.042 0.04 0.13 0.126 
 (0.029) (0.035) (0.084) (0.098) 
15 to 30 credits 0.021 0.062 0.03 0.340** 
 (0.033) (0.041) (0.081) (0.109) 
30 to 45 credits 0.043 0.041 0.013 0.176 
 (0.029) (0.039) (0.088) (0.117) 
45 to 60 credits 0.025 0.009 0.091 0.285** 
 (0.034) (0.047) (0.098) (0.108) 
More than 60 credits 0.067* 0.038 0.121 0.271** 
 (0.03) (0.036) (0.082) (0.086) 
Highest earned degree     
Associate's degree 0.001 0.082* 0.064 0.101 
 (0.021) (0.087) (0.07) (0.087) 
Bachelor's degree (0/1) 0.027 0.102*** 0.326*** 0.181** 
 (0.021) (0.028) (0.084) (0.069) 
Joint significance of earned credits (p-value of F-statistic) 0.310 0.544 0.489 0.011 
Joint significance of earned degrees (p-value of F-statistic) 0.237 <0.001 <0.001 0.032 
N 1,801 3,025 1,677 2,669 
R2 0.491 0.307 0.612 0.35 
Panel A: Any Community College Enrollment
Outcome:Employmentln(Earnings)
Cohort:NELSELSNELSELS
Independent Variable(1)(2)(3)(4)
Enrolled in community college 0.05 0.026 0.133* 0.199* 
 (0.026) (0.036) (0.065) (0.097) 
High school fixed effects Yes Yes Yes Yes 
N 1801 3025 1677 2669 
R2 0.483 0.301 0.59 0.278 
Panel B: Community College Credits and Degrees 
Outcome: Employment ln(Earnings) 
Cohort: NELS ELS NELS ELS 
Independent Variable (1) (2) (3) (4) 
Intensity of enrollment     
<15 credits 0.042 0.04 0.13 0.126 
 (0.029) (0.035) (0.084) (0.098) 
15 to 30 credits 0.021 0.062 0.03 0.340** 
 (0.033) (0.041) (0.081) (0.109) 
30 to 45 credits 0.043 0.041 0.013 0.176 
 (0.029) (0.039) (0.088) (0.117) 
45 to 60 credits 0.025 0.009 0.091 0.285** 
 (0.034) (0.047) (0.098) (0.108) 
More than 60 credits 0.067* 0.038 0.121 0.271** 
 (0.03) (0.036) (0.082) (0.086) 
Highest earned degree     
Associate's degree 0.001 0.082* 0.064 0.101 
 (0.021) (0.087) (0.07) (0.087) 
Bachelor's degree (0/1) 0.027 0.102*** 0.326*** 0.181** 
 (0.021) (0.028) (0.084) (0.069) 
Joint significance of earned credits (p-value of F-statistic) 0.310 0.544 0.489 0.011 
Joint significance of earned degrees (p-value of F-statistic) 0.237 <0.001 <0.001 0.032 
N 1,801 3,025 1,677 2,669 
R2 0.491 0.307 0.612 0.35 

Notes: Standard errors in parentheses. All models control for respondent race, gender, potential labor market experience (quadratic), parental education, family income when in tenth grade, and performance on standardized reading and math assessments in high school. See text for details.

*p < 0.05; **p < 0.01; ***p < 0.001.

In panel B I present estimates that go beyond these aggregates. It appears that the average differences in employment likelihood in panel A overlook higher employment levels for those with more intensive levels of study at community colleges. For the NELS cohort, workers who completed at least 60 credits were 6.7 percentage points more likely to be employed than their high school–educated peers. Whereas the earlier cohort saw no additional advantage for earning a degree, in the ELS cohort those earning AA and BA degrees were 8 to 10 percentage points more likely to be employed. The slightly larger employment differences might reflect the continued decline in the economic prospects of high school–educated workers over the period. More than 90 percent of the NELS cohort was employed, suggesting that in the booming economy of the late 1990s, postsecondary education was less necessary as a ticket to employment.

I find similar patterns for earnings differences in columns 3 and 4. Among the NELS cohort, workers who continued on to earn a BA degree earn significantly more than high school–educated workers. By the time the ELS cohort was in their late 20s in 2012, the earnings differences between community college–educated and high school–educated workers grew. For example, even for workers without degrees, those who had completed at least one or two years of community college earned over 30 percent more than high school–educated workers. Additionally, the total earnings differences between those who also earned a BA degree increased. Overall, community college education appears to be more important for young people in the more recent cohort in providing access to employment. Then, conditional on employment, the earnings gap between those with a community college and high school education grew over the period.

Robustness

The estimates presented come from regression models that exploit the main strength of survey data in this context: the ability to saturate models with rich sets of covariates. Of course, any research design that does not permit random assignment to treatment faces the inherent risk of bias in the parameter(s) of interest due to possible correlation between treatment and individual attributes that themselves affect outcomes. In the face of potential omitted variables bias, the relationship between estimated and true treatment effects depends on the relationship between treatment, omitted variables, and the control variables utilized to limit differences between treated and control observations. Altonji, Elder, and Taber (2005) suggest that researchers can evaluate the robustness of estimates by examining the relationship between treatment, observed variables, and variation in the outcome. In particular, they illustrate that under strict assumptions, the potential role of unobservables in influencing estimates of relationships between treatment and outcomes can be bounded using relationships between observed controls and outcomes. This formalizes the common practice of researchers of comparing treatment effects obtained from parsimonious versus fully specified models as an informal test of the potential role of unobservables. The utility of this exercise depends on the variance of the observed versus unobserved factors, since including low-variance controls provides less explanatory power than would including a high-variance control (Oster 2017).

Altonji, Elder, and Taber (2005) and Oster (2017) illustrate that if a researcher can be clear about the potential ratio of correlations between treatment and observables and treatment and unobservables, and about the maximum amount of variation in the outcome that could be explained by a full model, one can bound treatment effects using model estimates. The maximum amount of variation (Rmax) is the R2 from a hypothetical fully specified model. Necessarily, Rmax is bounded between 0 and 1. An Rmax of 1 is implausible in the context of a wage equation, and not to be expected in any model with measurement or idiosyncratic errors. An alternative suggested by Oster (2017) is to set Rmax = min{ΠR˜,1} where Π scales R˜, the R2 from the model estimated with all relevant observable controls. Using papers published in top-tier economics journals, Oster finds that 40 percent of results would not survive Π=1.25.

To explore the potential importance of selection on unobservables in the current setting, I reexamine the estimate from the fully specified fixed effect model estimating earnings differences between workers with any community college and those with high school educations (table 2, column 5). I set Π=2 for estimating earnings differences due to community college enrollment, implying that if unobservable characteristics were included in the model, they would add as much explanatory power as the included observed characteristics. In figure 3, I show how the coefficient of interest is affected by different assumptions about relative selection into community college on unobserved versus observed attributes. The horizontal line is the coefficient estimate from table 2 (0.199). The x-axis is the ratio of selection, so that 0.33 would imply that correlation between community college education and unobservables is only one third as large as correlation with observables included in the model. The implied treatment effect would decline if selection on unobservables is assumed to become more important. Altonji, Elder, and Taber (2005) and Oster (2017) recommend equal selection as an appropriate upper bound on this ratio.20 Within this bound, I estimate conditional earnings differences between community college–educated and high school–educated workers treatment effects in excess of 0.16. Estimates of these earnings differences would go to zero only when relative selection on unobservables becomes quite large.

Figure 3.

Earnings Differences Between Community College and High School Educated Workers by Selection on Unobservables vs. Observables

Figure 3.

Earnings Differences Between Community College and High School Educated Workers by Selection on Unobservables vs. Observables

Close modal

The early labor market experiences of young Millennials provide useful evidence for understanding the prudence of recent political and policy discussions to encourage more young people to enroll in sub-baccalaureate education. The push for sub-baccalaureate education is motivated in part by the cost of college, and by interest in the labor market value of vocational training in Germany and elsewhere. Policy discussions in the United States to encourage postsecondary study at community colleges have rightfully drawn on the experiences of students in the past, who have experienced clear employment and earnings benefits from a community college education. Although the existing literature establishes that workers enrolled in community colleges in the early 1990s fared well in their early careers relative to their high school–educated counterparts, we know less about those educated more recently. This paper updates the literature by studying a cohort in college in the mid 2000s.

I find that young people entering the labor market in the late 2000s after study at a community college have fared at least as well as those who entered the labor market in the early 1990s. I estimate that community college students from the ELS cohort were more likely to be employed, and those who were employed earned about 21 percent more than comparable peers with only a high school education. Additionally, those earning an AA degree earn approximately 40 percent more. This is slightly larger than the earnings difference for students from the NELS cohort, and equivalent to results from research using cohorts graduating high school in the 1970s and 1980s (Kane and Rouse 1995).

It is essential to remember that the current results are pertinent for a subset of community college students: Those matriculating into a community college right out of high school. Studies using administrative data focus on a different population—adults with a work history. It is also important to recognize that the estimates here are based on quasi-experimental data, so interpreting the results as causal estimates comes with the hazards attendant to nonexperimental settings. Whereas the models estimated here control for student and family attributes, cognitive ability measured in high school, and high school fixed effects, those enrolling in community college are likely different from their high school–educated peers in unobserved ways. This limitation of quasi-experimental data is difficult to overcome in the study of postsecondary education. In this context, I show that the average earnings of those who attended community college are robust to specification and estimation methods. Further, I find that under reasonable assumptions about selection on unobservable relative to observable characteristics, the lower bound estimate of community college enrollment on earnings exceeds 15 percent. This range is an estimate of average effects and is therefore a weighted average of outcomes of those with different levels of completed credits, and with/without degrees. I find evidence that more intensive enrollment and degree completion is indeed an important predictor of later earnings. This finding highlights the importance of continued focus on persistence in postsecondary education.

Because of the empirical challenges inherent in estimating employment effects of postsecondary education, it is useful to benchmark estimates against those found in different settings and for different groups. In table 6, I summarize main findings from selected studies of the earnings impacts of community college education, using both survey and administrative data. The survey-based studies are chosen because they use different data sets, and they are commonly cited. The first study using administrative data is included because it is seminal in the field, and the remaining four are chosen because of the variety of the states under study and their quality.21 These studies are typical of their respective literatures, though not exhaustive.22 In each case, I include notes on the sample, setting, and how community college enrollment (“treatment”) is measured.

Table 6.
Summary of Estimates Earnings Premia for Community College Students from Studies using Survey and Administrative Data
TypeStudySourceSampleOutcome Year(s)“Treatment”Earnings Premium
Survey data Kane and Rouse 1995  NLSY - 79 25- to 32-year-olds 1990 CC, no degree 0.101 
     AA degree 0.271 
 Gill and Leigh 1997  NLSY - 79 28- to 35-year-old returning students 1993 CC, no degree 0.080 
     AA degree 0.221 
  HS&B ∼28 year-olds 1992 CC, no degree 0.079 
     AA degree 0.253 
Administrative data Jacobson, LaLonde, and Sullivan 2005  Washington Displaced workers 1990—95 CC, any enrollment 0.026 
     CC, 4—75 Credits 0.148 
 Jepsen at al. 2014  Kentucky 20- to 60-year-olds w/work history 2000—08 AA degree 0.522 
 Bahr 2016  California First time college students 2002—13 AA degree 0.351 
 Liu, Belfield, and Trimble 2014 North Carolina Adults entering CC in 2002—03 2011 1-year FTE CC 0.101 
     AA degree 0.277 
 Belfield 2015  Arkansas Adults with work history 2000—2012 AA degree 0.102 
TypeStudySourceSampleOutcome Year(s)“Treatment”Earnings Premium
Survey data Kane and Rouse 1995  NLSY - 79 25- to 32-year-olds 1990 CC, no degree 0.101 
     AA degree 0.271 
 Gill and Leigh 1997  NLSY - 79 28- to 35-year-old returning students 1993 CC, no degree 0.080 
     AA degree 0.221 
  HS&B ∼28 year-olds 1992 CC, no degree 0.079 
     AA degree 0.253 
Administrative data Jacobson, LaLonde, and Sullivan 2005  Washington Displaced workers 1990—95 CC, any enrollment 0.026 
     CC, 4—75 Credits 0.148 
 Jepsen at al. 2014  Kentucky 20- to 60-year-olds w/work history 2000—08 AA degree 0.522 
 Bahr 2016  California First time college students 2002—13 AA degree 0.351 
 Liu, Belfield, and Trimble 2014 North Carolina Adults entering CC in 2002—03 2011 1-year FTE CC 0.101 
     AA degree 0.277 
 Belfield 2015  Arkansas Adults with work history 2000—2012 AA degree 0.102 

Notes: Estimates of earnings premia for community college students are weighted averages based on author's calculations using reported coefficients, means, and sample sizes. NLSY = National Longitudinal Survey of Youth; CC = community college; AA = Associate's; HS&B = High School and Beyond; FTE = full-time equivalent.

Previous research using survey-based data puts the estimates of earnings differences between those enrolling in community college but not earning a degree and high school graduates at between 5 and 10 percent. Those earning an AA degree earn about 22 to 27 percent more than high school graduates. The estimates in the current paper from the ELS (table 3) are for higher earnings premia for attendance without a degree (22 percent) and lower for receipt of an AA degree (13 percent). This is the pattern noted when comparing the ELS estimates from NELS estimates in table 5. The ELS cohort appears to have earned a higher premium for enrollment in community college, but not for earning an AA degree.

The bottom panel of table 6 includes results from several recent studies using administrative data, and the earlier seminal study of displaced workers by Jacobson, LaLonde, and Sullivan (2005). For the displaced worker population, community college enrollment resulted in an earnings premium only if the worker earned a substantial number of credits. More recent studies of broader populations find that those earning AA degrees earn between 10 and 50 percent more than comparable peers who enrolled in a community college but completed no/few credits. Liu, Belfield, and Trimble (2015) estimate that those earning at least one FTE year's worth of community college credit earn about a 10 percent premium subsequently. The results estimated here for the ELS cohort are smaller than the degree/diploma premia estimated in administrative data.

To make sense of the current estimates for the purposes of informing policy discussions, it is useful to consider their magnitude in relation to the costs of enrollment. The results in table 3 suggest that community college increases earnings by about 21 percent. At the mean, that would improve earnings by about $3,350 per year. Those earning at least one FTE year's worth of credits would earn about $6,300 more per year. The average tuition and fee costs net of grant aid for a year of full-time community college was $6,291 in 2009.23 A student who studied full-time at a community college for a year (without working) and left without a degree, would incur a loss of foregone wages of about $24,000 in addition to tuition and fee costs. A simple calculation (without accounting for the countervailing effects of discounting and differential rates of wage growth) implies that the earnings effect would compensate for the opportunity and direct costs of community college within five years. Even if the estimates here are 50 percent higher than true causal effects, the college investment would be paid off within seven years. Or, if the student worked during college (as many community college students do), the opportunity cost would be reduced and the investment costs recouped even faster.

Although the earnings gains associated with community college for the ELS cohort are clear, the declining real incomes of young workers are essential to interpreting the enduring relative earnings advantage of postsecondary education. In 2000, the median annual earnings of 25- to 34-year-olds employed full-year, full-time who had an AA degree was $41,240. By 2012 it was $36,830.24 The median cost of two years’ worth of tuition and fees necessary for an AA degree was $5,426 for the ELS cohort (in 2014 dollars). For the NELS cohort it was $3,358.25 So for an ELS sample member, an associate's degree cost 15 percent of subsequent median pre-tax annual earnings. For the NELS cohort, tuition and fees to cover the costs of an AA degree took only 8 percent of median pre-tax annual earnings. Even if the earnings advantage associated with community college education is as large for Millennials as it was for Generation X (educated in the early 1990s), the costs of paying for that education relative to real earnings have risen markedly. These real changes are surely part of the misgivings expressed by young college-educated workers who have paid more for a college education than their predecessors, but earn less.

1. 

Estimates from Digest of Education Statistics, 2016, Table 302.10: https://nces.ed.gov/programs/digest/d16/tables/dt16_302.10.asp?current=yes.

2. 

During the end of the Great Recession, the relative increase in enrollment at two-year colleges was most pronounced, peaking at 28.2 percent in 2012, whereas rates of enrollment in four-year colleges fell.

3. 

Community colleges are public two-year postsecondary institutions that award associate degrees as their highest degrees. This includes junior colleges, but not proprietary schools.

5. 

For example, a subset of the literature using administrative data examines the labor market effects of career technical education at community colleges (e.g., Stevens, Kurlaender, and Grosz 2015). This is related substantially to the questions taken up by Jacobson, LaLonde, and Sullivan (2005).

6. 

This is an issue likely to be pertinent in many of the states relevant to this literature. For example, in Kentucky, the Louisville metro area straddles the Indiana border, and the Cincinnati (Ohio) metro area is among the largest in Kentucky. Virginia's most populous region is Northern Virginia, a part of a metro area with Washington DC, Maryland, and West Virginia. Charlotte, the largest metro area in North Carolina, straddles the border with South Carolina.

7. 

A total of 96.2 percent of the ELS respondents with a high school diploma graduated high school by 2004.

8. 

I exclude those who enroll in for-profit colleges.

9. 

Approximately 14 percent of the ELS sample who enrolled in community college as their initial postsecondary education did so after 2006. The study here is limited to the 86 percent who enrolled within two years of high school graduation.

10. 

This feature of the ELS hampers comparison with findings from research using administrative data, which focuses on students enrolling in school after some period in the labor force. I discuss this external validity constraint in the conclusions.

11. 

U.S. Bureau of Labor Statistics (http://data.bls.gov).

12. 

The exclusion restrictions dropped 1,056 records of those who did not earn a high school diploma, and 5,890 who enrolled in four-year colleges after high school.

14. 

For discussion of propensity score matching, see Imbens (2004) and Stuart (2010).

15. 

The NELS provides an ideal comparison to the ELS for several reasons. These include readily comparable measures of family background, student ability, and employment outcomes.

16. 

Figure 2 presents math score differences. Reading score differences are nearly identical. In both cases, the differences are about 0.3 standard deviation.

17. 

Marginal effects at the means from logistic regression estimates are substantively similar to the linear probability model coefficients.

18. 

I exponentiate coefficients from log-linear models for more precise estimated percent changes.

19. 

Specifications in table 4 are identical to columns 2 and 4 in table 3.

20. 

Treatment effects under different degrees of selection on observed and unobserved variables can be calculated in Stata using psacalc.

21. 

The studies using administrative data were conducted via the Center for Analysis of Postsecondary Education and Employment, funded by the U.S. Department of Education's Institute for Education Sciences.

22. 

More extensive reviews of the literature can be found in Belfield and Bailey (2011, 2017).

23. 

Digest of Education Statistics (2015), Table 331.30.

24. 

Digest of Education Statistics (2015), Table 502.30. Both figures are in 2014 dollars.

25. 

Digest of Education Statistics (1995), Table 306. Dollars converted to 2014 using the Consumer Price Index for all Urban Consumers.

Thanks to Brad Herschbein, Stephanie Cellini, Amy Ellen Schwartz, and participants at the 2016 APPAM International Research Conference in London for helpful comments and suggestions. I benefitted from the research assistance of Kari Dalane and Molly Wiltshire. Any remaining errors are my own.

Altonji
,
Joseph G.
,
Todd E.
Elder
, and
Christopher R.
Taber
.
2005
.
Selection on observed and unobserved variables: Assessing the effectiveness of Catholic schools
.
Journal of Political Economy
113
(
1
):
151
184
.
Bahr
,
Peter Riley
.
2016
. The earnings of community college graduates in California.
New York
:
Center for Analysis of Postsecondary Education and Employment (CAPSEE) Working Paper
.
Belfield
,
Clive R.
2015
. Weathering the Great Recession with human capital?
Evidence on labor market returns to education from Arkansas
.
New York
:
Center for the Analysis of Postsecondary Education and Employment (CAPSEE) Working Paper
.
Belfield
,
Clive R.
, and
Thomas
Bailey
.
2011
.
The benefits of attending community college: A review of the evidence
.
Community College Review
39
(
1
):
46
68
.
Belfield
,
Clive
, and
Thomas
Bailey
.
2017
. Model specifications for estimating labor market returns to associate degrees: How robust are fixed effects estimates?
New York
:
Center for Analysis of Postsecondary Education and Employment (CAPSEE) Working Paper
.
Dadgar
,
Mina
, and
Madeline
Trimble
.
2015
.
Labor market returns to sub-baccalaureate credentials: How much does a community college degree or certificate pay
?
Educational Evaluation and Policy Analysis
37
(
4
):
399
418
.
Gill
,
Andrew
, and
Duane
Leigh
.
1997
.
Labor market returns to community colleges: Evidence for returning adults
.
Journal of Human Resources
32
(
2
):
334
353
.
Imbens
,
Guido W.
2004
.
Nonparametric estimation of average treatment effects under exogeneity: A review
.
Review of Economics and Statistics
86
(
1
):
4
29
.
Jacobson
,
Louis
,
Robert
LaLonde
, and
Daniel G.
Sullivan
.
2005
.
Estimating the returns to community college schooling for displaced workers
.
Journal of Econometrics
124
(
1
):
271
304
.
Jepsen
,
Christopher
,
Kenneth
Troske
, and
Paul
Coomes
.
2014
.
The labor-market returns to community college degrees, diplomas and certificates
.
Journal of Labor Economics
32
(
1
):
95
121
.
Kane
,
Thomas
, and
Cecilia
Rouse
.
1995
. Labor-market returns to two- and four-year college.
American Economic Review
85
(
3
):
600
614
.
King
,
Gary
,
Richard
Neilsen
,
Carter
Coberley
,
James E.
Pope
, and
Aaron
Wells
.
2011
.
Comparative effectiveness of matching methods for causal inference
.
Unpublished paper, Harvard University
.
Levy
,
Frank
, and
Richard
Murnane
.
1992
.
U.S. earnings levels and earnings inequality: A review of recent trends and proposed explanations
.
Journal of Economic Literature
30
(
3
):
1333
1381
.
Liu
,
Vivian Y. T.
,
Clive R.
Belfield
, and
Madeline J.
Trimble
.
2015
.
The medium-term labor market returns to community college awards: Evidence from North Carolina
.
Economics of Education Review
44
:
42
55
.
Malamud
,
Ofer
, and
Abigail
Wozniak
.
2012
.
The impact of college education on geographic mobility
.
Journal of Human Resources
47
(
4
):
913
950
.
Marcotte
,
Dave E.
2010
.
The earnings effect of community college education
.
Contemporary Economic Policy
28
(
1
):
36
51
.
Marcotte
,
Dave E.
,
Thomas
Bailey
,
Carey
Borkoski
, and
Greg
Kienzl
.
2005
.
The returns for education at community colleges: Evidence from the National Education Longitudinal Survey
.
Educational Evaluation and Policy Analysis
27
(
2
):
157
175
.
Murphy
,
Kevin M.
, and
Finis
Welch
.
1992
.
The structure of wages
.
Quarterly Journal of Economics
107
(
1
):
285
326
.
National Center for Education Statistics (NCES)
.
2014
.
Education longitudinal study of 2002 third follow-up data file documentation appendixes
(NCES 2014-364). Available
https://nces.ed.gov/pubs2014/2014364_Appendixes.pdf.
Accessed 19 March 2019
.
Oster
,
Emily
.
2017
.
Unobservable selection and coefficient stability: Theory and evidence
.
Journal of Business & Economic Statistics
37
(
2
):
187
204
doi: 10.1080/07350015.2016.1227711.
Shapiro
,
Doug
,
Afet
Dundar
,
Phoebe
Wakhungu
,
Xin
Yuan
,
Angel
Nathan
, and
Youngsik
Hwang
.
2016
.
Completing college: A national view of student attainment rates–Fall 2010 Cohort
.
Available
https://nscresearchcenter.org/wp-content/uploads/SignatureReport12.pdf.
Accessed 22 March 2019
.
Stevens
,
Ann Huff
,
Michal
Kurlaender
, and
Michel
Grosz
.
2015
.
Career technical education and labor market outcomes: Evidence from California community colleges
.
NBER Working Paper No. 21137
.
Stuart
,
Elizabeth A.
2010
.
Matching methods for causal inference: A review and a look forward
.
Statistical Science
25
(
1
):
1
21
.
Wozniak
,
Abigail
.
2010
.
Are college graduates more responsive to distant labor market opportunities
?
Journal of Human Resources
45
(
3
):
944
970
.
Xu
,
Di
, and
Madeline
Trimble
.
2016
.
What about certificates? Evidence on the labor market returns to nondegree community college awards in two states
.
Educational Evaluation and Policy Analysis
38
(
2
):
272
292
.