When I started my job as research director of the Massachusetts Department of Elementary and Secondary Education twelve years ago, I thought my job was to figure out what worked. My agency was just beginning to have access to new and exciting longitudinal data on students and educators. I envisioned that I'd use those data along with strong designs for causal inference to determine which programs and policies were working and which were not. Once we knew those answers, I figured, we would get better policy that would improve outcomes for Massachusetts’ students.

But in my twelve years in this job, I've learned that the process of improving policy1 through research is much subtler and more complex than I had initially imagined. Research influences policy more often than much of the academic community thinks, and more frequently every day as we learn how to do this work better. But its influence is less linear than researchers expect, and it is driven as much by relationships and organizational capacity as by the actual information studies produce. Research use operates through conversations, not code; structures in organizations, not standard errors; relationships, not randomized controlled trials.

I worry that the growing national efforts to connect research and policy too frequently start from the same “find what works” frame of mind that I did twelve years ago. The “find what works” approach misunderstands the problem of research use as one of lack of information—either lack of information about the impact of a policy or lack of awareness by the policy maker about the available information—and a need for “translation” across sectors (Penuel et al. 2015). This belies the research literature about how research actually plays into the policy decision process. If research is insufficiently used in policy making, it's because we have too few conversations between the policy and research communities, not because we have too few policy briefs.

The research on research use is clear: If we want research to matter for policy, we need to devote resources to building relationships and strengthening organizational practices in service of building organizations that learn. This will require researchers, universities, policy makers, practitioners, and professional associations like the Association for Education Finance and Policy (AEFP) to reconsider their activities and priorities and create new ways of working across sectors. This work will be complex, messy, and at times uncomfortable. But it is work worth doing.

The irony of my early years in my role is that if I had stopped to read the research on how research is used in policy making, I might have become effective in my role much faster. Decades of studies on this topic, starting with Carol Weiss's groundbreaking work in the 1970s and 1980s, shed light on when and how research is used to effect organizational and political change (Weiss 1977, 1980, 1982).

A key finding from this literature is that decision making is a process, not an event. Policy makers don't just mark off their calendar for “Decision Day.” They gather information on an issue over time and from a variety of sources, often in the absence of a specific pending decision. They make initial choices when decision opportunities arise, and they adjust course in an iterative process (Weiss 1980, 1982; Kingdon 1984). Further, their decisions tend to be less about choosing between programs and more about designing a new system or process for a specific context (Penuel et al. 2017, 2018). Nor, for that matter, is there a singular decision maker. Legislators, bureaucrats, advocates, consultants, and others all play a role in the policy development and implementation process. Decisions “accrete through small uncoordinated steps taken in many offices” (Weiss 1980, p. 382).

Indeed, this is how decisions are made in any organization, not just state houses and school districts. Suppose, for example, that a university was considering creating a new doctoral program in education policy. Conversations about whether to pursue this would probably take place over several years, with many different stakeholders from across the university community weighing in. Program designers would gather information from prior research, but also from other universities with similar programs, funders, student enrollment and market conditions, and so forth. The moment when the provost or president approved the program would be the culmination of years of incremental discussions and decisions, each of which subtly influenced the ultimate outcome. And that process would continue even after the decision was made, as the program was implemented and the department learned which dimensions were succeeding or needed improvement.

It should be no surprise to researchers, then, that policy decisions work exactly the same way. Insights from research certainly weigh in policy makers’ minds, but so too do many other factors—personal and community values, constituent concerns, budgets, legal constraints, and so on (Weiss 1982). Indeed, that is how a democratic system is supposed to work. Policy makers’ jobs are to consider the available information on an issue and use that information in conjunction with their values and professional judgment to make decisions (Brighouse et al. 2018).

Research may inform those judgments directly, through what Carol Weiss calls “instrumental” use of research (1977). This type of use is what I had in mind when I arrived to my agency—using findings from particular studies to determine next steps for policy development. But just as often, research informs decisions indirectly, by creating new frameworks or ways of thinking about problems. Weiss calls this “conceptual” or “enlightenment” use (Weiss 1977, 1980). This can take different forms, such as introducing new concepts, seeing problems in a new light, shifting understandings about possible solutions, or providing a framework to guide action (Farrell and Coburn 2016b). Weiss argues that “the major effect of research on policy may be the gradual sedimentation of insights, theories, concepts, and ways of looking at the world” (1977, p. 535).

How do policy makers gain access to insights from research? A recent study by the National Center for Research on Policy and Practice sheds light. Surveying a nationally representative sample of district leaders in midsize and large districts, the report finds that both the format and the messenger are critical. When asked to cite a piece of research that influenced their work, 58 percent of respondents cited a book; the next most common was a policy report, at 17 percent. Individual articles from peer-reviewed journals were even less frequently cited (Penuel et al. 2018). Respondents also rarely reported learning about research directly from researchers, instead relying on trusted, known intermediaries. The most common way that district respondents reported accessing research “often” or “all the time” was through their professional associations, at 53 percent. Conferences came in next at 40 percent; people in other school districts third at 39 percent (Penuel et al. 2017). Formal resources aimed at providing information to practitioners but lacking sustained personal connections, such as the What Works Clearinghouse and the Regional Education Laboratories, and translators of research such as print or social media, were used at less than half these rates.

Organizational context and structure can also advance research use. Where research use is most sophisticated, policy makers may gain access to research through a person who sits between the research and policy communities, often referred to in this literature as a broker or boundary-spanner (Penuel et al. 2015). Brokers engage in “intentional efforts … to make space for and enter into joint work with partners whose work involves responsibilities, expertise, pressures, and strategies different from one's own” (Penuel et al. 2015, p. 190). They create organizational norms and routines that allow for connections across perspectives, and they push people beyond their comfort zone in service of advancing the partnership's goals. By doing so, they help span gaps in perspective and values across professional communities and find productive ways for them to work together (Farley-Ripple et al. 2018). They also increase the absorptive capacity of organizations—that is, their ability to interpret and act on findings from research (Farrell and Coburn 2016a). At their best, brokers help organizations learn.

Instead of beginning with a model of decision making as Decision Day, let us instead begin with a model of decision making as it happens in the real world: working through relationships, embedded in organizations, influenced by information from many sources, and evolving over time. If this is how policy decisions are made, then the next question for the research community is: What is the best way to maximize the influence of research on this untidy, indirect process?

The crucial insight from the research literature is this: Research use is relational, organizational, and interpretive. To have impact, research must be embedded in organizational structures and personal, trusting relationships that give policy makers space to interpret research and construct their own meaning from it (Coburn 2018; Farley-Ripple et al. 2018; Farrell, Coburn, and Chong 2018). Thankfully, several promising new strategies for addressing this issue have emerged—ones that explicitly acknowledge the role of relationships, organizations, and interpretation in helping policy makers to use information more effectively. AEFP is playing an important role in promoting and sustaining these new models.

If relationships among people are how research use happens, then networks and professional associations are linchpins in that process. And if the goal is to drive research use among policy makers, then an obvious first step is to put policy makers and researchers in the same room.

AEFP is the only professional policy research association I am aware of that has sought out grant funding to bring policy makers and practitioners to its conference, recognizing that travel funds are often limited for public sector employees. Over the past three years we have supported travel costs for well over one hundred policy makers to attend the conference. We also created Policy Talks, a new type of session that identifies the broad themes or findings in an area of research and creates a conversation between policy makers and researchers on those issues. And we created the Ambassadors Breakfast, in part to give policy makers and researchers an opportunity to interact informally around shared interests and begin to build personal connections early in the conference. Since making these shifts, we have seen association membership double among policy makers.

We have also become more strategic in identifying two types of people working in policy and practice that would most benefit from AEFP conference attendance: the people leading an agency's research efforts, and the people who work in research-related roles at policy associations, such as the Council of Chief State School Officers or Education Commission of the States. These are brokering and boundary-spanning roles; people in them need to connect with the research community to do their work well. Although the content presented at AEFP may sometimes be too narrow for a typical superintendent or commissioner, it is invaluable for a research director trying to stay up to speed on the latest research in the priority areas for her district or state, or for a person charged with organizing policy conferences who needs to find experts as panel members or advisors. We are now prioritizing our travel funds and other connecting activities on supporting these two types of practitioners.

We are working to build stronger connections with education policy associations, to create more structural opportunities for the policy and research communities to interact. For the first time this year, we asked these associations to weigh in on priority topics for the policy talks, to work toward the goal of having the content at the conference reflect both the best research and current policy needs. We are also convening an advisory group of staff from organizations that connect researchers and practitioners to discuss how we can build stronger, sustained relationships across our associations.

I doubt my agency had read the research on research brokers or absorptive capacity before creating my position in Massachusetts; after all, they had no broker to bring it to their attention. Nonetheless, a broker was exactly what they created—an internal role fostering relationships between policy makers and researchers and shifting organizational practices in a way that increased the ability of the agency to use research effectively.

Situating this type of role internal to an organization allows the research director to be more aware of the current policy issues and, crucially, more connected to the agency's needs and ongoing routines. I offer some “anecdata” to make my case. Last year, I forwarded to my deputy commissioner some materials on options for measuring student growth. He wrote back, “Many people send me articles that I have neither the time nor the inclination to read. What's annoying about you is that the articles you send are so on point to the work we're doing that I feel compelled to read them.” I could not have annoyed my deputy nearly as effectively if my role were not deeply ingrained in the agency's work.

Of course, embedded research directors can only be effective to the degree their positions are given the positional and relational authority to influence organizational practice. Siloed away from decision makers, operating only within one policy office or division, or given too many responsibilities for time-sensitive, intensive work (such as assessment or accountability), they cannot hope to increase the organization's overall capacity to build and use research evidence (Conaway 2015; Schwartz 2015). Conversely, when placed into a supportive structure, embedding a research director is one of the most effective ways for organizations to accelerate this work.

AEFP has furthered the professional development of research directors by giving us a space to connect with one another. Research director roles have been relatively common, though not ubiquitous, in larger school districts for a while now. But when I started in my role twelve years ago, I was the only state education agency research director of this type in the nation. I was what Dan Goldhaber memorably described as a “golden unicorn”—that rare person working in a policy or practice setting who has “an excellent grasp of what constitutes good research” (Goldhaber 2018).

I may have been a unicorn, but I was alone in the forest until Nate Schwartz came along. Nate joined the Tennessee Department of Education in a role comparable to mine in July 2012. That single connection to another person doing similar work dramatically improved my own. It allowed me to reflect on my own practice and gave me access to new ideas, strategies, and opportunities that I could then adapt and implement in my own setting. Now the golden unicorns extend nationally into at least fifteen states and many more districts, as well as higher education settings. We have even begun an informal Golden Unicorn Society at the AEFP conference to share common challenges and concerns, and support one another in our work. One unicorn is already something special, but a group of unicorns is—literally and figuratively—a blessing.

An increasingly popular strategy for conducting policy research with impact is via research–practice partnerships (RPPs), defined by Coburn and Penuel (2016) as “long-term collaborations between practitioners and researchers that are organized to investigate problems of practice and solutions for improving schools and districts” (p. 48). RPPs differ from traditional research models in part by focusing on the problems practitioners want to solve rather than the questions researchers want to answer. But they also differ by explicitly elevating and supporting the relationship side of research use. They are intentionally organized to build sustained relationships between researchers and practitioners as a means of improving practice. Some RPPs are formal and institutionalized—for example, the longstanding Chicago Consortium on School Research or many of the other members of the National Network of Education-Research Practice Partnerships. But RPPs can also be thought of as an orientation toward the work—a more collaborative, relationship-based approach to conducting research that could apply in any policy or practice setting.

Most RPPs are designed to improve outcomes for students while simultaneously improving access to, and use of, research in education organizations, and the relevance of research conducted by partners. The research on whether they attain these goals is nascent, and the nature of the intervention does not lend itself easily to causal inference (Coburn and Penuel 2016). But another National Center for Research in Policy and Practice study, one of the winners of Institute of Education Sciences RPP grants, lends some insight on the organizational changes that may occur through RPPs (Farrell et al. 2018).

The study finds that “the majority of practitioners reported becoming better at using research in their work and were more likely to do so because of their participation in the partnership. Almost all of the researchers agreed that they had become better at conducting research that meets the needs of practitioners” (Farrell et al. 2018, p. 3). Interviews with RPP participants revealed that “both education leaders and researchers reported shifts in three key areas: their orientation toward research, their knowledge and skills about the research process, and their communication practices with stakeholders.” Further, their peers on the other side of the partnership also observed these changes (Davidson 2018).

RPPs are not the solution to all our research impact woes. They are not appropriate for all research questions—some require a more distant, hands-off relationship, and some don't merit the deep investment of time and effort necessary for an RPP to flourish. They are resource-intensive and thus tend to privilege more senior researchers (who worry less about getting publications for tenure), and larger education organizations (which tend to have greater administrative capacity for research and larger sample sizes that make inferential statistics more useful). And, they can be challenging to implement and sustain, precisely because they push the traditional boundaries of research and practice.

But having participated in several RPPs myself, I can attest to their value for changing how agencies use research evidence, particularly when the RPP is focused on a topic of long-term strategic interest for policy making. I can also attest to their impact on the relevance of research conducted by the research partners. As researchers become more connected to and embedded in organizations, they are better able to identify questions that practitioners value answering and find ways to include those questions in their research agendas.

The 2019 AEFP conference featured research from RPPs such as the Education Research Alliance at Tulane, the Education Policy Innovation Collaborative at Michigan State, the Tennessee Education Research Alliance at Vanderbilt, and many others. This demonstrates that a focus on questions of practice need not imply that the resulting work is less valuable to the research community.

All of these strategies hold great promise for increasing the influence of research on policy. But I think we can push even further. What if the research community thought of our end goal not as getting ideas from research into policy decisions, but as helping policy and practice organizations shift from organizations that do to organizations that learn?

Our whole way of approaching our work would be different. We would recognize that the most effective way to build systematic capacity to learn is through sustained connections across organizations and people. Therefore, we would value the time we spend on building relationships that allow us to ask meaningful questions and learn from their answers as much as we value the time we spend on producing research itself. We would appreciate that it is these relationships that allow ideas from research to take root.

We would see that the research community's contribution operates as much through its structured approach to learning as through any specific knowledge it generates. We would take advantage of that by collaborating to build structured approaches to learning within education organizations, supported through strategically positioned research brokers and partnerships. Crucially, these approaches would include organizational routines that allow policy makers and practitioners to make meaning from research and take appropriate action. Through all of this effort, ideas would diffuse organically across the policy and research communities, enriching both and making both more effective than they would otherwise have been (Gordon, Palmer, and Darling-Hammond 2019).

Enacting this vision would require change on the part of education agencies, individual researchers, universities, and professional associations like AEFP. Education agencies, whether states or districts, would need to invest in greater capacity for building and using evidence as a core part of their work. This capacity could come in a variety of forms: for example, training in the principles of evidence use; embedded research directors; and/or research partnerships. But the expectation should be that all educators are capable of evaluating evidence and using it to improve their organizations. The Institute for Education Sciences could play a role by catalyzing these investments through research partnership and training grants and by directly funding the embedded research directors that we know can dramatically shift organizational practices.

The work of individual researchers who want to help organizations learn would shift toward one of several pathways to impact. Some researchers might participate directly in building structured approaches to learning by serving as embedded research directors, brokers, and/or research partners themselves. But even researchers outside those roles could still influence organizational learning. Those who are working with a district or state to conduct a study could prioritize creating routines that create space for sharing preliminary findings and discussing and interpreting results, in the same way that a formal broker would insist upon. Others could consider writing a book, framing article, or broad, nontechnical pieces about their field of inquiry that could help shift policy makers’ thinking, or they could share their work through policy talks here at AEFP or at local or national meetings of policy makers.

Universities would need to shift in two ways. First, they would need to reconsider the balance of how different types of output are valued by their institutions, to put greater emphasis on effort spent on impact outside the ivory tower. At a time when the value of higher education is increasingly questioned, this would be a direct way to demonstrate the university's impact in the community. For inspiration, they might look to the Research Excellence Framework, which the United Kingdom uses to assess the quality of research output from its institutes of higher education. Impact is explicitly included as a criterion, and the United Kingdom has developed nuanced ways of measuring impact across the full range of academic disciplines (Research Excellence Framework 2019).

Second, universities would need to create opportunities for researchers, policy makers, and practitioners to learn the skills needed to do this work. Right now, people learn this the hard way, through investing a substantial amount of time and making mistakes along the way. Education organizations could much more quickly learn how to learn if their own staff and their research partners were explicitly trained in these skills. This includes how to use and build evidence as part of program and policy development, how to design a collaborative learning agenda, how to critically evaluate whether research is convincing and relevant, how to incorporate research into improvement processes, how to broker relationships across research and practice organizations, and so on. Training opportunities could range from short professional development or workshop opportunities for existing practitioners and researchers to full degree programs for masters or doctoral students preparing for these roles.

Professional associations like AEFP would play a unique role in this work. AEFP has been leading the way nationally with its efforts to make its content more directly relevant and appealing to policy makers and practitioners, and to explicitly seek them out as attendees and participants. If we want to increase the influence of research on policy, we need to redouble our efforts to invest in building strong relationships between these new attendees and our historically research-oriented membership. We can become one of the few spaces where perspectives from policy makers, practitioners, and research are all valued and where connections can be built across sectors. And we can demonstrate for others the benefits of taking this approach.

AEFP's mission is to promote understanding of the means by which resources are generated, distributed, and used to enhance human learning. To achieve our mission, we need to broaden our conception of “promoting understanding” to include the relational, organizational, and interpretive activities I have described in this essay. Society has invested tremendous resources in both education and research. We will maximize the return on that investment when we move beyond simplistic models of increasing research use to a model of building educational organizations that learn.

1. 

This essay centers on the influence of research on policy, rather than practice. This is both because I have more expertise in the policy process than I do in issues of direct practice and because AEFP members’ research tends to focus more on policy. I suspect, however, that many of the same insights would also apply in practice settings.

I benefited from conversations with many people in the development of the themes in this essay. Caitlin Farrell, Dan Goldhaber, Nora Gordon, Bob Lee, Andy Porter, and Nate Schwartz kindly read drafts and provided numerous helpful suggestions. Others whose input influenced my thinking include Paula Arce-Trigatti, Laura Booker, John Easton, Liz Farley-Ripple, Steve Fleischman, Venessa Keesler, Sara Kerr, Antoniya Marinova, Bill Penuel, Morgan Polikoff, Vivian Tseng, and the Results for America State Education Fellows. I wrote this essay when I was the Chief Strategy and Research Officer at the Massachusetts Department of Elementary and Secondary Education. I thank my colleagues at the agency for their generosity of time and spirit, and for the value they place on research in guiding their work. I dedicate this essay to our late commissioner, Mitchell Chester, who was as evidence-based a public servant as one could ever hope to work for.

Brighouse
,
Harry
,
Helen F.
Ladd
,
Susanna
Loeb
, and
Adam
Swift
.
2018
.
Educational goods: Values, evidence, and decision-making
.
Chicago
:
University of Chicago Press
.
Coburn
,
Cynthia
.
2018
.
Pathways from research to policy: Implications for researchers and policymakers (part 1 of 2)
.
Available
https://nzareblog.wordpress.com/2018/01/19/coburn-research-policy-1/.
Accessed 7 August 2019
.
Coburn
,
Cynthia
, and
William R.
Penuel
.
2016
.
Research-practice partnerships in education: Outcomes, dynamics, and open questions
.
Educational Researcher
45
(
1
):
48
54
.
Conaway
,
Carrie
.
2015
. Better policy through research: Pursuing high-impact research in state education agencies. In
The SEA of the future: Building agency capacity for evidence-based policymaking
, vol.
5
,
edited by
Betheny
Gross
and
Ashley
Jochim
, pp.
3
22
.
Austin, TX
:
Building State Capacity and Productivity Center
.
Davidson
,
Kristen
.
2018
.
What happens when educators and researchers work together in partnerships?
Available
http://blogs.edweek.org/edweek/urban_education_reform/2018/07/what_happens_when_educators_and_researchers_work_together_in_partnerships.html.
Accessed 7 August 2019
.
Farley-Ripple
,
Elizabeth
,
Henry
May
,
Allison
Karpyn
,
Katharine
Tilley
, and
Kalyn
McDonough
.
2018
.
Rethinking connections between research and practice in education: A conceptual framework
.
Educational Researcher
47
(
4
):
235
245
.
Farrell
,
Caitlin C.
, and
Cynthia E.
Coburn
.
2016a
.
Absorptive capacity: A conceptual framework for understanding district central office learning
.
Journal of Educational Change
18
(
2
):
135
159
.
Farrell
,
Caitlin C.
, and
Cynthia E.
Coburn
.
2016b
.
What is the conceptual use of research, and why is it important?
Available
http://wtgrantfoundation.org/conceptual-use-research-important.
Accessed 7 August 2019
.
Farrell
,
Caitlin C.
,
Cynthia E.
Coburn
, and
Seenae
Chong
.
2018
.
Under what conditions do school districts learn from external partners? The role of absorptive capacity
.
American Education Research Journal
56
(
3
):
955
994
.
Farrell
,
Caitlin C.
,
Kristen L.
Davidson
,
Melia
Repko-Erwin
,
William P.
Penuel
,
Mary
Quantz
,
Hayla
Wong
,
Rubbin
Riedy
, and
Zane
Brink
.
2018
. A descriptive study of the IES Researcher-Practitioner Partnerships in Education Research Program: Technical Report No. 3.
Boulder, CO
:
National Center for Research in Policy and Practice
.
Goldhaber
,
Dan
.
2018
.
Impact and your death bed: Playing the long game
.
Education Finance and Policy
13
(
1
):
1
18
.
Gordon
,
Dan
,
Scott
Palmer
, and
Sean
Darling-Hammond
.
2019
.
Transforming the education sector into a learning system: Harnessing the power of continuous improvement, research & development, and data to improve outcomes for each and every child
.
Washington, DC
:
Education Counsel
.
Kingdon
,
John W.
1984
.
Agendas, alternatives, and public policies
.
Boston
:
Little, Brown
.
Penuel
,
William R.
,
Anna-Ruth
Allen
,
Cynthia E.
Coburn
, and
Caitlin
Farrell
.
2015
.
Conceptualizing research-practice partnerships as joint work at boundaries
.
Journal of Education for Students Placed at Risk
20
(
1–2
):
182
197
.
Penuel
,
William R.
,
Derek C.
Briggs
,
Kristen L.
Davidson
,
Corinne
Herlihy
,
David
Sherer
,
Heather C.
Hill
,
Caitlin
Farrell
, and
Ann-Ruth
Allen
2017
.
How school and district leaders access, perceive, and use research
.
AERA Open
3
(
2
):
1
17
.
Penuel
,
William R.
,
Caitlin C.
Farrell
,
Anna-Ruth
Allen
,
Yukie
Toyama
, and
Cynthia E.
Coburn
.
2018
.
What research district leaders find useful
.
Educational Policy
32
(
4
):
540
568
.
Research Excellence Framework
.
2019
.
Panel criteria and working methods (2019/02)
.
Available
https://www.ref.ac.uk/publications/panel-criteria-and-working-methods-201902/.
Accessed 16 February 2019
.
Schwartz
,
Nathaniel
.
2015
. Making research matter for the state education agency. In
The SEA of the future: Building agency capacity for evidence-based policymaking
,
edited by
Betheny
Gross
and
Ashley
Jochim
, pp.
23
39
.
Austin, TX
:
Building State Capacity and Productivity Center
.
Weiss
,
Carol
.
1977
.
Research for policy's sake: The enlightenment function of social research
.
Policy Analysis
3
(
4
):
531
545
.
Weiss
,
Carol
.
1980
.
Knowledge creep and decision accretion
.
Knowledge: Creation, diffusion, utilization
1
(
3
):
381
404
.
Weiss
,
Carol
.
1982
.
Policy research in the context of diffuse decision-making
.
Journal of Higher Education
53
(
6
):
619
639
.