Conflict resolution as an academic field was built by pioneers like John Burton on the idea that conflict resolution academics must accomplish three basic interlocking tasks: conduct cutting‐edge research, educate students, and make a positive contribution to the real‐world work of practitioners engaged in resolving actual conflicts. Over time and as the field has grown, so have the demands to produce research deemed rigorous enough according to increasingly competitive and rigid traditional standards and so too have the demands of teaching growing numbers of students. Research and teaching commitments thus diminish the time and support available to engage in practice. In this article, I consider those pressures within universities and also consider the options available to conflict resolution scholar–practitioners outside the traditional university

When I was an undergraduate studying international relations at University College in London, I was taught and much influenced by a professor who had very firm views about the utilization of research findings and about the role of academics — knowledge producers — in the employment of those findings. This was hardly surprising as my professor had started his career as an Australian diplomat, and just after the end of the Second World War had become the youngest head of the Australian Department of External Affairs. After this, he had been hounded out of public life during the McCarthy period of the Cold War, which was as bad, if not worse, in Australia as it was in the United States and had ended up teaching at London University in the early 1960s.

John Burton was firmly convinced that any academic institution worthy of its salt had three basic interlocking tasks to fulfill. The first was the traditional business of carrying out “cutting‐edge” research and producing insights into, and knowledge about, a particular field of study; in his case, the field was international conflict. The second task was to educate students about that field — what we knew about it and what we had yet to find out — and particularly those students who would form the next generation of researchers and decision makers in that field. So far, so traditional. His third task was more controversial, however, as Burton was thoroughly convinced that it was quite legitimate for academics to apply the latest knowledge about the field directly to contemporary problems and to offer insights about process and substance to those engaged professionally and politically in the business of conflict resolution, especially of conflicts deemed intractable and deep‐rooted.

For Burton, these three tasks were intimately connected. Engaging in research informed and enlivened the process of teaching interested, aware, and critical students and kept professors “on their toes” when faced with challenges from the critical young. Being involved in the ongoing practice of conflict resolution in a variety of practical ways would perform a similar role but, equally importantly, it would enable theorists and researchers to test out their ideas in the real world of protracted conflict. Burton frequently argued that if a theory or an insight was to be seen as valid as well as useful, it had to be recognized as such by people actually entrapped as adversaries in a real‐world conflict. This would be the real test of a hunch, a hypothesis, or a theory. He always believed that the work of those who came to be known as “scholar–practitioners” engaged in “Track Two” initiatives should be evaluated largely through the reactions of those parties in conflict who found the ideas appropriate and reflective of their own experience.

With this as a basis, Burton argued that a university was an ideal place from which to undertake not merely conflict research but also to practice conflict resolution. In the 1960s and 1970s, when Burton's ideas and practice were being formulated, universities had certain credibility as the sources of new and useful knowledge and as neutral and disinterested in pursuit of “scientific” knowledge. Hence, one of the features of his academic career was his effort to establish centers or institutions whose program comprised these three elements of teaching, research, and practice closely intertwined. First, he launched the Centre for the Analysis of Conflict at London University, which later moved to the University of Kent at Canterbury, then, with Edward Azar, he started the Center for International Development and Conflict Resolution at the University of Maryland, and, finally, he joined Bryant Wedge and Henry Barringer to found the Center (later the Institute, now the School) for Conflict Analysis and Resolution at George Mason University.

Looking back now, after almost fifty years as a scholar–practitioner within a number of academic settings both in Britain and the United States, I recognize that I have been operating according to Burton's assumptions about the advantages of practicing from a university base, as well as the extent to which I am both indebted to, and limited by, the institutional parameters set by my old professor. It is undoubtedly time — it may be past time — to reconsider some of the basic assumptions I absorbed many years ago at University College London (UCL) and ask whether academia is, indeed, the ideal setting for theoretically based practice.

Over the years, I have seen that trying to work within and from a university setting can have some serious disadvantages, some of which should have been obvious from the start. The advantages are certainly there — continuity and concentration of effort over the long term, time to reflect and reconsider before moving on to the next stage or the next case study, the opportunity to check up on what you have attempted and ask what went right and what did not. Certainly, I can see that the vast majority of my theoretical ideas and insights, for whatever they are worth — on conciliatory moves, on timing, on entrapping situations — have originated in things I have actually observed while engaged directly in efforts to mitigate or resolve real‐world conflicts. But do these undoubted benefits outweigh some of the disadvantages that experience has revealed? Are there other institutional settings that are unambiguously better than academia for such important, sensitive, and practical work?

On reflection, I have some initial excuse for not recognizing at the outset of my academic career some of the inevitable problems that arose when Burton's model confronts the realities of academic life. At UCL's Centre for the Analysis of Conflict, we were a remarkably privileged, small, and close‐knit group of academics, admittedly from different backgrounds, but sharing enough of a common outlook to engage in fruitful disagreement with each other. Equally important, we had enough time to think, research, write, travel, and engage in practice, free from any but the most occasional demands from teaching. Over the years, the teaching tasks multiplied and the realities implied by Burton's third, educational task began to take hold, but in my naiveté, I imagined that those first two or three years were typical of academia.

Much later, when I was trying to establish my own conflict management center at another university in London, I started mentally to list some of the obstacles that arose when trying to establish a practice in an institution devoted to teaching and traditional research. Three problems became clear almost at once.

Differing Rhythms

Sometimes one becomes so used to a way of life that it becomes hard to think outside the limits of that life and realize how atypical it is. The academic year has its own rhythms, timetables, and deadlines, and these impose some strict limits on what one can do outside those limits and when. Obviously, full‐time academics normally have courses that they need to teach at certain times of the year and for which they need to be present. Classes have to be conducted, papers need to be graded, exams need to be marked, and degrees need to be awarded. There are committees that need members to sit on them: admissions, promotion and tenure, curriculum development, outreach, and so forth. The university year has its structure whether a particular institution is following a two‐semester or a three‐ or four‐term model. Traditionally and increasingly, research begins in the summer and is abandoned once the university year starts up again.

It goes without saying that protracted and intractable conflicts do not conform to academic timetables and thus neither can conflict resolution practice. No matter how much scholar–practitioners would like crucial events or key opportunities to occur during the summer vacation, they do not. Olive branches, Arab Springs, changes of leadership, and windows of opportunity do not occur conveniently synchronized to the academic timetable, and most scholar–practitioners do not have the flexibility to drop everything — students, research projects, committees, commitments — to rush off for three months, six months, a year to devote the needed time, attention, and resources to moving adversaries toward a durable solution to an inevitably complex problem.1 At best, a colleague will cover a few of your classes and your department chair will tolerate a couple of weeks of absence. However, that is usually the limit of flexibility for a typical teaching academic. The timetables of academia and of protracted conflict do not usually mesh.

Other Demands of Academia

Universities also set limits other than time availability, a point that was illustrated most vividly for me at a crucial workshop held at the University of Maryland in the mid‐1980s. This was the third of a series organized by the Center for International Development and Conflict Management (CIDCM) that focused on relations between Britain and Argentina over the Falkland/Malvinas Islands in the aftermath of the South Atlantic War of 1982. The previous workshop process had established good working relations among the participants, who now confronted the delicate task of coming up with a program of confidence‐building measures that could break the impasse between the two governments, exemplified by an official meeting in Berne, Switzerland, that had broken down on the very first evening. The Maryland meeting was meant to be low‐key, confidential, and off the record but it transpired that Ed Azar, as director of CIDCM, had invited a number of “observers” to the discussions, who duly turned up, observed briefly, and went away. This occasioned a major disagreement between Azar and John Burton, who — as ever — was insistent upon fulfilling the guarantees he had made to the participants that everything would be confidential and out of the eye of the British, Argentinean, and American publics.

At the time, as the series of deans, vice presidents, heads of departments, and colleagues came to observe and also to participate in the social events and “icebreakers” that had been built into the weeklong workshop, I failed to grasp what was going on. Later, of course, I recognized that Azar was doing what any director of a small and not well‐established center would do, which was to demonstrate to members of the university hierarchy what his center could achieve and that its activities were not merely a legitimate part of university work, but could redound to the credit of the institution and enhance its reputation. Centers and institutes constantly have to justify themselves to the university world, especially when the university underwrites that institution. Unfortunately, many of the necessary constraints imposed by conflict resolution practice — confidentiality, behind‐the‐scenes processes, low‐key meetings, off‐the‐record discussions, inability to publish findings openly — go directly against the central values of academia that activities be open to the scrutiny of peers, that findings and evidence be presented publicly, that findings be replicable, and that skeptics have opportunities to challenge findings. On this occasion, this clash of values was the beginning of a major rupture between two colleagues who, up to that point, had worked together amicably and to great effect.

Standards and Academic Criteria

The third problem that arises from situating the practice of conflict resolution in an academic setting arises from the way in which rewards are distributed in academia and from how academics achieve status, reputation, and respect. The successful senior professor has usually achieved eminence by undertaking traditionally conceived research into fashionable topics, and by publishing a number of monographs on the subject and heading a research team in that field for a number of years. Additional but less important criteria involve helping to run a department, a school, or the university itself, engaging in usually ill‐defined “outreach” beyond the university and — increasingly — success in bringing funding to the university. Occasionally, in the criteria for tenure, promotion, and academic kudos, mention will be made of “practice” or “practical application of research findings,” but as anyone who has worked in universities knows, when it comes to promotion, prestige, and reputation, nothing counts so much as a pile of single‐authored monographs, published by prestigious, usually academic, publishers, together with an even bigger pile of articles from refereed journals.2

Faced with the intellectual downgrading of practical work, which is the norm in the vast majority of universities — especially those seeking to rise in the national rankings that bring prestige, top students, and resources to universities — it is asking a good deal of a young scholar to consider devoting a large part of his or her career to the uncertain practice of applying conflict resolution concepts and theories to real‐world conflicts.3 The risks are too great and the potential costs too severe. In some institutions, this “research and prestige publications” culture is changing, but not in many.

One set of difficulties we had certainly not anticipated in those early days of struggle to become respectable scholar–practitioners was that which arose from academic “success.” This was especially so when success took the form of student demand for trainings, courses, and even degrees in conflict analysis and resolution that rose exponentially when the possibility of studying peace‐related topics became available. How did scholar–practitioners — and Burton's trilateral model — cope when the field became fashionable?

In effect, this has happened over the last twenty years, independent of more general developments in academia, such as changing student demographics and enrollments or rising and falling attitudes toward the economic utility of university degrees. The field has always tended to attract the idealistic young, and while the fortunes of the social sciences in general have fluctuated a great deal in recent decades, interest in applications for courses at least claiming to lead to an understanding of the causes of and cures for conflicts, feuds, disputes, wars, and genocides have shown a steady increase.

This success can be seen in the sheer number of undergraduate and postgraduate degree courses currently offered in universities in North America, Europe, and increasingly in Latin America, Africa, and Asia. When I joined the Center for Conflict Analysis and Resolution at George Mason University, the center offered a master's degree in the subject and was just beginning a doctoral program. There were about thirty‐five masters students, eight doctoral students, and five full‐time faculty. Today, the School for Conflict Analysis and Resolution offers an undergraduate major and minor that together enroll more than two hundred students, teach a number of short‐term certificate courses, and continue to train more than one hundred fifty two‐year masters students and more than eighty doctoral candidates. The faculty now numbers more than twenty. A similar trajectory has been followed by the program in International Peace and Conflict Resolution at nearby American University, while new degrees in the field are now offered by at least five universities in the greater Washington area.

While this university expansion, together with the growth in the number of careers that are now available to people in the field, is undoubtedly a success story, it poses serious challenges to the model of the scholar–practitioner and the trilateral institution pioneered in the 1960s and 1970s. Burton's ideal always envisioned the three tasks as being of equal weight and importance but, inevitably, as the demands of the teaching task become greater, the other two — research and practice — tend to suffer from shortages of time and effort.

This is especially true for practice. While traditional research and writing can be put temporarily on hold, perhaps until the next summer or the forthcoming sabbatical semester, the real world of conflict cannot wait, so there is a real tendency to abandon practice, which inevitably involves uncontrolled stops and starts, times when nothing is happening, times when everything is happening very fast, or occasions when carefully planned initiatives come to nothing and scarce time, effort, and resources appear to have been totally wasted. Small wonder that even the most practically oriented scholar sees her best career path as one that avoids the world of ongoing, real conflict for the world of subsequent analysis after the fact, or large‐scale quantitative comparison, using the latest statistical techniques.

Another process that has affected the ability of academics to practice conflict analysis and resolution, as opposed to researching it, has been the gradual but quite noticeable loss of universities' reputation for being a source of disinterested and credible knowledge that will be sought and, if possible, applied, irrespective of whomever that knowledge will help. Of course, there was always an element of myth in this image of the university professor's pursuit of sound knowledge and objective truth come what may, as well as the idea that this was the main business of a university quite irrespective of prevailing political and ideological fashion. Knowledge for the sake of knowledge, no matter how unfashionable or unpopular new findings might be, was always a tattered banner to fly over towers perceived from the outside as ivory and privileged. It was a worthwhile aim, however, and sustained many an academic who produced unpopular and unfashionable ideas and arguments. The pursuit of “truth” and “scientific objectivity” might have been abandoned at times, and in some parts of the world – one thinks of Joseph Stalin's baleful influence on Soviet sciences, especially biology, or the Nazis' corruption of findings about race and genetics – but mostly it has served as a standard for judging the worth of ideas and the value of institutions that should be minimally influenced, especially by the desires of the powerful and the wealthy.

Given the nature of the field of conflict analysis and resolution, the tradition of the application of sound and objective scientific knowledge has been especially important, if remarkably difficult, from its inception. During the 1960s and 1970s, I was frequently able to interest decision makers in alternative means of resolving or mitigating their conflict by means other than violence because a contact person — somewhat to my embarrassment — portrayed what my institution had to offer as a new “scientific approach” to handling conflict. It also helped that my institutional base was a well‐known university, which added the credibility of disinterest that helped to offset my own national, post‐colonial background.4

Gradually, however, the credibility of a university as a source of independent, objective, and above all disinterested knowledge has become extremely frayed, for a variety of reasons. It has, of course, been the case from the end of the Second World War that the funding and hence the direction of scientific research at universities in most “advanced” industrial countries has been government‐supported and influenced. Paradoxically, for conflict research and practice, the trend has been in the opposite direction, with governments until recently being less than interested in conflict research or practice.

As funding from a few private foundations (Hewlett, Rowntree, Volkswagen, McArthur, Ford) has petered out, however, more and more conflict resolution funding has come from national government agencies although, relatively speaking, the amounts have remained modest. This concentration of funding support for conflict research and the switch to topics of interest to governments as a major focus for university research within the field has been especially marked over the past decade in the United States. The trend has accelerated since the “War on Terror” provided millions of dollars for research on anything remotely connected with a security threat to the country or with post‐conflict peacebuilding in places where military interventions have occurred.

This increasing dependence on governments as the major — in some places as the sole — source of funding for conflict research and related practice in universities raises important questions. For example, the establishment in the State Department of a special unit concerned with “conflict management and mitigation” seems, at first glance, to be a thoroughly commendable innovation. When this is accompanied, however, by a major government effort to organize independent, non‐governmental conflict research and resolution institutions, including university centers and departments, into approved “consortia” that will be the only entities permitted to bid for large contracts on preselected topics, the process begins to have some drawbacks. At the very least, the effect is likely to be a narrowing down of the range of inquiry in the field.

Another trend in the erosion of what is left of university independence has been an increase in efforts by donors — corporate and private foundations, as well as individual donors — to influence both the research agendas and, more alarmingly, research findings. University fundraising has always reflected donors' interests, and gifts to establish institutions for the study of international affairs or strategic studies, or to endow chairs in peace studies have long been features of academia, but usually the donors have been content to leave the running of such institutions or the appointment of particular scholars to a chair to the institution itself. Some twenty years ago, the University of Southern California turned down a large grant from a Middle East government, which had tried to veto the appointment of a distinguished scholar to an endowed chair on the grounds that he was Jewish. The furor that incident generated now seems rather old‐fashioned in the light of the agreement recently concluded between the Charles G. Koch Charitable Foundation and Florida State University, which gives the funder not only the right to decide the selection criteria used to fill faculty positions in the economics department but also to appoint the majority of members to the department's appointments committees.5

Leaving aside concerns about the encroachment of funders on independent scholarship at universities, my concern is more narrowly focused on practitioners who work out of university departments and their ability to present their services and their work as even‐handedly and as neutrally as possible. Practitioners in the field of conflict analysis and resolution usually need to attain some level of trust and credibility with individuals from political entities engaged in violent, protracted, and intense conflicts in which all parties tend to assume that others will inevitably take sides or have their own hidden and potentially damaging agendas. In such circumstances, trust is inevitably lacking or fragile, and the effort of building it up with properly suspicious adversaries is not helped by the gradual incorporation of universities into national governments or ideological institutions.

If we accept that working from a university base and using John Burton's trilateral model as a starting point for aspiring scholar–practitioners does have a number of serious drawbacks, what alternatives are there and what might be the advantages of, for example, an entirely freestanding institution devoted simply to practice? After all, toward the end of his career, John Burton had an idea for an independent “Green Cross” service that would offer mediation and other conflict resolution services to parties in protracted and intractable conflicts. In this essay, I do not have enough space to consider alternative models in great depth, but it would be only fair to mention four alternatives:

  • A completely freestanding and independent institution, self‐funding, and able to choose which situations it will attempt to influence. Examples of such institutions that have developed over the last forty years include, among the most successful, International Alert and Conciliation Resources in London, the Berghof Foundation in Berlin, and the Institute for Multilateral Diplomacy and Search for Common Ground, both in Washington, DC. There are an equal number of, if not more, examples of such institutions that have been hopefully established and failed to survive. Their demise points to the major problem such initiatives face: finding a continuous stream of funding, which creates pressure to choose projects for which funding is readily available, often leaving the project no choice at all. This model may come closest to Burton's “Green Cross” model, but such institutions are weakened by their need to maintain a steady flow of funded projects, with a resultant lack of time and resources to reflect and to reanalyze.

  • A freestanding center or institution linked to a university department. In this case, the department will supply funded personnel as consultants to particular center projects, which relate to their own individual research programs or interests. With this model, the funding requirement is less acute, provided the university and the department are willing to lend faculty, funding, and facilities to the center and the latter manages to maintain the flow of “paid‐for” (grant‐funded) work. Perhaps the relationship between the Carter Center and Emory University, both in Atlanta, might serve as an example of this model, while, at least for a while, the example of the Conflict Clinic, Inc. (CCI) at George Mason University could be cited as a similarly successful partnership. However, CCI was eventually overwhelmed by its need for continuous funding.

  • A separate institution largely funded and sponsored by the national government. Again there are a number of examples of this model, many from Scandinavia, some of which have been obvious successes, such as the Peace Research Institute Oslo (PRIO), Stockholm International Peace Research Institute, Tampere Peace Research Institute in Finland, and United States Institute of Peace (USIP) in Washington, which is partly funded directly by the United States Congress. The major drawback of this model is that it is highly dependent on the support of political leaders. Levels of political support can alter significantly over time, however, so that PRIO's policy of seeking to diversify support beyond the government seems more than prudent. The fate of the Danish peace research institute, the Copenhagen Peace Research Institute, once the social democratic government left the scene to be replaced by a much more conservative administration, serves as a reminder of Sir Thomas More's maxim that one should not put one's trust in princes. Moreover, too close a reliance on governments may be risky for institutions whose staff seeks to work in conflict zones as neutral parties.

  • Lastly, there is the institution that is actually an entity of a national government. Such an institution could well be regarded as a version of Thomas Jefferson's proposed “Department of Peace” as a counter to a “War Department.” This model tends to suffer from all of the drawbacks of the government‐sponsored model discussed previously, in addition to which it can become the victim of political rivalries during campaigns when the opposition seeks to criticize and undermine the policies of whichever political party currently forms the government. The recent campaign in the U.S. Congress to defund the USIP is only the latest example of how departments concerned with conflict resolution and peacemaking can fall victim to political posturing and ideological prejudice.6

All the models currently in operation have weaknesses; most involve difficulty obtaining resources, and maintaining independence, credibility, and competence. John Burton's arguments about the benefits to be derived from establishing a university‐based institution that mixes research, teaching, and practice in equal proportions turns out to have had a number of practical drawbacks because the demands of these three activities often come into conflict with one another and — given the basic nature of universities — it is often the practice component that must give way to the other two tasks. Universities, after all, have developed over time mainly as teaching and research entities, and the image of the ivory tower, separated from the real world, did not arise without reason.

On the other hand, having worked in universities for much of my life and having attempted during that time to put John Burton's model into effect — to teach about practice, to generate broad hypotheses from involvement in case studies, to test out ideas in real‐world conflicts, to learn from those involved in protracted struggles, and to involve students in the real world of intractable rivalries and resultant pain and damage — I still believe that whatever its drawbacks, this model has much to offer universities and the world of conflict analysis and resolution and it would be sad if this trilateral model were to vanish completely.

Looking back over the past almost fifty years, I can only say that I cannot imagine a better setting for the development of useful ideas about coping with protracted conflicts and cautiously trying them out to see if they are useful. More and more, I find myself echoing C. P. Snow's (1951: 345) sentiments about his mythical but reality‐based Cambridge college and applying them to the universities where I have worked: “[T]he college was the place where men lived the least anxious, the most comforting, the freest lives.”


Certainly, had I had any regular teaching duties I would never have been able to take off in 1967 on a four‐week exploratory mission to renew contacts with Egyptian officials in Cairo, to clarify who were the parties and what were the issues to the conflict system in the Horn of Africa, and to explore who in the region might be open to the idea of a London‐based, conflict resolution initiative. My journey eventually lasted six or seven weeks.


Few universities go to the extent of requiring candidates for tenure and promotion to have published with a select list of elite university presses, but some do. At some universities, this list may not include, for example, Syracuse University Press, which publishes an excellent series on conflict analysis and resolution.


In the United Kingdom, this connection between published research productivity (numerically calculated) and government‐provided resources has been formalized in the five‐year research review, which fixes each university department into one of five categories. This then determines how many resources are provided for research over the next five years. Departments live in fear of being downgraded to becoming a “mere” teaching department with no allowance for any research time or effort.


This image of objectivity, neutrality, and disinterest tended to help whatever third‐party role — facilitator, go‐between, consultant, intermediary — one was trying to play as an inevitably powerless scholar–practitioner. In more recent years, responding to the question, “Who is paying you to do this?” with the answer “The Commonwealth of Virginia” has tended to modify, if not completely banish, the suspicion that I must be working for the Central Intelligence Agency because I come from the United States.


Similar questions about unacceptable levels of outside influence from donors have been raised over agreements that have recently been concluded with Cornell University, Troy University, Brown University, and Utah State University.


Quite apart from the dangers arising from a change of political masters, it seems very likely that the establishment of a specialized government conflict resolution and peace agency will be seen as a rival to existing foreign ministries or departments of state, which are traditionally regarded as the appropriate government institution to deal with the promotion of peace.

C. P.
The masters
This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 International (CC BY 4.0) license, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. For a full description of the license, please visit