Published online by Cambridge University Press: 25 April 2017
The attacks by terrorists against the United States on September 11, 2001, left a scar on the American psyche that will never fully heal. This date, too, will live in infamy, along with the Japanese bombing of Pearl Harbor on December 7, 1941. The brutality of the attacks on 9/11 awakened the American people to two central facts of the new century: first, we continue to live in a hostile world, despite the end of the Cold War, and, second, we are vulnerable to adversaries who not only reject our way of life but seek to destroy it.
The terrorist attacks have spurred a wideranging debate over the future of American foreign policy. The question of how best to organize the government for the common defense has been a central focus, with the proposal for a Department of Homeland Defense providing Congress and the president with a framework for fashioning preliminary answers. Officials will continue to refine the organizational requirements for improved security as negotiations continue over the proper role for such a department, as well as its relationship to existing counterterrorist agencies, especially the CIA and the FBI.
The debate, though, goes far beyond adding new boxes on the government's organizational charts. Profound issues have arisen over where and when America should use military force in the war against terrorism, including whether lawmakers should set the parameters for a presidentially proposed forceful regime change in Iraq (a nation suspected of harboring terrorists and manufacturing weapons of mass destruction that might be given to terrorists or used directly by Saddam Hussein against the United States)—or instead merely hold the president's coat and offer patriotic exhortations from ringside.
British historian Sir Michael Howard observes: “A year after September 11, the United States finds itself more unpopular than perhaps it has ever been in its history” (2002, 16). As we ponder the reasons for this unpopularity, and as we continue the debate on the proper role of the United States in the world today and how to improve our national security, it is useful to bear in mind the harmful effects of seven “sins” that have plagued this nation's foreign policy over the years. We begin with a fundamental defect: America's inadequate understanding of other lands.
In light of the long shadow cast by the United States across the globe as the only superpower, one might reasonably expect Americans to know something about the world—if only to protect themselves from foreign threats. Yet, poll after poll of this nation's citizenry reveals an embarrassing lack of knowledge about the world's geography, events, and conditions.
In a 1988 Gallup sample of people between the ages of 18 and 24 living in nine Western nations, the United States finished dead last in geographic literacy (Leslie 1988, 31). Threefourths of the Americans in this poll could not locate the Persian Gulf on a world map, even though at the time the U.S. Navy had gathered a sizable flotilla of warships in the waterway to protect commercial shipping. In other polls, 50% of high school students in Hartford, Connecticut, could not name three countries in Africa; nearly 50% of college students in a California survey could not locate Japan on a map; 95% of first-year students at a college in Indiana could not find Vietnam (Schwartz 1987, 29). A recent report from the American Council of Trustees and Alumni indicates that students at 55 of the nation's top colleges are able to graduate without taking a single course in American history (Strauss 2000, 13). As a 2000 Gallup Youth Survey summaries, teenagers in the United States have an “appalling low awareness of facts related to world events and leaders” (Gallup 2000).
Ignorance of world affairs is not the special preserve of young Americans. Gallup pollsters discovered in the 1980s that barely half of a broad sample of U.S. citizens realized that the Marxist-leaning Sandinistas and the American-backed contras were at war in Nicaragua, or that Arabs and Jews were at odds in Israel. Only a third could name a single member of NATO; and 18% thought the U.S.S.R. was a member of this defense pact, established in 1949 to thwart Soviet expansion (Leslie 1988, 31).
Only about 1% of Americans have studied a language other than English, even though most people on this planet have a different native tongue. Further, the United States is the only nation in the world where scholars can earn a doctorate without demonstrating competence in any foreign language (Atlanta Journal Constitution 1986, A6). Spotty knowledge of the world's languages extends even into those government agencies expected to gather information about foreign affairs and advise top policymakers. The U.S. Foreign Service is the only diplomatic corps in a major industrialized nation that does not insist on fluency in another language among its officers.
Inside the nation's intelligence agencies, speakers of Middle Eastern and African languages such as Farsi, Arabic, and Amharic are in short supply. So are analysts with a deep understanding of the history, politics, and culture of places like Afghanistan, Iraq, Iran, and North Korea. The Defense Intelligence Agency had only two Iraqi analysts at the time of the Persian Gulf War in 1990. During the subsequent buildup to the NATO bombing of Serbia, Serbo-Croatian translators were hard to find in the government. Prior to September 11th, the Federal Bureau of Investigation had only one strategic analyst with the requisite skills needed to track the Al Qaeda terrorist group considered responsible for the attacks against New York and Washington.
America's intelligence agencies have an abundance of documents and transcripts of telephone intercepts from around the world; but too much of this information—upwards of 90% (Millis 1996)—lies dust-covered in vaults, untouched in part because the agencies lack enough skilled translators. The most notorious recent case is the Al Qaeda message intercepted by the National Security Agency on September 10th that said: “Tomorrow is zero hour.” These fateful words were translated on September 12th.
A crash program is underway to remedy these deficiencies. It will take time, though, to recruit a cadre of linguists, acquire and infiltrate spies into terrorist cells and closed societies, and nurture a new generation of analysts with insights into the nations of the Middle East and South Asia—locations largely overlooked during America's focus on the Soviet empire during the Cold War.
The nation's K-12 educational system is notoriously weak in the teaching of world history, geography, and foreign languages. Few pre-college curriculums offer instruction in Chinese and Japanese, let alone Arabic. Nor, for that matter, do many institutions of higher learning in the United States. Efforts to improve student awareness of different world faiths and cultures are limited—and can lead to controversy and opposition among some citizens, as in 2002 when the University of North Carolina asked incoming first-year students to read a study about the Koran. In addition, programs in area studies have been in decline at the academy. As a New York Times report concludes: “Try finding a full-time political scientist who specializes in the Middle East or South Asia at the nation's top universities and you'd almost be out of luck. Stanford and Princeton don't have a single political scientist who specializes in the Middle East; Yale has no political scientist on South Asia” (Kotkin 2002, A15).
A recent memo from the dean at a major university announced to faculty that, in the name of efficiency, all classes with less than 20 students enrolled would be canceled—presumably including those in which only a few hearty undergraduates had signed up for Arabic. (After protests from the faculty, the dean rescinded the order.) University bureaucracy aside, students themselves often demonstrate little interest in understanding cultures beyond America's shores. Many continue to equate foreign language study with root canal work, although since September 11th classes on Arab language, culture, and religion have filled at some universities, as student demand surges beyond the supply of competent instructors. Still, 50 students enrolled in Arabic—out of, say, 35,000 undergraduates at a state university—remains a small number. Moreover, typically, less than less than 20% of undergraduates at state universities study abroad for a semester (though at some institutions this figure has shot up from 3 to 15–17% in just the past four years).
Clearly, until larger numbers of Americans commit themselves to learn more about the world—including traveling overseas with the intent of making friends and gaining a better appreciation of foreign cultures—other nations will look upon the citizens of the United States as unworthy of global leadership. Robert H. Swansbrough, a political scientist and administrator at the University of Tennessee, Chattanooga, has suggested (2002) that it may be time to pass something like the National Defense Education Act of the Cold War years, providing loans to students who seek to prepare themselves for careers related to area studies and language arts. The loans could be forgiven for those who graduate from college and devote five years to public service pursuits.
Hand in glove with ignorance comes an inability to empathize with other nations. Former President Jimmy Carter cautioned Americans in 1988 about “the increasing disharmony and lack of understanding between rich and poor nations” (1988, A23). This relationship has continued to deteriorate, as lamented in opening speeches by several world leaders at the UN World Summit on Sustainable Development held in Johnannesburg in September 2002.
The statistics are grim (Jentleson 2000, 342–50). A UN study released last year reported that 2.8 billion of the world's six billion people live on less than $2 a day; and, among them, 1.2 billion eke out an existence on $1 a day (James 2002, 1). In third-world countries around the globe, such as in Benin, Guatemala, Haiti, Morocco, Pakistan, and Uganda, less than 30% of adults age 25 and over have completed primary school (USAID 2001).
Disease in poor nations is rampant (USAID 2002; 2001; 1987; 1986). In poor nations, 3,000 children under five die every day from malaria. Almost 3 million people—mostly in the developing world—died of tuberculosis in 1995, surpassing the worst years of the TB epidemic that swept the earth at the beginning of the twentieth century. In sub-Saharan Africa, raked by tetanus, whooping cough, and measles—diseases all but unknown in the wealthy nations, one-fifth of all children never reach their fifth birthday. Diarrhea and acute respiratory infections also stalk the young, and polio is responsible for crippling some 200,000 children a year. In 2000 alone, 11.1 million children under the age of five died from preventable diseases (USAID 2002).
The AIDS epidemic has claimed about as many victims— more than 40 million—as the Black Death in Europe in the mid-14th century. About 95% of the infected individuals live in the developing world. As reported by UNAIDS (an arm of the World Health Organization), HIV/AIDS strikes some 6,000 young people between the ages of 15 to 24 every day, along with 2,000 children under 15. Again, almost all of the afflicted reside in the developing world (Stolberg 2002, A18)—especially sub-Saharan Africa, home of 28.1 million with the HIV/AIDS virus (USAID 2001). In 2000, a half-million children died from AIDS, while another half-million became newly infected (primarily from mother-to-child transmissions; USAID 2001).
Malnutrition is a constant specter, too, placing (for example) between 12 and 14 million people at risk of serious illness in Southern Africa (World Health Organization 2002). Childbearing presents a great danger to mothers in the developing world, with a woman in Africa having a one-in-three chance of dying during pregnancy and childbirth (USAID 2002). As one would anticipate from these sad figures, life expectancy is substantially lower in poor nations—for instance, only 39 years in Sierra Leone (James 2002, 1).
While the developing world remains gripped in a vise of poverty and poor health, television saturates the globe with images of a luxurious lifestyle in wealthy nations. The United States, for example, is shown awash in oversized automobiles and trucks (which account for more than 20% of global CO2 emissions). Economist Robert L. Heilbroner compared the planet to “an immense train, in which a few passengers, mainly in the advanced capitalist world, ride in first-class coaches, in conditions of comfort unimaginable to the enormously greater numbers jammed into the cattle cars that make up the bulk of the train's carriages” (1975, 39). Little wonder resentment and envy churn among the world's have-nots.
The growing divide between the haves and the have-nots has generated a population of underprivileged, resentful people —a prime reservoir for the recruitment of terrorists and the fostering of further violence (Thomson 2002). By more aggressively addressing the underlying conditions of poverty and disease, the affluent nations can help to excise the cancer of despair in the developing countries before it metastasizes into acts of terrorism.
Native Americans speak of walking in another person's moccasins, visualizing life from that individual's point of view. As a nation, we must empathize with the situation faced by others around the globe. America's Secretary of State after the Second World War, George C. Marshall, displayed this capacity. In preparation for his 1947 Harvard University commencement address announcing the European Recovery Program (later known as the Marshall Plan), he crossed out a reference to “the Communist threat,” which an aide had placed into an early draft. Instead, the enemies he chose to list were “hunger, poverty, desperation, and chaos” (Lewis 1987, A31). Yet Marshall's wise approach to foreign policy has been shunted aside, as funds in the coffers for international assistance have sharply dwindled in most of the developed world. With the exception of Denmark, reports Michael Ignatieff (2002, 30), “there isn't a country in the world that devotes even 1 percent of its gross domestic product to helping poor countries—the U.S. is nearly at the bottom of the pile, spending a derisory 0.1 percent of GDP.”
For many decades, the United States has sold more weapons abroad than any other country. Yet President George W. Bush has pointed to literacy and learning as “the foundation of democracy and development” (2001). What if Americans were better known for helping other nations build schools (as well as hospitals and churches) rather than selling weapons? What about the construction of highways across Afghanistan, providing projects that would pay local tribesmen a decent salary for a day's work and give them something more to do than shoot at each other, while at the same time knitting together the pieces of a fragmented society? America's military services are vitally important instruments of foreign policy; but so, for different reasons, are the Peace Corps, the diplomatic corps, and businesspeople who (ideally, in joint ventures with indigenous entrepreneurs) can provide jobs and hope for laborers in developing nations.
As a nation, it would be wise to heed the advice of the American journalist and diplomat Carl Rowen (1979, A14). “We need officials who care about these poor, weak nations and their peoples,” he said, “officials who will show up occasionally to ask, ‘What are your special problems? What can we buy from you, and what can we sell? What is it in medicine, food, education, technology that we can provide?’” Such an attitude reflects empathy, and wins friends for the United States.
Instead of reaching out, we could simply turn our backs on the rest of the world, savoring our prosperity in splendid isolation, pouring resources into a ballistic missile defense, sealing our borders, constructing a Fortress America designed to barricade us against the forces of chaos beyond our Atlantic and Pacific moats. For some, it is tempting to pretend we can exist alone, shutting out of our lives those overseas who dislike us or raise troubling questions about policies fashioned in Washington, D.C., and those who write slogans like one scrawled on a piazza wall in Venice last summer: I wanna see the Constitution burn Wanna watch the White House overturn.
Isolationism was America's initial response to the wrangling world and remained so throughout most of our history. “Steer clear of permanent alliances with any portion of the foreign world,” George Washington warned in his farewell address. “Peace, commerce and honest friendship with all nations, entangling alliances with none,” Thomas Jefferson prescribed in his first inaugural address. These cautionary speeches made sense at the time, when a weak America could ill-afford to be drawn into the vortex of Continental wars. Even in those days, however, the founders were aware of the importance of maintaining trade relations abroad to enhance the economic growth of the new nation. Today, America's economic prosperity is even more closely tied to international commerce—the centerpiece of what we mean by the popular phrase “globalization.”
Yet, despite the increasing trade interdependence of nations, coupled with signs of a growing political and cultural integration, the isolationist instinct lives on in America. One can see it in letters-to-the-editor columns, or even in the ruminations of some presidential candidates. “To apply the Founders' principle today, the U.S. government should bring ALL military forces home from foreign bases,” a citizen in the rural South wrote recently, “and stop playing diplomatic footsy with ALL governments, but especially those in the Mideast” (Banner-Herald 2002, A6, original emphasis). Recommending withdrawal of U.S. forces from South Korea and Europe and an end to America's participation in foreign aid programs, GOP presidential candidate Pat Buchanan wrote in 1991: “All that buncombe about what history ‘placed on our shoulders’ sucked the Brits into two wars, and left them living off Uncle Sam's food stamps. If America does not wish to end her days in the same nursing home as Britannia, she had best can Beltway geo-babble about ‘unipolarity’ and ‘our responsibilities to lead’” (Buchanan 1991, C1).
Most Americans support a limited degree of international involvement, such as fighting world hunger and taking steps to clean up the global environment (McGrory 2002, 4; Richman 1996, 1). Nevertheless, a 1995 Times-Mirror survey found that about 80% of the public did not place a high priority on the protection of weaker nations against foreign aggression; the promotion and defense of human rights in other countries; the improvement of living standards in developing nations; or the advancement of democracy in other nations (Richman 1996, 1). A sizable majority (69%) thought that strengthening the UN should be a low priority, even though the UN's budget is only $1.3 billion a year (compared, for example, to the program costs of $38.1 billion for the Defense Department's crash-prone V-22 Osprey aircraft). Few saw much need to promote political and economic stability in Mexico or democracy in Russia. Admittedly, the world may appear more benign with one's head in the sand, but that is a vulnerable posture.
Nothing has so alarmed and disheartened America's allies as our recent unwillingness to work—or even consult meaningfully —with them before carrying out important foreign policy initiatives (Preston 2002, 22). One of America's closest friends, the German Chancellor Gerhard Schröder, learned through newspaper reports in August 2002 about the second Bush Administration's new policy of a possible pre-emptive military strike against Iraq. “Consultation cannot mean that I get a phone call two hours in advance only to be told, ‘We're going in,”’ the Chancellor complained. “Consultation among grown-up nations has to mean not just consultation about the how and the when, but also about the whether” (Erlanger 2002, A1).
In another example of unilateralism, by referring in August 2002 to any new weapons inspections in Iraq as a “sham,” Donald H. Rumsfeld, the Secretary of Defense in the second Bush Administration, effectively undercut multilateral efforts by the United Nations to negotiate a resumption of the inspections (Alden and Hoyos 2002, 7). The Administration reversed itself the next month and sought UN approval for renewed weapons inspections; but, when Iraq agreed, the United States continued to push for a resolution that threatened the use of force if the Iraqis reneged—even though most members of the UN were prepared to see how the inspections went before escalating to a resolution in favor of military intervention. Inside the Vienna-based Organization for Security and Cooperation in Europe (OSEC), the Bush Administration again displayed hostility toward the principle of multilateral diplomacy by attempting to impose a 15% reduction in the OSEC's already thin budget—despite the important work this organization performs in reducing tensions and advancing human rights in turbulent regions of eastern Europe (OSEC 2002).
Granted, unilateralism is easier than working with others. Ultimately, though, success in the international arena is more likely through collective action, since the globe is too large, complex, and perilous for the United States to cope with alone. Moreover, when it comes to the loss of life in the name of peacekeeping, is it not better for the civilized community of nations to share this burden rather than have American troops make all the sacrifices?
“We're the ones who respond when the world dials 911,” a U.S. official told a reporter recently (International Herald Tribune 2002, 3). In a BBC interview, national security adviser Condoleezza Rice recalled the grave consequences of failing to respond. “Historically,” she said, “…how many dictators who ended up being a tremendous global threat and killing thousands, and indeed, millions of people, should we have stopped in their tracks?” (Rice 2002).
America's means for response are considerable. We are likely to spend more on our military in 2003 than virtually the whole rest of the world combined. But should we not become more circumspect about the costs to Americans in blood and treasure of serving as the world's sheriff? Moreover, do we really wish to promote the impression that inevitably accompanies widespread armed intervention, namely, that the United States is an imperial military power? Indeed, a power prepared to adopt a new doctrine of pre-emptive strikes (“preemption”) against any nation—Iraq at the moment—that possesses or might possess weapons of mass destruction that could be used against America?
How much do we really know about the military capabilities and intentions of nations like Iraq, Iran, and North Korea— proclaimed “an axis of evil” by the second Bush Administration. On the first anniversary of the 9/11 terrorist attacks, as the President appealed to the American public and members of the UN for their support of military action against Iraq, government officials acknowledged that the intelligence agencies had yet to prepare a major assessment (a National Intelligence Estimate) of Iraq's nuclear, chemical, and biological weapons capacities (Schmitt and Mitchell 2002). Further, how can we expect American diplomatic initiatives to compete with military options when the State Department budget is pared to the bone and the Defense Department bulges with new funding? When 85% of the dollar spent on intelligence is controlled by the Pentagon and used for military purposes, instead of gathering information about political, economic, and social conditions around the world (Johnson 2002, 124)?
No longer divided into two ideological camps, the world will experience extensive fragmentation, ethnic strife, human rights abuses, and violence for many years to come. If we are lucky, the global forces of political, economic, and cultural integration may draw nations together to a point where they will adopt more harmonious approaches to the settlement of international and internal disputes. In the meantime, we would do well to be more discriminating in our decisions to respond to world events with the introduction of U.S. troops. Sometimes we have sagely avoided the temptation to rush in with warriors, as some advised in response to the death of American soldiers in Somalia (1993) or when the war escalated in the Balkans. Sometimes we have failed to show the flag when U.S. military presence (along with other nations) might have prevented widespread killings, as in the Rwandan genocide of 1993. Yet, too often, diplomacy is trumped by precipitate military force. In recent years, examples include the interventions in Granada, Nicaragua, and Panama, along with the constant threat today of an invasion into Iraq before UN weapons inspectors have had a chance to determine the true nature of the threat posed by the Hussein regime.
The suggestion of greater discrimination should not be confused with appeasement. If attacked, the United States will respond with appropriate force, as Al Qaeda and the Taliban regime in Afghanistan discovered in 2001. When access to oil is threatened, the industrialized nations will not stand by idly; when modern-day autocrats (like Serbia's Slobodan Miloŝević) seek to build new empires, America will help organize opposition through the United Nations and regional defense pacts. But to concern ourselves with the vast majority of political and military eruptions that occur inevitably around the world is a sure prescription for sapping America's resources and energies, while portraying ourselves as an international meddler of the first order.
Just as we err in going it alone as a nation, so do we in letting the president act as a Lone Ranger. Certainly we do not want to exhibit weakness to adversaries and, at times, we may need to act with secrecy and dispatch; however, we are also a democracy—the world's oldest, with public institutions that are well regarded and emulated around the globe. At our best (the Marshall Plan, NATO, arms control accords, the advancement of human rights), we make open decisions after extensive debate within Congress and a working partnership between lawmakers and the president. At our worst (the Bays of Pigs, assassination plots, the Vietnam War, the Cambodian incursion, the Iran-contra scandal, and more recent war planning by executive fiat), we bypass debate and coalition building between the branches of government. We permit the White House to proceed as it wishes, in secrecy, free of “outside interference” from Congress (as advocated by national security adviser John M. Poindexter during the Iran-contra affair; 1987, 159), without the benefit of a solid foundation of public support.
The American people want neither an imperial president, free of legislative constraints, nor an imperiled president dominated by an overbearing Congress. The war in Vietnam, Watergate, domestic spy scandals, and the Iran-contra affair taught us anew the danger of an executive branch that operates in secrecy, without legislative consultation or accountability—a cautionary principle of governance that stands at the heart of the Constitution. Further, the disastrous Smoot-Hawley protectionist legislation of the 1930s serves as a reminder that a Congress grown too strong can misuse its power as well.
In the modern era, the aggrandizement of power by Presidents Lyndon B. Johnson and Richard M. Nixon shocked the American people, as did the Iran-contra affair. But the institutional lesson to be learned from malfeasance in the White House is, as political scientist Aaron Wildasky underscored (1975, 75), “not that the presidency should be diminished, but that other institutions should grow in stature. The people need the vigor of all their institutions.” Whether planning improvements in health policy or a war against an outlaw nation like Iraq, reliance on the judgment of the president and vice president alone is a foolish and risky course.
This lesson has been poorly understood by recent presidents, despite the unfortunate excesses of some of their predecessors. Only at the eleventh hour did President George H. W. Bush turn to the Congress for authorization to use military force against Iraq in 1990, claiming (before the clamor on Capitol Hill grew too loud to ignore) that he already had sufficient “inherent” authority from the Constitution. He maintained further that he enjoyed additional authority from the United Nations—as if that organization has the authority to decide when America goes to war. The first President Bush left the impression that, even if Congress decided formally to oppose his military plans against Iraq, he would proceed anyway. A potential constitutional crisis was narrowly averted when the Senate approved military action by a four-vote margin (and the House by a solid majority).
The apple fell close to the tree in 2002 when George W. Bush similarly implied that he had sufficient constitutional authority as president of the United States to use the war power as he saw fit in Iraq or, presumably, anywhere else. Yet, as Jack Rakove points out, “if an invasion of Iraq on the scale contemplated does not represent a decision for war within the meaning of the Constitution, it is hard to imagine any other military action that would ever again be subject to congressional approval or restraint” (2002, A31).
Like his father, George W. Bush eventually sought legislative and UN backing for a war against Iraq—but only after public anxiety and congressional reaction became intense. When UN members urged delay on military action, Bush, upset, turned to Congress for an open-ended resolution in support of the use of force against the Hussein regime Whether he would honor a congressional resolution against an invasion or proceed anyway remained in question. Neither of the two Bush presidents seemed to care much about a bedrock principle of American government, well expressed in the modern era by a strong advocate of presidential power, Professor Eugene Rostow of Yale University: “If the President and the executive branch cannot persuade Congress and the public that a policy is wise, it should not be pursued” (1978, 1536).
The sum of all these sins is arrogance. Since the end of the Cold War, how seriously has the United States weighed the views of other nations? For instance, to what extent have we tried to comprehend the forces that fuel Islamic extremism? The zealotry that led to the tragedy of September 11th cannot be tolerated, but some grievances in the Islamic world deserve more serious consideration by Americans. Do we really need military bases in Saudi Arabia—so offensive to many Muslims, given the proximity of these installations to their most holy shrines, Mecca and Medina? Is it really necessary for our security interests for the United States to stand alone among our major friends and allies in our refusal to sign treaties that ban land mines, halt trafficking in small arms, combat pollution (the Kyoto accords), set targets to limit greenhouse gas emissions (the 1992 UN Conference in Rio), enhance primary education in developing countries, and create an International Criminal Court (ICC)?
Such stances reflect egoism, not empathy. What if, instead, we presented to the world a more humble demeanor, joining openly with others in a united search for world peace and tolerance? During the second presidential campaign debate in 2000, George W. Bush said: “The United States must be proud and confident of our values, but humble in how we treat nations that are figuring out how to chart their own course” (Kessler 2002, A1). The subsequent practice of American foreign policy has not lived up to this rhetoric.
Proponents of military solutions to international disagreements will balk at the notion of entering into collaborative endeavors with nations around the world. Fascinated by America's great arsenal of weapons and the possibility of quick results through the use of force, they are unable to imagine the advantages of a foreign policy in which the United States exhibits tolerance for the views of others and the kind of patience that got us through the tightest straits of the Cold War; a foreign policy that turns to military options only after thorough debate on Capitol Hill, authorization from lawmakers, and serious dialogue with our allies; that understands the inevitability of religious and ethnic conflicts for decades to come, which for the most part must be resolved by indigenous factions themselves and seldom by the United States; that relies primarily on diplomacy as the most vital instrument in our relations with other countries, and exhibits more humility about the risks of using military force or secret CIA operations.
America, a strong but empathetic power; America, a friend and partner; America, part of an international coalition dedicated to solving the problems that haunt the planet. Here is the hope of our allies, and the dread of our enemies. Two world wars and several regional wars have taught most Americans that—especially in this age of globalization—we cannot escape from the world, however much we may wish to at times. Still, we can be more discriminating in our involvements overseas. We can “intervene” first with brigades of school, church, and home builders; with nurses, physicians, and other health care specialists; with teachers, farmers, economists, investment bankers, experts, and technicians; with Foreign Service diplomats and Peace Corps volunteers—and only in the most pressing situations with the CIA, the Marine Corps, or our Special Forces.
Let our guiding example be the Marshall Plan, a mutually beneficial program that helped future allies find their economic legs again while at the same time opening markets for the United States. Let our overarching objective be the support of a flourishing international commerce for all nations and free democratic institutions around the globe.
Note
With the usual disclaimer that they are in no way responsible for the views we offer here, the authors would like to express their appreciation to the following colleagues who read an earlier draft of this piece and made helpful suggestions: Karl F. Inderfurth, William Jackson, Leena S. Johnson, Edward J. Larson, Jeffrey Pugh, Robert H. Swansbrough, and Reinhold Wagnleitner. We thank, too, Robert J-P Hauck, Sean Twombly, and Stephen Yoder of PS for their encouragement and their editorial assistance for this symposium.