With all of the diaries of late about "What the US should do" with respect to other countries' problems, especially in the Middle East, where so few even bother to consider cultural and societal differences, I decided to post a chunk of my research on the history of US interventions abroad.
Yes, it's lengthy. Yes, it's highly referenced. Students - consider this a gift bibliography...but be warned, the paper WILL show up in TURNITIN, so if you use any of my actual words, so be sure to cite me!
Introduction
There are several paradigms through which the level and nature of American intervention abroad can be viewed: isolationism, capitalism, multilateralism, militarism, nationalism, and determinism. Isolationist nations seek to insulate themselves from revolution and other failures suffered outside its contiguous borders (Graebner, 1956, p. 19). Capitalist nations seek expansion and growth through private ownership, free enterprise, competition, investment, and acquisition of resources (Dowd, 1993, p. 58). Multilateralist nations seek mutual cooperative agreements for defense, trade, legal, political, and economic transactions (Aviel, Muldoon, Reitano, & Sullivan, 1999, p. 1).
Nations that are militaristic display "an attitude of belligerency and an act of hostility towards other nations" (Winslow, 1948, p. 3). Imperialistic nations not only possess the "ability and willingness to use military power, but also identify the territories on and against which such power must be employed" (Winslow, p. 3).
Should one doubt that the U.S. has militaristic and/or imperialistic tendencies, it is noteworthy that the United States is
the only nation that polices the world through five global military commands; maintains more than a million men and women at arms on four continents; deploys carrier battle groups on watch in every ocean; guarantees the survival of countries from Israel to South Korea; drives the wheels of global trade and commerce; and fills the hearts and minds of an entire planet with its dreams and desires (Ignatieff, 2003, p.1).
Americans tend not to think of themselves as an imperial power. Some scholars contend that America is an "empire by invitation" that is based on (1) ideology rather than territory; (2) wealth generation; and (3) countries struggling to join it, rather than being occupied and assimilated (Lundestad, 1990, p. 54).
Many contend that America's empire is different than other historical notions of imperialism, inasmuch as America's empire reflects the ideals of
a people who remember that their country secured its independence by revolt against an empire, and who like to think of themselves as the friend of freedom everywhere. It is an empire without consciousness of itself as such, constantly shocked that its good intentions arouse resentment abroad. But that does not make it any less of an empire (Ignatieff, 2003, p.2).
In the late nineteenth century, frustrations with the inequities of capitalism, imperialism, and colonialism led to the emergence of Marxism. Many of today's extremist groups claim that these same types of frustrations have led to their use of terrorist tactics to invoke change.
Determinist nations justify their policies and the imposition of these policies on others as being consistent with what is necessary for all people (Arat, 1999). While President Bush may claim that he and America have a "mission" to bring freedom to all parts of the world, others may view this as hegemonic conquest. Just as one man's terrorist may be another man's freedom fighter, one man's liberator may be another man's tyrant.
The American approaches to expanding democracy abroad have ranged from war, to military assistance and occupation, to humanitarian and economic aid.
Historian Tony Smith notes,
Until the 1990s, American scholarship neglected to investigate with any comparative framework or historical depth the consequences abroad of surely the greatest ambition of United States foreign policy over the past century: to promote democracy abroad as a way of enhancing the national security (Smith, 1994, p. 4).
Other recent historians and political scientists have alluded to the irony that the Framers of the U.S. Constitution had a deep distrust of government power - it is through this very power that the U.S. government seeks to impose its will on other nations (Barber, 2003, p. 20).
Fundamental principles of political theory
The 14th Century Arab historian Ibn Khaldun noted that "Social organization is necessary to the human species. Without it, the existence of human beings would be incomplete" (Khaldun, 1377, p. 46). The primary forms of social organization have been families, clans, tribes, and nation-states. Aristotle argued that among social organizations, the polis, or state, was "the most sovereign and inclusive association" to be directed toward the attainment of higher goals (Aristotle, 323 B.C.E.)
The division and organization of humankind has resulted in barbarism, war, progress, and peace. On the positive side, the power of association supports public safety, protection of property, and promotion of the common welfare (Fukuyama, 2004, p. 1). When abused, the power of association can result in the confiscation of private property and the denial of basic civil and human rights (Fukuyama, 2004, p. 1).
The most complex form of human organization, the state, can take on a variety of forms: monarchy, tyranny, democracy, theocracy, and oligarchy. There are many theories as to how and why governments are created and evolve. Fukuyama suggests that there are four factors that determine political development:
(1) organizational design and management, (2) political system design, (3) basis of legitimization, and (4) cultural and structural factors" (Fukuyama, 2004, p. 23). The Roman historian Polybius argued that governments are cyclical, "changing from monarchy to tyranny to aristocracy to oligarchy to democracy to anarchy and then back to monarchy (Donnelly, 2000, p. 173).
Within the corpus of each state, there are political, economic, and civil paradigms that affect development and stability (Chan, 2002, p. 2). The combination of these paradigms determine variations in the basic form of governments - for example, a "democracy" can be a representative democracy, deliberative democracy, direct democracy, republic, empire, or a socialist state.
What is democracy?
"Democracy is a form of government in which the people rule" (Sorenson, 1998, p. 3). It is not perfect, nor is it necessarily egalitarian, but "as a value, democracy has near universal appeal among people from every ethnic group, practicing every religion, in every region of the world" (McFaul, 2004, p. 2). It can take many forms - republic, parliamentary, unitary, ministerial, or proportional representation.
The requirements for a "democracy" can vary, depending on the evolution of political mores and infrastructure of a nation. One common definition of democracy is that it consists of "free elections contested by freely organized parties under universal suffrage for control of the effective centers of governmental power" (Smith, 1994, p. 13).
The core value of a democracy is the ability of citizens to control and direct the government's agenda (Dahl, 2005, p. 193). In its most complex form, a democracy will encompass "free, fair, and frequent elections, freedom or expression, alternative sources of information, associational autonomy, and inclusive citizenship" (Dahl, 2005, p.188; Sorenson, 1998, p. 12).
The most fundamental characteristic of modern-day democracies is universal suffrage (Dahl, 2005, p. 189). Inclusive citizenship is often the last component to develop within a complex democracy (Dahl, 2005, p. 191).
The development of democracies
Some of the more perplexing and contested issues in the study of democracy in modern times focus upon whether the spread of democracy should be actively encouraged, and if so, how? Can democracy can be "imposed" either at the barrel of a gun or by attempting to create a set of conditions in which it will typically flourish? Alternatively, is democracy the result of an evolutionary process of reform, revolution, adaptation, and progress that takes time to develop and ferment? Another fundamental issue of contention among academics and politicians is whether in a developing nation democracy must precede free markets, or vice versa (Chan, 2002, p. 8; Odom & Dujarric, 2004, p. 6; Wolf, 1999, p. xiii).
A "liberal" democracy balances the governance by the majority with limitations on the power of government (Chan, 2002, p. 39). Historically, democracies have had to guard against the dangers that can arise from the "tyranny of the majority" and the factionalism that can arise from the oppression of the will of the minority (Hamilton, Madison, and Jay, 1788). In early American history, outside observers, such as Alexis de Tocqueville noted,
The right of governing society, which the majority supposes itself to derive from its superior intelligence, was introduced into the United States by the first settlers; and this idea, which of itself would be sufficient to create a free nation, has now been amalgamated with the customs of the people and the minor incidents of social life...
...The moral power of the majority is founded upon yet another principle, which is that the interests of the many are to be preferred to those of the few... If there existed in America a class of citizens whom the legislating majority sought to deprive of exclusive privileges which they had possessed for ages and to bring down from an elevated station to the level of the multitude, it is probable that the minority would be less ready to submit to its laws. But as the United States was colonized by men holding equal rank, there is as yet no natural or permanent disagreement between the interests of its different inhabitants (de Tocqueville, 1820, Ch. 15).
The critique of deToqueville neglected to reference concerns about the "tyranny of the majority" and the need to protect the rights of individuals (at least those individuals who white males) that were urged by the authors of The Federalist papers and embodied in the Bill of Rights, Amendments I-X, of the U.S. Constitution (Hamilton, Madison, and Jay, 1788).
Western vs. Eastern thought - the individual vs. the collective
The emphasis on the individual and individual rights is the foundation of Western liberal democracies. The religious root of these rights are derived from the concept of the "free examination" of ideas and the recognition of each individual as a being of infinite spiritual value (Odom & Dujarric, 2004, p. 14). These rights and principles are enshrined in the Corpus Iurus Civilis (CJS) promulgated by the Roman Emperor Justinian between 529-535 C.E. and the Magna Carta declaration of King John in 1215.
The CJS codified the civil laws that regulated civil liberties, master-slave, husband-wife, parent-child, guardian-ward, wills, commercial transactions, and ownership of real and personal property. The CJS also made distinctions between "public" law and "private" law, and recognized the applicability of "natural law" in jurisprudence.
The Magna Carta, like so many other historical concessions of power, was not premised on idealism or goodwill, but rather, on economic necessity. The feudal lords and nobles agreed to acquiesce to taxation by the Crown in exchange for guarantees of civil liberties and due process of law. Although England remained a monarchy, the notion of uniting to limit the power of government became popular over the centuries that followed. It is important to note that while Judeo-Christian theologies no doubt heavily influenced western jurisprudence, a fundamental tenet of western thought is that divine law cannot provide the answer to all of life's conflicts; human law is necessary to supplement divine law and regulate everyday living, commerce, and human relationships (Aquinas, 1274).
The Cartesian ideal of the individual and the recognition of the necessity of human laws directly conflict with many tenets of Eastern theologies and philosophies. The importance of the "collective" and "unity" with other lives are foundational elements of Confucianism, Taoism, Buddhism, and Hinduism. At the geographic intersection of East meets West, competing philosophies and theologies also clash - the tension between the "one" and the "many" is evident in Islam.
The prophet Mohammed began proselytizing in the early 7th Century. The emergence of Mohammed and his teachings resulted in a division among the descendants of Abraham in the Middle East and northern Africa; adherents of Mohammed developed their own bodies of law and governance over the subsequent centuries, resulting in a far different societal evolution than that which occurred in the West.
Muslim scholars commonly characterize the progression of jurisprudence in Islam as natural law evolving to a monarchy evolving to the highest and preferred state, Shari'a, based on the Qur'an and teachings of Mohammed (El Fadl,, Waldron, Esposito, & Feldman, 2004, p. 1). In Islam, God is sovereign over the polity, with a caliph and jurists appointed to implement His will (El Fadl, p. 10). While the Shi'a believe that Mohammed appointed a caliph before his death, and can trace successive rulers of the caliphate back to that appointment, other sects within Islam do not adhere to this belief, and contend that Mohammed did not name a successor (El Fadl, 2004, p. 11).
The profound differences in Western versus Eastern thought have contributed to several wars fueled by religion, poverty, imperialism, and the quest for natural resources. Many efforts to exercise western dominion over Arab nations have been attempted, as have efforts to impose Western style democracy - none have yet been successful to date.
The Founding Fathers' views of democracy, military power, and foreign affairs
There is little doubt that our founding fathers' views on democracy were quite constrained, particularly when compared with modern democracy. To the founding fathers, suffrage was not universal, but rather, was limited to white males, and in some states, only white male landowners (Dahl, 2005, p. 191).
The drafters of the Constitution intended primarily for the executive and judicial branches to "check" the powers of the Congress; they feared that "majority factions" could gain control of the Congress and pass legislation that was "unfair to many...or that interfered with property rights" (Lobel, 1998, p. 592). The founders believed that "checks and balances" would ensure stability and would prevent the government from acting "radically or impulsively" (Lobel, 1998, p. 593).
The system of "checks and balances" within the federal government was designed to naturally pit the ambitions of actors within each branch against the ambitions of actors in the other branches. One of the most pressing concerns of the framers was the capacity to wage war - they sought to prevent the ability of the President to "rush into war" and to generally make the entry into any war more difficult by the division of military powers between the Executive and Legislative branches (Lobel, 1008, p. 595).
Washington, like many battle-seasoned warriors, was well aware of the need to tread cautiously in military and foreign affairs. In his Farewell Address, Washington noted,
...permanent, inveterate antipathies against particular nations, and passionate attachments for others, should be excluded; and that, in place of them, just and amicable feelings towards all should be cultivated. The nation which indulges towards another a habitual hatred or a habitual fondness is in some degree a slave. It is a slave to its animosity or to its affection, either of which is sufficient to lead it astray from its duty and its interest. Antipathy in one nation against another disposes each more readily to offer insult and injury, to lay hold of slight causes of umbrage, and to be haughty and intractable, when accidental or trifling occasions of dispute occur. Hence, frequent collisions, obstinate, envenomed, and bloody contests. The nation, prompted by ill-will and resentment, sometimes impels to war the government, contrary to the best calculations of policy. The government sometimes participates in the national propensity, and adopts through passion what reason would reject; at other times it makes the animosity of the nation subservient to projects of hostility instigated by pride, ambition, and other sinister and pernicious motives. The peace often, sometimes perhaps the liberty, of nations, has been the victim (Washington, 1796).
In his first inaugural address, Thomas Jefferson noted that America was "Kindly separated by nature and a wide ocean from the exterminating havoc of one quarter of the globe," for which he was glad, as it would aid in keeping the new nation out of unnecessary "foreign entanglements" (Jefferson, 1801). He advised, "The great rule of conduct for us in regard to foreign nations is, in extending our commercial relations to have with them as little political connection as possible," (Jefferson, 1801).
During the 19th Century, America was not completely isolated from the rest of the world, and it did seek to exert some influence in the world - but it did so through continental, rather than colonial expansion, and through liberalized international trade (Judis, 2004, p. 16). Capitalism was unfettered as a domestic and foreign policy. Early colonists sought an escape the tyranny associated with the Old World monarchies of Europe, so the early disdain for imperialist expansion is understandable settled our nation.
At the time James Madison, the third president of the United States, took office in 1809, Europe was embroiled in both revolutionary and international conflicts. At his first inaugural speech, Madison solemnly declared:
The present situation of the world is indeed without a parallel, and that of our own country full of difficulties... Under the benign influence of our republican institutions, and the maintenance of peace with all nations whilst so many of them were engaged in bloody and wasteful wars, the fruits of a just policy were enjoyed in an unrivaled growth of our faculties and resources. Proofs of this were seen in the improvements of agriculture, in the successful enterprises of commerce, in the progress of manufacturers and useful arts, in the increase of the public revenue and the use made of it in reducing the public debt, and in the valuable works and establishments everywhere multiplying over the face of our land... Indulging no passions which trespass on the rights or the repose of other nations, it has been the true glory of the United States to cultivate peace by observing justice, and to entitle themselves to the respect of the nations at war by fulfilling their neutral obligations with the most scrupulous impartiality (Madison, 1809).
Madison's policies in the early part of his first term were premised on the notion that neutrality equals wealth and security, because it allowed the U.S. to profit from trading with all nations, while sparing itself from the wrath of other nations, had it chosen to somehow take sides or give favorable trading status to particular nations over others.
Before Madison's first term ended, the U.S. became engaged in armed conflict in the War of 1812 due to disputes arising from piracy and trade sanctions against the U.S. The War of 1812 was the first of only five Congressionally declared wars in U.S. history (Ackerman & Grimmett, 2003, p. 8). Madison's view of foreign policy and military force changed accordingly. In his second inaugural speech, Madison proclaimed, "The war was just in its origin and necessary and noble in its object . . . in carrying it on, no principle of justice or honor, no usage of civilized nations, no precept of courtesy or humanity, have been infringed" (Madison, 1813).
Manifest destiny, continentalism, and isolationism
In the early 19th Century, American "foreign policy" pursuits focused on (1) "the Indian nations" and (2) international trade. As early as 1801, America's navy began deploying abroad to protect trade and shipping along the coast of Northern Africa and the Western Mediterranean from the Barbary pirates and others. There was no "higher cause" attributed to these naval actions, they were merely defensive in nature.
As American settlers expanded westward, conflict with Native Americans became more common, more organized, and more violent. President Monroe, in his second inaugural address indicated that a shift in policy was in order:
The care of the Indian tribes within our limits has long been an essential part of our system, but, unfortunately, it has not been executed in a manner to accomplish all the objects intended by it. We have treated them as independent nations, without their having any substantial pretensions to that rank. The distinction has flattered their pride, retarded their improvement, and in many instances paved the way to their destruction... We should become their real benefactors; we should perform the office of their Great Father, the endearing title which they emphatically give to the Chief Magistrate of our Union. Their sovereignty over vast territories should cease, in lieu of which the right of soil should be secured to each individual and his posterity in competent portions; and for the territory thus ceded by each tribe some reasonable equivalent should be granted, to be vested in permanent funds for the support of civil government over them and for the education of their children, for their instruction in the arts of husbandry, and to provide sustenance for them until they could provide it for themselves (Monroe, 1821).
In 1823, following continued war skirmishes among the Europeans, fueled by colonialist expansion around the world, President Monroe announced in his 7th Annual Speech to the Congress that "the American continents, by the free and independent condition which they have assumed and maintain, are henceforth not to be considered as subjects for future colonization by any European powers" (Monroe, 1823). This proclamation became known as "The Monroe Doctrine," and would lead to several military interventions by the United States within the Western Hemisphere during the past 182 years.
The Monroe Doctrine was not premised on any "noble" principles of protecting or promoting democracy in our neighbors' countries in the Americas. The fundamental purpose of the policy was to protect American security and trade interests by minimizing the risk of war near our borders or shipping lanes or with our trading partners who were closest geographically to us. There was no desire to interfere in the government of other nations or to impose American values upon any other nation. Monroe concluded his speech saying, "It is still the true policy of the United States to leave the parties to themselves, in the hope that other powers will pursue the same course" (Monroe, 1823).
Monroe's Secretary of State, and presidential successor, John Quincy Adams, was more adamant about not becoming involved in the affairs of other nations. He argued that America should
Abstain from interference in the concerns of others, even when conflict has been for principles to which she clings, as to the last vital drop that visits the heart. . . Wherever the standard of freedom and independence has been or shall be unfurled, there will her heart, her benedictions and her prayers be. But she goes not abroad, in search of monsters to destroy. She is the well-wisher to the freedom and independence of all. She is the champion and vindicator only of her own. . . She well knows that by once enlisting under other banners than her own, were they even the banners of foreign independence, she would involve herself beyond the power of extrication, in all the wars of interest and intrigue, of individual avarice, envy, and ambition, which assume the colors and usurp the standard of freedom. The fundamental maxims of her policy would insensibly change from liberty to force (Adams, 1821).
The overriding goal of American foreign policy abroad was to "...cultivate the friendship of all nations...decline alliances, as they are adverse to our peace...[and] commercial relations on equal terms" (Van Buren, 1837).
Although Americans were not interested in conquest abroad, conquest Westward continued. Although U.S. presidents made protests concerning 19th century government upheavals and colonial strife in other parts of the Americas, these matters were considered not to be vital to American interests. It wasn't until the late 1880's that expansion beyond the continental U.S. was eyed in order to protect and preserve American interests (Judis, 2004, p. 27).
Empire and War - Imperialism and determinism
At the turn of the 20th Century, Roosevelt, McKinley, and other leaders sought to expand America's influence beyond its shores, primarily for commercial reasons, albeit cloaked in the justifications that the U.S. would annex new lands in order to "civilize and Christianize" them (Judis, 2004, p. 4). Theodore Roosevelt argued that "the United States was justified in exercising `international police power' to put an end to chronic unrest or wrongdoing" (Marina & Beito, 2004).
The popularity of "Social Dawrinism" further fueled imperialist goals. Rev. Josiah Strong epitomized this philosophy in his bestselling book, In our country (1895):
Americans are a race of unequaled energy, with all the majesty of numbers and the might of wealth behind it - the representative, let us hope, of the largest liberty, the purest Christianity, the highest civilization. [Having] developed peculiarly aggressive traits calculated to impress its institutions upon mankind, [America] will spread itself across the earth. . . . And can any one doubt that this race, unless devitalized by alcohol and tobacco, is destined to dispossess many weaker races, assimilate others, and mold the remainder, until, in a very true and important sense, it has Anglo-Saxonized mankind? (p. 178).
Social Darwinism provided the moral justification for Americans to abandon the warnings of the Founding Fathers and over 100 years of relative non-interference abroad (Barber, 2003, p. 75; Smith, 1994, p. 38; Zimmerman, 1998).
Introducing democracy, industry, and free market economies to foreign lands were said to be benevolent acts of the U.S. that furthered U.S. national security and economic growth. (Peceny, 1999, p. 3). Interestingly enough, our relatively young nation acquired such grandiose and "altruistic" plans only after we had extended as far northward, southward, and westward as we could on the North American continent.
In 1893, American sugar growers revolted against the monarchy in Hawaii. U.S. Marines were sent to assist the Americans, overthrowing the monarchy, establishing a republic, and remaining as an occupying force on the island. Hawaii became an "official" U.S. territory in 1900.
When President McKinley first took office in 1897, issues of international trade and currency were coming to a head, including the need to resolve conflicts in monetary valuation from the gold versus silver standards employed in conversion and calculation of reserves (McKinley, 1897). His speech also referenced concerns about "the depressed condition of industry on the farm and in the mine and in the factory" (McKinley, 1897).
One year later, McKinley declared war on Spain, following the sinking of the Maine in Cuba. For several years preceding this incident, the United States had been trying to get the government of Spain to sell Cuba, but each offer was rejected. Simultaneously with purchase negotiations, the U.S. began covertly assisting Cuban insurgents and encouraging revolt against the Spanish rulers.
The Spanish-American war is a "watershed moment" in U.S. history because it was a departure from the avoidance of colonialism and "foreign entanglements" so feared by the Founding Fathers (Smith, 1994, p. 38). U.S. territorial claims were secured by American military occupation in Cuba, Puerto Rico, Guam, and the Philippines American industries and investors reaped bounties from new opportunities and access to commodities in these markets (Nearing & Freeman, 1925, p. 180).
One of McKinley's communiqués to the newly conquered in the Philippines was that
the mission of the United States is one of benevolent assimilation, substituting the mild sway of justice and right for arbitrary rule. In the fulfillment of this high mission, supporting the temperate administration of affairs for the greatest good of the governed, there must be sedulously maintained the strong arm of authority, to repress disturbance and to overcome all obstacles to the bestowal of the blessings of good and stable government upon the people of the Philippine Islands under the free flag of the United States (Nearing & Freeman, 1925, p. 198).
Although promoting democracy in Cuba, the Philippines, and elsewhere was not the reason stated for declaring war on Spain, it became the justification for annexation and continued occupation. This was a pattern that would repeat itself throughout the century in other conflicts.
Ignatieff notes that "the dual nemeses of empire in the 20th century were nationalism, the desire of peoples to rule themselves free of alien domination, and narcissism, the incurable delusion of imperial rulers that the ''lesser breeds'' aspired only to be versions of themselves" (Ingatieff, 2003, p.10). As theologian Reinhold Niebuhr noted,
every nation is caught in the moral paradox of refusing to go to war unless it can be proved that the national interest is imperiled, and of continuing in the war only by proving that something much more than national interest is at stake (Niebuhr, 1960, p. 273).
America's transition to colonial master was not seamless. After the defeat of the Spanish armada in the Philippines, the residents did not take kindly to American occupation, and an insurgency broke out. In the ensuing battles, over 4,000 Americans and over 220,000 Filipinos died (Smith, 1994, p. 42).
Multilateralism, Wilson, and "liberal democratic internationalism"
Woodrow Wilson was a "true believer" in democracy and its institutions. He is recognized as the father of public administration as a discipline (Shafritz & Russell, 2003, p. 24). Wilson recognized that "It is harder to run a constitution than to frame one" (Wilson, 1887). A society with a democratic core was believed to be more stable, because it offered a framework to enable people and factions to resolve their differences through elections and the rule of law, rather than through insurrection or war (Smith, 1994, p. 88).
Wilson managed to keep the U.S. out of World War I during the first three years of the conflict, even after the Lusitania was sunk by the Germans in 1915. It was only after the intercept of the Zimmerman telegraph in 1917, wherein Germany sought to entice Mexico to enter the war in exchange for regaining part of the Southwestern U.S. that Wilson acted and Congress declared war.
On the threshold of entering a war the likes of which the U.S. and the world had never before seen, Wilson remained the idealist. His speech requesting that the Congress declare war and listing the goals of the U.S. entering the war was one of a statesman, not a conqueror:
The world must be made safe for democracy. Its peace must be planted upon the tested foundations of political liberty. We have no selfish ends to serve. We desire no conquest or domination. We seek no indemnities for ourselves, no material compensation for the sacrifices we shall freely make. We are but one of the champions of the rights of mankind. We shall be satisfied when those rights have been made as secure as the faith and freedom of the nations can make them (Wilson, 1917).
Wilson was not aware of it at the time, but he was laying the foundation for American foreign policy for the century that followed (Smith, 1994, p. 85).
Wilson adopted a policy of encouraging American promotion of democratic change in other nations, not purely for commercial gain or altruistic reasons, but because he saw the competition for and the struggles within former colonies as being the root causes of World War I (Judis, 2004, p. 5).
Wilson believed that by dismantling the apparatus of imperialism throughout the world and establishing multilateral relationships among nations, future wars could be prevented, and this, in turn, promoted the national security interests of the United States (Judis, 2004, p. 6).
At its most basic level, Wilsonian "liberal democratic internationalism" is premised on the belief that America's "security is best protected by the expansion of democracy worldwide" (Smith, 1994, p. 9). Wilson's ideals included fostering free trade, reducing international arms races, discouraging economic imperialism, and promoting collective security agreements (Judis, 2004, p. 6).
The Wilsonian recipe for global peace and prosperity sought to create "Nationalism wed to democracy; democracies wed in peace, prosperity, and mutual respect embodied in international law and institutions" (Smith, 1994, p. 87). After the ravages of World War I, Wilson fought tirelessly for the ratification of the Versailles Treaty and U.S. membership in the League of Nations. "Wilson bequeathed a singularly American response, disavowing the primacy of interest, affirming the benign influence of democratic governance, and extolling the benefits of arms limitation, open trade, international law and collective security." (Leffler, 2003, p. 1058).
Wilson believed that democratization of individual nations was a fundamental requirement for lasting peace. This is also the stated foundation for President Bush's current policy in the Middle East. The primary difference between Wilson and Bush is that Wilson believed that multilateralism, such as was embodied in the League of Nations, was the vehicle through which to advance democratic change. Bush advocates American hegemony and unilateralism to achieve stated objectives (Jervis, 2004, p. 12).
Retreat and a return to isolationism
During the 1920 presidential elections, Warren Harding pledged during his campaign that he "would not empower an Assistant Secretary of the Navy to draft a Constitution for helpless neighbors in the West Indies and jam it down their throats at the point of bayonets borne by U.S. Marines" (Schmidt, 1971, p. 118). The election of Harding signaled Americans' post-WWI desires to focus on domestic issues and retreat from the world's problems. Harding and the Republican majority in Congress viewed membership in the League of Nations as foisting upon America and unwanted duty to "police the world" (Zasloff, 1993, p.1618).
Ignoring the economic strife caused by the Treaty of Versailles and the pools of radical discontent in Europe, America reveled in the "Roaring 20's" and imposed high tariffs on imports during the Harding and Hoover years. As Europe hurled toward World War II in the late 1930's, America was mired in economic depression and political isolationism. While an overwhelming majority (88%) of Americans sympathized with our traditional allies in Europe, nearly the same number (77%) were unwilling to enter the war, even if it meant the Axis powers would defeat England and France and become continental hegemons (Nolan, 1993, p. 58).
Multilateralism returns - World War II
World War II was a conflict that had no ambiguity as to why the U.S. entered as a combatant, and was the epitome of "total war." Following the December 7, 1941, attack on Pearl Harbor, Americans remained united and solid in their support of the war for its duration. Presidents Roosevelt and Truman were convinced that America's post WWI isolationism was a major contributing factor that led to WWII, so they resolved to build a permanent framework of alliances, as well as a global organization for resolution of differences, the United Nations (Pollard, 1985, p. 11).
Harkening back to the Lockean principle that "all men are born free," in the modern era several U.S. Presidents have argued that America's security depended on the expansion of democracy to other nations. The approaches to expanding democracy have ranged from war to military "assistance" to economic aid. In terms of non-military options,
Wilson focused on the League; Franklin Roosevelt helped craft the IMF, the World Bank and the United Nations; Truman, Eisenhower and Kennedy sought to fashion the political and economic instruments that nurtured the recovery of Germany and Japan and facilitated their peaceful integration into the international system (Leffler, 2003, p. 1061).
The Truman and Eisenhower Doctrines
Truman adopted many Wilsonian ideals in the aftermath of World War II, as is evident by his support of the United Nations, the International Monetary Fund, and the Marshall Plan (Judis, 2004, p. 7). The American idea of a world order opposed to imperialism and composed of independent, self-determining, preferably democratic states bound together through international organizations dedicated to the peaceful handling of conflicts, free trade, and mutual defense (a package of proposals that may be called "liberal democratic internationalism") has been with us in mature form since the early 1940s (Smith, 1994, p. 7).
America's historical tendency to retreat to isolationism following protracted conflict was thwarted in the post-WWII era by the perceived threat of the spread of communism and global domination by the Soviet Union and China. The conflict in the Korean peninsula in 1949 served to fuel the race for military superiority and influence abroad between the U.S. and the U.S.S.R. (Nolan, 1993, p. 106).
American support for occupation of foreign lands was lukewarm at periods during the reconstruction of Germany and Japan began; but for the emergence of competition for satellite states during the Cold War, it is likely that the U.S. involvement in these countries would have been substantially less (Western, 2004). In NSC Directive 68, President Truman initiated the Cold War paradigm sought to advance had two primary objectives - "to reduce the power and influence of Moscow ... and to bring about a basic change in the theory and practice of international relations observed by the government ... in Russia." (Leffler, 2003, p. 1052).
The post-WWII Marshall Plan served many useful purposes as a comprehensive package of aid and programs to rebuild infrastructure, monetary systems, and free-market economies in Europe (Wolf, 1992, p. 22). Despite post-WWII military occupation by American forces, U.S imposed "democratization" efforts in Germany and Japan were limited to the economic realm (Dobbins, McGinn, et.al., 2003, pp. 6-28).
In Europe, U.S. military forces provided security and a buffer against the expansion of the Soviet empire, but for the most part, U.S. intervention in the internal governance of these nations was limited and tangential. Nearly all of the Marshall Plan's aid was "government to government," thus decreasing the imprint of the U.S. on projects and programs that developed within these countries (Wolf, 1992, p. 24). This is substantially different than the post-war assistance being rendered in Afghanistan and Iraq currently, which is largely being done through U.S. payments to private contractors, many of whom are foreign, rather than indigenous to the countries in which they are working.
The powers of the Presidency have grown in relation to Congress in the last fifty years; this is particularly true with respect to national security, foreign affairs, and defense. The United States Constitution specifically delegates the power to declare war to the Congress, and denotes the president as commander-in-chief. Our nation has deployed military forces to fight in Korea, Vietnam, Bosnia, Kosovo, Afghanistan, Iraq, and elsewhere during the past 50 years, and yet, the last time that the Congress actually declared war on another nation was in 1941.
Since World War II, the Executive Branch has become more independent of the Judiciary and Legislative branches. Congress delegates powers and continues to fund prerogatives as they are undertaken by the Presidency; it rarely exercises "the power of the checkbook" to check the power of the Executive Branch. Lack of the exercise of meaningful Congressional oversight also has increased the power of the Executive Branch relative to Congress. John Woo, a former senior member of the U.S. Justice Department, and legal advisor to both President George H.W. Bush and President George W. Bush argues that all that Congress can do with respect to war or combat deployments initiated by the president is to impeach him or refuse to fund the operations (Fisher, 2000, p. 1637).
In 1950, President Truman deployed forces to the Korean peninsula without first obtaining Congressional approval. Truman acted under the guise of a U.N. mandate. His Secretary of State, Dean Acheson, coined the phrase, "police action," to describe U.S. combat in Korea (Wolk, 2000). Of course, there is no mention in our constitution of the U.N. being able to "authorize" U.S. troops to deploy and fight a war (Fisher, 2000, p. 1637). The notion of a "limited war" was also made part of U.S. strategy for the first time (Wolk, 2000).
Interestingly enough, Truman's successor, President Eisenhower, the former 5-star General and Supreme Allied Commander, believed that Truman's invasion of Korea was not only a strategic blunder, but was also unconstitutional (Fisher, 2000). Eisenhower pledged that he would never deploy troops to combat or hostile areas without a congressional declaration of war. Eisenhower set forth his policy as follows: ``I deem it necessary to seek the cooperation of the Congress. Only with that cooperation can we give the reassurance needed to deter aggression" (Fisher, 2000).
Vietnam and the Kennedy, Johnson, and Nixon Administrations
In 1962, President Kennedy initiated blockades of Cuba and took the nation to the brink of nuclear war, all without authorization by Congress. Kennedy believed that his inherent power as commander-in-chief allowed him to exercise military and war powers as he saw fit, without advance authorization or consultation with Congress (Fisher, 2000).
In 1964, President Johnson sought passage of the Gulf of Tonkin Resolution (H.J.Res. 1145) following alleged attacks upon U.S. vessels by the North Vietnamese. The Resolution authorized the president to use military force to "repel armed attack" and to "prevent further aggression in Southeast Asia" (Byrd, 2003, p. 1).
These actions subsequently "upped our ante" in Vietnam and began the "mission creep" that eventually led to deployments of up to 550,000 soldiers at a time in Vietnam. While there was pro forma talk of "democratization" and "freedom" as being an objective of the U.S. engagement in Vietnam, the reality was that the primary and sustaining objectives were the containment of the Soviet Union, China, and communism.
Ford, Carter, and the Post-Vietnam era
The War Powers Act (PL-148, henceforth referred to as the WPA) was passed in 1973 as a Joint Resolution of Congress, after it was originally vetoed by President Nixon ("The War Powers Act", 1973). The WPA arose out of multiple efforts by Congress to halt military action in Vietnam through means other than simply cutting off funding to troops in the field (Auerswald & Cowhey, 1997, p. 505). Its effect has been to virtually eliminate the consultative and deliberative role of the Congress in pre-invasion planning and approval.
The Act sought to prevent future protracted conflicts from occurring due to "mission creep," deceptive phraseology like "police action," and unchecked presidential power that was seen during the war in Vietnam. As a practical matter, the WPA has not checked the presidents' use of the military as they have seen fit because the WPA "has internal inconsistencies, lacks enforcement mechanisms. . . and because Congress lacks the will to challenge a president" that has advocated the use of force to defend American national security interests (Auerswald & Cowhey, 1997, p. 505). The WPA resigns Congress to a subordinate and purely "reactive" role in initial combat decisions should the President choose not to seek explicit authorization or a declaration of war prior to deploying troops (Farrar-Myers, 1998, p. 183). Once troops are so deployed, they must be withdrawn if after 60 days Congress has neither declared war nor enacted a specific authorization for the continuation of hostilities (Campbell, 2000, p. 1).
The United States has deployed troops into situations that have involved the WPA at least 173 times since 1973 (Grimmett, 2003). Formal reports to the Congress under the WPA seeking extended authorization and funding have been submitted at least 104 times. Surprisingly, President Clinton submitted 60 of the 104 reports; other totals include Presidents Ford (4), Carter (1), Reagan (14), George H.W. Bush (7), and George W. Bush (at least 18) (Grimmett, 2003). Clinton may not have been more bellicose than his counterparts, but rather, more conscientious about following the letter of the law when it came to the WPA.
When Jimmy Carter was elected president in 1976, he attempted to break away from some of the post-Vietnam isolationism that had taken hold of the country. Carter's foreign policy priorities were (1) laying the foundation for an enduring peace in the Middle East and (2) promoting democracy and human rights throughout the world. In Carter's inaugural speech, he said,
Peoples more numerous and more politically aware are craving and now demanding their place in the sun--not just for the benefit of their own physical condition, but for basic human rights. The passion for freedom is on the rise. Tapping this new spirit, there can be no nobler nor more ambitious task for America to undertake on this day of a new beginning than to help shape a just and peaceful world that is truly humane (Carter, 1977).
Unlike many of his predecessors, President Carter did not use the American military in his efforts to spread democracy and freedom. Although President Carter played a major role in getting Israel and Egypt to sign peace accords, his dreams of actually spreading freedom in that region were impeded by U.S. economic woes and the fall of the Shah of Iran.
Reagan and the beginning of the end of the Cold War
When Reagan began his presidency, he had no desire to engage in "Wilsonian" exports of democracy. His goals were "to contain the adversary, the Soviet Union; to reinvigorate U.S. alliances and foster support for American interests and goals; and to command respect for American hegemony among the nations of the third world" (Farrar-Myers, 2001). Rather than expanding the "free world," Reagan's initial policies were directed at "halting its erosion" (Johnson, 2000, p. 28).
In 1983, President Reagan sought Congressional authorization prior to sending troops to Lebanon in support of U.N. peacekeeping operations; although Congress approved the deployment, it did so with a number of advance conditions, including a limitation of the operational engagement to 18 months (Grimmett, 2003). Reagan also send combat troops to invade Grenada in 1983 without notifying Congress beforehand (Farrar-Myers, 1998, p. 183).
Reagan's invasion of Grenada in October of 1983 was not premised on promoting democracy, but rather, concerns that the Russians, via Cuba, were attempting to "project power in the region" by building a new, 10,000 foot runway (Farrar-Myers, 2001). His stated reasons for the invasion were:
(1) to protect the American citizens on the island; (2) to preserve order; and (3) to assist in the restoration of conditions of law and order and of governmental institutions to the island of Grenada where a brutal group of leftist thugs violently seized power (Reagan, 1983, p. 1505).
Reagan's critics often accused him of being a reckless cowboy itching for a fight. To those who criticized foreign deployments and establishing military bases abroad, Reagan responded, "We're not somewhere else in the world protecting someone else's interests; we're there protecting our own" (Reagan, 1983, p. 1521).
When President Reagan launched bombing attacks against Libya in 1986, it was not due to any "imminent" threat or hostility, but rather, in retaliation for the bombing of a Berlin disco frequented by Americans that killed three people and injured over 200. Congress was notified of the Libyan operation only after the planes were launched and were nearing their targets.
Reagan's use of the military was rarely justified by flowery language about freeing people and promoting democracy. A notable exception was the attempt by members of his Administration to aid insurgents in Nicaragua, which hardly posed a "credible threat" to U.S. national security. The praises of democratization and "freedom fighters" were sung from the highest rooftops in Washington when numerous members of the Administration were under Congressional and criminal investigation, and later prosecuted.
Bush and Clinton revive Wilsonian ideals - A dash of determinism and a whole lot of multilateralism
When he ascended to the presidency in 1989, George H.W. Bush found himself at a "defining moment" of both U.S. and global history (Wolf, 1992, p. 3). The collapse of the Soviet Union changed power and trade dynamics in the world. Rather than gloating about the defeat of a long-time enemy, Bush assumed a Wilsonian approach to leadership:
We are called not to wage a war, hot or cold, but to win the democratic peace, not for half a world as before but for people the world over. The end of the cold war, you see, has placed in our hands a unique opportunity to see the principles for which America has stood for two centuries, democracy, free enterprise, and the rule of law, spread more widely than ever before in human history (Bush, G.H.W., 1992).
Bush further warned against the dangers that result when America retreats from the world stage, lest anyone wish to rest on our past laurels and the "peace dividend": "When a war-weary America withdrew from the international stage following World War I, the world spawned militarism, fascism, and aggression unchecked, plunging mankind into another devastating conflict" (Bush, G.H.W., 1992).
In the period following the break-up of the former Soviet Union, the United States began to shift the focus of military actions from defense to nation building and peacekeeping. President George H.W. Bush defended nation-building efforts to his Republican critics, arguing that, "our vision is not mere utopianism. The advance of democratic ideals reflects a hard-nosed sense of our own, of American self-interest" (Bush, G.H.W., 1992).
The U.S wasn't alone in this shift in emphasis - during the first 40 years of its existence, the United Nations launched a total of 14 peacekeeping missions around the world; within the past decade, the United Nations has launched at least 40 such missions (Dobbins, Fukuyama, & Nash, 2005).
Several factors have contributed to the increased use of the American military interventions in the Post Cold-War period, to wit,
(1) the lack of a rival force of equal strength, (2) a number of regional and civil wars, (3) increased communication technologies resulting in stronger and more concerted human rights advocacy efforts around the world, and (4) the resurgence of terrorism and the prospect of terrorists acquiring WMD (Western, 2004).
Dr. Louis Fisher, senior historian for the Congressional Research Service at the Library of Congress has summarized the Framers' intent on the issue of war: "The framers gave Congress the power to initiate war because they believed that Presidents, in their search for fame and personal glory, would have too great an appetite for war" (Fisher, 2000). Although the U.S. has engaged in hundreds of military deployments and lost thousands of soldiers in combat during the past 60 years, the last time that Congress declared war was in 1941, when the U.S. entered World War II. Attempts to have the U.S. Supreme Court declare such actions unconstitutional have been unsuccessful, as the Court has ruled that these are political questions that can be resolved through the "power of the purse" that Congress wields.
Many of the post-Cold War policy initiatives were similar to the Marshall Plan used in the post-World War II period (Wolf, 1992, p. 5). Technical assistance, privatization of property, implementation of intellectual property laws, and the establishment of monetary controls were used, in addition to grants, loans, and financial aid (Wolf, 1992, p.7). Unlike the Cold War period, however, there was little competition from other "suitors" seeking to influence satellite states (Wolf, 1992, p. 7). Absent competition and the need for a "balance of power," the dangers of hegemony become greater.
Presidents George H.W. Bush and Clinton had no desire for conquest or hegemony, but unlike President Carter, they did use the American military in support of foreign democratization programs. They employed "carrots" and "sticks" in their efforts to expand U.S. influence and maintain peace. Their approach to spreading liberal democracy was selective, and their overall rates of success were mixed.
One of the first "carrots" was the 1989 passage of the SEED (Support for Eastern European Democracies) Act, which provided economic and trade assistance to former Warsaw Pact (communist states in Eastern Europe that were under Soviet dominion) states. Most of the SEED funds that have been appropriated and disbursed in the past 15 years to Eastern Europe have been through the U.S. Agency for International Development (USAID) and NGO's (U.S. Department of State, 2004).
Beneficiaries of the SEED program include Albania, Bosnia-Herzegovina, Bulgaria, Croatia, the Czech Republic, Estonia, Hungary, Latvia, Lithuania, Macedonia, Poland, Romania, Serbia, Slovakia, and Slovenia (Holmes, 1997). SEED aid to war-torn areas has included funding for law enforcement training, emergency shelters, business loans, and rebuilding of infrastructure (Holmes, 1997). In the more stable and peaceful nations, e.g., Poland, Estonia, the Czech Republic, SEED funding has focused on expansion of entrepreneurship and enterprises that would attract private foreign investors (Holmes, 1997). The SEED program is considered by many to be one of the most successful U.S. foreign aid programs of the post-Cold War period.
President George H.W. Bush premised his first major military intervention, the invasion of Panama and the ouster of Manuel Noriega, on the argument that American lives were at risk, after the 1989 beating of a U.S. naval officer and his wife by Panamanian Defense Forces (Farrar-Myers, 2001). At the time, 35,000 Americans lived in Panama (Farrar-Myers, 2001). Beyond the capture of Noriega and having "a" change in government in Panama, the U.S. interests were satisfied, and there were no further nation-building efforts expended in Panama.
Bush's subsequent commitment of troops to Operation Desert Storm was part of a multi-national, United Nation effort. It was premised on the invasion of Kuwait, a sovereign nation, by Iraq; the fact that a large portion of the world's oil supply was threatened by the occupation of Kuwait united allies to take military action. Once the Iraqis were pushed back across the border and Kuwait was secured, there were no further efforts to "nation-build" or promote democracy in Kuwait or Iraq. The seeds for the "Neo-Conservative" movement were sown when President Bush refused to extend Operation Desert Storm into a conquest of Iraq.
In 1991, a coup d'etat in Somalia threatened to launch that nation into civil war among regional warlords. The United Nations adopted Resolution 751, which provided for a U.N. peacekeeping force and a framework for humanitarian intervention (Dobbins, McGinn, et.al., 2003, p. 55). Over the objection of many republicans, President Bush committed U.S. forces to the U.N. operation. Reconstruction, relief, and security were the mission objectives - there was no democratization component to the operation (Dobbins, McGinn, et.al., 2003, pp. 58-60).
In the 1998 National Security Strategy, the Clinton Administration adopted many of the principles advocated by the Neo-conservatives, proclaiming that "United States remains the world's most powerful force for peace, prosperity and the universal values of democracy and freedom" (Clinton, 1998, p. 4). Clinton's initiatives put forth in 1998 were premised on three core objectives, to wit, enhancement of national security, bolstering U.S. economic growth, and promoting democracy abroad (Clinton, 1998, p. 4).
Clinton's justified the costs of his foreign policy initiatives by arguing that "Every dollar we devote to preventing conflicts, promoting democracy, and stopping the spread of disease and starvation brings a sure return in security and savings"(Clinton, 1998, p. 5).
Clinton's National Security Strategy (NSS) divided democratization into four regions: (1) Europe; (2) Southeast Asia; (3) The Western Hemisphere; (4) the Middle East and Southwest Asia; and (5) Africa. Each region had a different set of objectives and methods to be applied per the NSS. Objectives for advancing democracy in the Middle East and Southwest Asia were quite limited, particularly when compared to programs for other regions; the only nations specifically mentioned within the region with respect to promoting democracy were Iraq and Iraq (Clinton, 1998, p. 57). The omission of the Gulf Cooperation Council (GCC) states of Saudi Arabia, Kuwait, Oman, Qatar, the United Arab Emirates, Bahrain and the ignoring of civil and human rights abuses that occur within these regimes is viewed as hypocritical by many others in the region and the world (Fandy, 1997).
Presidents George H.W. Bush and Bill Clinton sought to keep the U.S. in partnership with other nations in order to promote democracy, preserve peace, and expand global market opportunities. President Bush said in 1992, "Our choice as a people is simple: We can either shape our times, or we can let the times shape us. And shape us they will, at a price frightening to contemplate, morally, economically, and strategically" (Bush, G.H.W., 1992). As will be discussed at length in the Depth component, this message is one that his son, President George W. Bush, did not embrace until after the tragic events of 9/11.
American Determinism and the spread of democracy in the 20th Century
Historically, American plans for democratic development within foreign nations seem always to initially focus on privatization of industry, rather than political and civil reform (Barber, 2003, p.26). Critics contend that this historical pattern portends that the U.S. is a neocolonial power, rather than a pure and benevolent exporter of democracy.
The demise of the Soviet Union and the dalliance with foreign capitalism by the Chinese within the past decade have resulted in a clarion call for global democratization by some, and a deterministic conclusion by others that is western-styled, liberal democracy is "the" only successful way to govern (Chan, 2002, p.11). The IMF and World Bank have increased the conditions precedent for aid, requiring oversight in four areas of "good governance," to wit,
(1) public service management, (2) accountability, (3) a 'legal framework' for development (by which is meant rights, essentially property rights, what the Bank calls 'institutional' rather than 'substantive' aspects of law), and (4) the availability of good and sufficient information and transparency (Chan, 2001, p. 16).
Determinists contend that "pockets of illiberal creeds, racist norms, patrimonial rituals, and anti-democratic ideologies exist throughout the world, but only Osama bin Ladenism and its variants constitute a serious transnational alternative to liberal democracy today" (McFaul, 2004, p.4). America is currently led by a determinist president who believes that his "responsibility to history is to `rid the world of evil'" primarily by ousting autocratic rulers and "spreading democracy" (Barber, 2003, p.58).
The determinist theories of democracy are not without critics. Albert Camus recently noted,
Democracy is not the best system of government. It is the least evil. We have had a small taste of all types of government and therefore we know that now. But this system can only be conceived, created and sustained by men who know that they don't know everything, who refuse to accept the condition of the proletariat and who will never put up with the misery of others but who will on the contrary refuse to aggravate that misery in the name of a theory or a blind messianism (Camus & Van Den Hoven, 2001).
The determinist view of democratization also ignores the sociological, philosophical, and economic evolution that is precedent for self-rule.
The danger in the seemingly "benevolent" desire by the United States to spread its brand of democracy abroad has been summarized as follows:
The New World Order is one in which the dominant liberal culture tends to diminish awareness of alternative values and ideologies and is conducive to the ready condemnation of others for not conforming to one's own perception of the norms appropriate to them (Hipler, 1995, pp. 9-10).
American condemnation and isolation of that which is different in foreign nations fosters resentment and "anti-imperialist" backlash that often manifests itself in the form of terrorist attacks against American institutions, allies, and people.
The core of many Eastern societies is the family, clan, or tribe. This core has intrinsic value that offers positive benefits that the individual-centered western society often lacks. Many scholars contend that the American and Western European precepts of representative democracy cannot be successfully transplanted to other cultures because the growth and evolution based on a society's own history and norms are necessary for democracy to flourish (Odegard, 1971, p.17).
Other scholars contend that western-styled democracy is the inevitable end to any nation's quest for greatness and stability, much as Marxists touted the "specter of democracy haunting Europe" in the late 19th Century (Dryzek, 1996, p.18; Howard, 2002, p.viii). The problem is that historically, violent conflicts are the necessary precursor to the emergence of democracies (Odom & Dujarric, 2004, p.217). It is doubtful that Americans are willing to shed more of their own blood on a large scale in order to bring about freedom for others, particularly when our traditional allies and those people to be liberated are unable and/or unwilling to fight alongside us.