Authors: Andrew J. Bacevich
Tags: #General, #Military, #World, #Middle Eastern, #United States, #Middle East, #History, #Political Science
Change has begun
A new chance, a new beginning
Not for the first time in America’s War for the Greater Middle East, “narrative” was displacing reality. What policymakers in Washington wished to see—a new chance and a new beginning for Afghans—became what they saw, even if the “seeing” required first shutting their eyes. Much as he had with Third Persian Gulf War, Obama had decided upon a date certain when the Second Afghanistan War was going to end, with the great body of Americans, even those who despised the president, willing to pretend that his words made it so. They did not.
In fact, the situation in Afghanistan by the summer of 2011 compared to that phase of the Korean War that occurred after the ceasefire negotiations with the Chinese began at Kaesong in the summer of 1951. Although hostilities in Korea were to continue for another two years, the outcome was foreordained: The peninsula’s division into two antagonistic halves was going to persist. When the end finally came, it was shrouded in ambiguity.
Similarly, in Afghanistan, fighting was going to continue, but without any real expectation of affecting the outcome. Rendered ungovernable by the U.S.-led campaign of the 1980s, Afghanistan was destined to remain a shattered country. After more than a decade of exertions aimed at putting the country back together again, the “new” Afghanistan remained a figment of Washington’s imagination. Here too, the end, whenever it came, was sure to be shrouded in ambiguity.
On December 28, 2014, President Obama announced that “the longest war in American history” had come “to a responsible conclusion.”
To this presidential claim, an even semi-attentive public might have replied, “Huh?” In fact, the Afghanistan War had not reached any sort of conclusion, responsible or otherwise. No such conclusion was even dimly visible. Indeed, over ten thousand U.S. troops remained in Afghanistan beyond that date, albeit in what the administration arbitrarily defined as a noncombat role even though Afghanistan remained very much a combat zone.
What set the Afghanistan War apart was not that it was the longest war in U.S. history but that it was more quickly forgotten than any other conflict in which the United States had ever participated. As if by mutual agreement, the American people and their government erased the Afghanistan experience from memory even before the bloodletting had ended.
In June 2011, while announcing the beginning of U.S. troop withdrawals from Afghanistan, President Obama said, “We take comfort in knowing that the tide of war is receding.”
In fact, the tide was not receding. Although Obama’s efforts to extricate the United States from Iraq and Afghanistan attracted understandable attention, they tell the lesser part of the story. In other quarters of the Islamic world, the range of U.S. military activities was actually expanding. Even as it sought to convey the impression of striking out boldly in new directions, the Obama administration’s chief contribution to the War for the Greater Middle East was to enlarge it.
During the Obama era, the United States initiated military action on many fronts across the Islamic world. Some of those actions marked a return to sites of earlier interventions. Others occurred in locales that the U.S. military had previously considered unimportant and sought to avoid. Having for political reasons jettisoned the phrase “global war on terrorism,” the new administration grouped its various and sundry military campaigns under the blandly generic heading of “overseas contingency operations.”
What distinguished these campaigns was the absence of any unifying aim or idea. As a consequence, the principal result of Obama’s willingness to expend American military might in places as far afield as Libya, Pakistan, Somalia, Yemen, and West Africa was to dissipate energy without notable effect. Prior to 9/11, the abiding defect of U.S. military policy had been ignorance. In the years directly after 9/11, it became hubris. During the Obama presidency, by contrast, the problem was one of diffusion. U.S. forces were increasingly found scattered across the Greater Middle East without actually making a difference anywhere in particular.
This dispersion of effort occurred in rapidly changing political circumstances. Assumptions and preconceptions that had guided U.S. military planning and operations in the Islamic world during the 1980s and 1990s and even in the years immediately following 9/11 no longer pertained. Three changes in particular stand out, one at home, the second abroad, and the third with implications in both realms.
First, the long wars in Iraq and Afghanistan that followed 9/11 pretty much exhausted the willingness of the American people to commit anything more than token numbers of U.S. troops to engage in ground combat. By the beginning of President Obama’s second term, a variant of the Vietnam Syndrome, ostensibly “kicked” by Operation Desert Storm, had reasserted itself. This did not translate into the wholesale demilitarization of U.S. policy or even the emergence of an antiwar political party. Indeed, Washington’s bipartisan appetite for armed intervention in the Islamic world persisted. By now appetite had become tantamount to addiction.
Yet a public once again averse to casualties and quagmires required changes in method. When Secretary of Defense Robert Gates, shortly before leaving office, told West Point cadets that anyone proposing “to again send a big American land army into Asia or into the Middle East or Africa” needed to “have his head examined,” he was acknowledging this shift in public temper.
President Obama himself concurred, subsequently remarking that “a strategy that involves invading every country that harbors terrorist networks is naive and unsustainable.”
Not even the president’s sharpest critics contested the point. For the moment at least, the invade-and-occupy-to-liberate phase of America’s War for the Greater Middle East had passed.
Further change came in the form of political upheaval sweeping through much of the Greater Middle East, but especially its Arab quarter. In his June 2009 Cairo speech, President Obama had offered a compelling vision of economic opportunity, religious liberty, individual equality, and “governments that reflect the will of the people” ushering in a world order based on tolerance and mutual respect.
Echoing George W. Bush and other members of the previous administration, Obama emphasized the universality of that vision. It was as applicable to the Islamic world as to Europe or the United States itself.
Beginning in December of the following year, in a rolling series of uprisings, the will of Arab peoples found concrete expression. Demands for fundamental change assailed regimes that for decades had given every appearance of permanence, first in Tunisia and then in Egypt, Yemen, Bahrain, Libya, and Syria. Although the uprisings resembled one another in energy and spontaneity, the results achieved varied widely. In some instances, the old order collapsed, although not necessarily producing outcomes that found favor in Washington. In others, the old order refused to give way and employed strong-arm tactics to fend off challenges to its authority.
This so-called Arab Awakening posed a huge predicament for the United States. From having ignored democracy in the name of oil at the outset of its War for the Greater Middle East, the United States after 9/11 had declared itself democracy’s great champion across the Islamic world. Now, with popular insurrection producing results seemingly at odds with U.S. interests, Washington was having second thoughts. This much alone seemed certain: As a formula for fostering stability, the longstanding U.S. practice of paying lip service to democracy while accommodating autocratic monarchs and presidents-for-life appeared increasingly untenable. The costs associated with hypocrisy were rising.
Then there was change associated with Israel. Through the first three decades of America’s War for the Greater Middle East, the U.S.-Israeli relationship had weathered more than a few storms as successive American administrations and Israeli governments labored to temper or, if need be, ignore the tensions between U.S. and Israeli security interests. The key to papering over those differences was to maintain the fiction that authorities in Washington and Jerusalem were equally committed to achieving a two-state solution to the Israeli-Palestinian conflict. The “peace process” held out the theoretical prospect of comprehensive reconciliation between Israel and its Arab neighbors, thereby putting to rest the antagonisms triggered by the founding of the Jewish state in 1948. As long as both parties in the U.S.-Israeli relationship sustained the pretense of being committed to “peace,” Washington’s unstinting support for Israel, providing it with diplomatic cover along with enormous quantities of arms, remained minimally controversial.
Developments during the Obama era made it increasingly difficult to sustain this arrangement. Within the United States (and more broadly throughout much of the West), public opinion became less inclined to blame the absence of peace on obstreperous Arabs. Relentless Israeli colonization of territories captured in the 1967 war vastly complicated, if not rendering implausible, prospects for creating a viable Palestinian state.
As if to drive home the point, a high-profile effort to restart peace talks, launched in 2013 by Secretary of State John Kerry, went nowhere. Meanwhile, periodic punitive actions against Palestinians that inflicted casualties wildly out of proportion with the ostensible provocation—“mowing the lawn,” in Israeli parlance—suggested a preference for the calculable effects of collective punishment over the uncertainties of negotiation. In the spring of 2015, running for reelection, Israeli Prime Minister Benjamin Netanyahu all but made it official. Declaring the two-state solution “unachievable,” he vowed that none would exist as long as he remained in office.
Israeli voters duly awarded him another term.
Within the United States, meanwhile, overt criticism of Israel, previously muted or confined to the political fringes (where it not infrequently carried a whiff of anti-Semitism), was now becoming sharper and more open. This was notably so on college campuses, long the bellwether of change in American political culture. To punish Israel for denying Palestinians their right to self-determination, a grassroots international campaign known as BDS (for boycott, divestment, and sanctions) gained traction, even as it generated controversy.
Acknowledging the existence and influence of a powerful pro-Israel lobby, once taboo, became commonplace.
That the West, to include the United States, had a moral obligation to support Israel in atonement for the Holocaust no longer elicited automatic assent. Nor did the argument that Israelis and Americans shared common values that bound the two countries together. Within and between both countries, issues related to identity, religion, democratic practice, and the basic requirements of social justice appeared increasingly at issue. There was no single Israel to align with a single America.
An episode purporting to demonstrate U.S.-Israeli solidarity had the opposite effect, drawing attention to the unseemly underbelly of the relationship. In March 2015, at the invitation of House Speaker John Boehner, Prime Minister Netanyahu presented himself before the United States Congress to address both houses and the American people. Boehner’s motives in extending the invitation (without bothering to check with the Obama White House) were nakedly political. As a Republican leader, he was intent on casting his party as Israel’s only truly reliable friend. As leader of Israel’s rightwing Likud Party, Netanyahu’s motives were equally partisan. He was intent on persuading the Israeli electorate that he alone could be counted on to deliver assured American support. Just as Boehner sought to score points at the expense of the Democratic Party, Netanyahu sought to score points at the expense of the Israeli political center and left. Yet while offering the Israeli prime minister an opportunity to flaunt his standing on Capitol Hill, his performance impressed some observers as both smarmy and presumptuous. As for the members of Congress who rewarded Netanyahu’s remarks with twenty-nine standing ovations, they came across as supine and puerile.
Netanyahu acted like he owned the place, with the actual owners eagerly conceding the claim. Some Israelis and some Americans found this reassuring, an affirmation of intimate friendship. For others, it was off-putting. The “jumping up and down, up and down, applauding wildly, shouting approval” reminded the acerbic Israeli commentator Uri Avnery of the Reichstag in the 1930s. The spectacle of “members of the most powerful parliament in the world behaving like a bunch of nincompoops” struck him as ridiculous.
The comedian and alternative news source Jon Stewart mocked Netanyahu’s performance as “the State of the Union Address the Republicans wanted, delivered by the leader they wish they had.”
The putative guardians of the Israeli-American relationship were making themselves laughingstocks.