Reversal

August 6, 2020

With survival at stake, can weapon makers change course?

Today, the seventy-fifth anniversary of the atomic attack on Hiroshima, should be a day for quiet introspection. I recall a summer morning following the U.S. 2003 “Shock and Awe” invasion of Iraq when the segment of the Chicago River flowing past the headquarters of the world’s second largest defense contractor, Boeing, turned the rich, red color of blood.At the water’s edge, Chicago activists, long accustomed to the read more

Tomgram: Patrick Cockburn, My Pandemics

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

In the midst of the pandemic from hell, with a president who seems incapable of grasping the reality of, no less dealing with, the spreading virus, as deaths mount and cases cascade, in a land where a Covid-19 “second wave” in the fall isn’t conceivable because the first will never have ended, it’s easy to forget about pandemics past. In fact, I did.

But I did live through one in my childhood. As Patrick Cockburn reminds us in his piece today, the last century had repeated moments when the polio virus struck before Jonas Salk perfected a vaccine for it in 1955. One of my close friends in college had a bad limp thanks to a case he caught in the early 1950s and my father-in-law, who, I believe, got it in the 1920s, had a similar limp, as does Patrick Cockburn, the journalist who, for my money, has been perhaps our best reporter on this country’s disastrous conflicts in the Greater Middle East.

I’ve followed Cockburn’s work in the British paper the Independent for years now, as he produced a body of work about our forever wars and their consequences that is both chilling and superb. He begins his just-published book, War in the Age of Trump: The Defeat of Isis, the Fall of the Kurds, the Confrontation with Iran, with typical Cockburnian insight. Our president’s unprecedented drone assassination of Iranian Major General Qasem Soleimani in January as he was leaving Baghdad International Airport for a meeting with the Iraqi prime minister may actually have saved his reputation and that of the Iranian leadership, too. “At the time of his assassination,” Cockburn writes, General Soleimani’s “strategy in Iraq and in other Middle Eastern countries with large Shia populations had become counterproductive. He is now guaranteed the status of an Iranian and Shia warrior-martyr, in spite of the mistakes he made in the last years of his life, the effects of which may, to some extent, have been reversed by President Donald Trump’s decision to kill him.”

Congrats to The Donald for bungling things yet again! You’ll find no such grasp of our president’s Iranian blunder in the American media, but it’s typical of what you will find in Cockburn’s must-buy, must-read book. Make sure to check it out and, in the meantime, consider his thoughts on ways in which war reporting and pandemic reporting eerily mirror each other in this strange moment of ours. Tom
War and Pandemic Journalism
The Truth Can Disappear Fast
By Patrick Cockburn

The struggle against Covid-19 has often been compared to fighting a war. Much of this rhetoric is bombast, but the similarities between the struggle against the virus and against human enemies are real enough. War reporting and pandemic reporting likewise have much in common because, in both cases, journalists are dealing with and describing matters of life and death. Public interest is fueled by deep fears, often more intense during an epidemic because the whole population is at risk. In a war, aside from military occupation and area bombing, terror is at its height among those closest to the battlefield.

The nature of the dangers stemming from military violence and the outbreak of a deadly disease may appear very different. But looked at from the point of view of a government, they both pose an existential threat because failure in either crisis may provoke some version of regime change. People seldom forgive governments that get them involved in losing wars or that fail to cope adequately with a natural disaster like the coronavirus. The powers-that-be know that they must fight for their political lives, perhaps even their physical existence, claiming any success as their own and doing their best to escape blame for what has gone wrong.

My First Pandemic

I first experienced a pandemic in the summer of 1956 when, at the age of six, I caught polio in Cork, Ireland. The epidemic there began soon after virologist Jonas Salk developed a vaccine for it in the United States, but before it was available in Europe. Polio epidemics were at their height in the first half of the twentieth century and, in a number of respects, closely resembled the Covid-19 experience: many people caught the disease but only a minority were permanently disabled by or died of it. In contrast with Covid-19, however, it was young children, not the old, who were most at risk. The terror caused by poliomyelitis, to use its full name, was even higher than during the present epidemic exactly because it targeted the very young and its victims did not generally disappear into the cemetery but were highly visible on crutches and in wheelchairs, or prone in iron lungs.

Parents were mystified by the source of the illness because it was spread by great numbers of asymptomatic carriers who did not know they had it. The worst outbreaks were in the better-off parts of modern cities like Boston, Chicago, Copenhagen, Melbourne, New York, and Stockholm. People living there enjoyed a good supply of clean water and had effective sewage disposal, but did not realize that all of this robbed them of their natural immunity to the polio virus. The pattern in Cork was the same: most of the sick came from the more affluent parts of the city, while people living in the slums were largely unaffected. Everywhere, there was a frantic search to identify those, like foreign immigrants, who might be responsible for spreading the disease. In the New York epidemic of 1916, even animals were suspected of doing so and 72,000 cats and 8,000 dogs were hunted down and killed.

The illness weakened my legs permanently and I have a severe limp so, even reporting in dangerous circumstances in the Middle East, I could only walk, not run. I was very conscious of my disabilities from the first, but did not think much about how I had acquired them or the epidemic itself until perhaps four decades later. It was the 1990s and I was then visiting ill-supplied hospitals in Iraq as that country’s health system was collapsing under the weight of U.N. sanctions. As a child, I had once been a patient in an almost equally grim hospital in Ireland and it occurred to me then, as I saw children in those desperate circumstances in Iraq, that I ought to know more about what had happened to me. At that time, my ignorance was remarkably complete. I did not even know the year when the polio epidemic had happened in Ireland, nor could I say if it was caused by a virus or a bacterium.

So I read up on the outbreak in newspapers of the time and Irish Health Ministry files, while interviewing surviving doctors, nurses, and patients. Kathleen O’Callaghan, a doctor at St. Finbarr’s hospital, where I had been brought from my home when first diagnosed, said that people in the city were so frightened “they would cross the road rather than walk past the walls of the fever hospital.” My father recalled that the police had to deliver food to infected homes because no one else would go near them. A Red Cross nurse, Maureen O’Sullivan, who drove an ambulance at the time, told me that, even after the epidemic was over, people would quail at the sight of her ambulance, claiming “the polio is back again” and dragging their children into their houses or they might even fall to their knees to pray.

The local authorities in a poor little city like Cork where I grew up understood better than national governments today that fear is a main feature of epidemics. They tried then to steer public opinion between panic and complacency by keeping control of the news of the outbreak. When British newspapers like the Times reported that polio was rampant in Cork, they called this typical British slander and exaggeration. But their efforts to suppress the news never worked as well as they hoped. Instead, they dented their own credibility by trying to play down what was happening. In that pre-television era, the main source of information in my hometown was the Cork Examiner, which, after the first polio infections were announced at the beginning of July 1956, accurately reported on the number of cases, but systematically underrated their seriousness.

Headlines about polio like “Panic Reaction Without Justification” and “Outbreak Not Yet Dangerous” regularly ran below the fold on its front page. Above it were the screaming ones about the Suez Crisis and the Hungarian uprising of that year. In the end, this treatment only served to spread alarm in Cork where many people were convinced that the death toll was much higher than the officially announced one and that bodies were being secretly carried out of the hospitals at night.

My father said that, in the end, a delegation of local businessmen, the owners of the biggest shops, approached the owners of the Cork Examiner, threatening to withdraw their advertising unless it stopped reporting the epidemic. I was dubious about this story, but when I checked the newspaper files many years later, I found that he was correct and the paper had almost entirely stopped reporting on the epidemic just as sick children were pouring into St. Finbarr’s hospital.

The Misreporting of Wars and Epidemics

By the time I started to research a book about the Cork polio epidemic that would be titled Broken Boy, I had been reporting wars for 25 years, starting with the Northern Irish Troubles in the 1970s, then the Lebanese civil war, the Iraqi invasion of Kuwait, the war that followed Washington’s post-9/11 takeover of Afghanistan, and the U.S.-led 2003 invasion of Iraq. After publication of the book, I went on covering these endless conflicts for the British paper the Independent as well as new conflicts sparked in 2011 by the Arab Spring in Libya, Syria, and Yemen.

As the coronavirus pandemic began this January, I was finishing a book (just published), War in the Age of Trump: The Defeat of Isis, the Fall of the Kurds, the Confrontation with Iran. Almost immediately, I noticed strong parallels between the Covid-19 pandemic and the polio epidemic 64 years earlier. Pervasive fear was perhaps the common factor, though little grasped by governments of this moment. Boris Johnson’s in Great Britain, where I was living, was typical in believing that people had to be frightened into lockdown, when, in fact, so many were already terrified and needed to be reassured.

I also noticed ominous similarities between the ways in which epidemics and wars are misreported. Those in positions of responsibility — Donald Trump represents an extreme version of this — invariably claim victories and successes even as they fail and suffer defeats. The words of the Confederate general “Stonewall” Jackson came to mind. On surveying ground that had only recently been a battlefield, he asked an aide: “Did you ever think, sir, what an opportunity a battlefield affords liars?”

This has certainly been true of wars, but no less so, it seemed to me, of epidemics, as President Trump was indeed soon to demonstrate (over and over and over again). At least in retrospect, disinformation campaigns in wars tend to get bad press and be the subject of much finger wagging. But think about it a moment: it stands to reason that people trying to kill each other will not hesitate to lie about each other as well. While the glib saying that “truth is the first casualty of war” has often proven a dangerous escape hatch for poor reporting or unthinking acceptance of a self-serving version of battlefield realities (spoon-fed by the powers-that-be to a credulous media), it could equally be said that truth is the first casualty of pandemics. The inevitable chaos that follows in the wake of the swift spread of a deadly disease and the desperation of those in power to avoid being held responsible for the soaring loss of life lead in the same direction.

There is, of course, nothing inevitable about the suppression of truth when it comes to wars, epidemics, or anything else for that matter. Journalists, individually and collectively, will always be engaged in a struggle with propagandists and PR men, one in which victory for either side is never inevitable.

Unfortunately, wars and epidemics are melodramatic events and melodrama militates against real understanding. “If it bleeds, it leads” is true of news priorities when it comes to an intensive care unit in Texas or a missile strike in Afghanistan. Such scenes are shocking but do not necessarily tell us much about what is actually going on.

The recent history of war reporting is not encouraging. Journalists will always have to fight propagandists working for the powers-that-be. Sadly, I have had the depressing feeling since Washington’s first Gulf War against Saddam Hussein’s Iraq in 1991 that the propagandists are increasingly winning the news battle and that accurate journalism, actual eyewitness reporting, is in retreat.

Disappearing News

By its nature, reporting wars is always going to be difficult and dangerous work, but it has become more so in these years. Coverage of Washington’s Afghan and Iraqi wars was often inadequate, but not as bad as the more recent reporting from war-torn Libya and Syria or its near total absence from the disaster that is Yemen. This lack fostered misconceptions even when it came to fundamental questions like who is actually fighting whom, for what reasons, and just who are the real prospective winners and losers.

Of course, there is little new about propaganda, controlling the news, or spreading “false facts.” Ancient Egyptian pharaohs inscribed self-glorifying and mendacious accounts of their battles on monuments, now thousands of years old, in which their defeats are lauded as heroic victories. What is new about war reporting in recent decades is the far greater sophistication and resources that governments can deploy in shaping the news. With opponents like longtime Iraqi ruler Saddam Hussein, demonization was never too difficult a task because he was a genuinely demonic autocrat.

Yet the most influential news story about the Iraqi invasion of neighboring Kuwait in 1990 and the U.S.-led counter-invasion proved to be a fake. This was a report that, in August 1990, invading Iraqi soldiers had tipped babies out of incubators in a Kuwaiti hospital and left them to die on the floor. A Kuwaiti girl reported to have been working as a volunteer in the hospital swore before a U.S. congressional committee that she had witnessed that very atrocity. Her story was hugely influential in mobilizing international support for the war effort of the administration of President George H.W. Bush and the U.S. allies he teamed up with.

In reality it proved purely fictional. The supposed hospital volunteer turned out to be the daughter of the Kuwaiti ambassador in Washington. Several journalists and human rights specialists expressed skepticism at the time, but their voices were drowned out by the outrage the tale provoked. It was a classic example of a successful propaganda coup: instantly newsworthy, not easy to disprove, and when it was — long after the war — it had already had the necessary impact, creating support for the U.S.-led coalition going to war with Iraq.

In a similar fashion, I reported on the American war in Afghanistan in 2001-2002 at a time when coverage in the international media had left the impression that the Taliban had been decisively defeated by the U.S. military and its Afghan allies. Television showed dramatic shots of bombs and missiles exploding on the Taliban front lines and Northern Alliance opposition forces advancing unopposed to “liberate” the Afghan capital, Kabul.

When, however, I followed the Taliban retreating south to Kandahar Province, it became clear to me that they were not by any normal definition a beaten force, that their units were simply under orders to disperse and go home. Their leaders had clearly grasped that they were over-matched and that it would be better to wait until conditions changed in their favor, something that had distinctly happened by 2006, when they went back to war in a big way. They then continued to fight in a determined fashion to the present day. By 2009, it was already dangerous to drive beyond the southernmost police station in Kabul due to the risk that Taliban patrols might create pop-up checkpoints anywhere along the road.

None of the wars I covered then have ever really ended. What has happened, however, is that they have largely ended up receding, if not disappearing, from the news agenda. I suspect that, if a successful vaccine for Covid-19 isn’t found and used globally, something of the same sort could happen with the coronavirus pandemic as well. Given the way news about it now dominates, even overwhelms, the present news agenda, this may seem unlikely, but there are precedents. In 1918, with World War I in progress, governments dealt with what came to be called the Spanish Flu by simply suppressing information about it. Spain, as a non-combatant in that war, did not censor the news of the outbreak in the same fashion and so the disease was most unfairly named “the Spanish Flu,” though it probably began in the United States.

The polio epidemic in Cork supposedly ended abruptly in mid-September 1956 when the local press stopped reporting on it, but that was at least two weeks before many children like me caught it. In a similar fashion, right now, wars in the Middle East and north Africa like the ongoing disasters in Libya and Syria that once got significant coverage now barely get a mention much of the time.

In the years to come, the same thing could happen to the coronavirus.

Patrick Cockburn is a Middle East correspondent for the Independent of London and the author of six books on the Middle East, the latest of which is War in the Age of Trump: The Defeat of Isis, the Fall of the Kurds, the Confrontation with Iran (Verso).

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel (the second in the Splinterlands series) Frostlands, Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s


read more

Tomgram: Andrea Mazzarino, Prioritizing Empire Over Health

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

No question about it. In 1991, it was the greatest power on the face of the Earth. There had never been anything like it — or so it seemed when the Berlin Wall came down and the Soviet Union, that other superpower of the Cold War era, imploded. Left alone on the planet, then, was a single mighty nation, wealthy beyond compare. To that one-of-a-kind land fell the obvious task of reorganizing a world and, given its military, what could possibly stand in its way? After all, it already possessed an unprecedented number of military bases stretching across the globe. History had simply never seen anything like it.

In the history to come, there would be nothing like the America that the Washington elite of that moment imagined either. Up against only the most pathetic of local anything-but-superpowers, with Russia reduced to a shadow of its Soviet self and China just beginning its rise, the U.S. military, funded like none other on Earth, stood alone — except for a few local autocrats and a small crew of Islamic extremists (whom the U.S. had once supported in a war against the Red Army in Afghanistan). And yet…

Oh yes, there was that First Gulf War against Saddam Hussein’s Iraq. It would be hailed as a wonder of a techno-triumph and celebrated with glorious victory parades here in the U.S., but somehow it would also prove strangely indecisive, leaving Saddam in power. Then, of course, there were the twenty-first-century invasions of Afghanistan and (again) Iraq and the utterly indecisive “forever wars” of this century that tossed trillions of American tax dollars down the drain to no purpose whatsoever. And all of that was, of course, before the pandemic arrived, turning this country’s disastrous wars into pandemic ones and its empire of bases into diseased and disease-spreading garrisons around the planet.

It’s been quite a story, how the greatest power in history took itself down. TomDispatch regular Andrea Mazzarino, a co-founder of the Costs of War Project, offers a very personal version of just what all of this means today. She does so from the point of view of someone who, as the spouse of a U.S. naval officer, is embedded in an increasingly diseased American military machine in this pandemic moment. Tom
The Military Is Sick
A Navy Spouse’s Take on Why We’re Not Getting Better
By Andrea Mazzarino

American military personnel are getting sick in significant numbers in the midst of the ongoing pandemic. As The New York Times reported in a piece buried in the back pages of its July 21st edition, “The infection rate in the services has tripled over the past six weeks as the United States military has emerged as a potential source of transmission both domestically and abroad.”

Indeed, the military is sick and I think of it as both a personal and an imperial disaster.

As the wife of a naval officer, I bear witness to the unexpected ways that disasters of all sorts play out among military families and lately I’ve been bracing for the Covid-19 version of just such a disaster. Normally, for my husband and me, the stressors are relatively mild. After all, between us we have well-paid jobs, two healthy children, and supportive family and friends, all of which allow us to weather the difficulties of military life fairly smoothly. In our 10 years together, however, over two submarine assignments and five moves, we’ve dealt with unpredictable months-long deployments, uncertainty about when I will next be left to care for our children alone, and periods of 16-hour workdays for my spouse that strained us both, not to speak of his surviving a major submarine accident.

You would think that, as my husband enters his third year of “shore duty” as a Pentagon staffer, the immediate dangers of military service would finally be negligible. No such luck. Since around mid-June, as President Trump searched for scapegoats like the World Health Organization for his own Covid-19 ineptitude and his concern over what rising infection rates could mean for his approval ratings, he decided that it was time to push this country to “reopen.”

As it turned out, that wouldn’t just be a disaster for states from Florida to California, but also meant that the Pentagon resumed operations at about 80% capacity. So, after a brief reprieve, my spouse is now required to report to his office four days a week for eight-hour workdays in a poorly ventilated, crowded hive of cubicles where people neither consistently mask nor social distance.

All of this for what often adds up to an hour or two of substantive daily work. Restaurants, dry cleaners, and other services where Pentagon staffers circulate only add to the possibility of his being exposed to Covid-19.

My husband, in other words, is now unnecessarily risking his own and his family’s exposure to a virus that has to date claimed more than 150,000 American lives — already more than eight times higher than the number of Americans who died in both the 9/11 terrorist attacks and the endless wars in Iraq and Afghanistan that followed.

In mid-August, he will transfer to an office job in Maryland, a state where cases and deaths are again on the rise. One evening, I asked him why it seemed to be business as usual at the Pentagon when numbers were spiking in a majority of states. His reply: “Don’t ask questions about facts and logic.”

After all, unless Secretary of Defense Mark Esper decides to speak out against the way President Trump has worked to reopen the country to further disaster, the movement of troops and personnel like my husband within and among duty stations will simply continue, even as Covid-19 numbers soar in the military.

America’s Archipelago of Bases

Global freedom of movement has been a hallmark of America’s vast empire of bases, at least 800 of them scattered across much of the planet. Now, it may prove part of the downfall of that very imperial structure. After all, Donald Trump’s America is at the heart of the present pandemic. So it’s hardly surprising that, according to the Times, U.S. troops seem to be carrying Covid-19 infections with them from hard-hit states like Arizona, California, Florida, and Texas, a number of which have had lax and inconsistently enforced safety guidelines, to other countries where they are stationed.

For example, at just one U.S. base on the Japanese island of Okinawa, the Marine Corps reported nearly 100 cases in July, angering local officials because American soldiers had socialized off-base and gone to local bars in a place where the coronavirus had initially been suppressed. No longer. In Nigeria, where official case counts are low but healthcare workers in large cities are reporting a spike in deaths among residents with symptoms, the U.S. military arms, supplies, and trains the national security forces. So a spike in cases among U.S. troops now places local populations (as well as those soldiers) at additional risk in a country where testing and contact tracing are severely lacking. And this is a problem now for just about any U.S. ally from Europe to South Korea.

What this virus’s spread among troops means, of course, is that the U.S. empire of bases that spans some 80 countries — about 40% of the nations on this planet — is now part of the growing American Covid-19 disaster. There is increasing reason to believe that new outbreaks of what the president likes to call the “Chinese virus” in some of these countries may actually prove to be American imports. Like many American civilians, our military personnel are traveling, going to work, socializing, buying things, often unmasked and ungloved, and anything but social distanced.

Public health experts have been clear that the criteria for safely reopening the economy without sparking yet more outbreaks are numerous. They include weeks of lower case counts, positive test rates at or beneath four new cases per 100,000 people daily, adequate testing capacity, enforcing strict social-distancing guidelines, and the availability of at least 40% of hospital ICU beds to treat any possible future surge.

To date, only three states have met these criteria: Connecticut, New Jersey, and New York. The White House’s Opening Up America plan, on the other hand, includes guidelines of just the weakest and vaguest sort like noting a downward trajectory in cases over 14-day periods and “robust testing capacity” for healthcare workers (without any definition of what this might actually mean).

Following White House guidance, the Department of Defense is deferring to local and state governments to determine what, if any, safety measures to take. As the White House then suggested, in March when a military-wide lockdown began, troops needed to quarantine for 14 days before moving to their next duty station. At the close of June, the Pentagon broadly removed travel restrictions, allowing both inter-state recreational and military travel by troops and their families. Now, in a country that lacks any disciplined and unified response to the global pandemic, our ever-mobile military has become a significant conduit of its spread, both domestically and abroad.

To be sure, none of us knew how to tackle the dangers posed by this virus. The last global pandemic of this sort, the “Spanish Flu” of 1918-1919 in which 50 million or more people died worldwide, suggested just how dire the consequences of such an outbreak could be when uncontained. But facts and lived experience are two different things. If you’re young, physically fit, have survived numerous viruses of a more known and treatable sort, and most of the people around you are out and about, you probably dismiss it as just another illness, even if you’re subject to some of the Covid-19 death risk factors that are indeed endemic among U.S. military personnel.

Perhaps what the spread of this pandemic among our troops shows is that the military-civilian divide isn’t as great as we often think.

Protecting Life in the Covid-19 Era

Full disclosure: I write this at a time when I’m frustrated and tired. For the past month, I’ve provided full-time child care for our two pre-school age kids, even while working up to 50 hours a week, largely on evenings and weekends, as a psychotherapist for local adults and children themselves acutely experiencing the fears, health dangers, and economic effects of the coronavirus. Like many other moms across the country, I cram work, chores, pre-K Zoom sessions, pediatrician and dentist appointments, and grocery shopping into endless days, while taking as many security precautions as I can. My husband reminds me of the need to abide by quarantines, as (despite his working conditions) he needs to be protected from exposing top Pentagon officials to the disease.

Yet the military has done little or nothing to deal with the ways the families of service members, asked to work and “rotate,” might be exposed to infection. In the dizziness of fatigue, I have little patience for any institution that carries on with business as usual under such circumstances.

What’s more, it’s hard to imagine how any efforts to quarantine will bear fruit in a country where even those Americans who do follow scientific news about Covid-19 have often dropped precautions against its spread. I’ve noted that, these days, some of my most progressive friends have started to socialize, eat indoors at restaurants, and even travel out of state to more deeply affected places by plane. They are engaging in what we therapists sometimes call “emotion-based reasoning,” or “I’m tired of safety precautions, so they must no longer be necessary.”

And that’s not even taking into account the no-maskers among us who flaunt the safety guidelines offered by the Centers for Disease Control and Prevention to indicate their supposed love of individual liberties. A relative, an officer with the Department of Homeland Security, recently posted a picture on Facebook of his three young children and those of a workmate watching fireworks arm in arm at an unmasked July 4th gathering. The picture was clearly staged to provoke those like me who support social-distancing and masking guidelines. When I talk with him, he quickly changes the subject to how he could, at any moment, be deployed to “control the rioters in D.C. and other local cities.” In other words, in his mind like those of so many others the president relies on as his “base,” the real threat isn’t the pandemic, it’s the people in the streets protesting police violence.

I wonder how the optics of American families celebrating together could have superseded safety based on an understanding of how diseases spread, as well as a healthy respect for the unknowns that go with them.

Sometimes, our misplaced priorities take my breath away, quite literally so recently. Craving takeout from my favorite Peruvian chicken restaurant and wanting to support a struggling local business, I ordered such a meal and drove with my kids to pick it up. Stopping at the restaurant, I noted multiple unmasked people packed inside despite a sign on the door mandating masks and social distancing. Making a quick risk-benefit assessment, I opened the car windows, blasted the air conditioning, and ran into the restaurant without my kids, making faces at them through the window while I stood in line.

A voice suddenly cut through the hum of the rotisseries: “Shameful! Shameful!” A woman, unmasked, literally spat these words, pointing right at me. “Leaving your kids in the car! Someone could take them! Shameful!” I caught my breath. Riddled with guilt and fearful of what she might do, I returned to my car without my food. She followed me, yelling, “Shameful!”

Aside from the spittle flying from this woman’s mouth, notable was what she wasn’t ashamed of: entering such a place, unmasked and ready to spit, with other people’s children also in there running about. (Not to mention that in Maryland reported abductions of children by strangers are nil.)

What has this country come to when we are more likely to blame the usual culprits — negligent mothers, brown and Black people, illegal immigrants (you know the list) — than accept responsibility for what’s actually going on and make the necessary sacrifices to deal with it (perhaps including, I should admit, going without takeout food)?

Typically in these years, top Pentagon officials and the high command are prioritizing the maintenance of empire at the expense of protecting the very bodies that make up the armed services (not to speak of those inhabitants of other countries living near our hundreds of global garrisons). After all, what’s the problem, when nothing could be more important than keeping this country in the (increasingly embattled) position of global overseer? More bodies can always be produced. (Thank you, military spouses!)

The spread of this virus around the globe, now aided in part by the U.S. military, reminds me of one of those paint-with-water children’s books where the shading appears gradually as the brush moves over the page, including in places you didn’t expect. Everywhere that infected Americans socialize, shop, arm, and fight, this virus is popping up, eroding both our literal ability to be present and the institutions (however corrupt) we’re still trying to prop up. If we are truly in a “war” against Covid-19 — President Trump has, of course, referred to himself as a “wartime president” — then it’s time for all of us to make the sacrifices of a wartime nation by prioritizing public health over pleasure. Otherwise, I fear that what’s good about life in this country will also be at risk, as will the futures of my own children.

Andrea Mazzarino, a TomDispatch regular, co-founded Brown University’s Costs of War Project. She has held various clinical, research, and advocacy positions, including at a Veterans Affairs PTSD Outpatient Clinic, with Human Rights Watch, and at a community mental health agency. She is the co-editor of War and Health: The Medical Consequences of the Wars in Iraq and Afghanistan.

Follow TomDispatch on Twitter and join us on


read more

China and the United States Could Avoid an Unnecessary War

Although few Americans seem to have noticed, China and the United States are currently on a collision course—one that could easily lead to war.

Their dispute, which has reached the level of military confrontation, concerns control of the South China Sea.  For many years, China has claimed sovereignty over 90 percent of this vast, island-studded region—a major maritime trade route rich in oil, natural gas, and lucrative fishing areas.  But competing claims for portions of the South China Sea have been made for decades by other nations that adjoin it, including Brunei, Malaysia, the Philippines, Taiwan, and Vietnam.  Starting in 2013, China began to assert its control more forcefully by island-building in the Paracel and Spratly Islands—expanding island size or creating new islands while constructing ports, airstrips, and military installations on them.

Other countries, however, protested Chinese behavior.  In 2016, the Permanent Court of Arbitration at the Hague, acting on a complaint by the Philippines that Chinese action violated the freedom of navigation guaranteed by the UN Convention on the Law of the Sea, decided in favor of the Philippines, although it did not rule on the ownership of the islands.  In response, the government of China, a party to the UN treaty, refused to accept the court’s jurisdiction.  Meanwhile, the U.S. government, which was not a party to the treaty, insisted on the treaty’s guarantee of free navigation and proceeded to challenge China by sailing its warships through waters claimed by the Chinese government.

Actually, the positions of the Chinese and U.S. governments both have some merit.  The Chinese, after all, conducted a variety of operations in this maritime region for millennia.  Also, some of the islands are currently controlled by other claimants (such as Vietnam), and China has been working for years with the Association of Southeast Asian Nations on a Code of Conduct that might finally resolve the regional dispute.  Nevertheless, the U.S. government can point to China’s provocative militarization of the islands, the rejection of China’s stance by most other nations in Southeast Asia, and the ruling of the Permanent Court of Arbitration.

But the bottom line is that the issue of legitimate control remains unclear and, meanwhile, both the Chinese and U.S. governments are engaging in reckless behavior that could lead to disaster.

The U.S. military buildup in the South China Sea is quite striking.  As defense analyst Michael Klare wrote recently:  “Every Pacific-based US submarine is now deployed in the area . . . the Air Force has sent B-1 bombers overhead; and the Army is practicing to seize Chinese-claimed islands.”  Furthermore, in the past few months, the U.S. Navy has repeatedly sent missile-armed destroyers on provocative “freedom of navigation operations” into the waters just off the Chinese-occupied islands.  In July alone, the U.S. government deployed two nuclear-powered aircraft carriers (the USS Nimitz and the USS Ronald Reagan) to the South China Sea, accompanied by squadrons of cruisers, destroyers, and submarines.  This powerful U.S. armada was reinforced by two supersonic bombers and a nuclear-capable B-52 Stratofortress.

In response, China’s government has vigorous reasserted China’s claims in the South China Sea.  To demonstrate its determination, it has frequently deployed ships and planes of its own to shadow or harass American warships, sometimes escorting them out of the area.  At the same time, it has stepped up Chinese naval operations in the East and South China Seas.  In April, China’s first operational aircraft carrier, the Liaoning, moved into the region.  “China has several times experienced the threats posed by the U.S. in the [South China] Sea,” a retired Chinese naval officer announced on government media.  But “China’s resolve to safeguard its territorial integrity, sovereignty, and maritime interests will not waver [after] the latest threat posed by the U.S.  The Chinese military is prepared and will deal with the threat.”

This growing military confrontation has been accompanied by an escalating war of words.  Although previous U.S. policy called for a peaceful resolution of the South China Sea dispute between China and its neighbors, U.S. Secretary of State Mike Pompeo recently announced a much harder line.  In an official statement on July 13, he declared that “Beijing’s claims to offshore resources across most of the South China Sea are completely unlawful, as is its campaign of bullying to control them.”  The United States “stands with our Southeast Asian allies and partners in protecting their sovereign rights.”  On July 23, Pompeo issued an inflammatory, across-the-board denunciation of China’s foreign and domestic policies, proclaiming that “the free world will triumph over this new tyranny.”

Responding to Pompeo, a spokesperson for the Chinese Foreign Ministry claimed that China was working with all parties to the South China Sea dispute to settle it through negotiations.  By contrast, he said, U.S. military operations in the area were designed to create tensions in the region.  Furthermore, it was the U.S. government that violated international law and withdrew from international organizations and treaties.

Clearly, despite their professed concern for international law, the governments of the United States and China are engaged in a 21st century-style gunboat diplomacy—one that, either intentionally or unintentionally, could escalate into war, even nuclear war.

If these two nuclear-armed governments are serious about settling the dispute over control of the South China Sea, they should call a halt to their provocative military operations and leave the job of sorting things out to the United Nations.  After all, resolving international conflicts is why the United States, China, and other countries created the world organization in the first place.  No single nation, however powerful its military forces, has the respect and credibility in the world community that the United Nations enjoys.  Nor does it have the legitimacy.  It’s time the governments of these two nations recognized these facts and ceased their threatening and dangerous military behavior.

Dr. Lawrence Wittner (https://www.lawrenceswittner.com/ ) is Professor of History Emeritus at SUNY/Albany and the author of Confronting the Bomb (Stanford University Press).

Tomgram: Karen Greenberg, Can the Pandemic Bring Accountability Back to This Country?

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

Yes, it’s possible that a vaccine for Covid-19 could be available by spring. I mean, I wouldn’t put my money on it, but it seems at least conceivable. Here’s something I would put a few bucks on, though: when a vaccine appears, the Trump administration will have so botched things that its widespread distribution any time soon — even in what could by then be Joe Biden’s America — will be, to put it mildly, a challenge.

Just imagine this for a moment: what’s still the world’s richest, most powerful country didn’t have a reasonable supply of protective gear and N95 masks when the virus hit. Nor, of course, did South Korea. That country’s government, however, managed to quickly intervene, ramp up production, and ensure that South Koreans got such masks on a national scale in a way that would help shut down the disease big time. The Donald and crew? They quite literally did the opposite, turning down an offer to ramp up mask production in January that could have made all the difference. In other words, the most powerful nation on the planet that, in a World War almost three-quarters of a century earlier, had geared up production lines at a remarkable speed to produce tanks and planes, couldn’t manage to coordinate the production of N95 masks, not even with a “wartime president” in the White House.

Call that remarkable indeed. Nor could the man in the Oval Office and his top officials produce a reasonable testing program for the coronavirus or a national team of contact tracers to track down those in touch with people who got the disease as, for instance, both China and Iceland were perfectly capable of doing. Yet the same president has proven quite capable of flooding the streets of Democratic-run cities with his own army of federal agents, togged out in military-style gear, and ready to promote his election-themed version of “law and order.”

Go figure. Or, as TomDispatch regular Karen Greenberg does today, think about what else is missing in this land of ours in 2020 — accountability — and how we lost it. Tom
Missing in Action
Accountability Is Gone in America
By Karen J. Greenberg

Whether you consider the appalling death toll or the equally unacceptable rising numbers of Covid-19 cases, the United States has one of the worst records worldwide when it comes to the pandemic. Nevertheless, the president has continued to behave just as he promised he would in March when there had been only 40 deaths from the virus here and he said, “I don’t take responsibility at all.”

In April, when 50,000 Americans had died, he praised himself and his administration, insisting, “I think we’ve done a great job.” In May, as deaths continued to mount nationwide, he insisted, “We have met the moment and we have prevailed.” In June, he swore the virus was “dying out,” contradicting the views and data of his just-swept-into-the-closet coronavirus task force. In July, he cast the blame for the ongoing disaster on state governors, who, he told the nation, had handled the virus “poorly,” adding, “I supplied everybody.” It was the governors, he assured the public, who had failed to acquire and distribute key supplies, including protective gear and testing supplies.

All told, he’s been a perfect model in deflecting all responsibility, even as the death toll soared over 150,000 with more than four million cases reported nationwide and no end in sight, even as he assured the coronavirus of a splendid future in the U.S. by insisting that all schools reopen this fall (and that the Centers for Disease Control and Prevention back him on that).

In other words, Donald Trump and his team have given lack of accountability a new meaning in America. Their refusal to accept the slightest responsibility for Covid-19’s rampage through this country may seem startling (or simply like our new reality) in a land that has traditionally defined itself as dedicated to democratic governance, and the rule of law. It has long seen itself as committed to transparency and justice, through investigations, reports, and checks and balances, notably via the courts and Congress, designed to ensure that its politicians and officials be held responsible for their actions. The essence of democracy — the election — was also the essence of accountability, something whose results Donald Trump recently tried to throw into doubt when it comes to the contest this November.

Still, the loss of accountability isn’t simply a phenomenon of the Trump years. Its erosion has been coming for a long time at what, in retrospect, should seem an alarmingly inexorable pace.

In August 2020, it should be obvious that America, a still titanic (if fading) power, has largely thrown accountability overboard. With that in mind, here’s a little history of how it happened.

The War on Terror

As contemporary historians and political analysts tell it, the decision to go to war in Iraq in the spring of 2003, which cost more than 8,000 American lives and led to more than 200,000 Iraqi deaths, military and civilian, was more than avoidable. It was the result of lies and doctored information engineered to get the U.S. involved in a crucial part of what would soon enough become its “forever wars” across the Greater Middle East and Africa.

As Robert Draper recently reminded us, those in the administration of President George W. Bush who contested information about the presence of weapons of mass destruction in Saddam Hussein’s Iraq were ignored or silenced. Worse yet, torture was used to extract a false confession from senior al-Qaeda member Ibn Sheikh al-Libi regarding the terror organization’s supposed attempts to acquire such weaponry there. Al-Libi’s testimony, later recanted, was used as yet another pretext to launch an invasion that top American officials had long been determined to set in motion.

And it wasn’t just a deceitful decision. It was a thoroughly disastrous one as well. There is today something like a consensus among policy analysts that it was possibly the “biggest mistake in American military history” or, as former Senate Majority Leader Harry Reid (D-NV) put it four years after the invasion, “the worst foreign policy mistake in U.S. history,” supplanting the Vietnam War in the minds of many.

And that raises an obvious question: Who was held accountable for that still unending disaster? Who was charged with the crime of willfully and intentionally taking the nation to war — and a failed war at that — based on manufactured facts? In numerous books, the grim realities of that moment have been laid out clearly. When it comes to any kind of public censure, or trial, or even an official statement of wrongdoing, none was ever forthcoming.

Nor was there any accountability for the policy and practice of torture, “legally” sanctioned then, that took the country back to practices more common in the Middle Ages. (It’s worth noting as well that John Yoo, who wrote the memos authorizing such torture then, is now helping the Trump administration find ways to continue evading checks on the presidency.)

More than a decade ago at TomDispatch, I wrote about how the Bush administration supported such acts at the highest levels. As a result, in the early years of the war on terror, in 20 CIA “black sites,” located in eight countries, the U.S. government used torture, as a Senate Select Intelligence Committee Report of December 2014 would detail, to elicit information and misinformation from dozens of “high-value detainees.”

It should go without saying that torture violates just about every precept of the modern rule of law: the renunciation of adjudication in favor of brutality, the use of dungeon-like chambers and medieval equipment rather than the expertise of intelligence professionals gathering information, and of course the rejection of any conviction that civility and rights are valuable.

Among his first acts on entering the Oval Office, Barack Obama pledged that the United States under his leadership would “not torture.” Nonetheless, the lawyers who wrote the memos legally approving those policies were never held accountable, nor were the Bush administration officials who signed off on them (and had such techniques demonstrated to them in the White House); nor, of course, were the actual torturers and the doctors who advised them in any way censured or criminally charged in American courts.

Indeed, many of their careers only advanced as they took jobs like a federal judge, a professor at a prestigious law school, or a well-remunerated author. When suggestions for leveling criminal charges or holding congressional hearings and investigations were raised, the Obama administration decided not to proceed. Attorney General Eric Holder claimed that “the admissible evidence would not be sufficient to obtain and sustain a conviction beyond a reasonable doubt,” while President Obama insisted that the administration should “look forward as opposed to looking backwards.” Accountability was once again abandoned.

And looming over the war on terror, the invasion of Iraq, and those torture policies was a refusal to hold any agency, administration, or anyone at all responsible for failing to stop 9/11 from happening in the first place. The 9/11 Commission Report might have been an initial step in that process, but as journalist Philip Shenon put it in his book

The
read more

Best of TomDispatch: John Dower, Terror Is in the Eye of the Beholder

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

Our lives are, of course, our histories, which makes us all, however inadvertently, historians. Part of my own history, my other life — not the TomDispatch one that’s consumed me for the last 14 years — has been editing books. I have no idea how many books I’ve edited since I was in my twenties, but undoubtedly hundreds. Recently, I began rereading War Without Mercy: Race and Power in the Pacific War, perhaps 33 years after I first put pen to paper (in the days before personal computers were commonplace) and started marking up a draft of it for Pantheon Books, where I then worked, and where I later ushered it into the world.

As it happens, however, my history with the author of that book dips significantly deeper into time than that. I first met Pulitzer Prize-winning historian John Dower in perhaps 1968, almost half a century ago. We were both graduate students in Asian studies then, nothing eminent or prize-winning about either of us in an era when so much of our time was swept away by opposition to the Vietnam War. Our lives, our stories, have crossed many times since, and so it was with a little rush of emotion that I opened his book all over again and began reading its very first paragraphs:

“World War Two meant many things to many people.

“To over fifty million men, women, and children, it meant death. To hundreds of millions more in the occupied areas and theaters of combat, the war meant hell on earth: suffering and grief, often with little if any awareness of a cause or reason beyond the terrifying events of the moment…”

That book — on World War II in the Pacific as a brew of almost unbearable racial hatreds, stereotypes, and savagery — would have a real impact in its moment (as, in fact, it still does) and would be followed by other award-winning books on war and violence and how, occasionally, we humans even manage to change and heal after such terrible, obliterating events. John’s work has regularly offered stunning vistas of both horror and implicit hope. He’s an author (and friend) who, to my mind, will always be award-winning. So it was, I have to admit, with a certain strange nostalgia that, at age 72, so many decades after I first touched a manuscript of his, I found myself editing a new one. It proved to be a small, action- and shock-packed volume on American global violence and war-making in these last 75 years. In doing so, I met on the page both my old friend who had once stood with me in opposition to the horror that was America’s war in Indochina and the award-winning historian who has a unique perspective on our past that is deeply needed on this war- and violence-plagued planet of ours.

So many years later, it felt like a personal honor to be editing and then publishing his new work, The Violent American Century: War and Terror Since World War Two, at Dispatch Books. If it’s a capstone work for him, it seemed like something of a capstone for me as well, both as an editor and, like all of us, as a historian of myself. Tom
Memory Loss in the Garden of Violence
How Americans Remember (and Forget) Their Wars
By John Dower

Some years ago, a newspaper article credited a European visitor with the wry observation that Americans are charming because they have such short memories. When it comes to the nation’s wars, however, he was not entirely on target. Americans embrace military histories of the heroic “band of [American] brothers” sort, especially involving World War II. They possess a seemingly boundless appetite for retellings of the Civil War, far and away the country’s most devastating conflict where American war deaths are concerned.

Certain traumatic historical moments such as “the Alamo” and “Pearl Harbor” have become code words — almost mnemonic devices — for reinforcing the remembrance of American victimization at the hands of nefarious antagonists. Thomas Jefferson and his peers actually established the baseline for this in the nation’s founding document, the Declaration of Independence, which enshrines recollection of “the merciless Indian Savages” — a self-righteous demonization that turned out to be boilerplate for a succession of later perceived enemies. “September 11th” has taken its place in this deep-seated invocation of violated innocence, with an intensity bordering on hysteria.

Such “victim consciousness” is not, of course, peculiar to Americans. In Japan after World War II, this phrase — higaisha ishiki in Japanese — became central to leftwing criticism of conservatives who fixated on their country’s war dead and seemed incapable of acknowledging how grievously Imperial Japan had victimized others, millions of Chinese and hundreds of thousands of Koreans foremost among them. When present-day Japanese cabinet members visit Yasukuni Shrine, where the emperor’s deceased soldiers and sailors are venerated, they are stoking victim consciousness and roundly criticized for doing so by the outside world, including the U.S. media.

Worldwide, war memorials and memorial days ensure preservation of such selective remembrance. My home state of Massachusetts also does this to this day by flying the black-and-white “POW-MIA” flag of the Vietnam War at various public places, including Fenway Park, home of the Boston Red Sox — still grieving over those fighting men who were captured or went missing in action and never returned home.

In one form or another, populist nationalisms today are manifestations of acute victim consciousness. Still, the American way of remembering and forgetting its wars is distinctive for several reasons. Geographically, the nation is much more secure than other countries. Alone among major powers, it escaped devastation in World War II, and has been unmatched in wealth and power ever since. Despite panic about Communist threats in the past and Islamist and North Korean threats in the present, the United States has never been seriously imperiled by outside forces. Apart from the Civil War, its war-related fatalities have been tragic but markedly lower than the military and civilian death tolls of other nations, invariably including America’s adversaries.

Asymmetry in the human costs of conflicts involving U.S. forces has been the pattern ever since the decimation of Amerindians and the American conquest of the Philippines between 1899 and 1902. The State Department’s Office of the Historian puts the death toll in the latter war at “over 4,200 American and over 20,000 Filipino combatants,” and proceeds to add that “as many as 200,000 Filipino civilians died from violence, famine, and disease.” (Among other precipitating causes for those noncombatant deaths, U.S. troops shot most of the water buffalo farmers relied on to produce their crops.) Many scholarly accounts now offer higher estimates for Filipino civilian fatalities.

Much the same morbid asymmetry characterizes war-related deaths in World War II, the Korean War, the Vietnam War, the Gulf War of 1991, and the invasions and occupations of Afghanistan and Iraq following September 11, 2001.

Terror Bombing from World War II to Korea and Vietnam to 9/11

While it is natural for people and nations to focus on their own sacrifice and suffering rather than the death and destruction they themselves inflict, in the case of the United States such cognitive astigmatism is backlighted by the country’s abiding sense of being exceptional, not just in power but also in virtue. In paeans to “American exceptionalism,” it is an article of faith that the highest values of Western and Judeo-Christian civilization guide the nation’s conduct — to which Americans add their country’s purportedly unique embrace of democracy, respect for each and every individual, and stalwart defense of a “rules-based” international order.

Such self-congratulation requires and reinforces selective memory. “Terror,” for instance, has become a word applied to others, never to oneself. And yet during World War II, U.S. and British strategic-bombing planners explicitly regarded their firebombing of enemy cities as terror bombing, and identified destroying the morale of noncombatants in enemy territory as necessary and morally acceptable. Shortly after the Allied devastation of the German city of Dresden in February 1945, Winston Churchill, whose bust circulates in and out of the presidential Oval Office in Washington (it is currently in), referred to the “bombing of German cities simply for the sake of increasing the terror, though under other pretexts.”

In the war against Japan, U.S. air forces embraced this practice with an almost gleeful vengeance, pulverizing 64 cities prior to the atomic bombings of Hiroshima and Nagasaki in August 1945. When al-Qaeda’s 19 hijackers crash-bombed the World Trade Center and Pentagon in 2001, however, “terror bombing” aimed at destroying morale was detached from this Anglo-American precedent and relegated to “non-state terrorists.” Simultaneously, targeting innocent civilians was declared to be an atrocity utterly contrary to civilized “Western” values, and prima facie evidence of Islam’s inherent savagery.

The sanctification of the site of the destroyed World Trade Center as “Ground Zero” — a term previously associated with nuclear explosions in general and Hiroshima in particular — reinforced this deft legerdemain in the manipulation of memory. Few if any American public figures recognized or cared that this graphic nomenclature was appropriated from Hiroshima, whose city government puts the number of fatalities from the atomic bombing “by the end of December 1945, when the acute effects of radiation poisoning had largely subsided,” at around 140,000. (The estimated death toll for Nagasaki is 60,000 to 70,000.) The context of those two attacks — and all the firebombings of German and Japanese cities before them — obviously differs greatly from the non-state terrorism and suicide bombings inflicted by today’s terrorists. Nonetheless, “Hiroshima” remains the most telling and troubling symbol of terror bombing in modern times — despite the effectiveness with which, for present and future generations, the post-9/11 “Ground Zero” rhetoric altered the landscape of memory and now connotes American victimization.

Short memory also has erased almost all American recollection of the U.S. extension of terror bombing to Korea and Indochina. Shortly after World War II, the United States Strategic Bombing Survey calculated that Anglo-American air forces in the European theater had dropped 2.7 million tons of bombs, of which 1.36 million tons targeted Germany. In the Pacific theater, total tonnage dropped by Allied planes was 656,400, of which 24% (160,800 tons) was dropped on the home islands of Japan. Of the latter, 104,000 tons “were directed at 66 urban areas.” Shocking at the time, in retrospect these Japanese numbers in particular have come to seem modest when compared to the tonnage of explosives U.S. forces unloaded on Korea and later Vietnam, Cambodia, and Laos.

The official history of the air war in Korea (The United States Air Force in Korea 1950-1953) records that U.S.-led United Nations air forces flew more than one million sorties and, all told, delivered a total of 698,000 tons of ordnance against the enemy. In his 1965 memoir Mission with LeMay, General Curtis LeMay, who directed the strategic bombing of both Japan and Korea, offered this observation: “We burned down just about every city in North and South Korea both… We killed off over a million civilian Koreans and drove several million more from their homes, with the inevitable additional tragedies bound to ensue.”

Other sources place the estimated number of civilian Korean War dead as high as three million, or possibly even more. Dean Rusk, a supporter of the war who later served as secretary of state, recalled that the United States bombed “everything that moved in North Korea, every brick standing on top of another.” In the midst of this “limited war,” U.S. officials also took care to make it clear on several occasions that they had not ruled out using nuclear weapons. This even involved simulated nuclear strikes on North Korea by B-29s operating out of Okinawa in a 1951 operation codenamed Hudson Harbor.

In Indochina, as in the Korean War, targeting “everything that moved” was virtually a mantra among U.S. fighting forces, a kind of password that legitimized indiscriminate slaughter. Nick Turse’s extensively researched recent history of the Vietnam War, for instance, takes its title from a military order to “kill anything that moves.” Documents released by the National Archives in 2004 include a transcript of a 1970 telephone conversation in which Henry Kissinger relayed President Richard Nixon’s orders to launch “a massive bombing campaign in Cambodia. Anything that flies on anything that moves.”

In Laos between 1964 and 1973, the CIA helped direct the heaviest air bombardment per capita in history, unleashing over two million tons of ordnance in the course of 580,000 bombing runs — equivalent to a planeload of bombs every eight minutes for roughly a full decade. This included around 270 million bomblets from cluster bombs. Roughly 10% of the total Laotian population was killed. Despite the devastating effects of this assault, some 80 million of the cluster bomblets dropped failed to detonate, leaving the ravaged country littered with deadly unexploded ordnance to the present day.

The payload of bombs unloaded on Vietnam, Cambodia, and Laos between the mid-1960s and 1973 is commonly reckoned to have been between seven and eight million tons — well over 40 times the tonnage dropped on the Japanese home islands in World War II. Estimates of total deaths vary, but are all exceedingly high. In a Washington Post article in 2012, John Tirman noted that “by several scholarly estimates, Vietnamese military and civilian deaths ranged from 1.5 million to 3.8 million, with the U.S.-led campaign in Cambodia resulting in 600,000 to 800,000 deaths, and Laotian war mortality estimated at about 1 million.”

On the American side, the Department of Veterans Affairs places battle deaths in the Korean War at 33,739. As of Memorial Day 2015, the long wall of the deeply moving Vietnam Veterans Memorial in Washington was inscribed with the names of 58,307 American military personnel killed between 1957 and 1975, the great majority of them from 1965 on. This includes approximately 1,200 men listed as missing (MIA, POW, etc.), the lost fighting men whose flag of remembrance still flies over Fenway Park.

North Korea and the Cracked Mirror of Nuclear War

Today, Americans generally remember Vietnam vaguely, and Cambodia and Laos not at all. (The inaccurate label “Vietnam War” expedited this latter erasure.) The Korean War, too, has been called “the forgotten war,” although a veterans memorial in Washington, D.C., was finally dedicated to it in 1995, 42 years after the armistice that suspended the conflict. By contrast, Koreans have not forgotten. This is especially true in North Korea, where the enormous death and destruction suffered between 1950 and 1953 is kept alive through endless official iterations of remembrance — and this, in turn, is coupled with a relentless propaganda campaign calling attention to Cold War and post-Cold War U.S. nuclear intimidation. This intense exercise in remembering rather than forgetting goes far to explain the current nuclear saber-rattling of North Korea’s leader Kim Jong-un.

With only a slight stretch of the imagination, it is possible to see cracked mirror images in the nuclear behavior and brinksmanship of American presidents and North Korea’s dictatorial dynastic leadership. What this unnerving looking glass reflects is possible madness, or feigned madness, coupled with possible nuclear conflict, accidental or otherwise.

To Americans and much of the rest of the world, Kim Jong-un seems irrational, even seriously deranged. (Just pair his name with “insane” or “crazy” in a Google search.) Yet in rattling his miniscule nuclear quiver, he is really joining the long-established game of “nuclear deterrence,” and practicing what is known among American strategists as the “madman theory.” The latter term is most famously associated with Richard Nixon and Henry Kissinger during the Vietnam War, but in fact it is more or less imbedded in U.S. nuclear game plans. As rearticulated in “Essentials of Post-Cold War Deterrence,” a secret policy document drafted by a subcommittee in the U.S. Strategic Command in 1995 (four years after the demise of the Soviet Union), the madman theory posits that the essence of effective nuclear deterrence is to induce “fear” and “terror” in the mind of an adversary, to which end “it hurts to portray ourselves as too fully rational and cool-headed.”

When Kim Jong-un plays this game, he is simultaneously ridiculed and feared to be truly demented. When practiced by their own leaders and nuclear priesthood, Americans have been conditioned to see rational actors at their cunning best.

Terror, it seems, in the twenty-first century, as in the twentieth, is in the eye of the beholder.

John W. Dower is professor emeritus of history at the Massachusetts Institute of Technology. His many books include War Without Mercy: Race and Power in the Pacific War and Embracing Defeat: Japan in the Wake of World War Two, which have won numerous prizes including the Pulitzer, the National Book Award, and the National Book Critics Circle award. His latest book, The Violent American Century: War and Terror Since World War Two (Dispatch Books), has just been published.

Follow TomDispatch on


read more

Tomgram: John Feffer, The No-Trust World

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

It wasn’t magic. It wasn’t astrology. Not faintly. But it was in the stars. No, not this specific pandemic, but a pandemic. In fact, back in 2010, TomDispatch ran a piece by John Barry on that very subject. He’s the expert on the “Spanish Flu,” the 1918-1919 pandemic that killed an estimated 50 million or more people on a significantly less populated planet. His 2005 book, The Great Influenza: The Story of the Deadliest Pandemic in History, has fittingly returned to the bestseller lists in the Covid-19 moment. A decade ago, his TomDispatch post “How Prepared Are We for the Next Great Flu Breakout?” concluded all too presciently this way: “Because H5N1 has not become a pandemic and H1N1 turned out to be mild, the idea that influenza is no longer a threat has become pervasive. Everything that happened in 2009 suggests that, if a severe outbreak comes again, failure to improve on that response will threaten chaos and magnify the terror, the economic impact, and the death toll. And it will come again.”

Yes, the nature of “it” may have been unpredictable, but a pandemic wasn’t. That was a decade ago and something like the Spanish Flu redux was already all too imaginable then. As Politico reported in March, it was so imaginable that, seven days before Donald Trump entered the Oval Office, Obama administration officials walked at least 30 members of his team, including future cabinet members, through a horrific pandemic scenario for 2017 in which a virus worse than the Spanish flu, let loose in Asia, began to spread across the planet.

Predictably enough, the Trump administration responded to this nightmare by “largely dismantling government units that were designed to protect against pandemics.” And then, of course, they were blindsided by what, to any virologist or epidemiologist, was all too predictable. With only election 2020 on their minds, the president and his crew suddenly faced their own version of the interloper from hell, Covid-19, and promptly ducked. They tried to push responsibility for dealing with it off on the states, even as they did their best to imagine it away and, in the process, consigned staggering numbers of Americans to an early grave. Thanks in part to such ignorant incompetents running the country, we now find ourselves in a version of hell (even if without the flames).

As TomDispatch regular John Feffer, weekly columnist for Foreign Policy in Focus and author of the Splinterlands series of dystopian novels, suggests today, The Donald and his crew might be considered the Great Unwinders on a previously globalized planet that looks to be coming apart at the seams. What that could possibly mean I leave him to explore. Tom
This Changes Everything (or Nothing)
How Covid-19 Could Upend Geopolitics
By John Feffer

I don’t trust you.

Don’t take it personally. It doesn’t matter whether you’re a friend or a stranger. I don’t care about your identity or your politics, where you work or if you work, whether you wear a mask or carry a gun.

I don’t trust you because you are, for the time being, a potential carrier of a deadly virus. You don’t have any symptoms? Maybe you’re an asymptomatic superspreader. Show me your negative test results and I’ll still have my doubts. I have no idea what you’ve been up to between taking the test and receiving the results. And can we really trust that the test is accurate?

Frankly, you shouldn’t trust me for the same reasons. I’m not even sure that I can trust myself. Didn’t I just touch my face at the supermarket after palpating the avocados?

I’m learning to live with this mistrust. I’m keeping my distance from other people. I’m wearing my mask. I’m washing my hands. I’m staying far away from bars.

I’m not sure, however, that society can live with this level. Let’s face it: trust makes the world go around. Protests break out when our faith in people or institutions is violated: when we can’t trust the police (#BlackLivesMatter), can’t trust male colleagues (#MeToo), can’t trust the economic system to operate with a modicum of fairness (#OccupyWallStreet), or can’t trust our government to do, well, anything properly (#notmypresident).

Now, throw a silent, hidden killer into this combustible mix of mistrust, anger, and dismay. It’s enough to tear a country apart, to set neighbor against neighbor and governor against governor, to precipitate a civil war between the masked and the unmasked.

Such problems only multiply at the global level where mistrust already permeates the system — military conflicts, trade wars, tussles over migration and corruption. Of course, there’s also been enough trust to keep the global economy going, diplomats negotiating, international organizations functioning, and the planet from spinning out of control. But the pandemic may just tip this known world off its axis.

I’m well aware of the ongoing debate between the “not much” and “everything” factions. Once a vaccine knocks it out of our system, the coronavirus might not have much lasting effect on our world. Even without a vaccine, people can’t wait to get back to normal life by jumping into pools, heading to the movie theater, attending parties — even in the United States where cases continue to rise dramatically. The flu epidemic of 1918-1919, which is believed to have killed at least 50 million people, didn’t fundamentally change everyday life, aside from giving a boost to both alternative and socialized medicine. That flu passed out of mind and into history and so, of course, might Covid-19.

Or, just as the Black Death in the fourteenth century separated the medieval world from all that followed, this pandemic might draw a thick before-and-after line through our history. Let’s imagine that this novel virus keeps circulating and recirculating, that no one acquires permanent immunity, that it becomes a nasty new addition to the cold season except that it just happens to kill a couple of people out of every hundred who get it. This new normal would certainly be better than if Ebola, with a 50% case fatality rate if untreated, became a perennial risk everywhere. But even with a fatality rate in the low single digits, Covid-19 would necessarily change everything.

The media is full of speculation about what a periodic pandemic future will look like. The end of theater and spectator sports. The institutionalization of distance learning. The death of offices and brick-and-mortar retail.

But let’s take a look beyond that — at the even bigger picture. Let’s consider for a moment the impact of this new, industrial-strength mistrust on international relations.

The Future of the Nation-State

Let’s say you live in a country where the government responded quickly and competently to Covid-19. Let’s say that your government established a reliable testing, contact tracing, and quarantine system. It either closed down the economy for a painful but short period or its system of testing was so good that it didn’t even need to shut everything down. Right now, your life is returning to some semblance of normal.

Lucky you.

The rest of us live in the United States. Or Brazil. Or Russia. Or India. In these countries, the governments have proven incapable of fulfilling the most important function of the state: protecting the lives of their citizens. While most of Europe and much of East Asia have suppressed the pandemic sufficiently to restart their economies, Covid-19 continues to rage out of control in those parts of the world that, not coincidentally, are also headed by democratically elected right-wing autocrats.

In these incompetently run countries, citizens have very good reason to mistrust their governments. In the United States, for instance, the Trump administration botched testing, failed to coordinate lockdowns, removed oversight from the bailouts, and pushed to reopen the economy over the objections of public-health experts. In the latest sign of early-onset dementia for the Trump administration, White House Press Secretary Kayleigh McEnany declared this month that “science should not stand in the way” of reopening schools in the fall.

Voters, of course, could boot Trump out in November and, assuming he actually leaves the White House, restore some measure of sanity to public affairs. But the pandemic is contributing to an already overwhelming erosion of confidence in national institutions. Even before the virus struck, in its 2018 Trust Barometer the public relations firm Edelman registered an unprecedented drop in public trust connected to… what else?… the election of Trump. “The collapse of trust in the U.S. is driven by a staggering lack of faith in government, which fell 14 points to 33% among the general population,” the

report
read more

Yemen: A Torrent of Suffering in a Time of Siege – Kathy Kelly

July 28, 2020

“When evil-doing comes like falling rain, nobody calls out “stop!”
When crimes begin to pile up they become invisible. When sufferings become unendurable, the cries are no longer heard. The cries, too, fall like rain in summer.”

—  Bertolt Brecht

In war-torn Yemen, the crimes pile up. Children who bear no responsibility for governance or warfare endure the punishment. In 2018, UNICEF said the war made Yemen a living read more

Tomgram: William Astore, Thinking About the Unthinkable (2020-Style)

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

He sent what House Speaker Nancy Pelosi called his “unidentified storm troopers” togged out like soldiers in a war zone onto streets filled with protesters in Portland, Oregon. Those camouflage-clad federal law enforcement agents were evidently from the Department of Homeland Security’s Federal Protective Service and the Customs and Border Protection agency. Soon, hundreds of them are evidently going to “surge” — a term that should sound eerily familiar — into Chicago and other cities run by Democratic mayors. In such a fashion, Donald Trump is quite literally bringing this country’s wars home. Speaking with reporters in the Oval Office, he recently described everyday violence in Chicago as “worse than Afghanistan, by far.” He was talking about the country the U.S. invaded in 2001 and in which it hasn’t stopped fighting ever since, a land where more than 100,000 civilians reportedly died violently between 2010 and 2019. By now, violence in Chicago (which is indeed grim) has, in the mind of the Great Confabulator, become “worse than anything anyone has ever seen” and so worthy of yet more militarized chaos.

Of course, in speaking of such violence, the president clearly wasn’t talking about Christopher David’s broken bones. That Navy veteran, having read of unidentified federal agents snatching protesters off Portland’s streets in unmarked vans, took a bus to the city’s nighttime protests. He wanted to ask such agents personally how they could justify their actions in terms of the oath they took to support the Constitution. For doing just that, they beat and pepper-sprayed him. Now, the president who claimed he would end all American wars (but hasn’t faintly done so) has offered a footnote to that promise. Admittedly, he’s only recently agreed, so it seems, to leave at least 4,000 American troops (and god knows how many private contractors) in Afghanistan beyond the November election, while U.S. air strikes there continue into what will be their 19th year. Now, however, he’s stoking violence at home as well in search of an issue to mobilize and strengthen his waning support in the upcoming election.

In other words, he’s giving the very idea of our wars coming home new meaning. As retired Air Force lieutenant colonel, historian, and TomDispatch regular William Astore suggests today, this country’s “forever wars” have become a kind of global pandemic of their own. It tells you all you need to know about this country in July 2020 that, even as congressional Democrats and Republicans fight over what kind of new bill to pass to help coronavirus-riven America, another bill will face no such issues in Congress. I’m thinking of the one that Republican Senator James Inhofe has labeled “the most important bill of the year”: to fund the U.S. military (and the military-industrial complex that goes with it). Oh, wait, unless the president decides to veto it because a mandate may be included in it to remove the names of Confederate generals from U.S. military bases.

Really, can you imagine a world in more of a pandemic mess than this one? Well, let Astore take a shot at it. Tom
Killing Democracy in America
The Military-Industrial Complex as a Cytokine Storm
By William J. Astore

The phrase “thinking about the unthinkable” has always been associated with the unthinkable cataclysm of a nuclear war, and rightly so. Lately, though, I’ve been pondering another kind of unthinkable scenario, nearly as nightmarish (at least for a democracy) as a thermonuclear Armageddon, but one that’s been rolling out in far slower motion: that America’s war on terror never ends because it’s far more convenient for America’s leaders to keep it going — until, that is, it tears apart anything we ever imagined as democracy.

I fear that it either can’t or won’t end because, as Martin Luther King, Jr., pointed out in 1967 during the Vietnam War, the United States remains the world’s greatest purveyor of violence — and nothing in this century, the one he didn’t live to see, has faintly proved him wrong. Considered another way, Washington should be classified as the planet’s most committed arsonist, regularly setting or fanning the flames of fires globally from Libya to Iraq, Somalia to Afghanistan, Syria to — dare I say it — in some quite imaginable future Iran, even as our leaders invariably boast of having the world’s greatest firefighters (also known as the U.S. military).

Scenarios of perpetual war haunt my thoughts. For a healthy democracy, there should be few things more unthinkable than never-ending conflict, that steady drip-drip of death and destruction that drives militarism, reinforces authoritarianism, and facilitates disaster capitalism. In 1795, James Madison warned Americans that war of that sort would presage the slow death of freedom and representative government. His prediction seems all too relevant in a world in which, year after year, this country continues to engage in needless wars that have nothing to do with national defense.

You Wage War Long, You Wage It Wrong

To cite one example of needless war from the last century, consider America’s horrendous years of fighting in Vietnam and a critical lesson drawn firsthand from that conflict by reporter Jonathan Schell. “In Vietnam,” he noted, “I learned about the capacity of the human mind to build a model of experience that screens out even very dramatic and obvious realities.” As a young journalist covering the war, Schell saw that the U.S. was losing, even as its military was destroying startlingly large areas of South Vietnam in the name of saving it from communism. Yet America’s leaders, the “best and brightest” of the era, almost to a man refused to see that all of what passed for realism in their world, when it came to that war, was nothing short of a first-class lie.

Why? Because believing is seeing and they desperately wanted to believe that they were the good guys, as well as the most powerful guys on the planet. America was winning, it practically went without saying, because it had to be. They were infected by their own version of an all-American victory culture, blinded by a sense of this country’s obvious destiny: to be the most exceptional and exceptionally triumphant nation on this planet.

As it happened, it was far more difficult for grunts on the ground to deny the reality of what was happening — that they were fighting and dying in a senseless war. As a result, especially after the shock of the enemy’s Tet Offensive early in 1968, escalating protests within the military (and among veterans at home) together with massive antiwar demonstrations finally helped put the brakes on that war. Not before, however, more than 58,000 American troops died, along with millions of Vietnamese, Cambodians, and Laotians.

In the end, the war in Indochina was arguably too costly, messy, and futile to continue. But never underestimate the military-industrial complex, especially when it comes to editing or denying reality, while being eternally over-funded for that very reality. It’s a trait the complex has shared with politicians of both parties. Don’t forget, for instance, the way President Ronald Reagan reedited that disastrous conflict into a “noble cause” in the 1980s. And give him credit! That was no small thing to sell to an American public that had already lived through such a war. By the way, tell me something about that Reaganesque moment doesn’t sound vaguely familiar almost four decades later when our very own “wartime president” long ago declared victory in the “war” on Covid-19, even as the death toll from that virus approaches 150,000 in the homeland.

In the meantime, the military-industrial complex has mastered the long con of the

no-win
read more

Speaking of Things That Should Be Torn Down

I lean more toward moving offensive monuments out of central squares and providing context and explanation in less prominent locations, as well as favoring the creation of numerous non-offensive public artworks. But if you’re going to tear anything down (or blast anything into outerspace), shouldn’t the bust of Wernher von Braun in Huntsville, Alabama, be considered for inclusion on the list?

Out of a long list of major wars there are only a few the United States claims to have ever won. One of those is the U.S. Civil War, from which monuments to the losers later sprouted up like toxic mushrooms. Now they’re coming down. Another, although principally won by the Soviet Union, was World War II. Some of the losers of that one also have monuments in the United States.

The Confederate monuments were put up in the cause of racism. The celebrations of Nazis in Huntsville glorify, not racism, but the creation of the high-tech weaponry of war, which is only offensive if you notice who gets bombed or if you object to murdering anybody.

But we’re not dealing here with a view toward truth, reconciliation, and rehabilitation. The bust of Von Braun — or for that matter the U.S. postage stamp of him — is not meant to say: “Yes, this man used slave labor to build weaponry for the Nazis. He and his colleagues fit right into white Huntsville in 1950, from which point on they produced horrible murderous weaponry to kill only the proper people who truly needed killing, plus rockets that went to the moon thereby proving that the Soviets stank like doodoo — na – na – na – NA – na!”

On the contrary, naming things around Huntsville for Von Braun is a way to say “Thou shalt maintain a steadfast ignorance about what this man and his colleagues did in Germany, and squint hard when viewing what they contributed to in places like Vietnam. These people brought federal dollars and symphony orchestras and sophisticated culture to our backwater, and they understood our racist ways as only Nazis could. Remember, we still had slavery and worse in Alabama right up until World War II.”

Look at this screenshot of the website of the rocket museum in Huntsville:

Why does this museum have a biergarten? Nobody would guess it was to celebrate Nazis. Any explanation uses only the word “Germans.” Look at how a website for Alabama writes about the great Von Braun’s former house and memorabilia. Look how the Chattanooga Times Free Press writes about a tourist pilgrimage to all the Huntsville sites sanctified by Von Braun. Never a critical or vaguely questioning word anywhere. No discussion of second chances — rather, enforced amnesia.

After World War II, the U.S. military hired sixteen hundred former Nazi scientists and doctors, including some of Adolf Hitler’s closest collaborators, including men responsible for murder, slavery, and human experimentation, including men convicted of war crimes, men acquitted of war crimes, and men who never stood trial. Some of the Nazis tried at Nuremberg had already been working for the U.S. in either Germany or the U.S. prior to the trials. Some were protected from their past by the U.S. government for years, as they lived and worked in Boston Harbor, Long Island, Maryland, Ohio, Texas, Alabama, and elsewhere, or were flown by the U.S. government to Argentina to protect them from prosecution. Some trial transcripts were classified in their entirety to avoid exposing the pasts of important U.S. scientists. Some of the Nazis brought over were frauds who had passed themselves off as scientists, some of whom subsequently learned their fields while working for the U.S. military.

The U.S. occupiers of Germany after World War II declared that all military research in Germany was to cease, as part of the process of denazification. Yet that research went on and expanded in secret, under U.S. authority, both in Germany and in the United States, as part of a process that it’s possible to view as nazification. Not only scientists were hired. Former Nazi spies, most of them former S.S., were hired by the U.S. in post-war Germany to spy on — and torture — Soviets.

The U.S. military shifted in numerous ways when former Nazis were put into prominent positions. It was Nazi rocket scientists who proposed placing nuclear bombs on rockets and began developing the intercontinental ballistic missile. It was Nazi engineers who had designed Hitler’s bunker beneath Berlin, who now designed underground fortresses for the U.S. government in the Catoctin and Blue Ridge Mountains. Known Nazi liars were employed by the U.S. military to draft classified intelligence briefs falsely hyping the Soviet menace. Nazi scientists developed U.S. chemical and biological weapons programs, bringing over their knowledge of tabun and sarin, not to mention thalidomide — and their eagerness for human experimentation, which the U.S. military and the newly created CIA readily engaged in on a major scale. Every bizarre and gruesome notion of how a person might be assassinated or an army immobilized was of interest to their research. New weapons were developed, including VX and Agent Orange. A new drive to visit and weaponize outerspace was created, and former Nazis were put in charge of a new agency called NASA.

Permanent war thinking, limitless war thinking, and creative war thinking in which science and technology overshadowed death and suffering, all went mainstream. When a former Nazi spoke to a women’s luncheon at the Rochester Junior Chamber of Commerce in 1953, the event’s headline was “Buzz Bomb Mastermind to Address Jaycees Today.” That doesn’t sound terribly odd to us, but might have shocked anyone living in the United States anytime prior to World War II. Watch this Walt Disney television program featuring a former Nazi who worked slaves to death in a cave building rockets. Guess who it is.

Before long, President Dwight Eisenhower would be lamenting that “the total influence — economic, political, even spiritual — is felt in every city, every State house, every office of the Federal government.” Eisenhower was not referring to Nazism but to the power of the military-industrial complex. Yet, when asked whom he had in mind in remarking in the same speech that “public policy could itself become the captive of a scientific-technological elite,” Eisenhower named two scientists, one of them the former Nazi in the Disney video linked above.

The decision to inject 1,600 of Hitler’s scientific-technological elite into the U.S. military was driven by fears of the USSR, both reasonable and the result of fraudulent fear mongering. The decision evolved over time and was the product of many misguided minds. But the buck stopped with President Harry S Truman. Henry Wallace, Truman’s predecessor as vice-president who we like to imagine would have guided the world in a better direction than Truman did as president, actually pushed Truman to hire the Nazis as a jobs program. It would be good for American industry, said our progressive hero. Truman’s subordinates debated, but Truman decided. As bits of Operation Paperclip became known, the American Federation of Scientists, Albert Einstein, and others urged Truman to end it. Nuclear physicist Hans Bethe and his colleague Henri Sack asked Truman:

“Did the fact that the Germans might save the nation millions of dollars imply that permanent residence and citizenship could be bought? Could the United States count on [the German scientists] to work for peace when their indoctrinated hatred against the Russians might contribute to increase the divergence between the great powers? Had the war been fought to allow Nazi ideology to creep into our educational and scientific institutions by the back door? Do we want science at any price?”

In 1947 Operation Paperclip, still rather small, was in danger of being terminated. Instead, Truman transformed the U.S. military with the National Security Act, and created the best ally that Operation Paperclip could want: the CIA. Now the program took off, intentionally and willfully, with the full knowledge and understanding of the same U.S. President who had declared as a senator that if the Russians were winning the U.S. should help the Germans, and vice versa, to ensure that the most people possible died, the same president who viciously and pointlessly dropped two nuclear bombs on Japanese cities, the same president who brought us the war on Korea, the war without declaration, the secret wars, the permanent expanded empire of bases, the military secrecy in all matters, the imperial presidency, and the military-industrial complex. The U.S. Chemical Warfare Service took up the study of German chemical weapons at the end of the war as a means to continue in existence. George Merck both diagnosed biological weapons threats for the military and sold the military vaccines to handle them. War was business and business was going to be good for a long time to come.

But how big a change did the United States go through after World War II, and how much of it can be credited to Operation Paperclip? Isn’t a government that would give immunity to both Nazi and Japanese war criminals in order to learn their criminal ways already in a bad place? As one of the defendants argued in trial at Nuremberg, the U.S. had already engaged in its own experiments on humans using almost identical justifications to those offered by the Nazis. If that defendant had been aware, he could have pointed out that the U.S. was in that very moment engaged in such experiments in Guatemala. The Nazis had learned some of their eugenics and other nasty inclinations from Americans. Some of the Paperclip scientists had worked in the U.S. before the war, as many Americans had worked in Germany. These were not isolated worlds.

Looking beyond the secondary, scandalous, and sadistic crimes of war, what about the crime of war itself? We picture the United States as less guilty because it maneuvered the Japanese into the first attack, and because it did prosecute some of the war’s losers. But an impartial trial would have prosecuted Americans too. Bombs dropped on civilians killed and injured and destroyed more than any concentration camps — camps that in Germany had been modeled in part after U.S. camps for native Americans. Is it possible that Nazi scientists blended into the U.S. military so well because an institution that had already done what it had done to the Philippines was not in all that much need of nazification?

Yet, somehow, we think of the firebombing of Japanese cities and the complete leveling of German cities as less offensive that the hiring of Nazi scientists. But what is it that offends us about Nazi scientists? I don’t think it should be that they engaged in mass-murder for the wrong side, an error balanced out in some minds but their later work for mass-murder by the right side. And I don’t think it should be entirely that they engaged in sick human experimentation and forced labor. I do think those actions should offend us. But so should the construction of rockets that take thousands of lives. And it should offend us whomever it’s done for.

It’s curious to imagine a civilized society somewhere on earth some years from now. Would an immigrant with a past in the U.S. military be able to find a job? Would a review be needed? Had they tortured prisoners? Had they drone-struck children? Had they leveled houses or shot up civilians in any number of countries? Had they used cluster bombs? Depleted uranium? White phosphorous? Had they ever worked in the U.S. prison system? Immigrant detention system? Death row? How thorough a review would be needed? Would there be some level of just-following-orders behavior that would be deemed acceptable? Would it matter, not just what the person had done, but how they thought about the world?

I’m not against giving anyone a second chance. But where is the history of Operation Paperclip on the U.S. landscape? Where are the historical markers and memorials? When we talk of tearing down monuments, it’s an act of historical education, not historical erasure that we should be after.