Tomgram: Karen Greenberg, Can the Pandemic Bring Accountability Back to This Country?

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

Yes, it’s possible that a vaccine for Covid-19 could be available by spring. I mean, I wouldn’t put my money on it, but it seems at least conceivable. Here’s something I would put a few bucks on, though: when a vaccine appears, the Trump administration will have so botched things that read more

Best of TomDispatch: John Dower, Terror Is in the Eye of the Beholder

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

Our lives are, of course, our histories, which makes us all, however inadvertently, historians. Part of my own history, my other life — not the TomDispatch one that’s consumed me for the last 14 years — has been editing books. I have no idea how many books I’ve edited since I was in my twenties, but undoubtedly hundreds. Recently, I began rereading read more

Tomgram: John Feffer, The No-Trust World

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

It wasn’t magic. It wasn’t astrology. Not faintly. But it was in the stars. No, not this specific pandemic, but a pandemic. In fact, back in 2010, TomDispatch ran a piece by John Barry on that very subject. He’s the expert on the “Spanish Flu,” the 1918-1919 pandemic that killed an estimated 50 read more

Yemen: A Torrent of Suffering in a Time of Siege – Kathy Kelly

July 28, 2020

“When evil-doing comes like falling rain, nobody calls out “stop!”
When crimes begin to pile up they become invisible. When sufferings become unendurable, the cries are no longer heard. The cries, too, fall like rain in summer.”

—  Bertolt Brecht

In war-torn Yemen, the crimes pile up. Children who bear no responsibility for governance or warfare endure the punishment. In 2018, UNICEF said the war made Yemen a living read more

Tomgram: William Astore, Thinking About the Unthinkable (2020-Style)

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

He sent what House Speaker Nancy Pelosi called his “unidentified storm troopers” togged out like soldiers in a war zone onto streets filled with protesters in Portland, Oregon. Those camouflage-clad federal law enforcement agents were evidently from the Department of Homeland Security’s Federal Protective Service and the Customs and Border Protection agency. Soon, hundreds of them are evidently going to “surge” — a term that should sound eerily familiar — into Chicago and other cities run by Democratic mayors. In such a fashion, Donald Trump is quite literally bringing this country’s wars home. Speaking with reporters in the Oval Office, he recently described everyday violence in Chicago as “worse than Afghanistan, by far.” He was talking about the country the U.S. invaded in 2001 and in which it hasn’t stopped fighting ever since, a land where more than 100,000 civilians reportedly died violently between 2010 and 2019. By now, violence in Chicago (which is indeed grim) has, in the mind of the Great Confabulator, become “worse than anything anyone has ever seen” and so worthy of yet more militarized chaos.

Of course, in speaking of such violence, the president clearly wasn’t talking about Christopher David’s broken bones. That Navy veteran, having read of unidentified federal agents snatching protesters off Portland’s streets in unmarked vans, took a bus to the city’s nighttime protests. He wanted to ask such agents personally how they could justify their actions in terms of the oath they took to support the Constitution. For doing just that, they beat and pepper-sprayed him. Now, the president who claimed he would end all American wars (but hasn’t faintly done so) has offered a footnote to that promise. Admittedly, he’s only recently agreed, so it seems, to leave at least 4,000 American troops (and god knows how many private contractors) in Afghanistan beyond the November election, while U.S. air strikes there continue into what will be their 19th year. Now, however, he’s stoking violence at home as well in search of an issue to mobilize and strengthen his waning support in the upcoming election.

In other words, he’s giving the very idea of our wars coming home new meaning. As retired Air Force lieutenant colonel, historian, and TomDispatch regular William Astore suggests today, this country’s “forever wars” have become a kind of global pandemic of their own. It tells you all you need to know about this country in July 2020 that, even as congressional Democrats and Republicans fight over what kind of new bill to pass to help coronavirus-riven America, another bill will face no such issues in Congress. I’m thinking of the one that Republican Senator James Inhofe has labeled “the most important bill of the year”: to fund the U.S. military (and the military-industrial complex that goes with it). Oh, wait, unless the president decides to veto it because a mandate may be included in it to remove the names of Confederate generals from U.S. military bases.

Really, can you imagine a world in more of a pandemic mess than this one? Well, let Astore take a shot at it. Tom
Killing Democracy in America
The Military-Industrial Complex as a Cytokine Storm
By William J. Astore

The phrase “thinking about the unthinkable” has always been associated with the unthinkable cataclysm of a nuclear war, and rightly so. Lately, though, I’ve been pondering another kind of unthinkable scenario, nearly as nightmarish (at least for a democracy) as a thermonuclear Armageddon, but one that’s been rolling out in far slower motion: that America’s war on terror never ends because it’s far more convenient for America’s leaders to keep it going — until, that is, it tears apart anything we ever imagined as democracy.

I fear that it either can’t or won’t end because, as Martin Luther King, Jr., pointed out in 1967 during the Vietnam War, the United States remains the world’s greatest purveyor of violence — and nothing in this century, the one he didn’t live to see, has faintly proved him wrong. Considered another way, Washington should be classified as the planet’s most committed arsonist, regularly setting or fanning the flames of fires globally from Libya to Iraq, Somalia to Afghanistan, Syria to — dare I say it — in some quite imaginable future Iran, even as our leaders invariably boast of having the world’s greatest firefighters (also known as the U.S. military).

Scenarios of perpetual war haunt my thoughts. For a healthy democracy, there should be few things more unthinkable than never-ending conflict, that steady drip-drip of death and destruction that drives militarism, reinforces authoritarianism, and facilitates disaster capitalism. In 1795, James Madison warned Americans that war of that sort would presage the slow death of freedom and representative government. His prediction seems all too relevant in a world in which, year after year, this country continues to engage in needless wars that have nothing to do with national defense.

You Wage War Long, You Wage It Wrong

To cite one example of needless war from the last century, consider America’s horrendous years of fighting in Vietnam and a critical lesson drawn firsthand from that conflict by reporter Jonathan Schell. “In Vietnam,” he noted, “I learned about the capacity of the human mind to build a model of experience that screens out even very dramatic and obvious realities.” As a young journalist covering the war, Schell saw that the U.S. was losing, even as its military was destroying startlingly large areas of South Vietnam in the name of saving it from communism. Yet America’s leaders, the “best and brightest” of the era, almost to a man refused to see that all of what passed for realism in their world, when it came to that war, was nothing short of a first-class lie.

Why? Because believing is seeing and they desperately wanted to believe that they were the good guys, as well as the most powerful guys on the planet. America was winning, it practically went without saying, because it had to be. They were infected by their own version of an all-American victory culture, blinded by a sense of this country’s obvious destiny: to be the most exceptional and exceptionally triumphant nation on this planet.

As it happened, it was far more difficult for grunts on the ground to deny the reality of what was happening — that they were fighting and dying in a senseless war. As a result, especially after the shock of the enemy’s Tet Offensive early in 1968, escalating protests within the military (and among veterans at home) together with massive antiwar demonstrations finally helped put the brakes on that war. Not before, however, more than 58,000 American troops died, along with millions of Vietnamese, Cambodians, and Laotians.

In the end, the war in Indochina was arguably too costly, messy, and futile to continue. But never underestimate the military-industrial complex, especially when it comes to editing or denying reality, while being eternally over-funded for that very reality. It’s a trait the complex has shared with politicians of both parties. Don’t forget, for instance, the way President Ronald Reagan reedited that disastrous conflict into a “noble cause” in the 1980s. And give him credit! That was no small thing to sell to an American public that had already lived through such a war. By the way, tell me something about that Reaganesque moment doesn’t sound vaguely familiar almost four decades later when our very own “wartime president” long ago declared victory in the “war” on Covid-19, even as the death toll from that virus approaches 150,000 in the homeland.

In the meantime, the military-industrial complex has mastered the long con of the

no-win
read more

Speaking of Things That Should Be Torn Down

I lean more toward moving offensive monuments out of central squares and providing context and explanation in less prominent locations, as well as favoring the creation of numerous non-offensive public artworks. But if you’re going to tear anything down (or blast anything into outerspace), shouldn’t the bust of Wernher von Braun in Huntsville, Alabama, be considered for inclusion on the list?

Out of a long list of major wars there are only a few the United States claims to have ever won. One of those is the U.S. Civil War, from which monuments to the losers later sprouted up like toxic mushrooms. Now they’re coming down. Another, although principally won by the Soviet Union, was World War II. Some of the losers of that one also have monuments in the United States.

The Confederate monuments were put up in the cause of racism. The celebrations of Nazis in Huntsville glorify, not racism, but the creation of the high-tech weaponry of war, which is only offensive if you notice who gets bombed or if you object to murdering anybody.

But we’re not dealing here with a view toward truth, reconciliation, and rehabilitation. The bust of Von Braun — or for that matter the U.S. postage stamp of him — is not meant to say: “Yes, this man used slave labor to build weaponry for the Nazis. He and his colleagues fit right into white Huntsville in 1950, from which point on they produced horrible murderous weaponry to kill only the proper people who truly needed killing, plus rockets that went to the moon thereby proving that the Soviets stank like doodoo — na – na – na – NA – na!”

On the contrary, naming things around Huntsville for Von Braun is a way to say “Thou shalt maintain a steadfast ignorance about what this man and his colleagues did in Germany, and squint hard when viewing what they contributed to in places like Vietnam. These people brought federal dollars and symphony orchestras and sophisticated culture to our backwater, and they understood our racist ways as only Nazis could. Remember, we still had slavery and worse in Alabama right up until World War II.”

Look at this screenshot of the website of the rocket museum in Huntsville:

Why does this museum have a biergarten? Nobody would guess it was to celebrate Nazis. Any explanation uses only the word “Germans.” Look at how a website for Alabama writes about the great Von Braun’s former house and memorabilia. Look how the Chattanooga Times Free Press writes about a tourist pilgrimage to all the Huntsville sites sanctified by Von Braun. Never a critical or vaguely questioning word anywhere. No discussion of second chances — rather, enforced amnesia.

After World War II, the U.S. military hired sixteen hundred former Nazi scientists and doctors, including some of Adolf Hitler’s closest collaborators, including men responsible for murder, slavery, and human experimentation, including men convicted of war crimes, men acquitted of war crimes, and men who never stood trial. Some of the Nazis tried at Nuremberg had already been working for the U.S. in either Germany or the U.S. prior to the trials. Some were protected from their past by the U.S. government for years, as they lived and worked in Boston Harbor, Long Island, Maryland, Ohio, Texas, Alabama, and elsewhere, or were flown by the U.S. government to Argentina to protect them from prosecution. Some trial transcripts were classified in their entirety to avoid exposing the pasts of important U.S. scientists. Some of the Nazis brought over were frauds who had passed themselves off as scientists, some of whom subsequently learned their fields while working for the U.S. military.

The U.S. occupiers of Germany after World War II declared that all military research in Germany was to cease, as part of the process of denazification. Yet that research went on and expanded in secret, under U.S. authority, both in Germany and in the United States, as part of a process that it’s possible to view as nazification. Not only scientists were hired. Former Nazi spies, most of them former S.S., were hired by the U.S. in post-war Germany to spy on — and torture — Soviets.

The U.S. military shifted in numerous ways when former Nazis were put into prominent positions. It was Nazi rocket scientists who proposed placing nuclear bombs on rockets and began developing the intercontinental ballistic missile. It was Nazi engineers who had designed Hitler’s bunker beneath Berlin, who now designed underground fortresses for the U.S. government in the Catoctin and Blue Ridge Mountains. Known Nazi liars were employed by the U.S. military to draft classified intelligence briefs falsely hyping the Soviet menace. Nazi scientists developed U.S. chemical and biological weapons programs, bringing over their knowledge of tabun and sarin, not to mention thalidomide — and their eagerness for human experimentation, which the U.S. military and the newly created CIA readily engaged in on a major scale. Every bizarre and gruesome notion of how a person might be assassinated or an army immobilized was of interest to their research. New weapons were developed, including VX and Agent Orange. A new drive to visit and weaponize outerspace was created, and former Nazis were put in charge of a new agency called NASA.

Permanent war thinking, limitless war thinking, and creative war thinking in which science and technology overshadowed death and suffering, all went mainstream. When a former Nazi spoke to a women’s luncheon at the Rochester Junior Chamber of Commerce in 1953, the event’s headline was “Buzz Bomb Mastermind to Address Jaycees Today.” That doesn’t sound terribly odd to us, but might have shocked anyone living in the United States anytime prior to World War II. Watch this Walt Disney television program featuring a former Nazi who worked slaves to death in a cave building rockets. Guess who it is.

https://www.youtube.com/watch?v=Zjs3nBfyIwM

Before long, President Dwight Eisenhower would be lamenting that “the total influence — economic, political, even spiritual — is felt in every city, every State house, every office of the Federal government.” Eisenhower was not referring to Nazism but to the power of the military-industrial complex. Yet, when asked whom he had in mind in remarking in the same speech that “public policy could itself become the captive of a scientific-technological elite,” Eisenhower named two scientists, one of them the former Nazi in the Disney video linked above.

The decision to inject 1,600 of Hitler’s scientific-technological elite into the U.S. military was driven by fears of the USSR, both reasonable and the result of fraudulent fear mongering. The decision evolved over time and was the product of many misguided minds. But the buck stopped with President Harry S Truman. Henry Wallace, Truman’s predecessor as vice-president who we like to imagine would have guided the world in a better direction than Truman did as president, actually pushed Truman to hire the Nazis as a jobs program. It would be good for American industry, said our progressive hero. Truman’s subordinates debated, but Truman decided. As bits of Operation Paperclip became known, the American Federation of Scientists, Albert Einstein, and others urged Truman to end it. Nuclear physicist Hans Bethe and his colleague Henri Sack asked Truman:

“Did the fact that the Germans might save the nation millions of dollars imply that permanent residence and citizenship could be bought? Could the United States count on [the German scientists] to work for peace when their indoctrinated hatred against the Russians might contribute to increase the divergence between the great powers? Had the war been fought to allow Nazi ideology to creep into our educational and scientific institutions by the back door? Do we want science at any price?”

In 1947 Operation Paperclip, still rather small, was in danger of being terminated. Instead, Truman transformed the U.S. military with the National Security Act, and created the best ally that Operation Paperclip could want: the CIA. Now the program took off, intentionally and willfully, with the full knowledge and understanding of the same U.S. President who had declared as a senator that if the Russians were winning the U.S. should help the Germans, and vice versa, to ensure that the most people possible died, the same president who viciously and pointlessly dropped two nuclear bombs on Japanese cities, the same president who brought us the war on Korea, the war without declaration, the secret wars, the permanent expanded empire of bases, the military secrecy in all matters, the imperial presidency, and the military-industrial complex. The U.S. Chemical Warfare Service took up the study of German chemical weapons at the end of the war as a means to continue in existence. George Merck both diagnosed biological weapons threats for the military and sold the military vaccines to handle them. War was business and business was going to be good for a long time to come.

But how big a change did the United States go through after World War II, and how much of it can be credited to Operation Paperclip? Isn’t a government that would give immunity to both Nazi and Japanese war criminals in order to learn their criminal ways already in a bad place? As one of the defendants argued in trial at Nuremberg, the U.S. had already engaged in its own experiments on humans using almost identical justifications to those offered by the Nazis. If that defendant had been aware, he could have pointed out that the U.S. was in that very moment engaged in such experiments in Guatemala. The Nazis had learned some of their eugenics and other nasty inclinations from Americans. Some of the Paperclip scientists had worked in the U.S. before the war, as many Americans had worked in Germany. These were not isolated worlds.

Looking beyond the secondary, scandalous, and sadistic crimes of war, what about the crime of war itself? We picture the United States as less guilty because it maneuvered the Japanese into the first attack, and because it did prosecute some of the war’s losers. But an impartial trial would have prosecuted Americans too. Bombs dropped on civilians killed and injured and destroyed more than any concentration camps — camps that in Germany had been modeled in part after U.S. camps for native Americans. Is it possible that Nazi scientists blended into the U.S. military so well because an institution that had already done what it had done to the Philippines was not in all that much need of nazification?

Yet, somehow, we think of the firebombing of Japanese cities and the complete leveling of German cities as less offensive that the hiring of Nazi scientists. But what is it that offends us about Nazi scientists? I don’t think it should be that they engaged in mass-murder for the wrong side, an error balanced out in some minds but their later work for mass-murder by the right side. And I don’t think it should be entirely that they engaged in sick human experimentation and forced labor. I do think those actions should offend us. But so should the construction of rockets that take thousands of lives. And it should offend us whomever it’s done for.

It’s curious to imagine a civilized society somewhere on earth some years from now. Would an immigrant with a past in the U.S. military be able to find a job? Would a review be needed? Had they tortured prisoners? Had they drone-struck children? Had they leveled houses or shot up civilians in any number of countries? Had they used cluster bombs? Depleted uranium? White phosphorous? Had they ever worked in the U.S. prison system? Immigrant detention system? Death row? How thorough a review would be needed? Would there be some level of just-following-orders behavior that would be deemed acceptable? Would it matter, not just what the person had done, but how they thought about the world?

I’m not against giving anyone a second chance. But where is the history of Operation Paperclip on the U.S. landscape? Where are the historical markers and memorials? When we talk of tearing down monuments, it’s an act of historical education, not historical erasure that we should be after.

 

Tomgram: Rebecca Gordon, Work in the Time of Covid-19

This article originally appeared at TomDispatch.com. To receive TomDispatch in your inbox three times a week, click here.

Here’s the strange thing. It never crossed my mind — how could it have? — but in work terms I’ve been testing out a Covid-19 world for the last decade and a half. I ran TomDispatch in those years, full time, from a small office in my own apartment in relative isolation. Yes, I could take the subway to see friends, swim at the Y, or visit my grandkids, but I was alone in that room and, however inessential my work and however unknown to myself, I was, it seems, preparing for a future pandemic. And yet, in some sense, little did all those years truly prepare me for the impact of this world in which the United States remains #1 (in coronavirus cases and in deaths USA! USA!), nor was I emotionally prepared for the president I recently heard Noam Chomsky label the most dangerous man in history because he’s so intent on burning the human world to a crisp and so focused on himself that the deaths of others mean less than nothing to him.

And now, of course, I have it easy beyond compare. Imagine the lives of what are today called “essential” workers but were once janitors, whose job is now to endlessly sanitize a world from hell and who, for little more than minimum wage, are catching Covid-19 and dying across the country thanks to people who won’t even wear face masks and to a president who has promoted the disease as if it were his political ally.

TomDispatch regular Rebecca Gordon, a teacher who has been working by Zoom in these last months, has had similar thoughts and today suggests how this upside-down, diseased planet of ours and the diseased political world that goes with it may change the very shape of work in our lives for years to come. Tom
Why Does Essential Work Pay So Little…
And Cost So Much?
By Rebecca Gordon

In two weeks, my partner and I were supposed to leave San Francisco for Reno, Nevada, where we’d be spending the next three months focused on the 2020 presidential election. As we did in 2018, we’d be working with UNITE-HERE, the hospitality industry union, only this time on the campaign to drive Donald Trump from office.

Now, however, we’re not so sure we ought to go. According to information prepared for the White House Coronavirus Task Force, Nevada is among the states in the “red zone” when it comes to both confirmed cases of, and positive tests for, Covid-19. I’m 68. My partner’s five years older, with a history of pneumonia. We’re both active and fit (when I’m not tripping over curbs), but our ages make us more likely, if we catch the coronavirus, to get seriously ill or even die. That gives a person pause.

Then there’s the fact that Joe Biden seems to have a double-digit lead over Trump nationally and at least an eight-point lead in Nevada, according to the latest polls. If things looked closer, I would cheerfully take some serious risks to dislodge that man in the White House. But does it make sense to do so if Biden is already likely to win there? Or, to put it in coronavirus-speak, would our work be essential to dumping Trump?

Essential Work?

This minor personal conundrum got me thinking about how the pandemic has exposed certain deep and unexamined assumptions about the nature and value of work in the United States.

In the ethics classes I teach undergraduates at a college here in San Francisco, we often talk about work. Ethics is, after all, about how we ought to live our lives — and work, paid or unpaid, constitutes a big part of most of those lives. Inevitably, the conversation comes around to compensation: How much do people deserve for different kinds of work? Students tend to measure fair compensation on two scales. How many years of training and/or dollars of tuition did a worker have to invest to become “qualified” for the job? And how important is that worker’s labor to the rest of society?

Even before the coronavirus hit, students would often settle on medical doctors as belonging at the top of either scale. Physicians’ work is the most important, they’d argue, because they keep us alive. “Hmm…” I’d say. “How many of you went to the doctor today?” Usually not a hand would be raised. “How many of you ate something today?” All hands would go up, as students looked around the room at one another. “Maybe,” I’d suggest, “a functioning society depends more on the farmworkers who plant and harvest food than on the doctors you normally might see for a checkup once a year. Not to mention the people who process and pack what we eat.”

I’d also point out that the workers who pick or process our food are not really unskilled. Their work, like a surgeon’s, depends on deft, quick hand movements, honed through years of practice.

Sometimes, in these discussions, I’d propose a different metric for compensation: maybe we should reserve the highest pay for people whose jobs are both essential and dangerous. Before the pandemic, that category would not have included many healthcare workers and certainly not most doctors. Even then, however, it would have encompassed farmworkers and people laboring in meat processing plants. As we’ve seen, in these months it is precisely such people — often immigrants, documented or otherwise — who have also borne some of the worst risks of virus exposure at work.

By the end of April, when it was already clear that meatpacking plants were major sites of Covid-19 infection, the president invoked the Defense Production Act to keep them open anyway. This not only meant that workers afraid to enter them could not file for unemployment payments, but that even if the owners of such dangerous workplaces wanted to shut them down, they were forbidden to do so. By mid-June, more than 24,000 meatpackers had tested positive for the virus. And just how much do these essential and deeply endangered workers earn? According to the U.S. Bureau of Labor Statistics, about $28,450 a year — better than minimum wage, that is, but hardly living high on the hog (even when that’s what they’re handling).

You might think that farmworkers would be more protected from the virus than meatpackers, perhaps because they work outdoors. But as the New York Times has reported: “Fruit and vegetable pickers toil close to each other in fields, ride buses shoulder-to-shoulder, and sleep in cramped apartments or trailers with other laborers or several generations of their families.”

Not surprisingly, then, the coronavirus has, as the Times report puts it, “ravaged” migrant farm worker communities in Florida and is starting to do the same across the country all the way to eastern Oregon. Those workers, who risk their lives through exposure not only to a pandemic but to more ordinary dangers like herbicides and pesticides so we can eat, make even less than meatpackers: on average, under $26,000 a year.

When the president uses the Defense Production Act to ensure that food workers remain in their jobs, it reveals just how important their labor truly is to the rest of us. Similarly, as shutdown orders have kept home those who can afford to stay in, or who have no choice because they no longer have jobs to go to, the pandemic has revealed the crucial nature of the labor of a large group of workers already at home (or in other people’s homes or eldercare facilities): those who care for children and those who look after older people and people with disabilities who need the assistance of health aides.

This work, historically done by women, has generally been unpaid when the worker is a family member and poorly paid when done by a professional. Childcare workers, for example, earn less than $24,000 a year on average; home healthcare aides, just over that amount.

Women’s Work

Speaking of women’s work, I suspect that the coronavirus and the attendant economic crisis are likely to affect women’s lives in ways that will last at least a generation, if not beyond.

Middle-class feminists of the 1970s came of age in a United States where it was expected that they would marry and spend their days caring for a house, a husband, and their children. Men were the makers. Women were the “homemakers.” Their work was considered — even by Marxist economists — “non-productive,” because it did not seem to contribute to the real economy, the place where myriad widgets are produced, transported, and sold. It was seldom recognized how essential this unpaid labor in the realm of social reproduction was to a functioning economy. Without it, paid workers would not have been fed, cared for, and emotionally repaired so that they could return to another day of widget-making. Future workers would not be socialized for a life of production or reproduction, as their gender dictated.

Today, with so many women in the paid workforce, much of this work of social reproduction has been outsourced by those who can afford it to nannies, day-care workers, healthcare aides, house cleaners, or the workers who measure and pack the ingredients for meal kits to be prepared by other working women when they get home.

We didn’t know it at the time, but the post-World War II period, when boomers like me grew up, was unique in U.S. history. For a brief quarter-century, even working-class families could aspire to an arrangement in which men went to work and women kept house. A combination of strong unions, a post-war economic boom, and a so-called breadwinner minimum wage kept salaries high enough to support families with only one adult in the paid labor force. Returning soldiers went to college and bought houses through the 1944 Servicemen’s Readjustment Act, also known as the G.I. Bill. New Deal programs like social security and unemployment insurance helped pad out home economies.

By the mid-1970s, however, this golden age for men, if not women, was fading. (Of course, for many African Americans and other marginalized groups, it had always only been an age of fool’s gold.) Real wages stagnated and began their long, steady decline. Today’s federal minimum wage, at $7.25 per hour, has remained unchanged since 2009 (something that can hardly be said about the wealth of the 1%). Far from supporting a family of four, in most parts of the country, it won’t even keep a single person afloat.

Elected president in 1980, Ronald Reagan announced in his first inaugural address, “Government is not the solution to our problem, government is the problem.” He then set about dismantling President Lyndon Johnson’s War on Poverty programs, attacking the unions that had been the underpinning for white working-class prosperity, and generally starving the beast of government. We’re still living with the legacies of that credo in, for example, the housing crisis he first touched off by deregulating savings and loan institutions and disempowering the Department of Housing and Urban Development.

It’s no accident that, just as real wages were falling, presidential administrations of both parties began touting the virtues of paid work for women — at least if those women had children and no husband. Aid to Families with Dependent Children (“welfare”) was another New Deal program, originally designed to provide cash assistance to widowed women raising kids on their own at a time when little paid employment was available to white women.

In the 1960s, groups like the National Welfare Rights Organization began advocating that similar benefits be extended to Black women raising children. (As a welfare rights advocate once asked me, “Why is it fine for a woman to look to a man to help her children, but not to The Man?”) Not surprisingly, it wasn’t until Black and Latina women began receiving the same entitlements as their white sisters that welfare became a “problem” in need of “reform.”

By the mid-1990s, the fact that some Black women were receiving money from the government while not doing paid labor for an employer had been successfully reframed as a national crisis. Under Democratic President Bill Clinton, Congress passed the Personal Responsibility and Work Reconciliation Act of 1996, a bill that then was called “welfare reform.” After that, if women wanted help from The Man, they had to work for it – not by taking care of their own children, but by taking care of their children and holding down minimum-wage jobs.

Are the Kids All Right?

It’s more than a little ironic, then, that the granddaughters of feminists who argued that women should have a choice about whether or not to pursue a career came to confront an economy in which women, at least ones not from wealthy families, had little choice about working for pay.

The pandemic may change that, however — and not in a good way. One of the unfulfilled demands of liberal 1970s feminism was universal free childcare. An impossible dream, right? How could any country afford such a thing?

Wait a minute, though. What about Sweden? They have universal free childcare. That’s why a Swedish friend of mine, a human rights lawyer, and her American husband who had a rare tenure track university job in San Francisco, chose to take their two children back to Sweden. Raising children is so much easier there. In the early days of second-wave feminism, some big employers even built daycare centers for their employees with children. Those days, sadly, are long gone.

Now, in the Covid-19 moment, employers are beginning to recognize the non-pandemic benefits of having employees work at home. (Why not make workers provide their own office furniture? It’s a lot easier to justify if they’re working at home. And why pay rent on all that real estate when so many fewer people are in the office?) While companies will profit from reduced infrastructure costs and in some cases possibly even reduced pay for employees who relocate to cheaper areas, workers with children are going to face a dilemma. With no childcare available in the foreseeable future and school re-openings dicey propositions (no matter what the president threatens), someone is going to have to watch the kids. Someone — probably in the case of heterosexual couples, the person who is already earning less — is going to be under pressure to reduce or give up paid labor to do the age-old unpaid (but essential) work of raising the next generation. I wonder who that someone is going to be and, without those paychecks, I also wonder how much families are going to suffer economically in increasingly tough times.

Grateful to Have a Job?

Recently, in yet another Zoom meeting, a fellow university instructor (who’d just been interrupted to help a child find a crucial toy) was discussing the administration’s efforts to squeeze concessions out of faculty and staff. I was startled to hear her add, “Of course, I’m grateful they gave me the job.” This got me thinking about jobs and gratitude — and which direction thankfulness ought to flow. It seems to me that the pandemic and the epidemic of unemployment following in its wake have reinforced a common but false belief shared by many workers: the idea that we should be grateful to our employers for giving us jobs.

We’re so often told that corporations and the great men behind them are Job Creators. From the fountain of their beneficence flows the dignity of work and all the benefits a job confers. Indeed, as this fairy tale goes, businesses don’t primarily produce widgets or apps or even returns for shareholders. Their real product is jobs. Like many of capitalism’s lies, the idea that workers should thank their employers reverses the real story: without workers, there would be no apps, no widgets, no shareholder returns. It’s our effort, our skill, our diligence that gives work its dignity. It may be an old saying, but no less true for that: labor creates all wealth. Wealth does not create anything — neither widgets, nor jobs.

I’m grateful to the universe that I have work that allows me to talk with young people about their deepest values at a moment in their lives when they’re just figuring out what they value, but I am not grateful to my university employer for my underpaid, undervalued job. The gratitude should run in the other direction. Without faculty, staff, and students there would be no university. It’s our labor that creates wealth, in this case a (minor) wealth of knowledge.

As of July 16th, in the midst of the Covid-19 crisis, 32 million Americans are receiving some kind of unemployment benefit. That number doesn’t even reflect the people involuntarily working reduced hours, or those who

haven’t
read more