Demonstrators participating in the Poor People's March at Lafayette Park and on Connecticut Avenue, Washington, D.C.
Thank you for signing up. For more from The Nation, check out our latest issue.
Toward the end of last summer, Fox News aired a nine-minute segment featuring a 29-year-old surfer in La Jolla, California, named Jason Greenslate. Not only did Greenslate possess certain qualities that seemed designed to enrage the cable channel’s over-65 demographic—long hair, mirrored sunglasses, a languid grin—but this particular surfer dude also happened to receive $200 a month in food assistance, some of which he spent on sushi and fresh lobster. Footage showed Greenslate driving a black Cadillac truck, jamming with his skate-punk band and generally enjoying himself. Close-ups of the offending shellfish were dutifully provided. (“It’s free food,” Greenslate said obligingly. “It’s awesome.”) Meanwhile, an on-screen graphic reminded viewers of The Great Food Stamp Binge.Ad Policy
Greenslate quickly became a talking point for Republican lawmakers, who cited his gourmet diet and cheerful indifference to a steady paycheck as evidence of a social welfare state run amok. In September, a month after the Fox News segment aired, House Republicans voted to slash food assistance by $40 billion. A GOP memo even mentioned “young surfers who aren’t working but cash their food stamps in for lobster”—as if one surfer in La Jolla had somehow self-replicated. Never mind that Greenslate was hardly representative of anyone other than himself: nine out of ten food stamp recipients live in a household with a child, a senior citizen or someone with a disability. The caricature he provided—vulgar, cocky, altogether annoying—was too perfect, and too useful. (Greenslate later told reporters that he had agreed to Fox News’s request in hopes of getting publicity for his band.) And so Lobster Boy joined the Welfare Queen in the ranks of America’s undeserving poor.
Being poor in the United States has rarely been taken to mean anything so simple as having too little money. Americans have long distinguished between those who deserve public or private charity and those who don’t. In the latest edition of The Undeserving Poor, first published in 1989, the historian Michael B. Katz writes about “the enduring attempt to classify poor people by merit.” This impulse is driven partly by policy calculations: given that resources are finite, how can the people who most need help get it? But the distinctions are often laced with moralism, too: Who are the real victims—the worthy ones? Who are the moochers trying to game the system? The deserving poor have typically included widows and children, along with “a few others whose lack of responsibility for their condition cannot be denied.” Katz says the working poor of today have also elicited some sympathy and support, though if our current political impasse is any indication—twenty-five Republican-controlled states have rejected Medicaid expansion, effectively shutting out half of all low-wage earners in the country from any kind of insurance coverage—that sympathy and support seem to come from just one side of the aisle.
There was a time when being poor didn’t carry the same stigma that it does now. Before the abundance of the twentieth century, poverty was ubiquitous as well as inevitable. American poor laws in the nineteenth century made the poor a community responsibility, with the result that local authorities (in what seems like a grim prelude to the pre-Obamacare insurance rolls) would dispatch their elderly or infirm to another town in an attempt to avoid paying for their care. Still, poverty wasn’t considered a deviant condition. “Resources were finite; life was harsh,” Katz writes. “Most people, as the bible predicted, would be born, live, and die in poverty.”
The Industrial Revolution changed that. Although opportunities and decent pay have never been as plentiful as laissez-faire boosters pretend (not to mention the hardship and displacement caused by industrialization itself), economic growth brought with it an extraordinary increase in the American standard of living. “With scarcity off the table,” Katz writes, “individual failings marked persons as all the more undeserving in a world of possibility where poverty no longer was inescapable.” According to the new dispensation, people were poor because of defects in character, not circumstance.
There have been eras that took exception to this attitude—during the Great Depression, when unemployment hovered around 25 percent, faith that anyone who truly wanted a job could get one was punctured by the miserable reality—but Katz shows how resilient the belief is that people are poor due to sheer laziness or incompetence. In a recent essay for the online magazine Berfrois, Katz explains that one of the reasons he decided to update the book was “the stubborn persistence of poverty as a blight on American life.” Even today, Republican attacks on food assistance have seemed more responsive to cultural assumptions than to economic facts; pols and pundits point to the swelling number of food stamp recipients as smoking-gun evidence of fraud and abuse, without acknowledging the possibility that more people need help because the economy is sputtering on fumes.
* * *
According to the official Census numbers, 46.5 million Americans are poor. That amounts to 15 percent of the population. Compared with the rest of the Organization for Economic Cooperation and Development countries, our poverty rate puts us on par with Poland, second only to Mexico. The extent of need among children, however, is unparalleled in the developed world: one out of five American children lives in poverty.
The statistics are distressing enough, but they’re also the result of a political calculation as much as a mathematical one. The Census equates poverty with a threshold, an absolute dollar amount pegged to inflation; the way Katz tells the story, it is as if the government stumbled upon a convenient measure of poverty and never looked back. In 1963, an economist at the Social Security Administration named Mollie Orshansky was tasked with researching the effects of poverty on children; there was no official poverty measure at the time, so Orshansky estimated a level of need based on the observation that poor people spent about a third of their income on food. The Office of Economic Opportunity, which was the lead agency for the War on Poverty, then decreed that the poverty threshold would be set at three times the Department of Agriculture’s low-cost food budget—which was 25 percent lower than Orshansky’s low-cost plan. Orshansky herself seemed surprised that policy was determined by a tool she had originally devised for her own research; the low-cost food budget, she said, was “a crude criterion of income inadequacy.”
Katz argues that the official statistics therefore underestimate the problem, and that the disparity between official and actual poverty has grown. Attempts to measure American poverty in relative terms—as a fraction of median income—have provoked plenty of resistance, even though the cost of food has decreased in the last fifty years, to the point where today it consumes between one-sixth and one-tenth of a household’s budget. A relative measure would bump the poverty rate up to 18 percent, which Katz, citing the Luxembourg Income Study, believes is more accurate, if more shameful: “Among the countries in the study, only in Mexico, India, and Guatemala did more people live on incomes this low.”
This bid to shake us out of complacency, to show that poverty is an even bigger problem for Americans than typically imagined, is well-meaning and sincere. Poverty measurements determine who qualifies for government assistance, and setting the poverty level low—above which a household might be considered struggling but not officially poor—is one way to prevent people from getting help. But sincerity and good intentions get us only so far, especially among those who evince not so much complacency as active antipathy toward the poor. Why would someone who isn’t inclined to worry about a poverty rate of 15 percent suddenly become alarmed to hear that it’s really 18 percent? Conservatives have shown themselves less interested in calibrating social welfare rolls than in discrediting the very idea of welfare in the first place. As Katz’s book makes clear, even on those occasions when conservatives and liberals agree on the numbers, their entirely different conclusions suggest radically different ways of looking at the same world. The dire numbers are often used by conservatives, who are eager to prove that the only accomplishment of anti-poverty programs has been to make poverty worse.
* * *
Gathering data on poor people was conceived as a way to help them. Orshansky devised her measure of poverty in an attempt to study and understand it; the measure became encoded in the official statistics because Lyndon Johnson had announced his “unconditional war on poverty” in 1964, and like any war effort, this one required a precise definition of the enemy.
Johnson deployed the military metaphor as a rhetorical strategy, one designed to rouse passions. “I wanted to rally the nation,” he later explained, “to sound a call to arms which would stir people in the government, in private industry, and on the campuses to lend their talent to a massive effort to eliminate this evil.” But the war itself, along with the rest of Johnson’s Great Society programs, would be conducted according to rational principles, and economics—a discipline that was becoming ever more mathematical—offered an attractive approach. “Economists met government’s need for systematic data, predictive models, and program evaluation,” Katz writes. Fixating on the intimate, devastating effects of poverty wouldn’t change anything; helping people on a massive scale required a degree of numeric abstraction. “For the purposes of government policy,” Katz observes, “poverty is not deprivation; it is bureaucratic category.”
The poverty rate in 1963 was estimated to include between 20 and 25 percent of the American population. (A Senate study a few years before had calculated that 32 million Americans were poor.) By 1965, when anti-poverty measures were in full swing, the percentage had dropped below 15 percent, dipping to 11.1 percent in 1973—which has marked the lowest point since.
One side sees these numbers as proof of the Great Society’s success, the other side as proof of failure. “By placing government policy on a scientific basis,” Katz writes, “poverty researchers hoped to transcend politics and ideology,” but their own research would eventually be used against them. The Undeserving Poor recounts not only the history of what happened, but also the many ways what happened has been interpreted and distorted.
A number of right-wing pundits and intellectuals have argued that economic growth, rather than government programs, accounted for the improving fortunes of the poor in the 1960s. The early attacks on the welfare state in the 1970s tried to “redefine poverty out of existence,” as Katz puts it. Poverty was no longer a major problem, according to this argument, because the poor had been overcounted; the numbers didn’t properly factor the in-kind benefits they received. When this fanciful theory foundered on actual facts—the problems of homelessness and hunger were too big to disappear up a Brooks Brothers sleeve—conservatives turned to a cultural approach. They argued that the welfare system was worse than useless: programs to help the poor actually harmed them.
George Gilder’s Wealth and Poverty was published in 1981 and became a Book of the Month Club pick, thereby ensuring a wide audience for his mix of techno-utopianism and primitive nostalgia. He had much to say about man’s “role as provider, the definitive male activity from the primal days of the hunt.” Man is “cuckolded by the compassionate state”; the government usurps his age-old role, which is why “welfare now erodes work and family and thus keeps poor people poor.” When women are less dependent on men, men no longer benefit from women’s civilizing powers, and all hell breaks loose: “Because female sexuality, as it evolved over the millennia, is psychologically rooted in the bearing and nurturing of children, women have long horizons within their very bodies, glimpses of eternity within their wombs.”
Underneath his ersatz sociology and wooly sex talk was a sustained assault on the welfare state, what Katz describes as “the intellectual ammunition” needed for cutting spending on the poor as well as taxes on the rich. “Intellectual,” however, doesn’t quite describe Gilder’s approach, nor its appeal. Scholars like Charles Murray might have given the conservative argument the sheerest patina of social-scientific respectability, but the ultimate power of the conservative attack derived from how directly it tapped into upper-middle-class fears and self-interest. (Katz picks apart Murray’s analysis to show how he got his facts backward. In his 1984 book Losing Ground, Murray wanted to prove that increased welfare benefits caused a rise in out-of-wedlock births among black mothers after 1972, whereas welfare benefits actually fell—sharply—at the same time.) Poor people, Murray argued, lived according to different values and “were engaged in self-destructive personal behavior that would keep them at the bottom of society”; spending money on them only produced “incentives to fail,” which perpetuated their depravity.
* * *
Conservatives like Gilder and Murray had revived an attempt to cast poverty in terms of culture instead of scarcity. There is no small irony in this, considering that the “culture of poverty” was introduced in the late 1950s as an attempt to sympathize with, rather than stigmatize, the poor. An anthropologist named Oscar Lewis wanted to show how the poorest Mexicans and Puerto Ricans developed traits that were “both an adaptation and a reaction…to their marginal position in a class-stratified, highly individuated, capitalistic society.” “Resignation” and “a lack of impulse control,” Lewis wrote, represented efforts “to cope with feelings of hopelessness and despair.”
This approach, as potentially perilous as it was, gained further traction on the left when Michael Harrington published The Other America in 1962. Harrington described poverty as “a culture, an institution, a way of life.” Like Lewis, he argued that fatalism and pleasure-seeking for the poor were “a piece of realism, not of vice,” and rather than become ensnared in “dry, graceless, technical matters,” he wanted to give his readers a sense of poverty as it was lived. He was writing about “the new poor” in an affluent society, those who were left behind by the “political and social gains” since the Depression, or else displaced by technological change—those for whom “progress is misery.” The new poor were an invisible minority, “the first poor not to be seen.” Harrington made ample and potent use of irony and paradox—“leisure is a burden to the aged”; “poverty is expensive to maintain”—as a way to rattle popular assumptions.
The Other America also contains some hoary stereotypes: Harrington warns that more women in the workforce would inevitably lead to “the impoverishment of home life, of children who receive less care, love, and supervision”; he cites Norman Mailer, of all people, in a wildly speculative passage about “the Negro psychology.” But there is something deeply human and humane about this book. We learn that the postwar shift toward big agriculture had left some small farm owners not only poor but truly hungry, to the point where “56 percent of low-income farm families were deficient in one or more basic nutrients in the diet.” Harrington describes life on the Bowery, where “men sell their blood in order to get enough money to drink—and then turn up in the hospital and need blood transfusions.” In a 13,000-word essay for The New Yorker on several books about poverty, the critic Dwight Macdonald singled out The Other America as “most important.” Reading it was a revelation, as unsettling as it was necessary. “Those who run things,” he wrote, “have been as unaware of the continued existence of mass poverty as this reviewer was until he read Mr. Harrington’s book.”
The Other America sold 70,000 copies in its first year, and more than a million copies in paperback thereafter. In an introduction to a later edition, the critic Irving Howe recalled being surprised by its success: “I remember thinking that Mike’s book, fine as it was, would probably be numbered among those ‘worthy’ publications that sell four or five thousand copies and then fade away.” John F. Kennedy was persuaded by Harrington’s book—or Macdonald’s review, depending on historical accounts—to put a comprehensive poverty program on the national agenda, and a war on poverty was born.
How did one little book—fewer than 180 pages—accomplish so much? Harrington attributed it to timing. The Other America was published during a period of relative affluence and declining unemployment, and the civil rights movement was creating a surge in social consciousness. “Had The Other America been published five years earlier or one year later,” Harrington later wrote, “it would not have had the impact it had.” Harrington was also an exceptional writer—lucid and straightforward, yet full of ardent empathy for the poor—but he was right to credit the temper of the time. As it happened, The Other America was published the same year as Macdonald’s Against the American Grain, in which he ridiculed a postwar generation of newly affluent, optimistic, middlebrow strivers. They didn’t just want to read; they wanted to be edified, and they vested critics like Macdonald with a kind of centralized cultural authority that not a single critic can claim today. Can you imagine the Obama administration making policy decisions based on a book review?
* * *
The influence of The Other America back in the 1960s is all the more remarkable when you consider what’s happened since. The afterwords that Harrington provided for subsequent editions are fascinating but disheartening documents; he catalogs the many ways that the progress of the 1960s was overtaken by unfortunate events (the Vietnam War, the oil crisis, stagflation), not to mention flagging political will and a hard shift to the right in the 1980s. The plaintive tone of the original text doesn’t bring to mind the same desperation, especially given that the situation for the poor initially improved after The Other America was first published. But while the poor may not be as invisible as they were when Harrington was writing, they no longer occupy a central space in the national imagination or the national agenda. This is why a Democrat could pledge to “end welfare as we know it”: the reforms Bill Clinton enacted during his presidency fed into the idea that poverty was mainly a problem of incentives, that the solution lay in making welfare harder to get, which would push people into the workforce.
The Personal Responsibility and Work Opportunity Reconciliation Act of 1996 exacerbated the distinctions between the deserving and the undeserving poor—which was part of the point. Some optimists believed that cutting benefits would actually inspire more generosity by increasing the number of working—that is to say, deserving—poor. As The New York Times reported at the end of the Clinton years, “The restrictions would help create a political climate more favorable to the needy. Once taxpayers started viewing the poor as workers, not welfare cheats, a more generous era would ensue. Harmful stereotypes would fade. New benefits would flow.” Needless to say, this kind of naïve hopefulness can only be read with a bitter irony now. It glossed over the prospects for poor people who, for whatever reason, couldn’t find or hold down a job. It ignored the fact that moving people off the welfare rolls was a simpler proposition during a time when jobs, especially low-wage jobs, were more readily available. And, finally, it failed to reckon with the possibility that the attempt to debunk a glib stereotype of the poor as merely unmotivated and lazy—a stereotype inflected by racial prejudice—might deepen it, rendering the poor even more vulnerable.
Just two months ago, House majority leader Eric Cantor went so far as to cast the recent House food stamp bill as the next step in Clinton’s welfare reform. “This legislation restores the intent of the bipartisan welfare reforms adopted in 1996 to the Supplemental Nutrition Assistance Program,” he declared. “It also refocuses the program on those who need it most.” Cantor’s rhetoric in 2013 is almost identical to Clinton’s in 1996, but it conceals substantially different motives. The similarity also shows that motives, in the end, count for little when it comes to political effectiveness. Food stamp recipients aren’t a powerful lobby; the fewer resources they share with the rest of the voting population, the easier it is to cut even more from the little they have left. The shrewder politicians have taken the extra step of trying to neutralize the possibility of empathy. Why feel anything but contempt for the poor if you can dismiss them as a bunch of surfer slackers and welfare queens?
Our current problems—grinding unemployment, persistent wage stagnation, increasing inequality—might be expected to alleviate some of the stigma. After all, when more people feel economically vulnerable, shouldn’t they be primed to understand the economic distress of others? The economist Benjamin Friedman suggests otherwise. In The Moral Consequences of Economic Growth, he points out that widespread economic insecurity has historically triggered the opposite effect. People tend to be more generous when they have more to give. “Attitudes among average citizens, now forced to question the security of their own economic position and made even more anxious for their children’s, became less generous and less tolerant.” The Great Society coincided with the Great Progression, or the growing middle class. “When incomes were rising for most Americans,” Friedman writes, “ways of making the society more inclusive had enjoyed broad appeal.” Even welfare reform during the Clinton “boom” years is evidence of this, as the income of the average American had stopped rising in real terms by 1996. More income inequality also means that the wealthy few who see government only in terms of a paved road and a tax cut will feel like they have nearly nothing at stake in everything else that government is supposed to do. “The very wealthy have little need for state-provided education or health care,” the economist Angus Deaton writes in his new book, The Great Escape. “They have even less reason to support health insurance for everyone, or to worry about the low quality of public schools that plagues much of the country.”
What Deaton describes sounds suspiciously like the economic arrangement in a developing country, in which a sliver of the population hunkers down in a gated community or jets off to other climes, their fellow citizens be damned. Eric Cantor has made it known that he and his Tea Party colleagues are “unapologetic believers in the concept of American exceptionalism”—which makes me wonder what their endgame is when it comes to social welfare. So many books about and campaigns against poverty end on a pleading note, an appeal to sympathy and conscience, but perhaps Cantor and the other “unapologetic believers” need to be reminded of something more fundamental to them, something in line with their stated beliefs and self-interest: the more stratified and unequal this country becomes, the more it starts to look like everywhere else.
“Remember the old days?” candidate Trump asked the crowd at a July 2016 rally. “A deserter, what happened?” Trump pantomimed aiming a rifle and pulled the invisible trigger, answering his own rhetorical question with a deadpan “Bang.” He was riffing on the case of Sergeant Bowe Bergdahl, who walked away from his battalion in 2009 and was captured by the Afghan Taliban. As CNN reported, six U.S. soldiers were killed during the months-long manhunt for Bergdahl. The facts of his captivity come under some dispute, as did his release as part of an unprecedented prisoner swap. The issue was a resonant drumbeat on Trump’s campaign stops, pounded out mercilessly as an example of bad deal-making. Hours after Trump’s inauguration, Bergdahl’s defense team filed a motion arguing that the commander-in-chief’s fiery rhetoric coupled with his new role created “unlawful command influence” that precluded any possibility that Bergdahl would get a fair trial. That motion was denied, and Bergdahl pleaded guilty in court today.
Over the years, the media has also made much of the dramatic old days when desertion was deadly. We revisit and reenact the events on January 31, 1945, in Sainte-Marie-aux-Mines, France, where Private Eddie Slovik was executed by a firing squad formed from a dozen of his fellow soldiers, under orders issued by Eisenhower. In Kurt Vonnegut’s Slaughterhouse-Five, or The Children’s Crusade: A Duty-Dance with Death, Billy Pilgrim finds William Bradford Huie’s The Execution of Private Slovik under a waiting-room seat cushion. From it, Vonnegut’s protagonist reads an excerpt of an actual appellate report written by a judge advocate general: “If the death penalty is ever to be imposed for desertion, it should be imposed in this case, not as a punitive measure nor as retribution, but to maintain that discipline upon which alone an army can succeed against the enemy.” Pilgrim reacts with glib fatalism—“So it goes.” In the made-for-TV version of The Execution of Private Slovik, Slovik is played by Martin Sheen who, five years later, would play Captain Benjamin L. Willard in Francis Ford Coppola’s Apocalypse Now (1979), a character tasked with the extrajudicial assassination of another deserter, Colonel Kurtz.
Bowe Bergdahl is one part Billy Pilgrim, one part Eddie Slovik—a symbol of our weaker selves and a stand-in for those of us who’ve lost heart. But unlike Pilgrim and Slovik, Bergdahl won’t get gunned down. He doesn’t face the death penalty, despite the severity of the Uniform Code of Military Justice, which states that “Any person found guilty of desertion or attempt to desert shall be punished if the offense is committed in time of war, by death or such other punishment as a court-martial may direct.” Bergdahl’s lesser desertion charge carries a maximum sentence of five years. It’s his second charge under Article 99, “Misbehavior Before The Enemy,” an obscure carryover from the older Articles of War folded into the Uniform Code of Military Justice in 1951, that has Bergdahl facing the possibility of life in prison.
The discreet Pentagon policy on desertion could be renamed “So It Goes.”
Given the unprecedented American incarceration rates, it may come as a surprise that Bergdahl is not presently in confinement. He’s flying a D4D—desk, 4 drawers—assigned to clerical duties at Fort Sam Houston, Texas. He’s even allowed off post in supervised company. Indeed, the extremes of the Bergdahl case have overshadowed this little-known revelation: the U.S. military is more merciful in its treatment of deserters than it lets on. The discreet Pentagon policy on desertion might even be renamed “So It Goes.” In his executioner’s rant, Trump hit upon something resembling truth: “Twenty years ago it was bang but slowly,” he said. “Ten years ago it was long prison. Today, they’re probably talking about nothing.” By and large, the U.S. Armed Forces don’t hunt down deserters. They don’t have the manpower or the resources. These days, some 95 percent of deserters aren’t prosecuted. Deserters receive an administrative discharge but only after they return, either voluntarily or by arrest, to military control. Each branch of the military differs, but the Army, the largest American force, displays little interest in bringing most deserters to court-martial; much cheaper and easier to let them walk. Discharge—in most cases, not dishonorable—is the most efficient way for the Army to rid itself of soldiers who are, statistically, the least enthusiastic and the most troubled enlistees, financially and emotionally. You desert, and the Army doesn’t have to provide your benefits. It doesn’t have to pay disability, and a high percentage of deserters have psychiatric problems. You’re cut off, and the Army got your service at cost. If you’re not a PFC Bergdahl deserting from a combat zone—and you’re among the estimated 50,000-plus absent soldiers presently at large—you’re essentially granted an amnesty.
This information isn’t readily available. The powers-that-be seem to believe that if every soldier knew the Pentagon’s unofficial policy on desertion, more soldiers would skedaddle. Indeed Eddie Slovik’s legend is a powerful cautionary tale, but its moral is deceptive: during World War II, of the approximately 50,000 U.S. soldiers who deserted, 20,000 were tried and sentenced. Forty-nine received death sentences. All but one of those sentences was commuted. In fact, the only U.S. soldier actually executed for desertion since the Civil War was the infamous Private Eddie Slovik.
There is general agreement in the military that AWOL and desertion rates are an indicator of the stress on military personnel. After a decade-and-a-half in what feels like a forever war—if this were ancient Greece, Troy would be seven years sacked and Odysseus halfway home by now—most of us are spent, soldiers and civilians alike.
Comprehensive contemporary desertion rates are impossible to come by without classified security clearance, and the public numbers are woefully underreported. Mother Jones put the count for the first decade of the War on Terror, from 2001 to 2011, at 29,000 Army desertions. Nearly fifteen years ago, the U.S. Army Research Institute concluded their last unclassified comprehensive study on the issue, called “What We Know About AWOL and Desertion: A Review of the Professional Literature for Policy Makers and Commanders.” In 2012, a lesser AWOL and desertion assessment was included as part of “Army 2020: Generating Health and Discipline in the Force Ahead of the Strategic Reset.” It lays out a policy under consideration that would allow the Army to separate deserters in absentia, without returning them to military control. This would set apart soldiers “absent for more than two years and who are not facing additional charges or who are not considered high risk,” from more threatening absentees, like those “wanted for crimes including homicide, armed robbery, assault, sexual assault, illegal drug use or possess a top secret security clearance.” This first group of soldiers would “receive a characterization of service of Other Than Honorable,” and “discharging these soldiers in absentia would save Army time and resources.”
No doubt our preternaturally uninformed president would be shocked by this leniency, but he should have more sympathy for the volunteers who change their minds and want to walk. These American minds—not exactly starving hysterical naked, nor seeing Mohammedan angels—aren’t yet fully formed when they’re first made up. The young women and men, the vast majority of fellow Americans choosing to fight our wars for us, have a right to know what happens to them if they experience changes of heart or mind, and want to abandon their posts. Anything less is coercion.
If you’re one of those at-large soldiers, here’s what happens when you go AWOL. Your commanding officer notifies your next of kin. An inquiry is undertaken to ascertain your location and your possible reasons for disappearing. Relevant personnel, like the provost marshal, are notified. Necessary reports are filed. After thirty days, you’re classified as “Dropped from Rolls.” More forms are compiled in a deserter packet forwarded to the Army Deserter Information Point at Ft. Knox, Kentucky. This is the normal progression from absence without leave to desertion, but there are any number of impediments. Of the 18,010 soldiers who “deserted from FY2006–11, only 13,443 were reported to law enforcement.” Army commanding officers, for a variety of reasons, failed to report deserters in one-quarter of cases.
Threatening deserters with the firing squad has become a tough-guy pantomime for hacks and hams.
So what normally happens if a deserter surrenders? You’re taken to a Processing Confinement Facility. There, you go through the paperwork drill, which takes a couple of days. In most cases, you’re discharged with an “other-than-honorable” designation on your separation papers. You’re even free to seek an upgrade to a “general discharge under honorable conditions.” Either way, most deserters will be given a Chapter 10 general discharge. This discharge will not ruin your life: the federal government will still employ you; you get no GI Bill, but you’re still fully eligible for all other federal financial aid. Once you have your discharge papers, you lose TRICARE, but if you get your “other-than-honorable” upgraded, you get benefits back. With the 2016 Fairness for Veterans Act, signed into law by President Obama, these upgrades are now supposedly more likely than ever. But, given the flightiness of the current commander-in-chief, this could change at the drop of a tweet.
Some seventy-two years removed from the last American execution for desertion, the firing squad is no longer an effective deterrent. It’s become an act, a tough-guy pantomime for hacks and hams.
The hollow threat of a death sentence should no longer stand as an official punishment for desertion during wartime. If, under Trump, we are facing a further extension of conventional ground forces in Afghanistan after a full generation of American war-waging and deficit increases, the Trump Administration must find other, less expensive ways to boost troop morale. It’s time we strike the death penalty from Article 85 of the Uniform Code of Military Justice. The battlefield offers death threat enough.
If our civilian and military leaders can’t count on friendly fear to maintain the commitment of our fighting force, maybe future administrations will be less inclined to indefinitely commit our troops to questionable wars. Or so it goes.