Subscribe: History
http://reason.com/topics/topic/165.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
america  american  civil religion  civil  history  insurance  life insurance  life  nixon  nuclear power  nuclear  people  power  war 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: History

History



All Reason.com articles with the "History" tag.



Published: Sat, 22 Jul 2017 00:00:00 -0400

Last Build Date: Sat, 22 Jul 2017 03:59:55 -0400

 



Impeach Eisenhower!

Fri, 07 Jul 2017 12:29:00 -0400

Impeachment talk has been in the air this week, with rallies in dozens of cities calling for Donald Trump to be ousted from office. Impeachment talk has been in the air for nearly a quarter-century now—you have to go back to George H.W. Bush for a president who didn't inspire a big chunk of the opposition to talk about kicking him out of the White House, and even then there was a small chunk of the opposition who wanted to kick him out of the White House. There always is.

In that spirit, here's the anarcho-pacifist Beat writer Lawrence Ferlinghetti reading his 1958 poem "Tentative Description of a Dinner to Promote the Impeachment of President Eisenhower" (with bonus video footage assembled ably by an anonymous YouTuber):

src="https://www.youtube.com/embed/e2Ldh1A5p-w" allowfullscreen="allowfullscreen" width="560" height="315" frameborder="0">

If you'd rather read to yourself than be read to, you can see the text of the poem here.

Ferlinghetti's four pages of antiwar verse did not inspire a mass movement to remove Eisenhower from office (nor was that the point), but it did help inspire a young broadcaster named Lorenzo Milam to try to start a pacifist radio station in Washington, D.C. I tell that story in chapter three of my book Rebels on the Air: An Alternative History of Radio in America; the short version is that it was 1958, the Cold War was in full swing, and the FCC wasn't about to license a dissident radio outlet in the nation's capital. After two years Milam gave up, applied for a license in Seattle instead (on the theory that maybe the authorities wouldn't care about an outlet located far away from the nation's capital), eventually got the go-ahead, and founded KRAB-FM, which in turn inspired a wave of listener-supported non-state, non-commercial radio stations around the country. Not a bad legacy. Certainly a better legacy than actually impeaching Eisenhower, which would've just saddled us with Richard Nixon a decade early.

(For past editions of the Friday A/V Club, go here.)




What Frederick Douglass Teaches Us about July 4th and American Exceptionalism

Tue, 04 Jul 2017 15:05:00 -0400

I don't think there's a greater Fourth of July speech than Frederick Douglass' 1852 address, "What to the slave is the Fourth of July?" The titular passage is the most-searing indictment of precisely the sort of cheap and easy American exceptionalism that continues to clot political rhetoric with the phoniest sort of patriotism: What, to the American slave, is your 4th of July? I answer: a day that reveals to him, more than all other days in the year, the gross injustice and cruelly to which he is the constant victim. To him, your celebration is a sham; your boasted liberty, an unholy license; your national greatness, swelling vanity; your sounds of rejoicing are empty and heartless; your denunciations of tyrants, brass fronted impudence; your shouts of liberty and equality, hollow mockery; your prayers and hymns, your sermons and thanksgivings, with all your religious parade, and solemnity, are, to him, mere bombast, fraud, deception, impiety, and hypocrisy—a thin veil to cover up crimes which would disgrace a nation of savages. There is not a nation on the earth guilty of practices, more shocking and bloody, than are the people of these United States, at this very hour. Contemporary conservatives especially recoil from this sort of auto-critique that is in fact one of the most unique facets of our national identity. Even before the United States was a nation, figures such as Samuel Sewall (one of the judges in the Salem witchcraft trials who recanted his actions, wore sackcloth and ashes in penance, and authored the first anti-slavery tract in the colonies) and Roger Williams (the religious dissenter who first articulated a theory of fully secular government in English and is the subject of this brilliant biography) excoriated the my-country-right-or-wrong mentality that is hardly specifically American. Sure, there is something grotesque about intergalactic "apology tours" that never seem to right past wrongs or change future policy, but as the constantly shifting valorization of dissent reminds us, partisan politics is a weak foundation upon which to rely for moral standing. Contemporary liberals loved dissent under Bush, found it unpatriotic under Obama, and now with Donald Trump in the White House, are busy rebranding themselves as "the Resistance." Conservatives simply reverse the process. In pre-abolition America, Douglass was of course specifically addressing slavery, a national original sin so monstrous that he notes its justification is elided in the founding document of the United States. The Constitution is a "glorious liberty document," he notes. But "if the Constitution were intended to be, by its framers and adopters, a slave-holding instrument," Douglass asks rhetorically, "why [is] neither slavery, slaveholding, nor slave...anywhere...found in it"? Yet for the all fury that courses through Douglass' lecture, he "do[es] not despair of this country." Instead, he paints a picture of globalization, interconnectedness, and progress toward more expansive freedom that resonates well over a century after he first spoke it: While drawing encouragement from the Declaration of Independence, the great principles it contains, and the genius of American Institutions, my spirit is also cheered by the obvious tendencies of the age. Nations do not now stand in the same relation to each other that they did ages ago. No nation can now shut itself up from the surrounding world, and trot round in the same old path of its fathers without interference. The time was when such could be done. Long established customs of hurtful character could formerly fence themselves in, and do their evil work with social impunity. Knowledge was then confined and enjoyed by the privileged few, and the multitude walked on in mental darkness. But a change has now come over the affairs of mankind. Walled cities and empires have become unfashionable. The arm of commerce has borne away the gates of the strong city. Intelligence is penetrating the darkest corners of the globe. It makes its pathway over and under the s[...]



Is Libertarianism a 'Stealth Plan' To Destroy America?

Mon, 03 Jul 2017 08:30:00 -0400

As its title suggests, Democracy in Chains: The Deep History of the Radical Right's Stealth Plan for America, by Duke historian Nancy MacLean, is filled with all sorts of melodramatic flourishes and revelations of supposed conspiracies. Chains, deep history, radicals, stealth—is this nonfiction or an Oliver Stone film? Even the cover depicts a smoke-filled room filled with ample-chinned, shadowy figures! This book, virtually every page announces, isn't simply about the Nobel laureate economist James Buchanan and his "public choice" theory, which holds in part that public-sector actors are bound by the same self-interest and desire to grow their "market share" as private-sector actors are. No, MacLean is after much-bigger, more-sinister game, documenting what she believes is the utterly chilling story of the ideological origins of the single most powerful and least understood threat to democracy today: the attempt by the billionaire-backed radical right to undo democratic governance...[and] a stealth bid to reverse-engineer all of America, at both the state and the national levels, back to the political economy and oligarchic governance of midcentury Virginia, minus the segregation. The billionaires in question, of course, are Koch brothers Charles and David, who have reached a level of villainy in public discourse last rivaled by Sacco and Vanzetti. (David Koch is a trustee of Reason Foundation, the nonprofit that publishes this website; Reason also receives funding from the Charles Koch Foundation.) Along the way, MacLean advances many sub-arguments, such as the notion that the odious, hypocritical, and archly anti-capitalistic 19th-century slavery apologist John C. Calhoun is the spirit animal of contemporary libertarianism. In fact, Buchanan and the rest of us all are nothing less than "Calhoun's modern understudies." Such unconvincing claims ("the Marx of the Master Class," as Calhoun was dubbed by Richard Hofstadter, was openly hostile to the industrialism, wage labor, and urbanization that James Buchanan took for granted) are hard to keep track of, partly because of all the rhetorical smoke bombs MacLean is constantly lobbing. In a characteristic example, MacLean early on suggests that libertarianism isn't "merely a social movement" but "the story of something quite different, something never before seen in American history": Could it be—and I use these words quite hesitantly and carefully—a fifth-column assault on American democratic governance? Calling attention to the term's origins to describe Franco's covert, anti-modern allies in the Spanish Civil War, MacLean writes the term "fifth column" has been applied to stealth supporters of an enemy who assist by engaging in propaganda and even sabotage to prepare the way for its conquest. It is a fraught term among scholars, not least because the specter of a secretive, infiltrative fifth column has been used in instrumental ways by the powerful— such as in the Red Scare of the Cold War era— to conjure fear and lead citizens and government to close ranks against dissent, with grave costs for civil liberties. That, obviously, is not my intent in using the term.... And yet it's the only term up for MacLean's job, since "the concept of a fifth column does seem to be the best one available for capturing what is distinctive in a few key dimensions about this quest to ensure the supremacy of capital." Sure, "fifth column" is a dirty, lowdown, suspect term among historians because using it trades in hysteria at the service of the ruling class rather than rational analysis intended to help the downtrodden. But come on, people, we're in a twilight struggle here, with a movement whose goals have included, among other things, ending censorship; opening the borders to goods and people from around the world; abolishing the draft and reducing militarism; legalizing abortion, drugs, and alternative lifestyles; reforming criminal justice and sentencing; focusing on how existing government operations, especially K[...]



Confederate Monuments Deserve to Go

Sun, 02 Jul 2017 00:00:00 -0400

In 1871, the city of Richmond, Virginia, publicly celebrated the Fourth of July. It was an unfamiliar experience. There had been no general commemoration of Independence Day since 1860—before Virginia had seceded from the nation that was formed in 1776. Other Southern cities were not ready to resume participation in our national ritual. Cheraw became the first place in post-Civil War South Carolina to do so, in 1891. Jackson, Mississippi, waited until 1901 to hold a reading of the Declaration of Independence on the occasion. Vicksburg, Mississippi, didn't join the party until 1945. Staunch supporters of the Lost Cause had little fondness for the United States. The Stars and Stripes was the banner of their enemy. When Union troops occupied Richmond in April 1865, the first thing they did was hoist the American flag over the capitol. The die-hards recognized what some Southerners miss: the deep contradiction between loving America and revering the Confederacy. The struggle over what to do with monuments to rebel leaders is a conflict between those who think what they did was admirable or heroic and those who think it was disgraceful. My long-dead relatives include several men who fought for the South. One was Gen. Leonidas Polk, who commanded troops in several major battles before being killed in action. He was not the last person to illustrate that fallibility runs in the family. In 1961, when I was a boy in the West Texas city of Midland, a new high school opened. It was named after Robert E. Lee, for reasons that are obvious: White resentment of the civil rights movement had produced widespread nostalgia for the Confederacy. San Antonio's Lee High School opened in 1958; Houston's in 1962. Midland Lee called its sports teams the Rebels and used the Confederate battle flag as its symbol. Black students didn't mind, because there weren't any. They attended a segregated black school. The general did have a connection to Texas. His last U.S. Army command before the Civil War was at a fort in the Hill Country town of Mason—which has no Lee monument. Gerald Gamel, editor of the Mason County News, ascribes the omission to strong anti-secession sentiment in Mason. That tells you something about why other places honor Confederate heroes. The town had good company in its resistance. Gov. Sam Houston, who fiercely opposed secession, was removed from office because he refused to take an oath to the Confederacy. His role comes to mind because of a recent rally in defense of a statue of him in Houston, which supposedly was under threat from leftists because he owned slaves. Armed counter-protesters, many expressing secessionist views, showed up on the appointed date. But the threat was a hoax, and Houston's self-styled defenders apparently didn't know that he saw disunion as treason. It was. Yet grand memorials were erected across the South to celebrate what the traitors did. The monuments were built by whites at a time when blacks had no political power—a condition those whites were desperate to preserve. They failed, and they deserved to fail. It's only fitting that Southerners who reject the legacies of slavery, secession, and Jim Crow would prefer to be rid of these tributes to them. It's not a symptom of modern political correctness. Days after the Declaration of Independence was signed, a New York mob destroyed a statue of King George III. If the men and women of the Revolution were eager to be rid of the images of those who had oppressed them and made war on America, why should African-Americans in the South feel differently about statues of leaders who fought to keep their race in chains? For a long time, American history was owned by white men and minimized the treatment of blacks, women, Indians, and Latinos. Accommodating our public spaces to their full citizenship doesn't erase history. It fills in parts that had been shamefully omitted. The Confederate monuments belong not in places of honor but in museums, as artifacts of past error. [...]



Law & Order: P.O.W. Unit

Fri, 23 Jun 2017 10:15:00 -0400

Resisting Enemy Interrogation was nominated for a Best Documentary Oscar, though it's not a documentary as the term is usually used today. It's a World War II–era military training film that tells a scripted story, dramatizing the ways that Germans might try to extract information from their prisoners. Carefully, methodically, the captors trick their captives into revealing important intelligence.

Here's what's weird about it: The story delves so deeply into the nitty-gritty of the interrogators' methods, observing as they piece together their puzzle, that the bulk of it is basically a police procedural shot from the Nazi point of view. If there's a Law & Order in the Man in the High Castle universe, it probably looks a lot like this:

src="https://www.youtube.com/embed/iZ0cu1UJX44" allowfullscreen="allowfullscreen" width="560" height="315" frameborder="0">

In 1951 the film was remade as a regular theatrical war movie, called Target Unknown. I don't think that happened with any of those training films about venereal disease, but you never know.

(For past editions of the Friday A/V Club, go here.)




'Atomic Humanism' and the Eco-Modernist Campaign to Promote Nuclear Power

Tue, 13 Jun 2017 11:40:00 -0400

"Only nuclear can lift all humans out of poverty while saving the natural environment," Michael Shellenberger said in his keynote address at yesterday's annual meeting of the American Nuclear Society. "Nothing else—not coal, not solar, not geo-engineering—can do that." This, he declared, was one of the first principles of "atomic humanism." Shellenberger is the founder of the pro-nuclear green group Environmental Progress, which argues that the best tool for fighting climate change is the no-carbon power generated by nuclear reactors. His speech offered a tour through the sorry history of environmentalist falsehoods and exaggerations about nuclear power. He began with Ralph Nader, who started training activists on how to stop new nuclear plants in the 1960s. (At one inflammatory moment, Nader declared: "A nuclear plant could wipe out Cleveland, and the survivors would envy the dead.") The Sierra Club soon jumped on board the anti-nuclear campaign. Shellenberger quoted a secret 1974 memo from then-executive director Michael McCloskey: "Our campaign stressing the hazards of nuclear power will supply a rationale for increasing regulation...and add to the cost of the industry." Unfortunately, this strategy worked to perfection. What was the activists' alternative to nuclear power? Fossil fuels. For example, Nader argued that we didn't "need nuclear power" because we "have a far greater amount of fossil fuels in this country than we're owning up to...the tar sands...oil out of shale...methane in coal beds." In 1976 Sierra Club consultant Amory Lovins declared that coal "can fill the real gaps in our fuel economy with only a temporary and modest (less than twofold at peak) expansion of mining." That same year, California Gov. Jerry Brown actually advocated the construction of coal-fired plants in place of nuclear power stations. The results? According to Shellenberger, California's carbon dioxide emissions are now two and a half times higher than they would have been had the planned nuclear plants been allowed to go forward. Meanwhile, vastly more people have died as a result of pollution from fossil fuel power generation than from nuclear power. It gets worse. Many prominent environmentalists, worried that abundant nuclear power would lead to overpopulation, endorsed strong anti-human sentiments. As Shellenberger noted: "Giving society cheap and abundant energy would be the equivalent of giving an idiot child a machine gun," said Paul Ehrlich. "It'd be little short of disastrous for us to discover a source of cheap, clean and abundant energy because of what we would do with it [emphasis original]," said Amory Lovins in 1977. "I didn't really worry about the accidents because there are too many people anyway....I think that playing dirty if you have a noble end is fine," confessed Martin Litton, the Sierra Club member who led the campaign to kill Diablo Canyon nuclear power plant in California. Shellenberger concluded by arguing for pro-nuclear activism including mass protests and sit-ins: There is no short-cut around political engagement. Nuclear energy's opponents are well-financed and well-organized. But they have this huge achilles heel: Their entire agenda rests on a rejection of simple physics and basic ethics. They are in the wrong factually and morally. As such, when they are confronted with the truth—when it is pointed out that the emperor is wearing no clothes—they lose their power.... It's time for action. We have to move. We must confront the truth, and confront the threat. By standing up to Sierra Club, NRDC, and other anti-nuclear greenwashers, we saved nuclear plants in Illinois and New York. A new grassroots movement, Generation Atomic, is backing measures to keep current nuclear power plants operating and also advocating the deployment of new advanced reactors. For example, Generation Atomic activists are now going door-to-door in Ohio urging voters to pressure st[...]



We Could Have Had Cellphones Four Decades Earlier

Sun, 11 Jun 2017 06:00:00 -0400

The basic idea of the cellphone was introduced to the public in 1945—not in Popular Mechanics or Science, but in the down-home Saturday Evening Post. Millions of citizens would soon be using "handie-talkies," declared J.K. Jett, the head of the Federal Communications Commission (FCC). Licenses would have to be issued, but that process "won't be difficult." The revolutionary technology, Jett promised in the story, would be formulated within months. But permission to deploy it would not. The government would not allocate spectrum to realize the engineers' vision of "cellular radio" until 1982, and licenses authorizing the service would not be fully distributed for another seven years. That's one heck of a bureaucratic delay. Primitive Phones and Spectrum-Hoarding Before there were cellphones, there was the mobile telephone service, or MTS. Launched in 1946, this technology required unwieldy and expensive equipment—the transceiver could fill the trunk of a sedan—and its networks faced tight capacity constraints. In the beginning, the largest MTS markets had no more than 44 channels. As late as 1976, Bell System's mobile network in New York could host just 545 subscribers. Even at sky-high prices, there were long waiting lists for subscriptions. Cellular networks were an ingenious way to expand service dramatically. A given market would be split into cells with a base station in each. These stations, often located on towers to improve line-of-sight with mobile phone users, were able both to receive wireless signals and to transmit them. The base stations were themselves linked together, generally by wires, and connected to networks delivering plain old telephone service. The advantages of this architecture were profound. Mobile radios could use less power, because they needed only to reach the nearest base station, not a mobile phone across town. Not only did this save battery life, but transmissions stayed local, leaving other cells quiet. A connection in one cell would be passed to an adjacent cell and then the next as the mobile user moved through space. The added capacity came from reusing frequencies, cell to cell. And cells could be "split," yielding yet more capacity. In an MTS system, each conversation required a channel covering the entire market; only a few hundred conversations could happen at once. A cellular system could create thousands of small cells and support hundreds of thousands of simultaneous conversations. When AT&T wanted to start developing cellular in 1947, the FCC rejected the idea, believing that spectrum could be best used by other services that were not "in the nature of convenience or luxury." This view—that this would be a niche service for a tiny user base—persisted well into the 1980s. "Land mobile," the generic category that covered cellular, was far down on the FCC's list of priorities. In 1949, it was assigned just 4.7 percent of the spectrum in the relevant range. Broadcast TV was allotted 59.2 percent, and government uses got one-quarter. Television broadcasting had become the FCC's mission, and land mobile was a lark. Yet Americans could have enjoyed all the broadcasts they would watch in, say, 1960 and had cellular phone service too. Instead, TV was allocated far more bandwidth than it ever used, with enormous deserts of vacant television assignments—a vast wasteland, if you will—blocking mobile wireless for more than a generation. How empty was this spectrum? Across America's 210 television markets, the 81 channels originally allocated to TV created some 17,010 slots for stations. From this, the FCC planned in 1952 to authorize 2,002 TV stations. By 1962, just 603 were broadcasting in the United States. Yet broadcasters vigorously defended the idle bandwidth. When mobile telephone advocates tried to gain access to the lightly used ultra-high frequency (UHF) band, the broadcasters deluged the commission, argui[...]



The Indestructible Idea of the Basic Income

Sat, 03 Jun 2017 00:00:00 -0400

Andy Stern is a former president of the Service Employees International Union. Charles Murray may be America's most prominent right-wing critic of the welfare state. So when they appeared onstage together in Washington, D.C., last fall to discuss the basic income—the idea of keeping people out of poverty by giving them regular unconditional cash payments—the most striking thing about the event was that they kept agreeing with each other. It isn't necessarily surprising that Stern and Murray both back some version of the concept. It has supporters across the political spectrum, from Silicon Valley capitalists to academic communists. But this diverse support leads naturally to diverse versions of the proposal, not all of which are compatible with one another. Some people want to means-test the checks so that only Americans below a certain income threshold receive them; others want a fully universal program, given without exceptions. Some want to replace the existing welfare state; others want to tack a basic income onto it. There have been tons of suggestions for how to fund the payments and for how big they should be. When it comes to the basic income, superficial agreement is common but actual convergence can be fleeting. In Stern's case, the central issue driving his interest in the idea is the turmoil he expects automation to bring to the economy. In the future, he and Lee Kravitz predict in their 2016 book Raising the Floor, tens of millions of jobs will disappear, leaving much of the country stuck with work that is "contingent, part-time, and driven largely by people's own motivation, creativity, and the ability to make a job out of 'nothing.'" A basic income, he hopes, would bring some economic security to their lives. Read Murray's first detailed pitch for a guaranteed income, the 2006 book In Our Hands, and you won't see anything like that. Its chief concern is shifting power from government bureaucracies to civil society. It doesn't just propose a new transfer program; it calls for repealing every other transfer program. And automation isn't a part of its argument at all. But onstage at the Cato Institute in D.C., Murray was as worried as Stern about technological job loss, warning that "we are going to be carving out millions of white-collar jobs, because artificial intelligence, after years of being overhyped, has finally come of age." Meanwhile, Stern signaled that he was open not just to replacing welfare programs for the disadvantaged but possibly even to rethinking Social Security, provided that people still have to contribute money to some sort of retirement system and that Americans who have already paid in don't get shortchanged. He drew the line at eliminating the government's health insurance programs—but the other guy on the stage agreed that health care was different. Under Murray's plan, citizens would be required to use part of their grant to buy health insurance, and insurance companies would be required to treat the population as a single pool. The Murray/Stern convergence comes as the basic income is enjoying a wave of interest and enthusiasm. The concept comes up in debates over everything from unemployment to climate change. Pilot programs testing various versions of the idea are in the works everywhere from Oakland to Kenya, and last year Swiss voters considered a plan to introduce a guaranteed income nationwide. (They wound up rejecting the referendum overwhelmingly, with only 23 percent voting in favor. I didn't say everyone was enthusiastic.) This isn't the first time the basic income or an idea like it has edged its way onto the agenda. It isn't even the first time we've seemed to see an ideological convergence. This patchwork of sometimes-overlapping movements with sometimes-overlapping proposals has a history that stretches back centuries. It Usually Begins with Tom Paine Just where you pinpo[...]



When Buying Life Insurance Was Deemed Immoral

Mon, 29 May 2017 06:00:00 -0400

The product was perfectly legal. Many prominent clergymen endorsed it, including celebrity preacher Henry Ward Beecher, brother of Uncle Tom's Cabin author Harriet Beecher Stowe. The Pennsylvania House declared in 1811 that it "would be highly beneficial to many descriptions of citizens throughout the state." The need was clear, and the businesses that sold it were untainted by scandal, bankruptcy, or fraud. They delivered what they promised. But in the early 19th century, Americans just wouldn't buy life insurance. The problem wasn't mere procrastination. Many people deemed the very idea immoral. "Has a man the right to make the continuance of his life the basis of a bargain? Is it not turning a very solemn thing into a mere commercial transaction?" wrote a typical critic. Religious traditionalists believed they should trust in God's providence, not a financial contract, to care for their loved ones after death. Others, pointing to arson to collect fire insurance, worried that it might encourage murder. Paternalists—and competitors—warned that beneficiaries wouldn't know how to manage a sudden windfall. The New York Times opined that life insurance eroded the work ethic and discouraged steady savings. It was, the paper editorialized, "calculated to encourage reliance upon something besides economy and industry and to lead accordingly to the relaxation and decay of those cardinal virtues of society." Taking a similar line, the president of a savings bank voiced concern that the "anodyne of security" defied God's plan, in which fear of poverty, which he called the "pressure of wants," encouraged thrift and hard work. (The right kind of thrift, of course, included a savings bank account.) Then there were the women. Life insurance was supposed to protect widows from destitution if their husbands died, but wives were among its biggest opponents. "It is almost incredible that one of the obstacles to the universal practice of life insurance is found in the opposition of wives and mothers," complained a pro-insurance writer. Wives were often afraid that placing a bet on death would tempt fate. Many viewed a life insurance payout as untouchable "blood money." Then something changed. "Not until the 1840s did life insurance begin selling in any significant degree. Then, suddenly, in the span of a few years, its rate of growth became astonishingly high," writes the economic sociologist Viviana Zelizer in her landmark 1979 study of the shift, Morals and Markets, now out in a new edition. "In 1840 there were 15 life insurance companies in the United States and the estimated amount of total insurance in force was under $5 million," she notes. "By 1860, there were 43 companies and almost $205 million of insurance in force." One reason was undoubtedly economic. As Americans left the countryside to work for wages in growing industrial cities, life insurance became more valuable. A farmer who died left his wife and children a means to earn a living, carrying on as before. The widows and orphans of urban wage earners, by contrast, had no way to replace the lost income. Urban life tended to be more dangerous, with poorer nutrition and a greater chance of disease. In the years since Zelizer first published her book, careful research by the late Nobel laureate Robert Fogel and other economists has demonstrated that life expectancies for those who survived childhood were falling after about 1840, as were adult heights, an indicator of health and nutrition. "Although the technological progress, industrialization, and urbanization of the nineteenth century laid the basis for a remarkable advance in health and nutritional status during the first half of the twentieth century," wrote Fogel, "their effects on the conditions of life of the lower classes were mixed at least until the 1870s or 1880s. In the U.S. the negative effects[...]



That Old-Time Civil Religion

Sat, 27 May 2017 06:00:00 -0400

The Tragedy of U.S. Foreign Policy: How America's Civil Religion Betrayed the National Interest, by Walter A. McDougall, Yale University Press, 408 pages, $30 Never mind the First Amendment; the United States has an official religion after all. It's a civil religion, and the deity's role is to bestow blessings on the state. The "Supreme Architect," "the Almighty Being," "the Infinite Power," and "the Being Who Regulates the Destiny of Nations" are just a few of the sobriquets that Benjamin Franklin, George Washington, Thomas Jefferson, and James Madison gave to the nation's nondenominational guardian spirit. For some the civil religion might be mere symbolism; others might conflate it with Christianity. Either way, it helps give the nation a sense of purpose, or so historian Walter McDougall contends. In The Tragedy of U.S. Foreign Policy, McDougall traces how changes in the American civil religion (or "ACR") have shaped the country's attitudes toward war and peace. From the founding until the Spanish-American War of 1898, what McDougall calls the "Classical ACR" (or "Neo-Classical ACR" after the Civil War) prevailed. It was a faith of national expansion on the North American continent, but it did not, in the words of John Quincy Adams, "go forth in search of monsters to destroy" overseas. A new faith took hold in the last decade of the 19th century: the "Progressive American Civil Religion," which became an even more firmly entrenched "Neo-Progressive ACR" during the Cold War. This was a militant faith that conceived of the nation's mission as being, in George W. Bush's words, to "end tyranny in our world." Today a third faith, the "Millennial ACR," aspires to unite the world through a global economy and regime of universal rights. It too has roots in the Cold War, though McDougall identifies it primarily with presidents Clinton and Obama. You'll notice a pattern. Each civil religion has a "neo" phase that emerges when its original formulation runs into trouble. The basic impulse—toward staying at home, asserting American primacy in international affairs, or uniting the world—stays the same, but the rhetoric gets updated. And the progression from one civil religion to the next is not strictly linear: After World War I, for example, the Progressive ACR was partly discredited and the broadly non-interventionist Classical ACR enjoyed a slight return. Similarly, the globalist Millennial ACR was knocked back by the 9/11 attacks and the wars of the George W. Bush years, which brought the Cold War–style "Neo-Progressive ACR" back into fashion. McDougall, who teaches history and international relations at the University of Pennsylvania, is a zestful writer as well as a meticulous scholar. He sometimes writes like a prophet—not in the sense of foretelling the future, but in relying on compact insight rather than step-by-step logical argument. He covers the sweep of U.S. foreign policy over some 200 years in a little more than 350 pages. Hang tight and enjoy the ride. McDougall is at his best when zooming in on the details of history and revealing the truth to be rather different from what other writers have led us to believe. The Tragedy of U.S. Foreign Policy is, among other things, a rejoinder to Robert Kagan's 2006 book Dangerous Nation, which argues that America has always aspired to remake the world in the image of its own values. McDougall shows that Abraham Lincoln, for one, never supported wars to promote revolution or to spread liberalism through empire building. Lincoln's son Robert made a rare public statement to denounce an attempt by then–President Theodore Roosevelt to link his father's name to an imperialist foreign policy. Peopling the continent—even when it already had quite a few other people—was the great mission that America's first civil religion endors[...]



3 Ways We're Reliving the Watergate Culture War

Wed, 24 May 2017 14:45:00 -0400

Whether or not we're reliving the Watergate investigation, we sure do seem intent on reenacting the Watergate culture war. That isn't just true of Donald Trump's critics, who are understandably eager to compare the 37th and 45th presidents. It's true of Trump and his team, who keep echoing arguments offered by Richard Nixon and his defenders four decades ago: 1. The double-standard defense. Complain about something Trump has done, and someone is bound to ask why you didn't say a peep when Hillary Clinton or Barack Obama did some other bad thing. (You will get this response even if you protested Clinton or Obama's action quite loudly.) The most prominent person to talk like this, of course, is Donald Trump himself: With all of the illegal acts that took place in the Clinton campaign & Obama Administration, there was never a special counsel appointed! — Donald J. Trump (@realDonaldTrump) May 18, 2017 But this defense is a lot older than the present president's political career. Throughout the Watergate investigation, Nixon complained angrily that his predecessors had gotten away with the very activities that were getting him in trouble. In his 2003 book Nixon's Shadow, the Rutgers historian David Greenberg lays out some examples: "If I were a liberal," [Nixon] told [die-hard defender Baruch Korff], "Watergate would be a blip." He compiled a private catalogue of behaviors by others that he believed excused his own. On the basis of comments J. Edgar Hoover made to him, he frequently claimed, not quite accurately, that Lyndon Johnson had bugged his campaign plane in 1968. When Nixon was chided for spying on political opponents, he shot back that John and Robert Kennedy had done the same. And as precedents for his 1972 program of political sabotage, he regularly cited the pranks of Democratic operative Dick Tuck, who had hounded Nixon since his 1950 Senate race. During the Watergate Hearings, [White House Chief of Staff H.R.] Haldeman testified that "dirty tricks" maestro Donald Segretti was hired to be a "Dick Tuck for our side." There's more—much more—but you get the idea. Now, Nixon may have gotten his facts a little scrambled when it came to that alleged airplane bug, and some of the supposed precursors to his crimes didn't actually fit the bill. (He seemed convinced that Daniel Ellsberg's leak of the Pentagon Papers was comparable to the Watergate break-in—a bizarre analogy, though if you've been following the debates over Edward Snowden you've probably heard worse.) But broadly speaking, the president had a point. Many American leaders had abused their powers, sometimes in ways that resembled the Nixon scandals, and the press hadn't always been quick to trumpet the news. Like Nixon, JFK had wiretapped reporters and used the IRS as a political weapon. LBJ may not have bugged Nixon's plane in 1968, but he did spy on Goldwater in 1964. And both Kennedy and Johnson, like many others who have held their job, presided over enormous violations of dissenters' civil liberties. You can make a decent case that Nixon's misbehavior was even worse than theirs, but you can see how the man could get a little resentful about the uneven attention. The trouble with the double-standard defense is that it isn't much of a defense. The crimes of prior presidents aren't a reason to let Nixon off the hook; they're a reason to rein in not just one abusive president but the whole imperial presidency. The same goes for any Trumpian abuses today. 2. Intimations of a "coup." Then as now, each side accused the other of plotting a coup. Rumors that Nixon was planning to seize dictatorial powers circulated not just on the political fringes but in official Washington; many of the president's foes feared that fascism was on the way. After Nixon had Special Prosecutor Archibald [...]



History Lessons Are Turning My Kid Into a Scofflaw (and I Couldn’t Be Happier)

Tue, 23 May 2017 00:01:00 -0400

"If I'd lived then, I'd have still gone to saloons," Anthony, my 11-year-old son, said as we watched the Ken Burns documentary, Prohibition. "But I'd have carried a gun in case I had to deal with police or militia." He commented after a scene in which Portland, Maine's Mayor Neal Dow—nicknamed "the sublime fanatic"—ordered troops in 1855 to fire on an angry crowd outside City Hall. They had gathered to protest the statewide ban on alcoholic beverages that Dow pushed through in his zeal to make the world a better place as he conceived such a thing. Like most fanatics, sublime or otherwise, the mayor didn't have a lot of patience for disagreement. One man was killed and seven wounded that day by the forces of mandatory sobriety. Interesting, well-produced, and drawing on multiple sources and experts, Prohibition lends itself beautifully to our homeschooling efforts. It does a thorough job of exploring the religious, reformist, and nativist roots of first the Temperance movement and then the push for full-on Prohibition. We've recently studied the Progressive Era and the fight for women's suffrage, and the documentary pulls in those histories, showing how social movements influence one another and often come together to achieve common goals—sometimes good, and other times leading to disastrous exercises in self-righteous presumption like Prohibition. The Prohibition website includes excellent additional material, too, including an activity asking students to decide between two conceptions of the role of government: In a democracy, people should have the freedom to make their own choices and be responsible for their actions. If they want to indulge in destructive personal behavior, that's their business, not the governments. A democratic government is made up of its citizens and a major responsibility of government is to guarantee equal opportunity for all. The government has a duty to alleviate social ills and guarantee that no one is in need. Those competing views of the state play an ongoing role through many of our lessons. Anthony knows my own opinions, and is no doubt influenced by them, but I always make sure to present him with competing viewpoints. Personally, I think the past speaks for itself as to which of those roles works better in practice, but I also see my job as raising my son to be a rational adult, not a clone of me. So when we studied the Progressive Era we worked with a series of Great Courses lectures by a college professor sympathetic to the progressives, online lectures from Hillsdale College that have a broadly conservative tone, readings from Thaddeus Russell's A Renegade History of the United States, and excerpts from Illiberal Reformers by Thomas C. Leonard. Anthony got an earful of would-be reformers decrying poverty and abuses in the world around them, but also disparaging individuals as "plastic lumps of human dough." He read pre-presidential Woodrow Wilson dismissing individual rights as "nonsense," and perused objections that adherence to respect for natural individual rights "prevent us from determining what social or individual tendencies we shall favor, what we shall depress; It will in general prevent us from imposing a social ideal, and compel us to leave a social anarchy." Anthony considered the Great Courses presenter too respectful of self-appointed shepherds who he found to be condescending and bossy, and the Hillsdale lectures overly deferential to religious authority—off-putting, to him, in its own way. To him, the pseudoscientific racism of the era, culminating in calls for eugenics controls and even elimination of whole populations, thoroughly tainted the confidence of the period's reformers that they were uniquely qualified to mold those lumps human of dough they saw all a[...]



Life Among the Ants

Fri, 19 May 2017 11:02:00 -0400

In the late 1940s and early '50s, GM chief Alfred P. Sloan funded a series of anti-communist cartoons. (The story behind the films is convoluted, but the compressed version is that Sloan's foundation gave grants to Harding College, an Arkansas-based Christian school, which then paid former Disney animator John Sutherland's studio to make them.) One of the shorts is Albert in Blunderland, a 1950 attack on the planned economy. It presents communism as an anthill society—literally, with actual ants.

While just about everyone involved in funding this film hailed from the political right, the cartoon was clearly aimed at a union-friendly working-class audience; it defends independent trade unions and warns that state factories will be able to impose harsh speed-ups with impunity. In a precursor of sorts to the Hard Hat Riot, it ends with a blue-collar worker beating up a socialist:

src="https://archive.org/embed/albert-in-blunderland-1950" allowfullscreen="allowfullscreen" width="640" height="480" frameborder="0">

(For past editions of the Friday A/V Club, go here. I haven't featured any of the Sloan/Harding/Sutherland films in the A/V Club before, but their 1948 effort "Make Mine Freedom" has turned up elsewhere on this website.)




Old Times There Are Best Forgotten

Tue, 16 May 2017 16:15:00 -0400

CHARLOTTESVILLE, VA—White supremacist provocateur Richard Spencer showed up in my town this past Saturday to roil the debate over the city council's planned removal of statues of Confederate Gens. Robert E. Lee and Stonewall Jackson. Spencer and a few score folks carrying flaming tiki torches gathered in Lee Park, a couple of blocks from my house, where they chanted, "What brings us together is that we are white, we are a people, we will not be replaced," and "Russia is our friend." Of course, Spencer and his associates have, as my Reason colleague Robby Soave points out, the constitutional right to their express their views in public. Spencer and his supporters are, as usual, in the wrong. The time has come to remove from public land the monuments honoring the men who led the Confederacy to defeat. But doing so doesn't mean we must then move on to purging slave-owning Founders or even memorials for dead southern soldiers. Looking back requires us to balance the good and the bad, and—on balance—Lee, Stonewall Jackson, and other Confederate leaders simply don't make the cut. Before delving more deeply into the Confederate memorial controversy, let me set out my Southern bona fides. I was born in Texas and reared on my family's dairy farm in the Appalachian Mountains of Southwest Virginia. Our county schools were racially integrated in 1963 when I was in the third grade. My third grade Virginia history book referred to the Civil War as the War Between the States and asserted that that conflict was chiefly over state's rights. Virginia Generals Lee, Jackson, and Stuart were portrayed as honorable and heroic defenders of Southern rights. My high school's team name was the Rebels and our fight song was "Dixie." It was not uncommon to see the Stars and Bars being waved in stands during football games. It is, however, worth noting that in a school in which African Americans made up less than 10 percent of the student body, my class elected a black senior as our homecoming queen. As a student at the University of Virginia in the early 1970s, I learned that many parts of the Commonwealth had not actually desegregated until 1971. At UVA I belonged to a literary and debating society whose members drank a great deal and often sang songs commemorating the Lost Cause, including "The Bonnie Blue Flag" and "Carry Me Back to Old Virginia," but also Yankee tunes like "The Battle Hymn of the Republic." I remained largely unconscious of how offensive Confederate symbols were to some people. That changed when my black roommate Dwayne Morris took a small Stars and Bars out of the coffee mug in which it was standing in our apartment, broke its staff in two and threw it in the trash. Several subsequent long, boozy conversations ended any residual sentimental attachment to the Lost Cause that I may have retained from my earlier schooling. Still, as a young Virginian I never gave much thought to what the Confederate monuments and memorials that appear in nearly every southern town represented. After Reconstruction, Ladies Memorial Associations (LMAs) in the South sprang up to advocate for and oversee the repatriation the remains of Confederate soldiers and to commemorate their deaths by erecting generic war memorial statues. Ultimately, the LMAs joined together for United Daughters of the Confederacy in 1894. It is, however, plain historical fact that most of those memorials to the Confederate dead and monuments to Confederate leaders were erected between 1890 and 1925, when Jim Crow racial apartheid was being established in the South. They were meant and served as powerful symbols of resurgent white supremacy. For example, the monument to Confederate President Jefferson Davis that was just t[...]



30 Days a Black Man: How Ray Sprigle Exposed Jim Crow in 1940s America [Reason Podcast]

Fri, 12 May 2017 11:00:00 -0400

In 1948, veteran newsman Ray Sprigle, best-known for having exposed Supreme Court Justice Hugo Black's membership in the Ku Klux Klan, published an explosive series detailing his month-long trip through the Jim Crow South. A white man, Sprigle altered his appearance and passsed as black so that he could experience firsthand a part of the country that most Americans either didn't know much or care much about. Traveling with the well-known NAACP activist John Wesley Dobbs, Sprigle (pronounced sprig-el) published 21 articles and a book that detailed the ways in which segregation was ruthlessly enforced at every level of interaction between the races. Party-line phone operators, for instance, would never address blacks as mister or missus on a call and shop owners would drape napkins or tissues over a black woman's head when she tried on a hat. Bill Steigerwald's powerful new book, 30 Days a Black Man: The Forgotten Story That Exposed the Jim Crow South, documents Sprigle's expose and does a masterful job of recreating an America in which de facto and de jure segregation was the rule not just in the former Confederacy but much of the North as well. It's a deeply disturbing and profoundly moving account of what Steigerwald, himself a veteran newsman whose previous book forced the publisher of John Steinbeck's Travels With Charley to reclassify the supposed travelogue as fiction, calls "superstars" fighting for equality under the law (along with Sprigle and Dobbs, Steigerwald points to NAACP head Walter White, who chose to identify as black despite being able to pass as white, and Eleanor Roosevelt, the former First Lady whose commitment to civil rights bore most of its fruit during the Truman years). "When anybody goes back in history," Steigerwald tells Nick Gillespie in the latest Reason Podcast, you learn that nothing is new, everything was worse, and what you thought was simple or true was not. When you look back at '48 and you see this stuff, and Ray Sprigle's reporting, he was a reporter. When he heard guys in Atlanta say, "Oh, Atlanta's a great city for black people. Nothing ever happens here." Well, he went down the courthouse and dug up some records and he came up with three cases in the last two years where young black males, this sounds a little familiar, were shot dead by cops or trolley conductors who were armed at the time and were able to shoot anybody. They were shot dead and the defense was always, "Oh, I thought he was reaching for a gun or something. I shot him dead," and they all got off. I mean, you could take those examples and put them in the paper today and people would say, "Well, yeah."... I have such a deeper appreciation for the punishment that black people received from their government for so long and the crass politics that perpetuated it. Read Sprigle's original series here. Produced by Ian Keyser. Subscribe, rate, and review the Reason Podcast at iTunes. Listen at SoundCloud below: src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/322279622&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&visual=true" width="100%" height="450" frameborder="0"> Don't miss a single Reason podcast! Archive here. Subscribe at iTunes. Follow us at SoundCloud. Subscribe at YouTube. Like us on Facebook. Follow us on Twitter. Subscribe to Reason magazine (print or digital) for just $15. This is a rush transcript. It has not been checked for spelling or errors—check all quotes against the audio for accuracy. Nick Gillespie: Today we're talking with Bill Steigerwald. He's a longtime newspaper man, author of several books, most recently an incredible story called 30 Days a B[...]