Subscribe: History
http://www.reason.com/topics/topic/165.xml
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
american  back  black  government  great  insurance  life insurance  life  nuclear power  nuclear  power  united states  war 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: History

History



All Reason.com articles with the "History" tag.



Published: Thu, 22 Jun 2017 00:00:00 -0400

Last Build Date: Thu, 22 Jun 2017 14:57:36 -0400

 



'Atomic Humanism' and the Eco-Modernist Campaign to Promote Nuclear Power

Tue, 13 Jun 2017 11:40:00 -0400

"Only nuclear can lift all humans out of poverty while saving the natural environment," Michael Shellenberger said in his keynote address at yesterday's annual meeting of the American Nuclear Society. "Nothing else—not coal, not solar, not geo-engineering—can do that." This, he declared, was one of the first principles of "atomic humanism." Shellenberger is the founder of the pro-nuclear green group Environmental Progress, which argues that the best tool for fighting climate change is the no-carbon power generated by nuclear reactors. His speech offered a tour through the sorry history of environmentalist falsehoods and exaggerations about nuclear power. He began with Ralph Nader, who started training activists on how to stop new nuclear plants in the 1960s. (At one inflammatory moment, Nader declared: "A nuclear plant could wipe out Cleveland, and the survivors would envy the dead.") The Sierra Club soon jumped on board the anti-nuclear campaign. Shellenberger quoted a secret 1974 memo from then-executive director Michael McCloskey: "Our campaign stressing the hazards of nuclear power will supply a rationale for increasing regulation...and add to the cost of the industry." Unfortunately, this strategy worked to perfection. What was the activists' alternative to nuclear power? Fossil fuels. For example, Nader argued that we didn't "need nuclear power" because we "have a far greater amount of fossil fuels in this country than we're owning up to...the tar sands...oil out of shale...methane in coal beds." In 1976 Sierra Club consultant Amory Lovins declared that coal "can fill the real gaps in our fuel economy with only a temporary and modest (less than twofold at peak) expansion of mining." That same year, California Gov. Jerry Brown actually advocated the construction of coal-fired plants in place of nuclear power stations. The results? According to Shellenberger, California's carbon dioxide emissions are now two and a half times higher than they would have been had the planned nuclear plants been allowed to go forward. Meanwhile, vastly more people have died as a result of pollution from fossil fuel power generation than from nuclear power. It gets worse. Many prominent environmentalists, worried that abundant nuclear power would lead to overpopulation, endorsed strong anti-human sentiments. As Shellenberger noted: "Giving society cheap and abundant energy would be the equivalent of giving an idiot child a machine gun," said Paul Ehrlich. "It'd be little short of disastrous for us to discover a source of cheap, clean and abundant energy because of what we would do with it [emphasis original]," said Amory Lovins in 1977. "I didn't really worry about the accidents because there are too many people anyway....I think that playing dirty if you have a noble end is fine," confessed Martin Litton, the Sierra Club member who led the campaign to kill Diablo Canyon nuclear power plant in California. Shellenberger concluded by arguing for pro-nuclear activism including mass protests and sit-ins: There is no short-cut around political engagement. Nuclear energy's opponents are well-financed and well-organized. But they have this huge achilles heel: Their entire agenda rests on a rejection of simple physics and basic ethics. They are in the wrong factually and morally. As such, when they are confronted with the truth—when it is pointed out that the emperor is wearing no clothes—they lose their power.... It's time for action. We have to move. We must confront the truth, and confront the threat. By standing up to Sierra Club, NRDC, and other anti-nuclear greenwashers, we saved nuclear plants in Illinois and New York. A new grassroots movement, Generation Atomic, is backing measures to keep current nuclear power plants operating and also advocating the deployment of new advanced reactors. For example, Generation Atomic activists are now going door-to-door in Ohio urging voters to pressure state legislators to support the ZEN (Zero Emissions Nuclear) bill, which aims to keep both Davis-Besse and Perry nuclear plants operating.[...]



We Could Have Had Cellphones Four Decades Earlier

Sun, 11 Jun 2017 06:00:00 -0400

The basic idea of the cellphone was introduced to the public in 1945—not in Popular Mechanics or Science, but in the down-home Saturday Evening Post. Millions of citizens would soon be using "handie-talkies," declared J.K. Jett, the head of the Federal Communications Commission (FCC). Licenses would have to be issued, but that process "won't be difficult." The revolutionary technology, Jett promised in the story, would be formulated within months. But permission to deploy it would not. The government would not allocate spectrum to realize the engineers' vision of "cellular radio" until 1982, and licenses authorizing the service would not be fully distributed for another seven years. That's one heck of a bureaucratic delay. Primitive Phones and Spectrum-Hoarding Before there were cellphones, there was the mobile telephone service, or MTS. Launched in 1946, this technology required unwieldy and expensive equipment—the transceiver could fill the trunk of a sedan—and its networks faced tight capacity constraints. In the beginning, the largest MTS markets had no more than 44 channels. As late as 1976, Bell System's mobile network in New York could host just 545 subscribers. Even at sky-high prices, there were long waiting lists for subscriptions. Cellular networks were an ingenious way to expand service dramatically. A given market would be split into cells with a base station in each. These stations, often located on towers to improve line-of-sight with mobile phone users, were able both to receive wireless signals and to transmit them. The base stations were themselves linked together, generally by wires, and connected to networks delivering plain old telephone service. The advantages of this architecture were profound. Mobile radios could use less power, because they needed only to reach the nearest base station, not a mobile phone across town. Not only did this save battery life, but transmissions stayed local, leaving other cells quiet. A connection in one cell would be passed to an adjacent cell and then the next as the mobile user moved through space. The added capacity came from reusing frequencies, cell to cell. And cells could be "split," yielding yet more capacity. In an MTS system, each conversation required a channel covering the entire market; only a few hundred conversations could happen at once. A cellular system could create thousands of small cells and support hundreds of thousands of simultaneous conversations. When AT&T wanted to start developing cellular in 1947, the FCC rejected the idea, believing that spectrum could be best used by other services that were not "in the nature of convenience or luxury." This view—that this would be a niche service for a tiny user base—persisted well into the 1980s. "Land mobile," the generic category that covered cellular, was far down on the FCC's list of priorities. In 1949, it was assigned just 4.7 percent of the spectrum in the relevant range. Broadcast TV was allotted 59.2 percent, and government uses got one-quarter. Television broadcasting had become the FCC's mission, and land mobile was a lark. Yet Americans could have enjoyed all the broadcasts they would watch in, say, 1960 and had cellular phone service too. Instead, TV was allocated far more bandwidth than it ever used, with enormous deserts of vacant television assignments—a vast wasteland, if you will—blocking mobile wireless for more than a generation. How empty was this spectrum? Across America's 210 television markets, the 81 channels originally allocated to TV created some 17,010 slots for stations. From this, the FCC planned in 1952 to authorize 2,002 TV stations. By 1962, just 603 were broadcasting in the United States. Yet broadcasters vigorously defended the idle bandwidth. When mobile telephone advocates tried to gain access to the lightly used ultra-high frequency (UHF) band, the broadcasters deluged the commission, arguing ferociously and relentlessly that mobile telephone service was an inefficient use of spectrum. It may seem surprising that they were [...]



The Indestructible Idea of the Basic Income

Sat, 03 Jun 2017 00:00:00 -0400

Andy Stern is a former president of the Service Employees International Union. Charles Murray may be America's most prominent right-wing critic of the welfare state. So when they appeared onstage together in Washington, D.C., last fall to discuss the basic income—the idea of keeping people out of poverty by giving them regular unconditional cash payments—the most striking thing about the event was that they kept agreeing with each other. It isn't necessarily surprising that Stern and Murray both back some version of the concept. It has supporters across the political spectrum, from Silicon Valley capitalists to academic communists. But this diverse support leads naturally to diverse versions of the proposal, not all of which are compatible with one another. Some people want to means-test the checks so that only Americans below a certain income threshold receive them; others want a fully universal program, given without exceptions. Some want to replace the existing welfare state; others want to tack a basic income onto it. There have been tons of suggestions for how to fund the payments and for how big they should be. When it comes to the basic income, superficial agreement is common but actual convergence can be fleeting. In Stern's case, the central issue driving his interest in the idea is the turmoil he expects automation to bring to the economy. In the future, he and Lee Kravitz predict in their 2016 book Raising the Floor, tens of millions of jobs will disappear, leaving much of the country stuck with work that is "contingent, part-time, and driven largely by people's own motivation, creativity, and the ability to make a job out of 'nothing.'" A basic income, he hopes, would bring some economic security to their lives. Read Murray's first detailed pitch for a guaranteed income, the 2006 book In Our Hands, and you won't see anything like that. Its chief concern is shifting power from government bureaucracies to civil society. It doesn't just propose a new transfer program; it calls for repealing every other transfer program. And automation isn't a part of its argument at all. But onstage at the Cato Institute in D.C., Murray was as worried as Stern about technological job loss, warning that "we are going to be carving out millions of white-collar jobs, because artificial intelligence, after years of being overhyped, has finally come of age." Meanwhile, Stern signaled that he was open not just to replacing welfare programs for the disadvantaged but possibly even to rethinking Social Security, provided that people still have to contribute money to some sort of retirement system and that Americans who have already paid in don't get shortchanged. He drew the line at eliminating the government's health insurance programs—but the other guy on the stage agreed that health care was different. Under Murray's plan, citizens would be required to use part of their grant to buy health insurance, and insurance companies would be required to treat the population as a single pool. The Murray/Stern convergence comes as the basic income is enjoying a wave of interest and enthusiasm. The concept comes up in debates over everything from unemployment to climate change. Pilot programs testing various versions of the idea are in the works everywhere from Oakland to Kenya, and last year Swiss voters considered a plan to introduce a guaranteed income nationwide. (They wound up rejecting the referendum overwhelmingly, with only 23 percent voting in favor. I didn't say everyone was enthusiastic.) This isn't the first time the basic income or an idea like it has edged its way onto the agenda. It isn't even the first time we've seemed to see an ideological convergence. This patchwork of sometimes-overlapping movements with sometimes-overlapping proposals has a history that stretches back centuries. It Usually Begins with Tom Paine Just where you pinpoint the start of that history depends on how broadly you're willing to define basic income. The idea's advocates have identified plenty o[...]



When Buying Life Insurance Was Deemed Immoral

Mon, 29 May 2017 06:00:00 -0400

The product was perfectly legal. Many prominent clergymen endorsed it, including celebrity preacher Henry Ward Beecher, brother of Uncle Tom's Cabin author Harriet Beecher Stowe. The Pennsylvania House declared in 1811 that it "would be highly beneficial to many descriptions of citizens throughout the state." The need was clear, and the businesses that sold it were untainted by scandal, bankruptcy, or fraud. They delivered what they promised. But in the early 19th century, Americans just wouldn't buy life insurance. The problem wasn't mere procrastination. Many people deemed the very idea immoral. "Has a man the right to make the continuance of his life the basis of a bargain? Is it not turning a very solemn thing into a mere commercial transaction?" wrote a typical critic. Religious traditionalists believed they should trust in God's providence, not a financial contract, to care for their loved ones after death. Others, pointing to arson to collect fire insurance, worried that it might encourage murder. Paternalists—and competitors—warned that beneficiaries wouldn't know how to manage a sudden windfall. The New York Times opined that life insurance eroded the work ethic and discouraged steady savings. It was, the paper editorialized, "calculated to encourage reliance upon something besides economy and industry and to lead accordingly to the relaxation and decay of those cardinal virtues of society." Taking a similar line, the president of a savings bank voiced concern that the "anodyne of security" defied God's plan, in which fear of poverty, which he called the "pressure of wants," encouraged thrift and hard work. (The right kind of thrift, of course, included a savings bank account.) Then there were the women. Life insurance was supposed to protect widows from destitution if their husbands died, but wives were among its biggest opponents. "It is almost incredible that one of the obstacles to the universal practice of life insurance is found in the opposition of wives and mothers," complained a pro-insurance writer. Wives were often afraid that placing a bet on death would tempt fate. Many viewed a life insurance payout as untouchable "blood money." Then something changed. "Not until the 1840s did life insurance begin selling in any significant degree. Then, suddenly, in the span of a few years, its rate of growth became astonishingly high," writes the economic sociologist Viviana Zelizer in her landmark 1979 study of the shift, Morals and Markets, now out in a new edition. "In 1840 there were 15 life insurance companies in the United States and the estimated amount of total insurance in force was under $5 million," she notes. "By 1860, there were 43 companies and almost $205 million of insurance in force." One reason was undoubtedly economic. As Americans left the countryside to work for wages in growing industrial cities, life insurance became more valuable. A farmer who died left his wife and children a means to earn a living, carrying on as before. The widows and orphans of urban wage earners, by contrast, had no way to replace the lost income. Urban life tended to be more dangerous, with poorer nutrition and a greater chance of disease. In the years since Zelizer first published her book, careful research by the late Nobel laureate Robert Fogel and other economists has demonstrated that life expectancies for those who survived childhood were falling after about 1840, as were adult heights, an indicator of health and nutrition. "Although the technological progress, industrialization, and urbanization of the nineteenth century laid the basis for a remarkable advance in health and nutritional status during the first half of the twentieth century," wrote Fogel, "their effects on the conditions of life of the lower classes were mixed at least until the 1870s or 1880s. In the U.S. the negative effects probably exceeded the positive ones through the 1870s." Yet the economic benefits of life insurance weren't enough in themselves to over[...]



That Old-Time Civil Religion

Sat, 27 May 2017 06:00:00 -0400

The Tragedy of U.S. Foreign Policy: How America's Civil Religion Betrayed the National Interest, by Walter A. McDougall, Yale University Press, 408 pages, $30 Never mind the First Amendment; the United States has an official religion after all. It's a civil religion, and the deity's role is to bestow blessings on the state. The "Supreme Architect," "the Almighty Being," "the Infinite Power," and "the Being Who Regulates the Destiny of Nations" are just a few of the sobriquets that Benjamin Franklin, George Washington, Thomas Jefferson, and James Madison gave to the nation's nondenominational guardian spirit. For some the civil religion might be mere symbolism; others might conflate it with Christianity. Either way, it helps give the nation a sense of purpose, or so historian Walter McDougall contends. In The Tragedy of U.S. Foreign Policy, McDougall traces how changes in the American civil religion (or "ACR") have shaped the country's attitudes toward war and peace. From the founding until the Spanish-American War of 1898, what McDougall calls the "Classical ACR" (or "Neo-Classical ACR" after the Civil War) prevailed. It was a faith of national expansion on the North American continent, but it did not, in the words of John Quincy Adams, "go forth in search of monsters to destroy" overseas. A new faith took hold in the last decade of the 19th century: the "Progressive American Civil Religion," which became an even more firmly entrenched "Neo-Progressive ACR" during the Cold War. This was a militant faith that conceived of the nation's mission as being, in George W. Bush's words, to "end tyranny in our world." Today a third faith, the "Millennial ACR," aspires to unite the world through a global economy and regime of universal rights. It too has roots in the Cold War, though McDougall identifies it primarily with presidents Clinton and Obama. You'll notice a pattern. Each civil religion has a "neo" phase that emerges when its original formulation runs into trouble. The basic impulse—toward staying at home, asserting American primacy in international affairs, or uniting the world—stays the same, but the rhetoric gets updated. And the progression from one civil religion to the next is not strictly linear: After World War I, for example, the Progressive ACR was partly discredited and the broadly non-interventionist Classical ACR enjoyed a slight return. Similarly, the globalist Millennial ACR was knocked back by the 9/11 attacks and the wars of the George W. Bush years, which brought the Cold War–style "Neo-Progressive ACR" back into fashion. McDougall, who teaches history and international relations at the University of Pennsylvania, is a zestful writer as well as a meticulous scholar. He sometimes writes like a prophet—not in the sense of foretelling the future, but in relying on compact insight rather than step-by-step logical argument. He covers the sweep of U.S. foreign policy over some 200 years in a little more than 350 pages. Hang tight and enjoy the ride. McDougall is at his best when zooming in on the details of history and revealing the truth to be rather different from what other writers have led us to believe. The Tragedy of U.S. Foreign Policy is, among other things, a rejoinder to Robert Kagan's 2006 book Dangerous Nation, which argues that America has always aspired to remake the world in the image of its own values. McDougall shows that Abraham Lincoln, for one, never supported wars to promote revolution or to spread liberalism through empire building. Lincoln's son Robert made a rare public statement to denounce an attempt by then–President Theodore Roosevelt to link his father's name to an imperialist foreign policy. Peopling the continent—even when it already had quite a few other people—was the great mission that America's first civil religion endorsed. God wanted America to grow. But projecting power into Europe or Asia, acquiring bases or imperial possessions overseas, was not part[...]



3 Ways We're Reliving the Watergate Culture War

Wed, 24 May 2017 14:45:00 -0400

Whether or not we're reliving the Watergate investigation, we sure do seem intent on reenacting the Watergate culture war. That isn't just true of Donald Trump's critics, who are understandably eager to compare the 37th and 45th presidents. It's true of Trump and his team, who keep echoing arguments offered by Richard Nixon and his defenders four decades ago: 1. The double-standard defense. Complain about something Trump has done, and someone is bound to ask why you didn't say a peep when Hillary Clinton or Barack Obama did some other bad thing. (You will get this response even if you protested Clinton or Obama's action quite loudly.) The most prominent person to talk like this, of course, is Donald Trump himself: With all of the illegal acts that took place in the Clinton campaign & Obama Administration, there was never a special counsel appointed! — Donald J. Trump (@realDonaldTrump) May 18, 2017 But this defense is a lot older than the present president's political career. Throughout the Watergate investigation, Nixon complained angrily that his predecessors had gotten away with the very activities that were getting him in trouble. In his 2003 book Nixon's Shadow, the Rutgers historian David Greenberg lays out some examples: "If I were a liberal," [Nixon] told [die-hard defender Baruch Korff], "Watergate would be a blip." He compiled a private catalogue of behaviors by others that he believed excused his own. On the basis of comments J. Edgar Hoover made to him, he frequently claimed, not quite accurately, that Lyndon Johnson had bugged his campaign plane in 1968. When Nixon was chided for spying on political opponents, he shot back that John and Robert Kennedy had done the same. And as precedents for his 1972 program of political sabotage, he regularly cited the pranks of Democratic operative Dick Tuck, who had hounded Nixon since his 1950 Senate race. During the Watergate Hearings, [White House Chief of Staff H.R.] Haldeman testified that "dirty tricks" maestro Donald Segretti was hired to be a "Dick Tuck for our side." There's more—much more—but you get the idea. Now, Nixon may have gotten his facts a little scrambled when it came to that alleged airplane bug, and some of the supposed precursors to his crimes didn't actually fit the bill. (He seemed convinced that Daniel Ellsberg's leak of the Pentagon Papers was comparable to the Watergate break-in—a bizarre analogy, though if you've been following the debates over Edward Snowden you've probably heard worse.) But broadly speaking, the president had a point. Many American leaders had abused their powers, sometimes in ways that resembled the Nixon scandals, and the press hadn't always been quick to trumpet the news. Like Nixon, JFK had wiretapped reporters and used the IRS as a political weapon. LBJ may not have bugged Nixon's plane in 1968, but he did spy on Goldwater in 1964. And both Kennedy and Johnson, like many others who have held their job, presided over enormous violations of dissenters' civil liberties. You can make a decent case that Nixon's misbehavior was even worse than theirs, but you can see how the man could get a little resentful about the uneven attention. The trouble with the double-standard defense is that it isn't much of a defense. The crimes of prior presidents aren't a reason to let Nixon off the hook; they're a reason to rein in not just one abusive president but the whole imperial presidency. The same goes for any Trumpian abuses today. 2. Intimations of a "coup." Then as now, each side accused the other of plotting a coup. Rumors that Nixon was planning to seize dictatorial powers circulated not just on the political fringes but in official Washington; many of the president's foes feared that fascism was on the way. After Nixon had Special Prosecutor Archibald Cox fired, Rep. Parren Mitchell of Maryland asked, "Will democracy as we have known it survive, or will fascism come to dominate in this [...]



History Lessons Are Turning My Kid Into a Scofflaw (and I Couldn’t Be Happier)

Tue, 23 May 2017 00:01:00 -0400

"If I'd lived then, I'd have still gone to saloons," Anthony, my 11-year-old son, said as we watched the Ken Burns documentary, Prohibition. "But I'd have carried a gun in case I had to deal with police or militia." He commented after a scene in which Portland, Maine's Mayor Neal Dow—nicknamed "the sublime fanatic"—ordered troops in 1855 to fire on an angry crowd outside City Hall. They had gathered to protest the statewide ban on alcoholic beverages that Dow pushed through in his zeal to make the world a better place as he conceived such a thing. Like most fanatics, sublime or otherwise, the mayor didn't have a lot of patience for disagreement. One man was killed and seven wounded that day by the forces of mandatory sobriety. Interesting, well-produced, and drawing on multiple sources and experts, Prohibition lends itself beautifully to our homeschooling efforts. It does a thorough job of exploring the religious, reformist, and nativist roots of first the Temperance movement and then the push for full-on Prohibition. We've recently studied the Progressive Era and the fight for women's suffrage, and the documentary pulls in those histories, showing how social movements influence one another and often come together to achieve common goals—sometimes good, and other times leading to disastrous exercises in self-righteous presumption like Prohibition. The Prohibition website includes excellent additional material, too, including an activity asking students to decide between two conceptions of the role of government: In a democracy, people should have the freedom to make their own choices and be responsible for their actions. If they want to indulge in destructive personal behavior, that's their business, not the governments. A democratic government is made up of its citizens and a major responsibility of government is to guarantee equal opportunity for all. The government has a duty to alleviate social ills and guarantee that no one is in need. Those competing views of the state play an ongoing role through many of our lessons. Anthony knows my own opinions, and is no doubt influenced by them, but I always make sure to present him with competing viewpoints. Personally, I think the past speaks for itself as to which of those roles works better in practice, but I also see my job as raising my son to be a rational adult, not a clone of me. So when we studied the Progressive Era we worked with a series of Great Courses lectures by a college professor sympathetic to the progressives, online lectures from Hillsdale College that have a broadly conservative tone, readings from Thaddeus Russell's A Renegade History of the United States, and excerpts from Illiberal Reformers by Thomas C. Leonard. Anthony got an earful of would-be reformers decrying poverty and abuses in the world around them, but also disparaging individuals as "plastic lumps of human dough." He read pre-presidential Woodrow Wilson dismissing individual rights as "nonsense," and perused objections that adherence to respect for natural individual rights "prevent us from determining what social or individual tendencies we shall favor, what we shall depress; It will in general prevent us from imposing a social ideal, and compel us to leave a social anarchy." Anthony considered the Great Courses presenter too respectful of self-appointed shepherds who he found to be condescending and bossy, and the Hillsdale lectures overly deferential to religious authority—off-putting, to him, in its own way. To him, the pseudoscientific racism of the era, culminating in calls for eugenics controls and even elimination of whole populations, thoroughly tainted the confidence of the period's reformers that they were uniquely qualified to mold those lumps human of dough they saw all around them. The sort of molding that evangelical Protestants and progressives attempted during Prohibition, for instance. Or that their h[...]



Life Among the Ants

Fri, 19 May 2017 11:02:00 -0400

In the late 1940s and early '50s, GM chief Alfred P. Sloan funded a series of anti-communist cartoons. (The story behind the films is convoluted, but the compressed version is that Sloan's foundation gave grants to Harding College, an Arkansas-based Christian school, which then paid former Disney animator John Sutherland's studio to make them.) One of the shorts is Albert in Blunderland, a 1950 attack on the planned economy. It presents communism as an anthill society—literally, with actual ants.

While just about everyone involved in funding this film hailed from the political right, the cartoon was clearly aimed at a union-friendly working-class audience; it defends independent trade unions and warns that state factories will be able to impose harsh speed-ups with impunity. In a precursor of sorts to the Hard Hat Riot, it ends with a blue-collar worker beating up a socialist:

src="https://archive.org/embed/albert-in-blunderland-1950" allowfullscreen="allowfullscreen" width="640" height="480" frameborder="0">

(For past editions of the Friday A/V Club, go here. I haven't featured any of the Sloan/Harding/Sutherland films in the A/V Club before, but their 1948 effort "Make Mine Freedom" has turned up elsewhere on this website.)




Old Times There Are Best Forgotten

Tue, 16 May 2017 16:15:00 -0400

CHARLOTTESVILLE, VA—White supremacist provocateur Richard Spencer showed up in my town this past Saturday to roil the debate over the city council's planned removal of statues of Confederate Gens. Robert E. Lee and Stonewall Jackson. Spencer and a few score folks carrying flaming tiki torches gathered in Lee Park, a couple of blocks from my house, where they chanted, "What brings us together is that we are white, we are a people, we will not be replaced," and "Russia is our friend." Of course, Spencer and his associates have, as my Reason colleague Robby Soave points out, the constitutional right to their express their views in public. Spencer and his supporters are, as usual, in the wrong. The time has come to remove from public land the monuments honoring the men who led the Confederacy to defeat. But doing so doesn't mean we must then move on to purging slave-owning Founders or even memorials for dead southern soldiers. Looking back requires us to balance the good and the bad, and—on balance—Lee, Stonewall Jackson, and other Confederate leaders simply don't make the cut. Before delving more deeply into the Confederate memorial controversy, let me set out my Southern bona fides. I was born in Texas and reared on my family's dairy farm in the Appalachian Mountains of Southwest Virginia. Our county schools were racially integrated in 1963 when I was in the third grade. My third grade Virginia history book referred to the Civil War as the War Between the States and asserted that that conflict was chiefly over state's rights. Virginia Generals Lee, Jackson, and Stuart were portrayed as honorable and heroic defenders of Southern rights. My high school's team name was the Rebels and our fight song was "Dixie." It was not uncommon to see the Stars and Bars being waved in stands during football games. It is, however, worth noting that in a school in which African Americans made up less than 10 percent of the student body, my class elected a black senior as our homecoming queen. As a student at the University of Virginia in the early 1970s, I learned that many parts of the Commonwealth had not actually desegregated until 1971. At UVA I belonged to a literary and debating society whose members drank a great deal and often sang songs commemorating the Lost Cause, including "The Bonnie Blue Flag" and "Carry Me Back to Old Virginia," but also Yankee tunes like "The Battle Hymn of the Republic." I remained largely unconscious of how offensive Confederate symbols were to some people. That changed when my black roommate Dwayne Morris took a small Stars and Bars out of the coffee mug in which it was standing in our apartment, broke its staff in two and threw it in the trash. Several subsequent long, boozy conversations ended any residual sentimental attachment to the Lost Cause that I may have retained from my earlier schooling. Still, as a young Virginian I never gave much thought to what the Confederate monuments and memorials that appear in nearly every southern town represented. After Reconstruction, Ladies Memorial Associations (LMAs) in the South sprang up to advocate for and oversee the repatriation the remains of Confederate soldiers and to commemorate their deaths by erecting generic war memorial statues. Ultimately, the LMAs joined together for United Daughters of the Confederacy in 1894. It is, however, plain historical fact that most of those memorials to the Confederate dead and monuments to Confederate leaders were erected between 1890 and 1925, when Jim Crow racial apartheid was being established in the South. They were meant and served as powerful symbols of resurgent white supremacy. For example, the monument to Confederate President Jefferson Davis that was just taken down in New Orleans was dedicated in 1911 during a "Whites Only" ceremony featuring a living Stars and Bars formation that sang "Di[...]



30 Days a Black Man: How Ray Sprigle Exposed Jim Crow in 1940s America [Reason Podcast]

Fri, 12 May 2017 11:00:00 -0400

In 1948, veteran newsman Ray Sprigle, best-known for having exposed Supreme Court Justice Hugo Black's membership in the Ku Klux Klan, published an explosive series detailing his month-long trip through the Jim Crow South. A white man, Sprigle altered his appearance and passsed as black so that he could experience firsthand a part of the country that most Americans either didn't know much or care much about. Traveling with the well-known NAACP activist John Wesley Dobbs, Sprigle (pronounced sprig-el) published 21 articles and a book that detailed the ways in which segregation was ruthlessly enforced at every level of interaction between the races. Party-line phone operators, for instance, would never address blacks as mister or missus on a call and shop owners would drape napkins or tissues over a black woman's head when she tried on a hat. Bill Steigerwald's powerful new book, 30 Days a Black Man: The Forgotten Story That Exposed the Jim Crow South, documents Sprigle's expose and does a masterful job of recreating an America in which de facto and de jure segregation was the rule not just in the former Confederacy but much of the North as well. It's a deeply disturbing and profoundly moving account of what Steigerwald, himself a veteran newsman whose previous book forced the publisher of John Steinbeck's Travels With Charley to reclassify the supposed travelogue as fiction, calls "superstars" fighting for equality under the law (along with Sprigle and Dobbs, Steigerwald points to NAACP head Walter White, who chose to identify as black despite being able to pass as white, and Eleanor Roosevelt, the former First Lady whose commitment to civil rights bore most of its fruit during the Truman years). "When anybody goes back in history," Steigerwald tells Nick Gillespie in the latest Reason Podcast, you learn that nothing is new, everything was worse, and what you thought was simple or true was not. When you look back at '48 and you see this stuff, and Ray Sprigle's reporting, he was a reporter. When he heard guys in Atlanta say, "Oh, Atlanta's a great city for black people. Nothing ever happens here." Well, he went down the courthouse and dug up some records and he came up with three cases in the last two years where young black males, this sounds a little familiar, were shot dead by cops or trolley conductors who were armed at the time and were able to shoot anybody. They were shot dead and the defense was always, "Oh, I thought he was reaching for a gun or something. I shot him dead," and they all got off. I mean, you could take those examples and put them in the paper today and people would say, "Well, yeah."... I have such a deeper appreciation for the punishment that black people received from their government for so long and the crass politics that perpetuated it. Read Sprigle's original series here. Produced by Ian Keyser. Subscribe, rate, and review the Reason Podcast at iTunes. Listen at SoundCloud below: src="https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/322279622&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&visual=true" width="100%" height="450" frameborder="0"> Don't miss a single Reason podcast! Archive here. Subscribe at iTunes. Follow us at SoundCloud. Subscribe at YouTube. Like us on Facebook. Follow us on Twitter. Subscribe to Reason magazine (print or digital) for just $15. This is a rush transcript. It has not been checked for spelling or errors—check all quotes against the audio for accuracy. Nick Gillespie: Today we're talking with Bill Steigerwald. He's a longtime newspaper man, author of several books, most recently an incredible story called 30 Days a Black Man: The Forgotten Story That Exposed the Jim Crow South. Bill, thanks for talking to us. Bill Steigerwald: Hi. How are you doing? N[...]



100 Years After the Russian Revolution, Russians Are Still Paying

Tue, 18 Apr 2017 07:00:00 -0400

On April 16, 1917, which is to say 100 years ago last Sunday, a train from Helsinki arrived at the Finland Station in Petrograd. The "sealed train" originated in Zürich, Switzerland. It carried on board 32 Russian revolutionaries, including Vladimir Lenin and his wife, as well as millions of German "goldmarks." Lenin, who desperately wanted to return home from his Swiss exile in order to take over the leadership of the Russian Bolsheviks, needed German logistical help to cross the Eastern Front as well as financial help to foment a revolt against the sitting Russian government of Alexander Kerensky. Both were duly furnished by the German imperial high command. As such, Russia experienced two revolutions in 1917. The February revolution deposed the Tsar, while the October revolution put the Bolsheviks in charge. Subsequent to the Bolshevik putsch, Russia withdrew from the Great War, thus allowing the Germans to move their divisions to the Western Front to face the combined might of the French, the British and the Americans. Once in charge, Lenin established one-party dictatorship and the first gulags. The Soviet Union, with its accompanying horrors, was born. Communist apologists have often blamed Bolshevik crimes on Joseph Stalin, who took over the Russian government following Lenin's death in 1924. The Russian historian and politician Alexander Yakovlev, who headed the Presidential Commission for the Victims of Political Repression, however, noted that the "truth is that in punitive operations Stalin did not think up anything that was not there under Lenin: executions, hostage taking, concentration camps, and all the rest." Violence was inherent in the Bolshevik revolution. Per Lenin: "If we are not ready to shoot a saboteur and White Guardist [i.e., anti-communist Russian soldiers], what sort of revolution is that?" Estimating the cost of the Bolshevik rule is not easy, although Yakovlev argues that 20 million lives were lost to state-sponsored violence, malnutrition, man-made famine, slave labor, etc. That seems like a very conservative estimate. Looking at other consequences of Russian communism, the story is similarly depressing. Comparing Russia with any other country is difficult. Russia's geography and history are unique. That said, I went back to Maddison's data in search of a European country that was, roughly speaking, at Russia's level of economic development in 1917. With average annual per capita income of $1,212 (in 1990 dollars), Portugal was closest to Russia's $1,085. Where would Russia be, had it matched the economic performance of Portugal—a country that is even today considered as something of a European basket case? Let data tell the story. 1. GDP per capita, per person, per year, 1990 Geary-Khamis dollars (1917-2010) 2. Life expectancy, years, 1960-2015 3. Democracy vs. autocracy, scale -10 (worst) to 10 (best), 1917-2015 4. Civil liberties, scale 1 (best) to 7 (worst), 1972-2015 5. Political rights, scale 1(best) to 7 (worst), 1972-2015 [...]



Documentaries Put Spotlight on War Propaganda

Fri, 14 Apr 2017 15:00:00 -0400

Five Came Back. Available now on Netflix. American Experience: The Great War. PBS. Monday, April 10, 9 p.m. Growing up, I was completely absorbed by a CBS documentary series called The 20th Century that aired on weekends from 1958 1966. Every other episode, it seemed, was about a war. At the time, I thought the main reason was probably that Walter Cronkite, the narrator, had become famous as a combat correspondent. That may have had something to do with it, but with the passage of years and a widened perspective, I've come to suspect that the real reason is that war—preparing for it, fighting it, recovering from it, and arguing about what it meant—was the century's principal activity. From the decapitation fad during the Boxer War that opened the century to the trigger-happy streets of Mogadishu that closed it, war was a global avocation. TV this week takes a look back at the century's two biggest bangs with a pair of magnificent three-part documentaries. PBS' American Experience series spends six hours dissecting World War I (part of it, anyway; we'll get back to that), while Netflix explores how Hollywood enthusiastically picked up the propaganda gun during World War II with Five Came Back. Both shows convey an astonishing amount of information with a mixture of style and simplicity that other filmmakers could study to immense profit. World War I, as American Experience: The Great War paraphrases a conclusion already reached by the cast of Friends many years ago, is probably the biggest event in U.S. history of which Americans know next to nothing. In some ways, that will still be true even if they watch The Great War, which views the events strictly through the lens of how Americans were affected. The welter of royal bloodlines and backdoor treaties that turned a seemingly isolated event—the assassination of an Austrian nobleman by a Serbian teenager—into a worldwide conflagration involving Russia, France, England, Italy, Germany, the Austro-Hungarian Empire, the Ottoman Empire, Bulgaria, Japan, and the United States is barely explored. Nor are many of the war's geopolitical shockwaves. Even the implosion of Russia's czarist government, which would eventually result in a Cold War that for nearly five decades threatened to turn apocalyptically hot, only gets a minute or two. What The Great War does do, in truly spectacular fashion. is limn the voracious expansion of the American government midwifed by World War I. When Woodrow Wilson's uncertain attempts at neutrality floundered and he called for a declaration of war in 1917 because "the world must be made safe for democracy," it made the United States unique among the combatants, notes a historian in The Great War: "It was not fighting for survival. It was fighting for an ideal." But as The Great War documents in horrifying detail, that ideal was the creation of a Leviathan state with unprecedented power: to draft young men and send them to a foreign war. To set price controls on food and impose dietary restrictions. To arrest and even deport political dissidents. To create a powerful government propaganda organ aimed not at enemy nations but the American people. (It expanded from one employee to about 100,000 in a couple of months.) To send goon squads known as Liberty Loan Committees roaming neighborhoods offering deals on war bonds that couldn't be refused. Wilson's actions did not go without dissent (signs at a protest march in New York City: MR. PRESIDENT, WHY NOT MAKE AMERICA SAFE FOR DEMOCRACY?) and dissent did not go without punishment. Wilson demanded, and got, a new Espionage Act that made it a crime to collect, record and disseminate information "harmful to the war effort," and he wielded it like an axe against the anti-war movement. By the fall of 1917,[...]



Adam Smith Needs a Paper Clip

Thu, 13 Apr 2017 06:00:00 -0400

Adam Smith famously used a pin factory to illustrate the advantages of specialization, choosing this "very trifling manufacture" because the different tasks were performed under one roof: "One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it at the top for receiving the head; to make the head requires two or three distinct operations; to put it on, is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them into the paper; and the important business of making a pin is, in this manner, divided into about eighteen distinct operations, which, in some manufactories, are all performed by distinct hands, though in others the same man will sometimes perform two or three of them." By improving workers' skills and encouraging purpose-built machinery, the division of labor leads to miraculous productivity gains. Even a small and ill-equipped manufacturer, Smith wrote in The Wealth of Nations, could boost each worker's output from a handful of pins a day to nearly 5,000. In the early 19th century, that number jumped an order of magnitude with the introduction of American inventor John Howe's pin-making machine. It was "one of the marvels of the age, reported on in every major journal and encyclopedia of the time," writes historian of technology Steven Lubar. In 1839, the Howe factory had three machines making 24,000 pins a day—and the inventor was clamoring for pin tariffs to offset the nearly 25 percent tax that pin makers had to pay on imported brass wire, a reminder that punitive tariffs hurt domestic manufacturers as well as consumers. "Considering the great quantity and value of pins used in this country—and their importance as an article of general use, and convenience, if not of necessity," Howe wrote, "it would seem reasonable that encouragement should be given to an attempt to manufacture them; or at least that no obstacle arising out of the past legislation of our government, should be allowed to remain in the way of such an undertaking." So what happened to all those pins? Nowadays, we think of straight pins as sewing supplies. But they weren't always a specialty product. In Smith's time and for a century after, pins were a multipurpose fastening technology. Straight pins functioned as buttons, snaps, hooks and eyes, safety pins, zippers, and Velcro. They closed ladies' bodices, secured men's neckerchiefs, and held on babies' diapers. A prudent 19th century woman always kept a supply at hand, leading a Chicago Tribune writer to opine that the practice encouraged poor workmanship in women's clothes: "The greatest scorner of woman is the maker of the readymade, who would not dare to sew on masculine buttons with but a single thread, yet will be content to give the feminine hook and eye but a promise of fixedness, trusting to the pin to do the rest." Most significantly, pins fastened paper. Before Scotch tape or command-v, authors including Jane Austen used them to cut and paste manuscript revisions. The Bodleian Library in Oxford maintains an inventory of "dated and datable pins" removed from manuscripts going as far back as 1617. Pin sales grew along with record-keeping and bureaucracy—the unfairly derided systems necessary to operate an enterprise of any scale. "The expanded market for pins came from expanded uses in business and administrative record-keeping, as well as in clothing," says economic historian Beverly Lemire. Before paper clips or staples, pins gave businesses an inexpensive, unobtrusive way to keep pieces of paper together. Compared to ribbons or cords that required holes or sealing wax, they marked a major advance. "My guess would be that the expansion of great trading companies—like the [...]



Guerrilla Lobs Bombs at Romanticized History of ‘70s Violence

Fri, 07 Apr 2017 15:30:00 -0400

Guerrilla. Showtime. Sunday, April 16, 9 p.m. Before we get to everything else about Showtime's Guerrilla—how it's intelligent, insightful, resonant, well-acted and all that—let's deal with the mysterious question of why a show about an underground black-nationalist terrorist group of the 1970s, written and produced by Americans, would be set in Great Britain. To be sure, London had its share (actually, much more than its share) of political terrorism in the 1970s. But nearly all of it was connected to the issue of Northern Ireland. Neither the British black-power movement nor the government response to it ever reached the extreme levels of violence that wracked their counterparts in the United States; there was no Mayfair chapter of the Black Panthers. The 1970s underground group that most closely resembles the one portrayed in Guerrilla was the Black Liberation Army, the Panther offshoot for which JoAnne Chesimard, a.k.a. Assata Shakur, robbed banks and shot it out with cops, but it was a purely American affair. The most obvious answer for the show's peculiar venue is that it's a co-production with Great Britain's Sky TV, which seemed to suggest that Guerrilla's creator-writer-director John Ridley, who won a screenwriting Oscar for 12 Years a Slave, couldn't round up enough funding within the United States. Why that should be is just one of those Hollywood imponderables. Guerrilla is a thoughtful and undidactic look at a time when the left went from nutty to nihilistic. In one 18-month stretch of 1971-72, the FBI recorded more than 2,500 bombings in the United States, more than five a day. And much of the revolutionary violence was directed not at the war in Vietnam, where American involvement was in steep decline, but at racial iniquities. Underground American groups like the Weathermen and the Symbionese Liberation Army explicitly declared that their violence was committed to combat black oppression, even if the fingers that were pulling the triggers or lighting the fuses were, in many cases, white. The Black Lives Matter movement is different in many, many ways, but the echoes are there nonetheless. Frieda Pinto (Slumdog Millionaire) and Babou Ceesay ('71) play the politically engaged young lovers Jas and Marcus. Marcus is a black teacher whose revolutionary impulses are strictly cerebral; trying to blaze the way for the fulfillment of Ho Chi Minh's dictum that "when the prison gates are opened, the real dragon will fly out," he spends his spare time teaching classes at a London jail, educating future cadres. Jas, a nurse and a red diaper baby with daddy issues (her father is in jail in India for killing soldiers), is less patient. "I have to be with someone who wants to do things," she warns Marcus. They're both jolted to action when a black friend is beaten to death by cops at a protest rally. But they immediately learn how easily violence can spiral out of control, when, breaking a Marxist street criminal named Dhari (Nathaniel Martello-White, Red Tails) out of jail in hopes that he can provide their movement with more muscular leadership, they accidentally kill a guard. Marcus is stricken by the blood on his hands, even when the hard-boiled Dhari scoffs, "No use talking you didn't do this, you didn't do that—you're in it." Jas, on the other hand, is enchanted to hear news reports speculate that their little group must be veteran revolutionaries, perhaps even an offshoot of the Panthers. "We're so fucking cool," she exclaims to Marcus, even as her newfound notoriety spurs her into new fits of rage. Ridley's keenly observant script clearly draws on the multiplying accounts of life underground by 1970s survivors who've come in from the cold, not only for detail[...]



The Myth of Isolationism

Wed, 05 Apr 2017 16:40:00 -0400

I'm always happy to see someone taking on the myth that America pursued an "isolationist" foreign policy between world wars one and two. So I recommend Andrew Bacevich's latest piece for The American Conservative, which makes the point concisely:

(image) The oft-repeated claim that in the 1920s and 1930s the United States raised the drawbridges, stuck its head in the sand, and turned its back on the world is not only misleading, but also unhelpful....Here, by way of illustrating some of those relevant facts, is a partial list of places beyond the boundaries of North America, where the United States stationed military forces during the interval between the two world wars: China, the Philippines, Guam, Hawaii, Panama, Cuba, and Puerto Rico. That's not counting the U.S. Marine occupations of Nicaragua, Haiti, and the Dominican Republic during a portion of this period. Choose whatever term you like to describe the U.S military posture during this era—incoherent comes to mind—but isolationism doesn't fill the bill.

Bacevich, by the way, is responding to a Richard North Patterson column that doesn't merely mention isolationism; it invokes "the isolationism in Europe and America which precipitated World War II." Bacevich is too kind to dwell on that phrase "isolationism in Europe," but I'll be scratching my head over it for a while. Does Patterson mean the Munich agreement? That would be a bizarre use of the word isolationist, but every other possible reference I can think of is even stranger.