Published: Thu, 30 Mar 2017 00:00:00 -0400
Last Build Date: Thu, 30 Mar 2017 20:33:52 -0400
Thu, 30 Mar 2017 13:15:00 -0400
(image) As you know, "Beauty and the Beast" is a fairy tale about a wealthy, powerful creature who holds a woman captive until Stockholm Syndrome kicks in and she learns to love Big Brother. Now that a new version of the story has hit the theaters, Dan Sanchez reminds us that the Beast's bride isn't his only victim:
As the new film's opening sequence makes explicit, the prince paid for his lavish lifestyle by levying taxes—so high that even lefty Hollywood regards them excessive—on the hard-working, commercial townspeople....The party-animal prince being transformed into a sulking beast may have amounted to a 100% tax cut for the town; no wonder the townspeople are so cheerful and thriving when we first meet them!
All princes and other nobles, after all, are descended from marauding warriors who settled down and transmuted plunder and tribute (protection money) into taxes and other feudal obligations.
This reminds me of my idea for a new version of "Sleeping Beauty"—one that focuses not on the comings and goings of various ruling-class parasites but on the prosperity that surely swept the countryside while the castle slept for 100 years.
Anyway, Sanchez (who claims he doesn't want to be a "childhood-ruining killjoy," but c'mon, that's half the fun) uses the new film as a nice hook for some anti-feudal, pro-commercial history. To read the rest, go here.
Fri, 24 Mar 2017 15:30:00 -0400Harlots. Hulu. Available March 29. Somewhere in the vast terrain between the hooker-as-fairytale-princess fantasy of Julia Roberts in Pretty Woman and the prim, grim Victorian sociology of Stephen Crane's Maggie: A Woman of the Streets lies Harlots, Hulu's odd but engrossing new drama about life inside an 18th-century London brothel. Screenwriter Moira Buffini, one of the five British women who produce, write, and direct Harlots, said in unveiling the project that the goal was "everything from the whore's-eye view." The result is that the women in Harlots are neither glamorous courtesans nor broken flowers, and their depiction is never erotic. There's plenty of nudity, of both sexes, but you've seen commercials for bladder medication that were sexier. The Harlots hookers don't make much money, but it's a living—and they regard the cops and and do-gooder moralists trying to close their house less as saviors than as a circling wolfpack. When a judge who's been asked to close the bordello as a public nuisance haughtily declares that "I grieve for the desperate women I have seen today who, faced with starvation, have sold their flesh," the prostitutes in the courtroom exchange looks laden with the unspoken question: "So you think we'll be better off in jail?" Harlots opens in 1763 with a prologue that claims a fifth of the women in London were hookers. That runs far ahead of police estimates of the day, but there's little doubt prostitution was a major industry. One of the show's early scenes, in which the women amuse themselves by reading their own notices in a Consumer Reports-style guide to the various local hookers and their skill sets ("one of the finest, fattest figures as fully finished for fun and frolick as fertile fancy ever formed...") is drawn from documented history. The brothel at the heart of Harlots is operated by Margaret Wells (Samantha Morton, nominated for an Oscar in 2002 as the troubled young immigrant mother of In America), a veteran of the trade whose virginity was bartered away for a pair of shoes at age 10 by her own mother. Margaret, buffeted by high rent and increasing graft demands by cops, hopes to get a much higher price for the maidenhead of her teenaged daughter Lucy (British TV actress Eloise Smyth). And she's playing the even more lucrative long game with slightly older but much more reluctant daughter Charlotte, who she's trying to place as an indentured consort to a wealthy nobleman. But her plans must be dangerously accelerated when cops raid her house, putting her out of business at least temporarily, and a rival madame (Lesley Manville of the British version of Law & Order) starts raiding her corps of whores. It turns out these two events are not coincidental. In a classic example of the regulatory-economics parable known as Baptists and Bootleggers, the other madame has been funding a decency group to attack Margaret's brothel and clear away the competition. That plot description sounds bleak, which is not entirely fair. Harlots burbles with the bawdy workplace humor of the hookers, from their theories about the sexual ontology of the reformers (the blind leader of the decency group, they speculate, lost her eyesight after putting her eyes out upon seeing her first penis on her wedding night) to, tart—heh-heh—remarks about job training. Told she must undergo instruction in cultural refinements, one of the women inquires, wide-eyed: "So, you will teach my cunny French?" The humor extends to the casting of Charlotte, the steely daughter resisting indenturement. She's played (quite well) by Jessica Brown Findlay, that sweet and gentle Lady Sibyl of Downton Abbey, whose death in childbirth so unhinged PBS cultists that the Washington Post ran a medical story explaining preeclampsia, the obscure condition that killed her, demanding an explanation of her inadequate treatment: "If Lord Grantham had listened to the country doc and sent his daughter to the hospital for a Caesarean section, would she have lived?" I'll buy a drink for the first one to write the [...]
Fri, 24 Mar 2017 14:05:00 -0400
Nuclear test films would be eerie and hypnotic even if they didn't have the extra resonance they reap by representing the power to inflict mass death. And now there's more of them around: The Lawrence Livermore National Laboratory is digitizing declassified atomic footage, and this month it started posting the clips online. Go to the lab's YouTube page, and you can watch the government nuking the water...
src="https://www.youtube.com/embed/XnrLY-phipw?list=PLvGO_dWo8VfcmG166wKRy5z-GlJ_OQND5" allowfullscreen="allowfullscreen" width="560" height="315" frameborder="0">
src="https://www.youtube.com/embed/L_jFQw78uzo?list=PLvGO_dWo8VfcmG166wKRy5z-GlJ_OQND5" allowfullscreen="allowfullscreen" width="560" height="315" frameborder="0">
...and the space beneath the ground:
src="https://www.youtube.com/embed/szbb00ncNNs?list=PLvGO_dWo8VfcmG166wKRy5z-GlJ_OQND5" allowfullscreen="allowfullscreen" width="560" height="315" frameborder="0">
To peruse the whole catalog, go here. Bruce Conner makes art from atomic-test footage here. Stanley Kubrick makes art from atomic-test footage here. The health consequences of being near such tests are explored here. Past editions of the Friday A/V Club are here.
Thu, 23 Mar 2017 09:50:00 -0400
(image) Where did the idea come from that small-business owners and religiously affiliated charities are at the mercy of lawmakers to decide whether they can exercise their faith?
All around us, governmental bodies at the state and federal level are claiming the authority to override the choices of private groups and individuals on questions such as whether or not to participate in a same-sex marriage celebration, and whether or not to offer insurance coverage for the morning-after pill. It's easy to assume these efforts are a thoroughly modern phenomenon. But in truth, this fight began more than 150 years ago. The only difference is that back then, Christian traditionalists were the ones trying to regulate a religious practice out of existence via laws that openly targeted members of the nascent Mormon faith.
In 1862, Congress outlawed plural marriage despite anguished protest from the Latter-day Saints, who at the time believed God wanted their men to take multiple wives. In subsequent decades, the U.S. Supreme Court repeatedly upheld the government's right to do as much. In the process, the Court laid down a precedent that the Constitution protects only your right to believe as you wish, but not your right to put those beliefs into practice in ways the state might not approve of.
"As the nation goes to war over birth control mandates and gay wedding cakes," I write in the April issue of Reason, "many religious supporters of traditional marriage and sexual mores understandably feel their rights are being trampled. But so did the Mormons a century ago. To justify the anti-polygamy laws forbidding that group to live out its faith, Christian traditionalists stretched the First Amendment to precarious lengths. Now, the arguments they created and employed are being turned against them."
Thu, 23 Mar 2017 08:15:00 -0400A man who lived with more than one woman was anathema in the 19th century; the media called polygamy an "act of licentiousness" that deserved to be categorically denounced, its adherents disenfranchised. In 1885, the U.S. Supreme Court upheld a federal law making plural marriage a felony, declaring that "the union for life of one man and one woman in the holy estate of matrimony [is] the sure foundation of all that is stable and noble in our civilization." A New York Times editorial celebrated that result, observing cheekily that "we had not supposed there had ever been any serious question." Today, it's the old-timey view that marriage is between one man and one woman only—and that sex should be reserved to that union—that raises the Grey Lady's ire. When Californians sought to ban gay marriage in 2008, the editors of the Times called the initiative a "mean-spirited" effort "to enshrine bigotry in the state's Constitution." Even assuming you think the paper was right the second time around, the reversal is striking. But while the norms have clearly changed, the desire to punish anyone who refuses to comply with those norms appears to be forever. As the nation goes to war over birth control mandates and gay wedding cakes, many religious supporters of traditional marriage and sexual mores understandably feel their rights are being trampled. But so did the Mormons a century ago. To justify the anti-polygamy laws forbidding that group to live out its faith, Christian traditionalists stretched the First Amendment to precarious lengths. Now, the arguments they created and employed are being turned against them. Discrimination Nation "We can't promote a marriage that God says isn't really marriage," the blog post would have read. "Even if our beliefs are a bit different or unpopular, we have to stick to them." But those words, penned by Joanna Duka and Breanna Koski, were never published to their website. The authors feared the government of Phoenix might come after them if they were. The young women, aged 23 and 24 respectively, are the owners of Brush & Nib Studio, an Arizona-based custom artwork and calligraphy shop. Shortly after getting their new business off the ground in 2015, they realized that a city ordinance passed two years earlier opened them up to enormous fines and even jail time as a result of their beliefs. The law forbids certain companies not just from discriminating against gays and lesbians but also from saying anything that so much as implies a customer would be unwelcome because of his or her sexual orientation. Duka and Koski don't want to be forced to create wedding invitations and other artwork that celebrate same-sex marriage, so they're suing to overturn the Phoenix regulation as a violation of their First Amendment rights. Their prospects seem grim, however: In September of last year, the Maricopa County Superior Court denied their request for a temporary injunction to stop the law from being enforced while the challenge proceeds. "There is nothing about custom wedding invitations made for same-sex couples that is expressive," the decision, incredibly, reads. That ruling is just one in a litany of recent instances in which small business owners have faced serious legal consequences for not wanting to be involved in commemorating same-sex unions. In Colorado, the owner of Masterpiece Cakeshop was hauled before the state's Civil Rights Commission. In Oregon, the proprietors of Sweet Cakes by Melissa were fined an eye-popping $135,000 and had to shutter their storefront. In New Mexico, the state Supreme Court told photographer Elaine Huguenin that she and her husband would be "compelled by law to compromise the very religious beliefs that inspire their lives." In upstate New York, a couple was forced to stop renting out their farm for wedding ceremonies unless they agreed to let gay couples marry there as well. In theory, the Constitution is supposed to prevent things like this. The First [...]
Mon, 20 Mar 2017 10:59:00 -0400"Good day to you," Jimmy Breslin told the crowd of cops. "I'd like the record to state that I'm here without a lawyer." It was 1969. By this time Breslin, who died yesterday, was already a well-known newspaper columnist, but he wasn't giving a talk about journalism. He was campaigning to be president of the New York City Council. Also on the bill was Breslin's running mate, Norman Mailer, who was aiming to be mayor. The place was the John Jay College of Criminal Justice, and Breslin was about to give what Mailer's campaign manager, Joe Flaherty, later described as "the best speech he would deliver during the campaign." (It went over better than Mailer's turn with the crowd, which featured lines like "there were years when I hated some of you guys so much it wasn't funny" and "I'm as yellow as any good cop.") After some opening jokes, Breslin got down to the heart of his pitch—elect his ticket, and "there will be no more New York Police Department as we know it": Our idea is to have this city become a state, have the various sections of this city become cities right inside the state, and let them run their own police. Let's get the wisdom of the neighborhoods, give them the power, and let them run with it. I say the plan is far better from a police viewpoint than the way we're going, because in my estimation policemen today are being used. The police get all the mistakes of all the people who are supposed to be more important and smarter than us. The argument that followed mixed lines a Black Panther could love ("Those days are gone when white people can rule the black neighborhoods") with sentences calculated to appeal to people afraid of Panthers ("I think the time also should be gone that we should ask a white person to go in there"). Breslin called for the radical decentralization not just of the police but of the schools, and he wrapped up with a joke about knowing a guy who could come in to teach a class on bookmaking. After the official event was over, the candidates found themselves shooting the shit with some beat cops who had skipped the lecture. One of them told Breslin he had "doubts about you with your long, curly hair." Breslin shot back that he "wouldn't want to walk into a piss house with you alone either, baby." By this time Mailer and Breslin weren't a conventional ticket so much as a double act, with each candidate taking the spotlight in front of different audiences. Breslin, a more natural populist, was better with Catholics and cops, Mailer with Jews and intellectuals; when they spoke before an audience of feminists, they both bombed. It never was completely clear how serious a campaign they were running. Breslin told one audience that "anyone who runs for office in this city, with the shape this city is in, and takes it as a joke, is committing a mortal sin." But it didn't take Flaherty long to decide that Breslin saw his candidacy "as a brief and witty exercise to discredit the regular pols, then an exit before the real campaigning began." The duo definitely believed that stuff about decentralization and community control. But they were also prone to pitching proposals that were satiric, utopian, or maybe both: banning cars from Manhattan, inviting gangs to fight jousting matches in Central Park, holding a stickball World Series on Wall Street. Mailer took to promoting an idea he called Sweet Sunday—one day a month when, in his words, "New York would stop for 24 hours. Everything would stop running. Electricity, cars, planes, trains, name it. If nothing else, it would give New York a chance to clear itself once a month. And people would hear themselves think for a change." Pressed on whether he'd permit hospitals to run their generators on Sweet Sunday, Mailer backed down slightly and said he'd allow it. And air conditioning? There he held firm, though he acknowledged that the people wouldn't like the results: "On the first hot day the populace would impeach m[...]
Thu, 16 Mar 2017 00:01:00 -0400During Cold War debates about the merits of capitalism and communism, Americans offered a simple gauge: the movement of people. "You have the Berlin Wall," the argument went. "We have the Statue of Liberty. If communism is a blessing, why do people flee Cuba for America, not the other way around?" Ronald Reagan, the hero of modern Republicans, knew that immigrants were not a threat to our way of life but a reinforcement of it. He welcomed them as allies, self-selected for their attraction to democratic ideals. They came here not because they wanted to change America but because they admired it as it was. "I have always believed there was some divine providence that placed this great land here between the two great oceans," he said in 1986, "to be found by a special kind of people from every corner of the world who had a special love for freedom and a special courage that enabled them to leave their own land, leave their friends and their countrymen, and come to this new and strange land to build a new world of peace and freedom and hope." Imagine what Reagan would think of Rep. Steve King, R-Iowa, who abhors foreigners like a deadly virus. On Monday, King tweeted, "We can't restore our civilization with somebody else's babies." Last year, he declared, "Cultural suicide by demographic transformation must end." This week, King insisted that immigrants are "importing a different culture, a different civilization, and that culture and civilization, the imported one, rejects the host's culture." King was talking about Middle Easterners, but his suspicions extend to undocumented immigrants, most of whom come from Latin America. He claims they are "refusing to assimilate into the American culture and civilization." Among the alleged sins of Latino immigrants are that they drag down wages and give birth at public expense. But if there's anything worse than poor foreigners, it's rich ones. Steve Bannon, Donald Trump's chief White House strategist, has complained (inaccurately) that "two-thirds or three-quarters of the CEOs in Silicon Valley are from South Asia or from Asia." He sees their numbers as trouble because, he says, "a country is more than an economy. (It's) a civic society." Yes, it is. And Silicon Valley is a proud product of ours, showcasing the wonders that intellectual and economic freedom can create. It's absurd to think immigration undermines our civic life. Immigration has always been inseparable from our civic life. What is it about high-achieving Asian-Americans that Bannon finds threatening to our way of life—aside, that is, from their race? From the start, immigrants have elicited groundless panic. Bannon, a Catholic, forgets that Catholic immigrants were once seen as fundamentally hostile to democratic principles. The eminent 19th-century Presbyterian minister Lyman Beecher warned that "the subjects of the pope" would "subvert our free institutions." Beecher would be surprised that subjects of the pope now dominate that free institution called the Supreme Court. The court also has three Jewish justices, which would offend supporters of the Immigration Act of 1924. It was designed to keep out Jews, among others, who were seen as genetically inferior and politically radical. Jews, however, confounded anti-Semites by succeeding and integrating into American society just as every previous immigrant group had. There is no reason to think newcomers from Latin America or the Middle East will be any different. King and others believe Islam is irredeemably violent and hostile to freedom and democracy—hence his opinion that admitting Muslim refugees amounts to "cultural suicide." But he underestimates the power of American culture. A 2011 Gallup poll found that 89 percent of American Muslims say there is never a justification for an individual or group to target and kill civilians—compared with only 71 percent of Protestants and Catholics. American Mu[...]
Tue, 14 Mar 2017 06:00:00 -0400In an unremarkable hotel room, a team of officers watches the footage streaming from a hidden camera next door. A middle-aged man is making arrangements to pay a young woman for sex. Once she agrees, the squad will rush in, shouting instructions, their bulletproof vests bulging with firearms and emblazoned with police or FBI. The woman—or is she a girl?—will have her hands tied behind her back and her phone confiscated. She will sit on the bed, partially undressed, as a team of men search her room, pawing through her underwear drawer and toiletry bags, seizing any cash they find. She will eventually be fingerprinted, interrogated, and taken into police custody. Welcome to Operation Cross Country, the U.S. government's huge, intrusive, and utterly ineffective effort to fight child sex trafficking. Variations on the scene above play out again and again in sensationalized montages of footage from the stings, which the FBI has been proudly posting to YouTube since Operation Cross Country launched in 2008. The vignettes are unsettling. In one scene, someone can be heard crying in the background as the camera pans past her stuff—Skittles, electric toothbrush, makeup—and settles on cops counting stacks of money. Other clips follow officers tailing people in tight dresses and stiletto heels or scouring printouts of escort ads from hotel beds. Shot after shot show authorities handcuffing young people, mostly women and girls, and parading them down dim hallways, thick gloved hands gripping skinny arms on either side, or pushing them up against cop cars, the camera lingering on cuffed wrists clasped tightly over baggy jeans or long, bare legs. The latest iteration of the initiative—Operation Cross Country X—took place across 103 U.S. cities from October 13 to 16. According to the FBI, it involved the efforts of 74 federally led Human Trafficking Task Forces, comprised of officers from 55 FBI field offices and more than 400 federal, state, and local law-enforcement agencies. These included city and suburban police departments, county sheriff's offices, state police and investigative bureaus, juvenile detention departments, drug enforcement units, and an impressive array of federal entities: Homeland Security Investigations, Immigration and Customs Enforcement (ICE), the U.S. Marshals Service, the Drug Enforcement Administration (DEA), Customs and Border Protection, the Internal Revenue Service (IRS), the Coast Guard Investigative Service, the State Department, myriad U.S. Attorney's Offices, and the Bureau of Alcohol, Tobacco, Firearms, and Explosives. They were aided by the National Center for Missing and Exploited Children (NCMEC) and local nonprofits that had recently received federal grants. According to an FBI press release, this mighty group conducted "sting operations in hotels, casinos, truck stops, and other areas frequented by pimps, prostitutes, and their customers." The focus: "recovering underage victims of prostitution," or, as FBI Director James Comey put it, offering sexually exploited children a "lifeline" from a "virtual prison." Overall, the operation identified 82 "children" engaged in prostitution, an average of about 0.88 per city, or one for every five agencies participating. All were teenagers—mostly 16- and 17-year-olds—and a number of cities where they were found made no simultaneous pimping or sex trafficking arrests. To the feds, anyone under 18 who trades sex acts for money is defined as a victim of sex trafficking, regardless of whether they have experienced abduction, violence, restraint, or threats. In the end, only five men stand accused of federal crimes—with only two accused of crimes against actual minors. None of these suspects was part of anything even remotely resembling an organized criminal enterprise. In the four months following Operation Cross Country X, U.S. prosecutors announced fe[...]
Mon, 13 Mar 2017 11:15:00 -0400Yesterday was the start of Daylight Saving Time, and if anyone would like to form a SuperPAC to destroy politicians who jack around with my circadian rhythm I'll gladly chip in a few bucks. The twice-annual timepiece adjustment is outdated and irritating. States should pick a time zone and commit. Let's first dispense with some of the myths behind Daylight Saving Time (DST). Many people assume we enacted DST to help farmers. That's nonsense. Most of my relatives who aren't in prison are farmers. I have no idea what time they wake up in the morning because whenever I visit they've already eaten lunch by the time I'm mixing a hangover cure. They rise before dawn to feed the cows, mow the corn, construct scarecrows, etc. All without directives from Congress. Daylight Saving Time came about because of World War I. Germany, the United Kingdom, and the United States all pushed our clocks forward to better coordinate waking hours with light bulb use, thereby conserving electricity. The program lapsed until World War II, when President Franklin Roosevelt instituted "War Time Zones," which were basically the same thing, only with a cooler-sounding name. Astonishingly, despite originating as a temporary FDR government program, War Time Zones actually ceased at the conclusion of the war. Thereafter time zones defaulted to municipalities until 1966, when Congress enacted a permanent annual Daylight Saving Time, in part to standardize the plethora of discordant clocks across the nation. Today all of these reasons are outdated. We probably won't go to war with Germany again for another 20 or 30 years. And all of the economic benefits seem to cancel each other out. While we saved about 1 percent on electricity when first enacting DST, that figure is now offset by an increase in air conditioning. The idea that we'll all revert back to discordant municipal time zones set by the sundial in our mayor's front yard is utter nonsense. Everyone I know owns a smartphone, set automatically by a clutch of nerds in Cupertino. Each year a dozen or so state legislatures consider ending Daylight Saving Time, only to drop the measure and return to squabbling about transgender bathrooms or determining what the official state reptile should be. (The Moutain Boomer, of course.) Legally, if a state decides to drop Daylight Saving Time, it must then procure an exemption from the U.S. Department of Transportation. It's possible Secretary Elaine Chao would enforce federal time regulations with an iron fist and scream "this is the hill I will die on!" but I think we could probably win her over. There's a healthy debate about whether places like California should scrap DST and permanently move an hour forward or backwards. Television companies consider darkness their ally, and know that the earlier the sun sets the quicker viewers drop irritating habits like family picnics or soccer games and return to the vital activity of watching The Big Bang Theory. Conversely, the Chamber of Commerce and its chorus of retailers lust for delayed sunsets, because shoppers will stay out later buying The Big Bang Theory paraphernalia at malls. I'm a devout evening person and also a shill for the Chamber of Commerce, so I'd prefer we postpone sunset until around 11:30 at night. If nothing else, to punish all of you sanctimonious morning people for bragging about what you accomplished before breakfast, such as mowing your corn, constructing scarecrows, watching an episode of The Big Bang Theory, and so forth. That said, I think I speak for most Americans in saying: Just pick one! If Arizona and the territory of Guam can figure out how to commit to one time zone, surely the rest of us can.[...]
Tue, 07 Mar 2017 12:32:00 -0500
(image) So the House Republicans' replacement for Obamacare is Obamacare Lite. If it passes, we'll keep the same basic subsidy-and-penalty insurance framework we have with the Affordable Care Act. If it doesn't pass—and there's a good chance it won't—we may well end up keeping the Affordable Care Act, full stop. Or perhaps we'll get a different bill that rearranges the system, maybe makes it stingier, but doesn't challenge its basic approach. What we probably won't get is something radically different.
This often happens when Democrats enact a big expansion of the social-welfare state. Republicans protest noisily, but when a Republican becomes president the party makes its peace with it. Dwight Eisenhower conserved the New Deal, and in fact built on it. (Among other things, he expanded Social Security coverage, launched some big infrastructure projects, and birthed the Department of Health, Education, and Welfare.) Barry Goldwater called the results a "dime-store New Deal"—programs for a president who was less ambitious than Harry Truman but didn't want to challenge the post-Roosevelt order. But Eisenhower was president and Goldwater wasn't.
Similarly, Richard Nixon ratified the core elements of the Great Society. Yes, he pared back some of it. He accelerated the process, already underway before he took office, of clearing out its semi-radical side—all those activists who took that "maximum feasible participation" talk seriously. But Medicare, Medicaid, food stamps, the expanded federal role in education: Nixon kept those in place, and so did Ronald Reagan. Call it the Great Society Lite, or maybe the Pretty Good Society.
Republicans have spent nearly a decade denouncing Obamacare, first as a concept and then as a law. But they've denounced a lot of things over the years. As of now, Phil Klein writes, the Democrats "have won the central philosophical argument, and Republicans are reduced to fighting over the mechanics." He's talking about health care, but you can apply his words to more than that.
Mon, 06 Mar 2017 13:42:00 -0500Once there was an election when about two-thirds of Congress either retired or were defeated. This massive turnover was preceded by a wave of raucous town-hall meetings where hundreds of angry constituents "gathered to draft denunciatory resolutions, deliver angry speeches and, in some cases, stage mock court proceedings against their local House members." The year was 1816. And if you'd like to hear more about what happened, you're in luck: Joshua Zeitz has written an engaging account of that ballot-box rebellion over at Politico, with a eye trained on how those old town-hall revolts resemble the Tea Party protests of the early Obama years and their anti-Trump counterparts of today. If you grew up thinking of the years after the War of 1812 as a sedate "era of good feelings," Zeitz's story may come as a surprise. The immediate impetus for the protests of 1816 was the Compensation Act, a bipartisan bill to increase congressional pay. But the broader force at work, Zeitz argues, was a gradual shift away from the idea of explicit elite rule. More Americans were getting the right to vote, in part because new states were competing with old states for citizens. The ruling class was increasingly seen as a faction with its own interests, rather than as the disinterested defenders of the public good. And upstream from politics, a spirit of cultural leveling was overturning the old spirit of deference: If elites were not the guardians of a fictitious public good, equally, they had no lock on truth or fact. In parallel with the democratization of politics and government, over the first half of the 19th century, professions like the law, medicine and ministry underwent a similar, dramatic democratization, with states loosening educational and licensing requirements. Not everyone approved of these developments. A college president in Pennsylvania anticipated with worry book titles like, "Every Man his own Lawyer," "Every Man his own Clergyman and Confessor," or "Every Man his own physician." "Truth," grumbled a concerned Federalist, "has but one side and listening to error and falsehood is indeed a strange way to discover truth."...Another opponent of this new hyper-democratic, relativist spirit warned against a world in which "the unalienable right of private judgment involves the liberty of thinking as we please on every subject." Two centuries later, we're hearing the same elite anxieties. Zeitz notes that the Jeffersonians tended to stoke that spirit of revolt. He also notes that the losses of 1816 hit Jeffersonian as well as Federalist incumbents. With that in mind, and with the Tea Party rebellion in the rear-view mirror, he ends his essay with this thought: Now [the Republican Party] controls every branch of government. They are the elite. And they may soon find, like members of Congress 201 years ago, that the forces of democratic populism are hard to contain, indiscriminate in whom they target and unforgiving of powerful people when they believe that those powerful people have betrayed them. Read the whole thing here. Related: "Trump Now Faces the Same Public Distrust That Propelled Him Into Office." Also related: "A Short History of Libertarian Moments."[...]
Fri, 03 Mar 2017 14:00:00 -0500Arkansas State Rep. Kim Hendren (R) has introduced a one-page bill that would ban "study books or any other material authored by or concerning Howard Zinn" from the state's public schools, including charter schools. Zinn has been dead since 2010, so it's not like he's a commentator on current social or political affairs, and he's not the kind of writer typically taught in high schools. But in a phone interview with Reason today, Rep. Hendren explained why he introduced legislation to protect Arkansas teenagers from hearing the ideas of the late radical leftist historian (and favorite of fictional Matt Damon characters). Hendren, who says "he's not an expert" on Zinn, asserts that a number of his constituents have raised "concerns about some of the approaches that Howard Zinn has taken to history in the books he's written." He adds, "My basic personal philosophy is I think we ought to be open to hearing both sides of the situation and then try to do what's best for ourselves and our country. That's what will happen with this bill." The 79-year-old legislator — who also happens to be the brother-in-law of Arkansas Gov. Asa Hutchinson (R) — clarified that the bill is only meant to apply to elementary and secondary schools, not public colleges. When asked if he thinks this bill could set a precedent allowing for left-leaning states to ban conservative historians' perspectives from being considered in public education, Hendren said, "Ultimately the parents have a little more responsibility to what [children] are exposed to until they are a little bit older to be able to exercise more judgment. In college and so forth, I have no problem with it." Hendren says his concern is primarily with providing equal time for opposing political viewpoints to avoid "indoctrination" of one point of view, and that his aim with this bill is not necessarily seeing his bill passed in its current form, but rather, to spark a conversation and debate. Hendren tells Reason that since news of his bill was first reported by the Arkansas Times, he has been inundated with hostile phone calls and tweets. He adds that he doesn't think he's done anything to make people think he's "a bad American or somebody that ought to be degraded or called a cracker." In late 2016 Hendren introduced a bill that would ban students from possessing any personal electronic or digital devices while at school, including video game consoles, cell phones, cameras, tablets, and pagers (what year is this?). Explaining his motivation for introducing the bill, Hendren told KATV, "If it's going to allow a young boy in that class to email, or however they do…Instagram or whatever they do, a girl in their class to send him a nude or partial nude picture, which is going on now in the public schools, it ought not be done in the classroom." As a state senator running for a U.S. Senate seat in 2009, Hendren found himself in hot water when he referred to Sen. Charles Schumer (D-N.Y.) as "that Jew." Hendren later apologized and tried to explain away his gaffe by saying, "I don't use a Teleprompter, and occasionally I put my foot in my month...I was attempting to explain that unlike Sen. Schumer, I believe in traditional values, like we used to see on 'The Andy Griffith Show.'" Telling Reason that he doesn't "think it harms people to discuss what we are discussing here," Hendren hopes for "an intelligent, respectful debate" in the Arkansas House of Representatives, adding, "we Arkansas folks think we ought to listen to each other and then try to work out a solution that's best for us and our country and our state and our young people." Of the people he says are conflating his actions with book-burning — or those that would describe his bill as fundamentally hostile to free speech and whic[...]
Thu, 23 Feb 2017 00:01:00 -0500Any president can change the future. Donald Trump stands out for his ability to change the past, without even trying. He's already altered perceptions of what happened in America decades and centuries ago. We know that because of a new survey of presidential historians conducted by C-SPAN, asking them to rank presidents on various attributes and overall performance. The latest scorecard, which included responses from 91 historians, is similar in most respects to those compiled in C-SPAN's first two, in 2000 and 2009. But it holds some surprises that suggest that things look different with Trump in the picture. Some things are fixed. The greatest president is Abraham Lincoln, who has finished first in each poll. Coming in second, for the second straight time, is George Washington. Franklin Roosevelt is third, just ahead of cousin Theodore. The worst, three times running, is James Buchanan, who preceded Lincoln and whose indulgence of pro-slavery forces is blamed for helping to bring on the Civil War. Second-to-last each time has been Andrew Johnson, who succeeded Lincoln and was the first president to be impeached (though he was not convicted). This is the first poll to include Barack Obama, who came out ahead of his most recent predecessors. Obama is ranked No. 12, three spots below Ronald Reagan but ahead of George W. Bush (33), Bill Clinton (15) and George H.W. Bush (20). Obama is one of the lowest-rated presidents in terms of relations with Congress—worse, somehow, than William Henry Harrison, who died a month after taking office—and got mediocre marks on foreign relations, but he scored high on pursuing equal justice for all. The biggest improvement was registered by Dwight Eisenhower, ranked ninth in 2000 and eighth in 2009. He landed at fifth, jumping over John Kennedy, Thomas Jefferson and Harry Truman, who were ahead of him the last time around. The biggest decline was that of Andrew Jackson, who slid from 13th in 2000 and 2009 to 18th. Richard Norton Smith, a presidential biographer and member of C-SPAN's advisory team, suggests that the changing fortunes of Eisenhower and Jackson are both partly the product of a "Trump effect." Eisenhower, Smith told me, benefits from being "the anti-Trump—massively competent, self-effacing, moderate." He had been supreme Allied commander in Europe during World War II, and despite his Army background—or because of it—he warned, "A nation's hope of lasting peace cannot be firmly based upon any race in armaments but rather upon just relations and honest understanding with all other nations." Eight years of comparative peace and prosperity made his administration a stark contrast to those of Obama and George W. Bush, which featured endless war and a deep recession. The disappearance of centrism in the Republican Party doubtless elicits a nostalgia for Ike, who triumphed over Joseph McCarthy and others on the far right. Jackson, Smith suspects, has declined in public estimation as his slave ownership and brutal policies toward Native Americans have acquired new significance. It probably doesn't help that Trump's approach to foreign relations has been described as "Jacksonian" for its pugnacity, unilateralism and contempt for human rights considerations. Unlike Alexander Hamilton, Jackson inspired a Broadway musical (Bloody Bloody Andrew Jackson) that never found an audience. Nothing about Trump, however, affects the standing of the highest-ranked presidents. Why not? Because he only highlights their well-known virtues. Lincoln is revered for his humanity, his unflagging resolve and his capacity for profound thought and eloquent word. Washington was a master of dignity and statesmanship. Franklin Roosevelt had a capacity to inspire and unite Americans[...]
Wed, 22 Feb 2017 16:00:00 -0500Democracy's Detectives: The Economics of Investigative Reporting, by James T. Hamilton, Harvard University Press, 368 pages, $35 Sloshed on 30 percent profit margins, the news media went on a drunkard's tear over the final three decades of the 20th century. Some publishers, such as Gannett, spent their loot acquiring more newspapers. The Boston Globe blew a portion of its windfall on foreign bureaus, establishing its first in the early 1970s and eventually expanding to five. Newspapers everywhere expanded regional and national bureaus, sprouted additional sections, added color printing, hired more journalists, and boosted circulation as the money bender continued. Almost every news outlet—print or broadcast—spent heavily on investigative journalism, producing a scoop renaissance. The Johnny Deadlines dug deep to bust crooked cops, call out polluting corporations, and expose criminal justice outrages. Health care fraud, banking hijinks, payoffs, bribes, and government waste got a full press airing. But the renaissance stalled at the new century mark as cable TV and the web encroached on the advertising monopoly the press had grown sozzled on. Then came the 2008–09 recession, reversing the grand expansion. The Globe closed all of its foreign bureaus; newspapers shed their suburban and regional bureaus; whole newspaper sections folded; and tens of thousands of journalists got sacked. The investigative beat took a hit too, as James T. Hamilton, a Stanford professor of communications, explains in his comprehensive study, Democracy's Detectives: The Economics of Investigative Reporting. Unmistakable proof of the decline: No trade honors itself as grandly as journalism, so submissions to investigative award programs are a fine marker of how much of that genre is being produced. During the 2008–09 recession, submissions to the popular Investigative Reporters and Editors contest dropped 34.1 percent compared to 2006–07, indicating the extent of the cutback. The biggest losers haven't been journalists—who cares about them, anyway?—but members of the public, from whom more perfidy is concealed, while public officials, bureaucrats, and corrupt businessmen have scored. "Which stories get discovered where depends on economics," Hamilton writes. By bringing the economist's eye to the business of investigative journalism, Hamilton sharpens our appreciation of the craft as he explores its history, the motivations publishers have to fund the work, and the cash benefits investigations pay out. Investigative journalism, Hamilton tells us, produces extraordinary benefits—perhaps billions of dollars' worth. A journalistic investigation of government waste can save taxpayers a lot of dough if officials pay attention. A successful probe of commercial fraud can likewise prevent crooks from looting millions from consumers and investors. And where a dollar figure can be placed on health, the best investigations can save untold millions when policies change. The tragedy of investigative journalism is that its publishers can never come close to fully monetizing those benefits. Investigative journalism, in the economist's parlance, produces positive externalities by the tanker-load which almost everybody except news outlets ends up reaping. If it were feasible for an outlet to claim even a tiny vig from the benefits they produce, we'd likely see tons more investigations. Instead, investigative journalists and their publishers must depend on indirect payouts. Reporters can reap psychic income for their work, for example, and the proliferation of investigative journalism prizes show that they're cleaning up in that market. Some publications back investigative projects for partisan reasons[...]
Fri, 17 Feb 2017 15:00:00 -0500The Good Fight. CBS. Sunday, February 19, 8 p.m. Sun Records. CMT. Thursday, February 23, 10 p.m. "All rock 'n' roll came out of Sun Records!" declares Jerry Lee Lewis in the opening moments of CMT's bopping new miniseries. Like a lot of things in Sun Records, it's not quite true, but you'll be too busy dancing to care. Sun Records rocks! Filling out the early-1950s birth certificate of rock 'n' roll is no easy task. Did the water break in Chicago, where Chuck Berry was underlining his tone poems about the lives of an emerging demographic, the teenagers, with a jangling guitar? Or Philadelphia, where Bill Haley was punching up western swing music with machine-gun saxophone lines? Or West Texas, where Buddy Holly's nerd glasses distracted parents from his ragged cries to their kids to rave on? Memphis, perched just above the Mississippi Delta at a strategic spot where icy bluesmen and hillbilly shouters were bound to collide, has as good a claim as any of them. And Sam Phillips, owner of the corner-store Sun Records, if not the father of rock 'n' roll, was surely its midwife. Phillips in 1951 cut what is perhaps the first rock 'n' roll record, Jackie Brenston's Rocket 88 (though fans of Wynonie Harris' 1949 Good Rockin' Tonight will argue the point unto death and beyond). He discovered and signed Elvis Presley, Jerry Lee Lewis, Roy Orbison, Johnny Cash, and Carl Perkins, then eventually lost them all because his mom-and-pop business instincts never rose to the epic level of his artistic vision. Three generations past the rise of rock 'n' roll, the thrill of its rise—the most exciting cultural revolution in American history—is in danger of being forgotten in an age of fans who don't know who Paul McCartney or Wings are, much less that he was in a band before that. But Sun Records is more than up to the task of its tale. The 10-episode miniseries starts out in 1951, just as Phillips is turning away from a successful career as a radio-station engineer to concentrate on his bandbox recording studio. Moving away from his bread-and-butter business of taping funerals and weddings, Phillips starts encouraging musical acts he spots in the down-and-dirty clubs along Beale Street, the main artery of Memphis' black nightlife. But his efforts are met with relentless hostility by record distributors, radio stations, parents and even his own wife. "I swear I heard the heavens open up," he exclaims as he plays his newest record for his wife. Sniffs she: "Sounds like the gate to Hell to me." Intercut with Phillips' story in Memphis are scenes of simmering discontent from a restless post-war generation. In rural Arkansas, a teenage Johnny Cash is trying to escape not only the fields where his parents sharecrop, but the dead-end schools where the three R's are reading, writing and the road to Detroit in hopes of a job on an automobile assembly line. In Louisiana, an adolescent Jerry Lee Lewis and his priapic-TV-evangelist-to-be cousin Jimmy Swaggart are sneaking into whorehouses to ogle the girls and, in the process, inadvertently picking up a thing or two about jump-blues piano. Back in a public-housing project, shy high-school kid Elvis Presley's cultural tourism is taking the opposite direction: He's slipping away from sermons at his own church to listen to the gospel singing at a black congregation on the other side of town. And in Nashville, Presley's soon-to-be manager, carny barker Tom Parker, has hustled his way from a gig with nickel-a-peek dancing ducks ("You shoot 'em! You eat 'em! You chase 'em around the yard! You see 'em in the pool! But you ain't never seen 'em dance!") to promoting country crooner Eddy Arnold. The backdrops to the inexorable mar[...]