Subscribe: Reason Magazine
Added By: Feedage Forager Feedage Grade B rated
Language: English
american  federal  free speech  free  government  make  new  people  percent  police  speech  states  tax  trump  war  years 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Reason Magazine

Articles from

Updated: 2017-09-26T00:00:00-04:00


College Isn’t Higher Education and May No Longer Be the Best Way to Deliver the Goods


There's a "deep partisan divide on higher education," reported Inside Higher Ed in July. A month later, Gallup got more specific, asking, "Why are Republicans down on higher ed?" Is that really true? Have our red/blue tribal loyalties actually split us over our views of the value of education beyond the high school level? Let's see. Well, the articles, based on separate polls from Pew and Gallup, found some strong partisan disparities. According to Pew, 58 percent of Republicans say that colleges and universities have a negative effect on the way things are going in the country (36 percent said they have a positive effect), compared to 19 percent of Democrats with a negative view of colleges and universities. Gallup found that only 33 percent of Republicans and those leaning Republican have a great deal of faith in colleges and universities (67 percent had some or very little), compared to 56 percent of Democrats and those leaning that way (43 percent had some or very little). So why do Republicans have so little faith in— Wait a minute. Those headlines said "higher education," but poll respondents were asked about "colleges and universities." That's not necessarily the same thing. Sure, colleges and universities have long been the traditional means of hammering learning into the heads of adults, but asking about the delivery system isn't the same thing as asking about the product. And the delivery system is looking a bit seedy these days. Pew shows a sharp flip in support for colleges and universities among Republicans from generally positive in 2010 to negative now (Gallup just added the question in its latest poll, so has no historical data). That flip occurred during years when colleges and universities have frequently featured in wince-worthy headlines about ideological intolerance, politicized instruction, and eroding due process. In recent weeks, Reed College, a private, liberal-arts college in Oregon, canceled classes after student protesters disrupted lectures over accusations that a humanities course is too Euro-centric. "A group of freshmen also got involved, complaining that their lecture had been taken over, and the conversation became a shouting match," according to Inside Higher Ed. At almost the same time, Bret Weinstein accepted a $500,000 settlement and he and his wife, Heather Heying, resigned from their positions teaching biology at Evergreen State College, in Washington. Weinstein was essentially chased off campus by activists for objecting to racially charged student protests. At the height of the controversy last spring, the campus closed amidst threats of violence and thousands of dollars in vandalism. Students infuriated over disagreement and dissent? Well, why not? Too many disciplines—and entire campuses--have been captured by ideology, making opposition increasingly rare and risky. In 2015, social psychologist Jonathan Haidt cautioned that "As psychology has become politically purified, its concepts have morphed to make them more useful to social justice advocates trying to prosecute and convict their opponents. This political shift poses a grave danger to the credibility of psychology." Two years later, sociologist Musa al-Gharbi echoed that warning, writing, "The fact that many US universities are so out of step with broader society is also contributing to declining public confidence in them—and a growing inability among social researchers to relate to ordinary people." That "out of step" quality bleeds out of the classroom and affects even students who might try to hide dissenting viewpoints, but still expect decent treatment. Amidst a tidal wave of lawsuits against colleges and universities for bypassing due process protections for the accused, U.S. Education Secretary Betsy DeVos rescinded a federal "dear colleague" letter that pressured college administrators to pursue sexual assault charges against students through campus kangaroo courts run according to dubious rules. The change didn't come out of the blue; prominent Harvard law faculty had warned in 2014 that "As tea[...]

Humanizing the Struggle at the Oslo Freedom Forum in New York


"Some of our speakers don't live through the year because they're executed, or they're assassinated," Oslo Freedom Forum founder Thor Halvorssen told an audience at Alice Tully Hall in New York. The conference—a smaller version of an eight-year-old international event variously described as "the Davos of dissidents" and "a bit like Comic-Con, only all the heroes are real"—coincided with the United Nations General Assembly session a short distance away. At a pre-conference reception, Garry Kasparov, the Russian chess master and opposition activist, took note of the meeting of tyrants on one side of Manhattan and the meeting of freedom champions on the other. Somehow, he quipped, freedom always seems to be in the West. Halvorssen said that one goal of the one-day New York conference last week was to "humanize the struggle" of freedom fighters and freedom seekers; indeed, it brought together people with very different stories and struggles. North Korean defector and human rights activist Ji Seong-ho, now in his early thirties, survived the famine of the 1990s. He lost his left hand and foot in 1996 after being run over by a train while scavenging for coal to trade for food, and endured grueling surgery without anesthesia. A decade later, on wooden crutches, he and his brother made a 6,000-mile trek through China, Laos, Burma and Thailand to escape to South Korea. Ji's raw, emotional account in Korean, in a sometimes-breaking voice—aided by photos on a big screen, such as the schoolroom by emptied famine—told a powerful story even though I had forgotten to get a translation headset. Ji's story ended in finding freedom and in being made whole by modern medicine and prosthetics. No translator was needed for his final triumphant gesture, holding up the crutches he no longer needed. But it is also a story of unfinished work: Ji's activism on behalf of those still trapped under North Korea's hellish regime. The coexistence of the harrowing and the upbeat, of victory and never-ending battle at devastating cost was a central, if unspoken, theme of the Freedom Forum. Iranian-born author Marina Nemat, now living in Canada, was arrested in 1982, at the age of 16, for criticizing the Islamic revolution in the school newspaper; she was tortured in prison and sentenced to death. (Her life was spared largely to the intercession of a guard to whom she was forcibly married.) Russian democracy activist Vladimir Kara-Murza not only lost a close ally and friend, Boris Nemtsov, to assassination but was himself the target of two apparent poisonings that left him comatose and near death. Kara-Murza spoke in flawless, almost unaccented English of the yearning for freedom in Russia, a country often stereotyped as craving the whip. He spoke of the risks taken daily by critics of the Putin regime, of political prisoners—now numbering about a hundred, comparable to the late Soviet period—and Nemtsov's murder ("when smears and threats fail, they use bullets as their final argument"). Kara-Murza recalled his "first conscious political memory" at the age of ten: the Soviet hardliners' coup in August 1991, when Russians, including his father, "not armed with anything except their dignity and their determination to defend their freedom," were able to stop the coup leaders who had everything, from the mass media to tanks, at their disposal. The activism of Somali-born Leyla Hussein, now a psychotherapist living in England, is driven by a much more gruesome childhood experience: the genital cutting she suffered at the age of seven, together with her younger sister. "I started to campaign against [female genital mutilation] not because I thought it was wrong; I wanted to protect my daughter," Hussein said. She is co-founder of the non-profit Daughters of Eve and author of a much-discussed television documentary on FGM in England, The Cruel Cut. Hussein said she has been threatened and physically assaulted for her activism, which some see as offensive to religious and cultural values. Because of safety concer[...]

Why We Need To Shrink the National Debt, And Fast!


It was big news when our national debt recently passed the $20 trillion mark. What's less understood is exactly why having such a massive debt is a bad thing. The short answer is that too much debt slows economic growth, reducing living standards.

The sheer size of the existing debt is deeply worrying to economists on both the left and the right, who agree that when debt reaches 90 percent of GDP for five years in a row it means painfully slow growth, creating what's called a "debt overhang."

A group of progressive economists affiliated with the University of Massachusetts predicted in 2013 that a debt burden at that level would result in an annual growth rate of just 2.2 percent, which means economic stagnation and anemic job growth. (Earlier this year, one of those researchers co-authored a paper walking back that claim; read it here).

So when will our debt load cross the 90 percent threshold? It's actually been at more than 100 percent of GDP for years now. Periods of slow growth associated with debt overhangs almost always last more than a decade and sometimes stretch out over a quarter century. That means that in 25 years, the overall economy will be about 75 percent the size it would have been if the government had only gotten the debt in check.

That's not much of a future to look forward to.

Countries like New Zealand, Canada, and Germany have demonstrated that when governments reduce debt good things happen. U.S. spending, by contrast, has been above 20 percent of GDP for years, which is well above the historical average. No wonder the Congressional Budget Office predicts that the economy will grow less than 2 percent annually over the next decade. Compare that to growth rates of more than 3 percent for much of the post-World War II period.

Barack Obama, and George W. Bush were leaders who lacked the integrity to do what's best for the country by keeping spending and debt in line. President Donald Trump also shows no interest in explaining to the public how runaway debt chokes off the future. That's a failure which we'll all be paying for for a very long time to come.

Edited by Mark McDaniel. Written by Nick Gillespie. Graphics by McDaniel and Meredith Bragg. Cameras by Jim Epstein and Alexis Garcia.

Subscribe to our YouTube channel.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.


To Commemorate Constitution Day, Princeton Professor Says 'F%*# Free Speech'


Every year, Princeton University holds a Constitution Day to honor one of the most important documents in human history. This year's was was a little different, with lectures on search and seizure policies in the Snowden era, and another on slavery and the Constitution. And then there was a lecture called "F%*# Free Speech: An Anthropologist's Take on Campus Speech Debate." Professor Carolyn Rouse, the chair of the Department of Anthropology and director of the program in African Studies asserted, "the way which free speech is being celebrated in the media makes little to no sense anthropologically," according to Campus Reform. Free speech absolutism doesn't exist because people self-censor themselves in ways society deems appropriate, Rouse told her audience. Culture is the prime determiner of what speech is permissible and what speech is rejected, she said. "Language is partial," Rouse argued. "It relies on context for comprehensibility, and can have implications that go far beyond simply hurting somebody's feelings. Put simply, speech is costly. So, contrary to the ACLU's statement on their website regarding the role of free speech on college campuses, the academy has never promoted free speech as its central value." Absolute free speech means every idea is granted equal consideration no matter how crazy it sounds, and for this reason free speech absolutism should not be valued in academia, according to Rouse. "Free speech is also asymptotic with respect to the goal of allowing people to say whatever they want, in any context, with no social, economic, legal, or political repercussions," Rouse said. Free speech absolutism fails in an academic setting, Rouse argued, when it allows equal footing to the belief of a climate-change skeptic that "all the science discovered over the last X-number of centuries were irrelevant" and the arguments supporting climate change from a scientist. Rouse seems to see in this scenario a failure of free speech, rather than an opportunity to challenge ideas and see how they hold up in the marketplace of ideas. Preventing the climate change skeptic from talking about his views won't make them disappear. And it isn't just academia, Rouse contended. No other social institution values free speech absolutism. Every institution has some sort of speech constraint, she said. A defendant can't walk into a courtroom and just start preaching his innocence. The rules and procedures of court prohibit this and, appropriately, Rouse said. To some degree Rouse is correct. Institutions ranging from the courts to the media have some restraints on speech. People self-censor for a variety of reasons. Rouse misses the mark when she suggests the goal of free speech is to allow people to say whatever they want, consequences be damned. The goal of free speech is to allow engagement in open dialogue with others in the marketplace of ideas without the government imposing censorship or punishment. It is with intention that freedom of speech is included in the first of the Bill of Rights. The Founding Fathers valued the ability to speak freely for myriad reasons, particularly because it guaranteed citizens the right to to openly criticize their government. Freely criticizing the government is something Rouse should support. After all, she started a project called Trumplandia, documenting with essays, articles, poems, video clips, or other media the impact of Trump's presidency. Rouse calls Trump's campaign slogan "Make America Great Again" racist and authoritarian. Without free speech in academia, Rouse's project would not exist. Absolute free speech does not mean unchallenged speech, as Rouse seems to believe. Rather, it secures the opportunity for even unpopular ideas to be explored. Restraining speech won't make "bad" ideas go away and it won't suddenly changes the minds of people who hold unpopular or even offensive ideas. As Kat Timpf argues in National Review, "If you understand that you have a right to free speech bec[...]

Wearing a Mask in Public Shouldn't Be a Crime


Last weekend's demonstrations on Monument Avenue in Richmond, Va. didn't descend into rioting and mayhem, for which we can all be thankful. Only seven people were arrested—and four of them shouldn't have been. Three of them are students at Virginia Commonwealth University, and the fourth is a former student. They were on hand to protest the neo-Confederates who had come to town, and were arrested for wearing masks in public. One wore a bandanna over her face; the others wore Halloween masks. In Virginia, wearing a mask or hood to conceal your identity is a felony. In one of those amusing coincidences of which the universe seems so fond, their trials have been set for Oct. 31—Halloween. In another amusing coincidence, the law they are accused of breaking was passed in 1952, in an effort to stymie the KKK's effort to start a chapter in Richmond. Actually, that is neither amusing nor a coincidence. Laws passed for the sake of protecting racial minorities or limiting the power of the majority often wind up being used for precisely the opposite purpose. In April, for instance, two women were charged with a hate crime after they burned a sign supporting Donald Trump. Louisiana's "blue lives matter" law forbidding hate crimes against police officers has been interpreted to mean resisting arrest is a hate crime. Hate-speech laws have been used to shut down government critics in Kenya and punish anti-Israel activists in France, among many other examples. As Glenn Greenwald wrote recently in The Intercept, "This is how hate speech laws are used in virtually every country in which they exist: not only to punish the types of right-wing bigotry that many advocates believe will be suppressed, but also a wide range of views that many on the left believe should be permissible, if not outright accepted... Ultimately, what constitutes 'hate speech' will be decided by majorities, which means that it is minority views that are vulnerable to suppression." But the law against wearing masks in public is not a bad law because (or only because) it might affect college students protesting racism in addition to white supremacists trying to sustain it. It is a bad law because it infringes on individual freedom without justification. To begin with, some people have legitimate grounds for wanting to conceal their identity in public. Just ask the many officers of the Virginia State Police who covered up their name tags while working last Saturday's protests. Like Carolyn Hill, the student who was arrested for wearing a bandanna, they did not want internet trolls tracking them down online and harassing them. Second, people have other reasons for covering their faces in public. A political activist might wear a mask of Guy Fawkes or Nancy Pelosi to make a political point. That's free speech, protected by the First Amendment. Muslim women often wear a niqab out of modesty. That's religious freedom, protected by the First Amendment. People with weakened immune systems sometimes wear masks to protect them from infection. And of course, there's Halloween. We could carve out exceptions for people in those circumstances—and the current statute does make medical and holiday exceptions—but why should we? Why should prohibition be the default position? In a free society, the default position should be the one that upholds individual liberty—and government should need a good reason to carve out an exception. After all, in 99 cases out of 100 a mask doesn't hurt anybody. The law should not prohibit things that do nobody any harm. But what about the hundredth case? Easy. There's no blanket prohibition against using a firearm, but the law imposes additional penalties for using a firearm in the commission of a felony. We could treat masks the same way. True, sometimes cases might arise in which mask-wearing makes work harder for the police. If the police have surveillance video of a riot, for instance, identifying the culprits is easier [...]

There's Nothing Funny About Trump's Troubling Policing Edicts


During a July speech to police in Long Island, Donald Trump joked that when officers "put somebody in the car and you're protecting their head" that "you can take the hand away, okay?" Many of the cops laughed approvingly, but civil liberties groups—and even some law-enforcement officials—were upset that the president made light of police brutality, especially given some troubling nationally publicized incidents. Trump's defenders argued that he was only joking about the treatment of killers, and that the rest of us need to lighten up. Didn't Ronald Reagan joke about bombing Russia as he prepared for a radio address? Well, yes. But those arguments aren't persuasive given that the administration's actual policing policies seem likely to encourage abusive police behavior in a variety of ways. Even the Republican-controlled House of Representatives seems to understand that point. Last Tuesday, the House overwhelmingly approved amendments to a spending bill that try to limit the U.S. Justice Department's efforts to let police officers expand the use of a policy known as "civil asset forfeiture." Some forms of forfeiture have been around for centuries, but it really ramped up in the early days of the drug war, with policies designed to let police grab property and proceeds from major drug enterprises. Like most government programs, it expanded beyond recognition. It's turned into an astoundingly abusive process by which police seize the property of people who have never been convicted—or even accused—of a crime. In 2012 in Anaheim, federal authorities tried to seize a $1.5 million commercial building from its owner after one of his tenants, a medical-marijuana clinic, was accused of selling $37 in marijuana to an undercover cop. The feds eventually dropped the case amid blistering media coverage, but it shows how seriously this power can be abused. Many states, including California, have passed laws requiring police agencies to gain a conviction (in most cases) before taking a person's property. To get around those laws, local cops would "partner" with federal agencies and then operate under looser federal standards. After the property was taken, the local and federal folks would divvy up the proceeds—and then use the money to bolster their departmental budgets. Two Justice Department officials who helped start the program in the 1980s later argued that the process "has turned into an evil itself, with the corruption it engendered among government and law enforcement coming to clearly outweigh any benefits." The recent House vote seeks to block Attorney General Jeff Sessions from overturning Obama administration rules that put a few limits on these local-federal partnerships. In another example of the administration's lax attitude toward abusive government practices, Sessions last month decided to restore a federal program that provided rocket launchers, tank-like vehicles and other military gear to local cops. Police departments are supposed to protect and serve the community, not behave like an occupying army. Before the last administration reined it in, the military acquisition program had gotten out of hand. A San Diego school district received a $730,000 mine-resistant ambush-protected (MRAP) surplus vehicle from the military. Before they were pressured to return it, school officials said, "There will be medical supplies in the vehicle. There will be teddy bears in the vehicle." Oh please. What kind of uprising are these police departments and school security offices trying to subdue? Years ago, one official told me that his department eschewed high-powered equipment. That's because once the agencies have new toys, they want to use them—even in situations where community policing operations are more appropriate. The equipment encourages police-state tactics. Yet the Trump administration thinks this is a good idea. Newsweek reported that Sessions in June "submitte[...]

Self-Driving Cars Are Cool, but They're Not for Everyone


"I expect human driving to become illegal in the next 25–35 years in developed countries," insisted Rice University's Moshe Vardi in the course of plugging self-driving cars during a 2016 Reddit question-and-answer session. Tesla CEO Elon Musk sounded a similar note at a 2015 developers' conference, saying, "You can't have a person driving a two-ton death machine." It's an interesting perspective from a man who runs a company that manufactures such devices. Once upon a time, mass transit was the technocrat's preferred method for prying people out of their wasteful, dangerous cars. If only we could subsidize the right combination of buses, trolleys, jitneys, light rail, monorail, and bullet trains—the thinking went—all our problems would be solved. To save the planet, "public transportation should be favored over private automobiles, and the cars heavily taxed," wrote Hugh McDonald of New York City College of Technology in a 2014 book on environmental philosophy. That view is shared by a number of other scholars and policy makers who hope to eliminate traffic deaths, largely by getting rid of cars. But now there's a new kid on the block: self-driving cars. The trouble is that neither of these approaches takes into account the reality that almost 20 percent of the population of the United States live in the low-population rural areas that make up the majority of the country's land mass, and they're not about to trade in the F-150 for a newfangled robot chariot. Against my advice, a friend of mine once insisted on relying on GPS navigation to get to my old house in rural Arizona. Once we managed to locate him, we started his visit by digging his car out of the sand in which he'd mired himself. He had gone down an unmaintained road that didn't actually lead to my address, no matter how enthusiastically the robotic navigator claimed otherwise. For reasons that are clear if you live in the boonies, self-driving cars look a little limited in their near-term potential. They don't seem especially well-suited to paved but poorly mapped byways, let alone delivering passengers down miles of dirt lanes to hunting camps or trailheads. Many of those routes require a fairly responsive hand on the wheel to deal with unexpected washouts, deep ruts, and uncooperative quadrupeds. I'm also not sure how much fun off-roading would be with a robot calling the shots. Public transportation has its own challenges in much of the non-urban world. Around me, one bus service connects some local workers with their tourist-industry jobs in Sedona. Another circulates through the town's business areas, and seems to do a heavy trade in getting hobos back and forth between the public library and wherever they're sleeping. Neither will take you to Target, which is 50 miles away. Or to Costco. Or to most residential areas, which are understandably spread out in this sparsely settled piece of the world. You're not riding the bus to dinner and a movie, either, since it shuts down after work hours. So you can imagine that enthusiasts for public transit and/or a self-driving future are a little thin on the ground around here. The dispersed and sometimes uncharted nature of rural travel makes the former difficult—and while plenty of people view automated cars with gee-whiz interest, the odds are just a bit too high that on the way home from buying one, it'll take a nonexistent turn and dump the new owners into a ditch. They'll ultimately be excavated by future archeologists and displayed as "well-preserved examples of early adopters." To address these problems, McDonald wants to "encourage settlement in cities," which he says "have much more to offer in the way of museums, performing arts, varied cuisines, and other amenities." It seems we'll get to ride the trolley and be civilized. No thanks. Some oracles of the coming transportation revolution are less presumptuous. "Low populatio[...]

George Washington's 'Founding War of Conquest'


Autumn of the Black Snake: The Creation of the U.S. Army and the Invasion that Opened the West, by William Hogeland, Farrar, Straus and Giroux, 448 pages, $28 The Battle of Little Big Horn may loom larger in popular consciousness, but it is the fray now known as St. Clair's Defeat that marks Native Americans' single largest victory over U.S. forces. In 1791, in what today is Ohio, a pan-tribal force under the direction of Shawnee, Miami, and Delaware leaders served notice to the fledgling American republic that continued incursion into Native lands would come at a dear price. In this case, that price was at least half the soldiers on the U.S. side killed—some sources suggest the number dead was far larger—and nearly 20 percent more badly wounded. News of the rout caused President George Washington temporarily to lose his legendary cool. (More than one source reportedly heard from Washington's personal secretary, Tobias Lear, how the president raged about General Arthur St. Clair: "To suffer that army to be cut to pieces, hacked, butchered, tomahawked—by a surprise! The very thing I guarded him against! Oh God, oh God, he's worse than a murderer!") Once Washington simmered down, he embarked on a path that would define both his administration and his country: the creation of a standing national army and the pursuit of a war to secure the West for U.S. expansion. In Autumn of the Black Snake, the independent historian William Hogeland tells the story of that war. His aim, he writes, is to fill in a "vacancy in American memory when it comes to what is perhaps the longest-lasting legacy of George Washington's career, and to the political, moral, and existential burden his career, and its national indispensability, will forever carry." The result is an imperfect but nevertheless compelling work of history. Hogeland rescues some colorful key players from obscurity and restores them to the main narrative of the early American republic. The Black Snake himself is a case in point. Anthony Wayne began as a Pennsylvania boy enthralled with all things military and became a war hero during the Revolution, rising to the rank of major general. But "after 1776," Hogeland writes, "Wayne never really went home." Returning to civilian life in his late 30s, he proved unfit to manage anything competently: not marriage, not fatherhood, not property, not politics. Wayne was estranged from his family, barely one step ahead of his creditors, freshly relieved of his seat in the U.S. House of Representatives (after a House committee found fraud in his election), and in a downward spiral when Washington unexpectedly placed him in command of the country's new standing army, the Legion of the United States. In that position, Wayne turned his obsessive focus toward preparing, supplying, and supporting his troops. He built forts, he instituted the first basic training for U.S. soldiers, and his tireless emphasis on discipline and preparedness earned him the nickname Mad Anthony from his men. His "preternatural vigilance"—the man could not be surprised and seemed never to sleep—also earned Wayne the title Black Snake from his enemies in the pan-tribal Western Confederacy. Wayne ultimately vindicated Washington's trust and accomplished what the president wanted, breaking the back of Native resistance at the August 1794 Battle of Fallen Timbers, and securing both Native and British retreat from the Northwest Territory in the Treaty of Greenville a year later. And he did it all while his second in command both actively undermined him and served as a spy for Spain. Hogeland also devotes attention to the impressive leaders of "the only confederation that had a chance of obstructing the westward expansion of the United States and came close to damaging the American project in its fragile infancy." One was the charismatic and flamboyant Blue Jacket, [...]

The Ninth Circuit's Foie Gras Blunder


Last week, the Ninth Circuit Court of Appeals overturned a District Court ruling that had struck down California's dumb and unconstitutional foie gras ban. The plaintiffs are already planning their appeal. Technically the ban is back, but the law won't be enforced while the appeal is pending. "It is unprecedented and unconstitutional that the California legislature can dictate how New York farmers care for their animals, produced in compliance with New York's strict animal welfare laws, and processed under federal inspection," said Marcus Henley, manager of Hudson Valley Foie Gras, a co-plaintiff that's based in New York State, in an email to me this week. "States have the right to protect their citizens from inhumane and substandard products," said Paul Shapiro, spokesperson for The Humane Society of the United States, which wrote an amicus brief in support of the state law, in an email to me this week. "Rather than continuing to fight a losing battle, foie gras agribusinesses should join the 21st century and accept that the vast majority of Americans find violently force-feeding ducks simply too much cruelty to swallow." To Shapiro's credit, he predicted this outcome to me in 2015. While that prediction seems long ago, this case has been winding its way through the courts now for around five years. The plaintiffs, led by an association of Quebec-area foie gras producers and Hudson Valley, had argued that California has no authority to regulate out-of-state and international foie gras producers. But a federal court rejected those arguments, determining in 2012 that such "vagueness, Dormant Commerce Clause, and preemption arguments [we]re 'unlikely to succeed on the merits.'" Ultimately, a U.S. District Court held in 2015 that the law was preempted by the federal Poultry Products Inspection Act (PPIA), which governs, among other things, poultry-product "ingredients." The Ninth Circuit decision last week disagreed about the ingredients issue and overturned the lower court's ruling. "The PPIA prohibits states from imposing requirements on ingredients that contradict federal regulations," Reason's Scott Shackford wrote last week, in a post that nailed the details of the court's reasoning. "But this foie gras ban technically regulates a process, the manner by which the foie gras is made. Therefore, the judges ruled, the California law does not come into conflict with the PPIA at all." "In our case, there can be no question that California imposes a requirement on the primary ingredient in my clients' USDA-approved foie gras products—i.e., that they may not contain any force-fed foie gras—which is a requirement that is 'in addition to or different than' those under federal law and is therefore preempted," said California attorney Michael Tenenbaum, who represented the plaintiffs in the foie gras case, in an email to me this week. "The last time the Ninth Circuit tried this—i.e., reversed a district court's preemption finding in an effort to save a misguided state ban on USDA-approved products on the ground that 'states are free to decide which animals may be turned into meat'—it was reversed, 9-0, by a Supreme Court opinion that said (literally), 'We think not,' as it would allow states to 'make a mockery' of federal preemption," Tenenbaum says. (Case link added for reference purposes.) Tenenbaum and Shackford are correct in their facts and analysis. Ultimately, though, this case isn't about statutory interpretation or ingredients or processes or the PPIA. This is a case—plain and simple—about a farmer's right to raise animals that consumers want to eat, and a big bully of a state working hand in hand with animal rights activists to impose its vague and burdensome laws on other states, and even other countries. No state should have such power. Thankfully—hey!—the U.S. Constitution e[...]

Health Care Costs Are the Reason You're Not Getting a Raise


Every Labor Day, you can count on seeing a spate of news stories saying that "real wages" in the United States haven't grown since the 1970s. That's true, more or less, but the reason for the stagnation might surprise you. It's a complex story, but it boils down to this: Blame health care costs. According to the Federal Reserve Bank of St. Louis, inflation-adjusted wages have grown by just 2.7 percent in the last 40 years. But inflation-adjusted total compensation—wages plus fringe benefits, such as health insurance, disability insurance, and paid vacation, along with employer-paid Social Security and Medicare taxes—increased by more than 60 percent in the same period. Wages still make up a significant share of your total compensation: 68.3 percent, according to 2017 data from the Bureau of Labor Statistics, vs. 31.7 percent that goes to benefits. But that latter piece has grown significantly, in no small part due to the rising cost of health insurance. And that trend is only going to get worse. This has political consequences, since most workers don't appreciate how hefty the non-wage share of their compensation is, nor do they generally realize just how much of the money their employer is shelling out on their behalf gets eaten up by health care. As a result, they demand that politicians intervene to deliver more raw pay. To control health care costs, Americans will have to stop relying on third-party payers to cover small, routine expenditures (as opposed to large and unforeseen ones). According to the U.S. Department of Health and Human Services, out-of-pocket spending—copays and the like—was only 11 percent of all health care spending in 2015, down from 43 percent in 1965. It's an economic truism that if someone else is covering the bulk of the cost of something, you're likely to use more of it—especially if you don't realize that you're paying for it with foregone wages and higher taxes. This increases the overall demand for health services, which in turn increases the cost. It also creates an incentive for whoever is paying, be it the government or your insurance company, to start putting constraints on which services you can and cannot consume. The end result is that patients have become minor players in many of the financial and medical choices that deeply affect their lives. Reversing this trend would be hard without a reduction in health care costs big enough to get people to stop expecting their insurance to pay for every little thing. Lower costs would also make it possible to free employers from the responsibility of providing coverage to their workers—because if quality care is cheap and abundant, you don't need to look to your boss to make sure you can get it. That in turn would reduce the gap between compensation and wages. Controlling costs, then, really is the key. Easier said than done? Yes and no. This is, of course, a long-term project. It requires bringing to health care the kind of innovation we've seen in other sectors over the last few decades. And that means reducing the influence of government bureaucrats and special interests, which routinely obstruct new technologies and resist innovative ways for consumers to interact with their doctors. Introducing novel tools and services can make health care more expensive at first. But as long as the government refrains from setting price controls, costs will eventually go down, just as with consumer goods, allowing ever more people to gain access. And some cost-saving steps can be taken immediately. For instance, why not allow medical tourism, reform the onerous Food and Drug Administration approval process, and end regulations that stop highly trained nurse practitioners and physician assistants from treating patients? By freeing the health care sector from the grip of government and speci[...]

Fall’s First Television Premieres May Make You Go ‘Meh’


Young Sheldon. CBS. Monday, September 25, 8:30 p.m. Me, Myself, and I. CBS. Monday, September 25, 9:30 p.m. The Brave. NBC. Monday, September 25, 10 p.m. The Good Doctor. ABC. Monday, September 25, 10 p.m. The hell with Charles Dickens. The new fall television season is certainly not the best of times, nor is it the worst of times (mostly, anyway, though CBS' 9JKL certainly gives pause). It is, perhaps, the most mediocre of television times since The Sopranos and Sex and the City established cable TV as a programming force in which a Nielsen rating of 35 could not be reasonably mistaken for the average IQ of the viewing audience. Nineteen new series will debut on broadcast television between now and November 2. (Well, 18; The Orville, Fox's cartoon send-up of Star Trek, somehow slipped through security a couple of weeks ago, and if you're only learning this now, count yourself lucky.) And they are nothing if not diverse. There are American Special Forces troops in Syria (NBC's The Brave), American Special Forces troops in Liberia (CBS' Seal Team), and American Special Forces troops in America (The CW's Valor). There are remakes from the 1970s (CBS' S.W.A.T), remakes from the 1980s (The CW's Dynasty) and remakes from the 1990s (NBC's Will & Grace, less a remake than a desiccated zombie clawing its way back out of the grave, since it features the same cast). There are mutants battling a fascist military government (ABC's Marvel's Inhumans) and mutants battling a fascist civilian government (Fox's The Gifted). What there is not is a surefire breakout hit. (Though, to be honest, my track record on picking hits is almost as bad as that of actual network programmers. When I started writing about television in 2002, I never dreamed Survivor and The Bachelor would still be with us 15 years later, or I might have stopped right then and there.) Even DVR-worthy shows were spotted less frequently than virginal Kardashians. Wading through the pilots reminded me of the 2008 fall season that followed a five-month writers' strike that resulted in a lineup pockmarked with shows like Stylista, which made me want to eat the brains of pretty people, and the remake of Knight Rider, which made me feel like my own brain was being eaten. Why things went so badly this year, I can't tell. One argument that will doubtless be advanced is that so-called peak television—the glut of production triggered by digital services like Netflix and Hulu joining the business with broadcast and cable channels—has stretched the Hollywood talent pool too far. That doesn't make sense to me; the biggest bucks, generally speaking, are still in broadcast TV, which should ensure that it gets the top talent. Whatever the explanation, this is the couch-potato diversion we have chosen. All that's left to us is to clench our remotes tightly in our teeth as we ride boldly and well into the jaws of video banality. The fact that NBC's The Brave is just one of three new series about U.S. combat operations in the Islamic world is, unfortunately, probably less a marker of television's follow-the-leader mentality than a depressing reminder that we've been at war there for 16 years, through three presidencies, and seem no closer to the end of the chapter than we were at the beginning. Of course, that wouldn't be true if these TV troopers were unleashed over there. As movies like Hacksaw Ridge have grown more gruesomely honest about the quantity and quality of battlefield deaths during wartime, TV has gone the opposite direction. The Brave guys—err, persons; gender integration of combat units is a lot further along on television than it is in the Pentagon—are all but impervious to bullets and bombs. And because of all their high-tech toys that can look and listen though walls, in most[...]

We Read Hillary's Book So You Don't Have To


Hillary Clinton's new book What Happened attempts to explain Trump's upset victory in 2016 through a series of reasons which are not Hillary Clinton. The gambit runs from apathetic white lady voters to Russian meddling to the inscrutable popularity of Donald Trump. When Clinton's focus turns to herself, however, she's light on culpability. She admits she can be guarded, but that admission doesn't encapsulate the relentless political ambition paired with shady financial dealings and a willingness to subvert national security that turned off much of the country.

In the end Clinton was also just a bad candidate.

In the latest Mostly Weekly Andrew Heaton explores Clinton's new book, so you don't have to.

Watch past episodes.

Mostly Weekly is hosted by Andrew Heaton with headwriter Sarah Rose Siskind.

Script by Sarah Rose Siskind with writing assistance from Andrew Heaton and Brian Sack.

Edited by Austin Bragg and Siskind.

Produced by Meredith and Austin Bragg.

Theme Song: Frozen by Surfer Blood.

Subscribe at YouTube.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.


California Legislative Session Ends With Higher Taxes, Anti-Trump and Union Priorities


California's legislative session, which completed its work in the wee hours Saturday morning, was one of the more controversial ones in years, given the degree to which the Democratic majority was able to secure various tax and fee increases. It was also one of the more divisive recent sessions from a partisan standpoint. The most significant measures passed long before the session's deadline. In April, lawmakers passed a controversial 12-cents-a-gallon gas-tax increase by a razor-thin margin. The law also increased vehicle-license fees. In July, they passed a 10-year extension of the state's cap-and-trade program, with the help of several Republican legislators. The Legislative Analyst's Office estimates the measure could increase gas prices as much as 63 cents a gallon by 2021. But the final hours of the session were still filled with tension. The housing package worked out between Gov. Jerry Brown (D) and legislative leaders had stalled in the final days, but snuck past the finish line. The package includes three bills. One (Senate Bill 35) would streamline the approval process for high-density affordable housing projects, but requires contractors to pay union-based prevailing wage rates on those subsidized projects in return. The other two parts of the deal have a bigger tax-and-spend element to them. SB2 imposes new fees of $75 to $225 on various real-estate transactions to help fund subsidized high-density housing projects. SB3 will place before voters on the November 2018 ballot a $3 billion state housing bond that likewise will fund the construction of low-income housing units. The gas tax increase has sparked a GOP-led recall effort of Fullerton-area Democrat Josh Newman, mainly because of his vote to support the increase—and because he represents a GOP-heavy district. Democrats passed two bills this session to change the recall rules to help the embattled senator, but that issue is working its way through the courts. If Newman loses, Democrats would lose their supermajority in the Senate. Anti-tax activists also are gathering signatures for an initiative that would repeal the new gas tax and license fees. Nevertheless, some commentators were relieved that the session wasn't worse, from a tax-hiking perspective. Joel Fox, editor of Fox and Hounds Daily, referred to this as a "tax-happy session," but noted the California Chamber of Commerce's success in defeating nine so-called "job killer" bills that proposed some form of tax increase. Some of these defeated tax proposals include: A tax on contractors who do business with the California Department of Corrections and Rehabilitation; excise taxes on manufacturers, distributors and wholesalers of distilled beverages; an excise tax on distributors of sweetened soft drinks to fund a new health program; an increase in the personal income tax of 14.3 percent; a tax on opioid distributors; a new retail tax to fund affordable housing; expansion of the capital gains tax; and a measure to lower the vote threshold for local property tax increases. California Democratic leaders spent a lot of time this session positioning themselves to resist the Donald Trump presidency. Many efforts involved little more than posturing and press conferences, but the Legislature passed three substantive bills that are designed to either affect the next presidential election or confront the Trump administration over its controversial immigration policies. For instance, the Legislature passed SB568, which moves up the state's presidential primary. It now occurs late in the primary process in June, but would move to March. That would make California the fifth state to vote for president in the 2020 election, provided other states don't play leapfrog with their[...]

Don't Fall for Jimmy Kimmel's Cheap Zero-Sum Emotionalism


In recent months, late-night talk show host Jimmy Kimmel has taken to scaremongering his audience with well-worn Democratic Party talking points regarding health care insurance policy. Between yuks, he occasionally accuses Republicans of being would-be baby killers, which is treated as an important political development because, well, Jimmy Kimmel is famous. This week, the comedian was back to explain why the new Graham-Cassidy Republican "repeal" bill is bad news. There were only two things wrong with his monologue: Almost everything he said was either completely untrue or highly misleading, and his simplistic emotional appeal was completely disconnected from the real world. The comedian's interest in policy was sparked by the harrowing experience of having a newborn son who suffered from a rare health condition. Thankfully, his boy is OK. "If your baby is going to die and it doesn't have to, it shouldn't matter how much money you make," said an emotional Kimmel in May. "I think that's something that, whether you're a Republican or a Democrat or something else, we all agree on that, right?" Yes, everyone agrees. As far as I know, there isn't a single politician in America who has ever supported allowing babies to die because they are born with birth defects, even if the parents can't pay. Not pre-Obamacare, and not post-Obamacare. In any event, after Kimmel's May rant, Louisiana Sen. Bill Cassidy (R) showed up on his show to explain his position. These types of culture encounters shouldn't be dismissed, because the fact is most viewers are unaware of specific policies and have a notional understanding that's prejudiced by the establishment media's coverage. Cassidy came up with something he called the "Jimmy Kimmel Test," a test that no "family should be denied medical care, emergency or otherwise, because they can't afford it." Kimmel claims that the new bill doesn't meet this threshold. "This guy, Bill Cassidy, just lied right to my face," the talk-show host said on Tuesday night during an extended political rant. He went on to say, "And by the way, before you post a nasty Facebook message saying I'm politicizing my son's health problems, I want you to know: I am politicizing my son's health problems." OK. He explained: "Coverage for all? No. Fact, it will kick about 30 million Americans off insurance." Not a single person would be "kicked off" his or her insurance. Rather, the Congressional Budget Office review of the AHCA found that of the 24 million Americans who would no longer have health insurance after an Obamacare repeal, 14 million would choose not to buy insurance in 2018 in the absence of a penalty. And if Obamacare were not repealed, the CBO projects another 6 million people would voluntarily leave the Obamacare markets. Now, if you don't believe Americans should be afforded the choice to leave or not buy insurance, just say that. No one is being kicked off. Moreover, if Kimmel supports the individual mandate, Graham-Cassidy allows California to institute it—as I am sure it would. Kimmel says: "Pre-existing conditions? Nope. If the bill passes, individual states can let insurance companies charge more if you have a pre-existing condition." States would be allowed to apply for waivers to change what qualifies as an essential health benefit as long as they still preserve "adequate and affordable health insurance coverage" for people with pre-existing conditions. You may prefer price fixing to allowing states flexibility to try and fix these problems, but Graham-Cassidy does not break the "Jimmy Kimmel Test." Kimmel might not be aware that there is no plan in place that has government cutting checks after every surgery. Kimmel then implored his au[...]

Movie Review: Kingsman: The Golden Circle


Like its predecessor, the new Kingsman movie is a bit too much of a good thing. In the first film, Kingsman: The Secret Service, director Matthew Vaughn came up with a giddy take on the early Bond movies, this time focusing on an "independent intelligence agency" headquartered in a high-end tailor shop in London's Savile Row. It was a cute concept—the agents were naturally dapper and their brollies were bulletproof—and it was fun watching Colin Firth, as head spy Harry Hart (workname: Galahad), apply his recessive charm to the bashing of bad guys and the tracking of uber-villains. That first film was fun because Vaughn has a mad gift for action and an utter disregard for tender sensibilities (something already clear in his wonderfully vicious Kick-Ass movies). But Secret Service was hobbled by a plot thread that required Firth to affectionately mentor a wayward youth called Eggsy (Taron Egerton), who was thought to have agent potential. It turned out that he did, but arriving at this realization was a slog. The Golden Circle has a different sort of problem, although it's not apparent at first. Vaughn launches the movie with a blazing chase, pegged to Prince's "Let's Go Crazy," that has Eggsy in ferocious battle with an opposition killer all over the inside of a careening taxi. The killer has a cybernetic arm that takes on a life of its own. The taxi is naturally being pursued by a pack of cars with roof-mounted machine guns. Vaughn is in his element here, bringing fresh perspectives to venerable action-movie tropes. But soon the aforementioned problem crops up—and it turns out to be Colin Firth. We're not expecting to see Firth because at the end of the first Kingsman movie he was shot in the face, and thus rendered, we can surely be forgiven for thinking, dead. But Firth added important marquee heft to that movie, which grossed more than $400-million worldwide. So now his character has been awkwardly— and time-consumingly—exhumed. We see him first as one of the ghostly deceased agents at a supernatural Kingsman board meeting; later we encounter him as a potty lepidopterist; and then, finally, in his familiar bespoke form. One difference: he has an eye patch now, covering the only injury he sustained from – again – being shot in the face. This tests the bounds of even comic-book logic, but if you can get past it, there are better things ahead. First of all, there's an international drug cartel run by a chatty psycho called Poppy (Julianne Moore, definitely having fun), who maintains a cheery smile even while feeding an unfortunate underling into a meat-grinder. Poppy's HQ compound—Poppy Land—is located in the Cambodian jungle; it includes a donut shop, a bowling alley, a classic chrome-and-vinyl diner, and a movie theater…where Elton John is being held hostage in one of his old psychedelic-feather-monkey stage outfits. (Why Elton? you might wonder. I have no idea, but he's around a lot and he gives a game performance.) Moore's Poppy character has a surprising mission in life, and it's the movie's best joke: she wants to end the War on Drugs! Because then she can begin dealing out in the open and people will hail her as the drug-biz legend she should rightly be. In order to make this dream come true, she has spiked vast quantities of heroin, cocaine and crystal meth with a lethal virus cooked up in her very Bondian mountaintop lab high in the Italian Alps. She's prepared to kill millions of people unless…well, let's move on. Poppy—whose professional trademark is a golden-circle tattoo—eventually comes up on the Kingsman radar, and before long Eggsy and the revivified Harry and tech specialist Merlin (Mark St[...]

Breaking Addicts in Order to Fix Them


The Recovery Revolution: The Battle Over Addiction Treatment in the United States, by Claire Clark, Columbia University Press, 336 pages, $35 Though nearly forgotten today, Synanon—the first organization to claim to have a drug-free cure for heroin addiction—had an enormous impact on American culture. At the peak of the drug war of the 1980s and early '90s, at least half of all publicly funded addiction treatment was based on its model of communal and intensely confrontational living. Synanon was founded in 1958 by Chuck Dederich, a member of Alcoholics Anonymous who felt that A.A. wasn't tough enough. (Dederich claims to have coined the popular self-help phrase, "Today is the first day of the rest of your life.") By the mid-1960s, it had evolved into a California-based commune widely celebrated as the counterculture's antidote to drugs, and populated by hipsters, Hollywood stars, and jazz musicians. In those days it was lionized by Life magazine, the major TV networks, and even a 1965 Columbia Pictures film; Milton Berle, Jack Lemmon, and other celebrities promoted it. But then came word—and later proof—of child abuse and beatings for non-compliance of both adults and children. Dederich decided that members' children were a drain on the community, so he pressured men into having vasectomies performed by Synanon doctors on the spot and ordered women to get abortions or be forced out. "Marathon" groups extended for days without food or bathroom breaks, featuring sadistic emotional attacks on those who were seen as backsliding or disloyal. To the extent that it is remembered now, Synanon is probably best known for having put a four-and-a-half-foot-long de-rattled rattlesnake in the mailbox of an attorney who was starting to win legal cases against it. Among journalists it also has the notoriety of winning the largest libel judgment in history, after Time called the group a cult in 1977. Only later was the truth about the organization fully established, thanks to the courageous reporting of the small-town Point Reyes Light, which won a Pulitzer for its exposés in 1979. Claire Clark's The Recovery Revolution prefers to emphasize the positive. The book doesn't even mention that snake attack, nor the fact that Dederich was convicted of conspiracy to commit murder as a result of it. Perhaps the author wanted to avoid covering familiar ground. But by failing to explore how Dederich and Synanon set the precedent for the abuses she chronicles in later programs based on it—and by failing to wrestle with how those harms stemmed directly from its structure and practices—she undermines her credibility as an objective observer. Worse still: Even though a great deal of evidence suggests that the methods pioneered by Synanon do more harm than good, she doesn't mention any of this scholarship. Instead, she uncritically accepts studies that claim to show that what has become known as the American "therapeutic community" is an effective model for addiction treatment. Synanon made it acceptable to use tactics in treating addiction that had long been seen as barbaric when used for mental or physical illness. Since the group's core idea was that the "addict" personality was marked by "character defects," Synanon aimed to eradicate it by brute force: shouting abuse at participants for hours and even days on end, spitting in people's faces, humiliating them by making them dress in diapers or drag, and combining food deprivation, sleep deprivation, and relentless confrontation in an attempt to erase a person's old identity. Not a single study shows that these methods are superior to kinder, safer approaches. Indeed, re[...]

The Vietnam Syndrome: How We Lost It and Why We Need It Back


In Kabul, Afghanistan, American Embassy personnel who want to meet with their counterparts at the nearby U.S. military base have to travel a mere 100 yards. But they don't make a practice of walking or driving. They go by military helicopter, reports The New York Times. The space between is too dangerous to cross on the ground. It's the sort of bizarre fact that might have emerged in Ken Burns' new PBS series on the Vietnam War, illustrating our inability to turn South Vietnam into a safe, stable place. But it's not the past; it's the present. The Vietnam War was the greatest U.S. military catastrophe of the 20th century. A conflict begun under false pretenses, based on ignorance and hubris, it killed 58,000 Americans and as many as 3 million Vietnamese. It ended in utter failure. Never in our history have so many lives been wasted on such monumental futility. It was a national trauma worse than any since the Great Depression, and it left deep gashes in the American psyche. It instilled an aversion to wars of choice that became known as the Vietnam syndrome. The allergy might have lasted for generations. It didn't. In 2001, just 26 years after the fall of Saigon, the United States invaded Afghanistan. American troops have been fighting there twice as long as we fought in Vietnam. Once again we find ourselves mired in an incomprehensible land, amid people who distrust us. Once again we are aligned with a corrupt regime that couldn't survive without our help as we incur casualties in the pursuit of goals we never reach. In Burns' documentary, President Lyndon B. Johnson is heard in 1965 confiding, "A man can fight if he can see daylight down the road somewhere, but there ain't no daylight in Vietnam." Afghanistan has also been an endless journey down a pitch-black mine shaft. The American military drew some obvious conclusions from Vietnam. Gen. Colin Powell, who served in combat there, had them in mind when he formulated what became known as the Powell Doctrine. It advised going to war only if we can identify a vital interest, have clear, achievable purposes, are prepared to use decisive force, and know our exit strategy. But Powell's wisdom eventually was forgotten. How could we be repeating the mistakes of Vietnam already? We didn't wake up one day with severe amnesia. It was not a one-step process. It occurred through a succession of military interventions that convinced us we were clever enough to avoid the pitfalls that had brought us to such ruin in Southeast Asia. Ronald Reagan lamented the Vietnam syndrome but shrewdly declined to send American forces to fight leftists in Central America. He did, however, undertake one brief, low-risk invasion -- of the Caribbean island of Grenada, against a Castro-backed Marxist regime. Our forces removed the government, and we soon departed. In 1989, George H.W. Bush tried a more ambitious mission, invading Panama to eject a dictator. Then came Iraq's invasion of Kuwait, which provoked Bush to send a huge air and ground force to expel Saddam Hussein's army—a fight that proved far easier than expected. Bill Clinton had his own victory, an 11-week bombing campaign that forced Serbia to leave the breakaway province of Kosovo. He managed it without a single American combat fatality. By 2001, when terrorists attacked the World Trade Center and the Pentagon, Americans had gotten our swagger back. We had proved we could bring about regime changes in hostile countries while incurring few casualties and avoiding long-term entanglement. We thought we had cracked the code of successful military interventions. The general attitude in Washington wa[...]

3 Priorities to Guide Tax Reform


Congress is finally tackling the tax code, which is good news because reform is badly needed. Our outdated code is complicated by thousands of credits, deductions and exemptions to individual and corporate interests—and it imposes high rates that inhibit economic growth. However, as we've seen with the failed efforts to repeal and replace Obamacare, getting a consensus among Republican members is easier said than done. It should boil down to three priorities. First, though overhauling the whole tax code would be great, if the goal is economic growth, reforming the corporate side is the most pressing priority. Everyone knows that the corporate tax system is a punishingly inefficient and large driver of corporate avoidance. Ideally, a reform plan could cut the rate dramatically and move the United States from the highest to one of the lowest rates among industrialized nations. The president has talked about 15 percent, which would make U.S. companies significantly more competitive abroad and at home while dramatically reducing the need for tax avoidance and inversions. It should also replace "depreciation" with "full expensing." This sounds like a bunch of tedious jargon, but all you need to know is that companies generally aren't allowed to immediately deduct (expense) their investment costs when calculating taxable income and that this creates a bias against business investment. Some exceptions exist and create their own problematic biases because they're targeted toward particular industries or activities supported by politicians. Different rules make for a more complex tax code, encourage lobbying and lead to special privileges for the well-connected. Full expensing would flatten all this out. These reforms would boost the economy, American competitiveness and job creation the most. A corporate tax reduction would boost standards of living through higher wages, too. That's because the majority of the corporate tax is shouldered by workers, in the form of lower wages. The second priority? Congress needs a budget. Without that, there's no reconciliation—the process by which Republicans can bypass the need for 60 votes in the Senate. Without that, there's no reform. However, the rules of reconciliation require that tax reform be deficit-neutral outside the 10-year budget window. A lot of the current tension about tax reform is caused by a disagreement about how to meet the deficit-neutral constraint. A third priority requires that tax reform be paid for. The best way to do that, however, is to restrain spending. We're $20 trillion in debt and heading once again to a $1 trillion deficit, even before the tax cuts. Extending and strictly enforcing the previously bipartisan and quite modest Budget Control Act caps of 2011 until 2025 would pay for tax reform without resorting to new sources of revenue such as the misguided value-added tax, a carbon tax or a border adjustment tax. Getting rid of genuine loopholes that benefit individuals and corporate interests would also help pay for tax reform. The exclusion for employer-provided fringe benefits, the state and local tax deduction, and the deduction for U.S. production activities are ripe for repealing and could allow for trillions of dollars in tax cuts. Congress could approve a tax cut that expires after 10 years, of course, but temporary tax cuts are less conducive to growth because entrepreneurs and investors realize that there's no permanent change in incentives to create jobs, income and wealth. All of this leads to a problem. If Congress and President Donald Trump aren't willing to impose spending[...]

The Politician Behind California High Speed Rail Now Says It's 'Almost a Crime'


High-speed rail lines began popping up in Europe and Asia in the early 1980s. Passengers were exhilarated by the futuristic trains rocketing between cities on glass-smooth rails at upwards of 200 miles per hour. With high-profile roll-outs in France and Japan, bullet train mania was underway. And then reality set in. "The costs of building such projects usually vastly outweigh the benefits," says Baruch Feigenbaum, assistant director of transportation policy at the Reason Foundation, the 501(c)(3) that publishes this website. "Rail is more of a nineteenth century technology [and] we don't have to go through these headaches and cost overruns to build a future transportation system." Supporters, who claim that most high speed rail systems operate at a profit, use accounting tricks like leaving out construction costs and indirect subsidies. If you tabulate the full costs, only two systems in the world operate at a profit, and one breaks even. But politicians can't resist the ribbon cutting ceremonies and imagery of sleek trains hurtling through the lush countryside. So the projects keep coming. California's high speed rail line was sold to voters on the bold promise that it will someday whisk passengers between San Francisco and Los Angeles in under three hours. Nine years later, the project has turned into such a disaster that its biggest political champion is now suing to stop it. An icon of California politics known as the "Great Dissenter," Quentin L. Kopp introduced the legislation that established the rail line, and became chairman of the High-Speed Rail Authority. He helped convince voters in 2008 to hand over $9 billion in bonds to the Rail Authority to get the project going. Since he left, Kopp says the agency mangled his plans. "It is foolish, and it is almost a crime to sell bonds and encumber the taxpayers of California at a time when this is no longer high-speed rail," says Kopp. "And the litigation, which is pending, will result, I am confident, in the termination of the High-Speed Rail Authority's deceiving plan." Voters supported the bond measure to pay for construction on the condition that the train would be self-sustaining. But multiple outside analyses conclude that the Rail Authority will have to massively hike ticket prices or rely on taxpayer largess. According to one recent estimate, the project's latest iteration would suck up at least $100 million in annual subsidies. Since 2008, lawsuits have multiplied, private investors have fled, and even the official price tag has nearly doubled, from $33 billion to $64 billion. When the legislature cleared the way for the Rail Authority to begin selling the voter-approved bonds in early 2017 to fund construction, the agency declared it a "milestone." Kopp was livid. "It's deceit. That's not a milestone, it's desperation, because High-Speed Rail Authority is out of money," Kopp told Reason. Kopp joined a lawsuit brought by attorney Stuart Flashman, who has represented environmental and transportation groups in several previous actions against the Rail Authority. He aims to stop the project on the grounds that the agency broke numerous promises to voters enshrined in the 2008 ballot measure, including that all the financing for a segment had to be in place before construction could begin. Feigenbaum believes that starting construction even though there isn't enough funding allocated to finish the project is a deliberate strategy: The Rail Authority intends to sink as much money into the project as possible, hoping to extract further taxpayer subsidies to complete it.[...]

Unlicensed Tour Guides Not Allowed in Savannah


Michelle Freenor's business almost failed before it began. That would have been a loss, since her Savannah, Georgia, walking tour gets only good reviews from customers. "Top notch tour guide giving us a lot of history of Savannah's Historic District," said one five-star Yelp review. "Great, informative," said another. But that didn't matter to Savannah politicians. They said she had to get a government license if she wanted to charge people for tours. And getting the license was difficult. She had to pay $100 and then "pass a college-level history exam with tons of obscure gotcha questions," Freenor told us. Passing required "three to five months of studying because it was about 120 pages. I had to map out where I was standing, what I was saying." It's one more example of abuse of licensing rules. Dick Carpenter, author of the book Bottleneckers, lists how these regulations strangle new businesses. "She also had to do a criminal background check, which meant she had to give a urine sample and a blood sample." Carpenter told me. "She also had to go through a physical fitness test." No matter, said the city, you must pass the test and you must pay the fee. "The city was making a nice bit of money," says Freenor. A video producer went to Savannah to confront the licensing rules' biggest promoter, Alderman Bill Durrence. "A lot of people think that this fee is just another money grab by the city," he admitted, "but I hear a lot of tour guides saying things that make me cringe." So what? Some of the more popular Savannah tours are "ghost tours." Those tour guides must take the test, too, although it includes no questions about ghosts. The city even had some wrong answers on its test. It claimed "Jingle Bells" was written in Savannah. Most people say it was written in Massachusetts. The test also misidentified the city's largest square. Savannah's politicians demanded aspiring tour guides pass a test that included rules about horse-and-buggy and tram tours, even if the guides only intended to walk. Freenor took the exam and passed it on her first try. But then she got sick; she has lupus. "When I told them, hey, I don't think I can pass the physical this year, I was actually told by a city official, well, I guess you're going to have to find another occupation." Durrence admits, "There were a couple of points that maybe went a little too far in the licensing process, (like) having to have a physical exam periodically, maybe the cost of the test." Yes—politicians routinely go too far. Fortunately, the Institute for Justice, the libertarian law firm where Carpenter works, helped Freenor sue Savannah, and the city backed down. Robert Johnson, one of Freenor's lawyers, points out that such licensing laws violate the guides' right to free speech. "What tour guides do is talk for a living. They're just like stand-up comedians, journalists or novelists. In this country, you don't need a license from government to be able to talk." An Institute for Justice lawsuit got rid of the Washington, D.C., tour guide test, too. Will terrible guides start giving terrible tours because of that? No, says Freenor. "The free market is taking care of itself. Bad tour companies don't last." She's right. Competition is the best way to decide which tour is good. In fact, a recent Institute for Justice study—using Trip Advisor review data—found that tour guide quality was no different after Washington's test was eliminated. Alderman Durrence doesn't like the fact that the market, rather than government, determines c[...]

Congress Does Not Want Its War Power


The short-lived CBS series Brain Dead, now available on Netflix, is a science-fiction satire about an invasion of Washington, D.C., by extraterrestrial bugs that crawl into people's ears and hijack their minds as part of a plot to conquer the world. But the most implausible aspect of the story is a dramatic Senate committee vote on whether to authorize military action in Syria. In the real world, of course, no such vote is necessary, because the president does whatever he wants with the armed forces he controls while Congress abdicates its constitutional responsibility to decide when the country should go to war. Last week 61 senators showed they are happy with that situation by tabling an amendment that would have forced a debate about endless, metastasizing wars that cost trillions of dollars and thousands of lives without making Americans any safer. The amendment, introduced by Sen. Rand Paul (R-Ky.), would have repealed the 2001 authorization for the use of military force (AUMF) against the perpetrators of the 9/11 attacks and the 2002 resolution approving the war in Iraq. The repeal would have taken effect in six months, giving Congress time to consider the justification for continued U.S. military involvement in Afghanistan, Iraq, and the various other countries supposedly covered by those resolutions. "The war in Afghanistan has gone on 16 years now," Paul said before the vote on his amendment. "We have people who will be fighting in the war…in the next year or so who were not yet born on 9/11. We have long since killed the people who perpetrated 9/11." For years Donald Trump opposed what has become America's longest war, calling it "a total and complete disaster" that has "wasted an enormous amount of blood and treasure." After becoming president, he changed his mind, reaffirming the U.S. commitment to remain in a country where by his own account "we don't know what we are doing." But as far as 61 senators are concerned, there is nothing to debate here. Barack Obama said the 2001 AUMF should be repealed because it was dangerously obsolete. He nevertheless claimed it authorized military action against ISIS, which did not exist when the resolution was passed. Obama belatedly sought congressional permission for that war while insisting he did not need it. But as far as 61 senators are concerned, there is nothing to debate here. Paul notes that Congress never approved U.S. intervention in Libya, Syria, Yemen, Nigeria, or Somalia. As a presidential candidate, Trump criticized such ham-handed meddling in foreign civil wars. As president, not so much. But as far as 61 senators are concerned, there is nothing to debate here. Obama opposed the war in Iraq. So did Trump, although not until after it started. Even Hillary Clinton, who as a senator voted for the war, eventually conceded it was a mistake. "For years now," Paul noted last week, "some senators and candidates have lamented that they voted for the Iraq war." But as far as 61 senators are concerned, there is nothing to debate here. Those 61 senators include every Republican aside from Paul, Mike Lee (Utah), and Dean Heller (Nev.), who opposed tabling Paul's amendment, and Marco Rubio (Fla.), who did not vote. Opponents of the amendment also included 13 Democrats, several of whom have publicly questioned Trump's fitness for office. Sen. Claire McCaskill (D-Mo.) thinks Trump is a "buffoon." Sen. Sheldon Whitehouse (D-R.I.) says Trump is attacking "basic institutions of government…in unprecedented ways." Sen. Jeanne[...]

Stossel: Tour Guides Under Attack


Want to earn money showing someone around? It's not as simple as it sounds. Many cities require a license in order to do that.

For Michelle Freenor, owner of "Savannah Belle Walking Tours" in Savannah, this meant a background check complete with blood and urine samples, a physical fitness test, plus months of studying for a college-level history exam. The city charges $100 every time the exam is taken. She passed on her first try, but many fail.

All of this, just to speak for a living.

Bill Durrence, Alderman of the 2nd District of Savannah, admits parts of the licensing requirements may have gone too far, but said: "the licensing and the testing, I thought was a good idea just to make sure people had the accurate information."

When Michelle was diagnosed with Lupus, she told the city she might not be able to pass the physical. A licensing bureaucrat told her "you'll have to find another occupation... if you don't like it then you can sue us."

So she did.

The Institute for Justice, a libertarian law firm, took her case for free. The Savannah bureaucrats backed down, but it doesn't happen easily, says Dick Carpenter. "There's discovery, depositions are taken... [it can take] months, often years."

But Savannah isn't the only city to create bottlenecks for those who want to give tours: Charleston (SC), New York (NY), Williamsburg (VA), St. Augustine (FL), and New Orleans (LA) all have tests.

Washington, D.C. used to, until the Institute for Justice fought them too. Watch John Stossel give his own segway tour in DC, and learn about yet another way that the government makes it harder for people to find jobs.

It is part three of our Bottleneckers series.

Produced by Naomi Brockwell. Edited by Joshua Swain.

Stossel on Reason

Subscribe to our YouTube channel.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.


Venezuelan Price Controls Lead to Predictable Shortages


Yesterday, Bloomberg had an interesting article about food shortages in Venezuela. Contrary to popular perception, the Venezuelan shops are not empty. Bakeries, for example, offer "a wide variety of freshly-made breads," including, "a fat, dense loaf called the gallego, or a soft sobado." Conversely, "the canilla, a soft, buttery take on the baguette that's been the beloved bread of choice in this South American country for decades," is missing from the shelves. Why? The canilla has disappeared because its price is set by the state. The price of the bread is "set at such a low level—1,500 bolivars versus the 4,500 to 7,500 a gallego commands—that bakers complain it doesn't come close to covering their costs. So they use new-found supplies of wheat in the country to bake every other kind of bread imaginable." Say what you will about socialism, it always follows a predictable pattern. In an attempt to make something available to everyone, the socialists ensure that it is not available to anyone (except for the politically well-connected). As a child growing up behind the Iron Curtain, I recall constant shortages of basic foodstuffs. The price of meat, for instance, was kept artificially low due to political considerations. Low prices created an impression of affordability. On their trips abroad, communists would often boast that workers in the Soviet empire could buy and produce more meat than their Western counterparts. In reality, shops were often empty. The deleterious consequences of price controls should not come as a surprise to anyone with a basic understanding of economics, including supply and demand, and the role that free markets play in allowing the price mechanism to function properly. Back in 1979, Robert Schuettinger of Oxford University and Eamonn Butler of the Adam Smith Institute wrote a brilliant series of essays entitled Forty Centuries of Wage and Price Controls: How Not to Fight Inflation. The authors noted that price and wage controls go back, at the very least, 4,000 years to ancient Egypt. "For centuries the Egyptian government strived to maintain control of the grain crop, knowing that control of food is control of lives. Using the pretext of preventing famine, the government gradually regulated more and more of the granaries; regulation led to direction and finally to outright ownership; land became the property of the monarch and was rented from him by the agricultural class." According to the French historian, Jean-Philippe Levy, "There was a whole army of inspectors [in Egypt]. There was nothing but inventories, censuses of men and animals … estimations of harvests to come... In villages, when farmers who were disgusted with all these vexations ran away, those who remained were responsible for absentees' production... [one of the first effects of harsh price controls on farm goods is the abandonment of farms and the consequent fall in the supplies of food]. The pressure [the inspectors] applied extended, in case of need, to cruelty and torture." As Venezuelans can attest, the basic laws of economics have not changed since the time of Hammurabi. And, as they can also attest, neither have the means—cruelty and torture—by which governments attempt to make price controls work in real world. [...]

Mississippi's Jump-Out Boys


Betty Jean Tucker, a 62-year-old resident of Canton, Mississippi, says she was hosting a barbecue for family and friends in 2014 when several unmarked cars appeared. Plainclothes deputies from the Madison County Sheriff's Department (MCSD) jumped out. Without a warrant, they detained and searched all her guests, going so far as to rummage through everyone's pockets, she says. After finding nothing, the deputies got back in their cars and drove off without explanation. It wasn't the first time Tucker had a run-in with the MCSD. About five years ago, she says, her teenage grandson was in her front yard, fixing his brother's bicycle, when an unmarked truck sped toward him and stopped. Two plainclothes officers jumped out, tackled him to the ground, and searched him. Again finding nothing, the deputies left. Tucker shouted at them, asking what he had done. "Tell your grandson to wear a shirt next time," they allegedly replied. Tucker is now a named plaintiff in an American Civil Liberties Union (ACLU) class action lawsuit against Madison County. The suit, filed in May, alleges that the sheriff's department and its plainclothes "jump-out" squads systematically target and violate the Fourth Amendment and 14th Amendment rights of residents like her just for being black and living in the wrong place. The dozens of similar stories unearthed in the ACLU's suit and a subsequent Reason investigation are stunning, but the Madison County Sheriff's Department's tactics are not unique. Paloma Wu, a Mississippi ACLU attorney, says they only represent a "sharpened iteration" of the methods used widely in other places across the country. The ACLU is also suing Milwaukee for its high-volume stop-and-frisk program, which the civil rights group says subjects minority residents to suspicionless searches. Milwaukee Police Chief Ed Flynn is a disciple of "broken windows" policing—the theory that a heavy police presence in a community, combined with proactive enforcement of low-level nuisance crimes, will deter more serious offenses. Under Flynn, the combined number of police traffic and pedestrian stops in Milwaukee nearly tripled in an eight-year period, rising from 66,657 in 2007 to 196,434 in 2015, according to the lawsuit. And minorities bear the brunt of that saturation policing. Flynn argues the strategy logically focuses on areas with the most crime. In practice, however, community activists and civil rights groups say these strategies make minority residents feel like they're under siege from police. It's also expensive. In 2016, the city of Milwaukee paid out $5 million to 74 black residents who said they were illegally strip-searched. If nothing else, the policy hurts relations between neighborhoods and police. Earlier this year, the Baltimore Police Department declared that it was ending its use of plainclothes police known as "jump-out boys" or "knockers" by locals. The announcement came after the federal indictment of seven Baltimore plainclothes officers on charges of robbery, extortion, racketeering, and filing false police reports. The Metropolitan Police Department in Washington, D.C., used to have an aggressive jump-out squad that operated in the poorer wards of the District, but the city disbanded these vice squads amid community complaints in 2015. "Half of the time, they pull you over, or if you walking and they stop you, [it's because], oh, you fit a description," a black high school student i[...]

Let’s Lay Off the 1930s Reenactments


We can be forgiven, I think, for wondering if historical reenactors are looking to the 1930s for inspiration and excitement. Two groups of losers (to borrow a favorite pejorative of the moment)—white supremacists/neo-Nazis and antifa/neo-communists—have revived the hoary collectivist ideologies of that era in a seemingly conscious effort to live out their Weimar dreams of heroic street fighting. It would be bad enough if the 1930s reenactments were confined to bloody clashes in the streets, but they're not. Then, as now, those clashes were the outcome of a larger loss of faith in individualism, tolerance, and the sort of open, unbossy, relatively live-and-let-live society that we sum up, within broad parameters, as liberal democracy. "Democracy has become much more dysfunctional in the West, and particularly in the United States, I think more than anywhere else," Adrian Wooldridge told a Harvard Business School interviewer after the 2014 publication of his book, The Fourth Revolution: the Global Race to Reinvent the State. "And that is it seems to be completely gridlocked, it seems to be incapable of taking long-term decisions, and it seems to have become a prisoner of various interest groups." Writing before the rise of Trump, as well as of populist movements and politicians across Europe, Wooldridge, Management Editor of The Economist, and co-author John Micklethwait, former editor-in-chief of the same magazine, posited that modern democracies have paralyzed themselves with unaffordable welfare-state commitments. "Ever-bigger government meant ever-greater social dysfunction. Vested interests competed ever more viciously for their share of the pie," they write in the book. Debt-ridden and sclerotic, democratic governments are unable to keep their promises or clean up their messes, and so anger their constituents. In desperation, voters look further and further afield for solutions—even to explicitly authoritarian fixes and demagogues who finger unpopular groups or classes to blame. We've been here before. In the 1920s and 1930s, after the bloodbath of the Great War and even more with the onset of the Great Depression, Americans and Europeans also lost faith in liberal democracy. "In the wake of global economic disaster, there was no particular reason to prefer the political system most closely associated with capitalism—liberal democracy—to new systems that promised a brighter future," muses historian Wolfgang Schivelbusch in his 2006 book, Three New Deals. The fate of Weimar Germany is well-known, as is the rise of authoritarianism and totalitarianism across most of Europe of the time. But the U.S. did better only by comparison. The atmosphere in Washington, D.C. was "strangely reminiscent of Rome in the first weeks after the march of the Blackshirts, of Moscow at the beginning of the Five-Year Plan.…America today literally asks for orders," the New York Times reported on May 7, 1933. In short order, administration officials including Hugh Johnson, Rex Tugwell, and even President Franklin Delano Roosevelt were openly praising the policies and precedents set by Italian dictator Benito Mussolini. And they weren't fond of naysayers. "Roosevelt appointed loyalists to the Federal Communications Commission who made it clear that licenses would be revoked for broadcasters who aired programs critical of the government," Thaddeus Russell notes in A Renegade His[...]

The New Red Scare


These days America sometimes looks as if it were slipping into the grip of another Red Scare. Only this time the object of fear and loathing is the far-right menace, not the far-left one. The first Red Scare happened after the Bolshevik Revolution in 1917. The second followed WWII, and helped commence the Cold War. Both scares involved a hysterical overreaction to a genuine threat. Totalitarian communism was antithetical to America's most cherished values, and anti-communism was the morally correct position to take. Some took it too far. The overreaction led to loyalty oaths and star-chamber hearings before the House Un-American Activities Committee and Hollywood blacklists and a general atmosphere of what, today, we might call political correctness: an intolerance of dissenting ideas that challenged, or were insufficiently devoted to, the prevailing anti-communist orthodoxy. The more common name for the overreaction is McCarthyism. All of this produced almost inevitable blowback, which came to be known as anti-anti-communism. Anti-anti-communists did not support communism, but they also opposed McCarthyism. To muddle the issue even further, many on the left were at least sympathetic to communism, and at least a few were objectively pro-Soviet, so it was easy to lump anti-anti-communists in with those who were pro-communist, and it could be difficult to navigate all of the finely grained distinctions. Those debates have passed into history's sepia pages. Now the current debate over the alt-right has begun to display some of the same hallmarks. To begin with, there is the undeniable existence of a clear and present danger. The racist right's identitarianism is antithetical to America's most cherished values, and opposing the alt-right is the morally correct position to take. The threat must be countered at every turn. At the same time, the wholesome and necessary opposition to bigotry has started to metastasize into something less healthy. You can see that in the way Berkeley reacted to a speech by Ben Shapiro. From the militarized police preparation to the emotional counseling for students, you'd have thought Shapiro, a Jewish conservative who opposes Donald Trump, was the reincarnation of Adolf Eichmann. You can see it at the Oregon Bach Festival, which recently fired British conductor Matthew Halls for affecting a Southern accent while joking with a friend. The friend, Reginald Mobley, is from the South, and black. A woman reported Halls for making racist comments. Mobley insists "there was nothing racist or malicious" about his friend's joke. Too bad, festival officials said; Halls is out. Mobley told a British newspaper Halls "has been victimized and I'm very upset about it." You can see it at the University of Iowa, which requires job applicants to promise they will "demonstrate their contribution to diversity and inclusion" if they are hired. (Virginia Tech tried to impose a similar litmus test for faculty members a few years ago.) To consider why that might be problematic, imagine the university were to demand that applicants "demonstrate their fidelity to capitalism and free enterprise." You can see it in the proliferation of college "bias response teams," which swing into action when somebody reports somebody else—informs on them—for saying or doing something that might be viewed as offensive or hurtful. On today's campus, tha[...]

How Alexander Hamilton Screwed Up America


Having now endured a more than two-year orgy of adoration for the Broadway hip-hop musical, Hamilton, the public surely deserves a historical corrective. Historian Brion McClanahan's latest work on the Revolutionary period, How Alexander Hamilton Screwed Up America, is being released Monday. Ron Paul, the Libertarian and Republican candidate for president and longtime U.S. Representative from Texas, has written the foreword, which he graciously shared in advance with Reason. The central government has always been the greatest threat to liberty in America, but most Americans don't understand how modern America became the warfare state. How did the president acquire so much unconstitutional power? How did the federal judiciary become, at times, the most powerful branch of government? How were the states reduced to mere corporations of the general government? Why is every issue, from abortion to bathrooms to crime to education, a "national" problem? The people have very little input into public policy. They vote, they rally, they attend "town hall" meetings, but it does very little to stop the avalanche of federal laws, regulations, and rules that affect every aspect of American life. We have a federal leviathan that can't be tamed, and Americans are angry about it. They want answers. Certainly, the Framers of the Constitution did not design our system this way. They intended the checks and balances between the three branches of government and also between the states and the central government to limit the potential for abuse, but somewhere along the way that changed. Who or what changed the system? It wasn't Barack Obama or George W. Bush. It wasn't even Franklin Roosevelt, his cousin Teddy, or Woodrow Wilson. They certainly helped, but as Brion McClanahan argues in the following pages, the architects of our nationalist nightmare were none other than Alexander Hamilton and a trio of Supreme Court justices: John Marshall, Joseph Story, and Hugo Black. Identifying the source of the problem is essential for correcting it. Hamilton has become one of the more popular figures in America for the Left and the Right, so accusing him of making a mess of the United States is certainly shocking. But it is also accurate. Hamilton's constitutional machinations created the outline for literally every unconstitutional federal act, from executive and judicial overreach to the nationalization of every political issue in the country. He lied to the American public about his true intentions before the Constitution was ratified and then used sly doublespeak to persuade others that so-called "implied powers" were part of the plan from the beginning. We would not have abusive unilateral executive authority in foreign and domestic policy, dangerous central banking, and impotent state governments without Hamilton's guidance. Hamilton is the architect of big government in America. Marshall, Story, and Black certainly acted as co-conspirators. Marshall's landmark decisions could have been written by Hamilton. His reading of the Constitution was at odds with how the document was explained to the state ratifying conventions in 1788. Marshall's interpretation would have led the people to reject the document. His belief in federal judicial supremacy and unchecked national authority has been the keystone to every subsequent outrageous federal ru[...]

The Juggalos March on Washington


"We have the right to listen to any kind of music we want without being labeled a gang," says Nellie Aldred, a Juggalo and mother of two. Aldred and family traveled from South Carolina to the D.C. area to participate in the Juggalo March on Washington. Juggalos are the fans of the Detroit horrorcore rap group Insane Clown Posse (ICP) and they are protesting a gang classification given to them by the Federal Bureau of Investigation in 2011.

Aldred says she and family were needlessly stopped by police over a hatchetman sticker on their car (The hatchetman is a symbol that identifies one as a Juggalo.) Further, more Juggalos say the gang label has lead to lost jobs and been used against them in child custody disputes.

"[The march] is our mark on history. It's showing the world who we are. We've been hiding under the streets for way too long and we are about to come up top and show everybody who we are," says Aldred.

"If the government can get away with this, then what the fuck happens to us next," said Shaggy 2 Dope (Joseph Utsler), one of the members of ICP, to a crowd in front of the Lincoln Memorial.

The march included testimonies from people who have had the gang label negatively applied to them as well as speeches from supporters like writer and Juggalo Nathan Rabin.

"There is no such thing as an ordinary Insane Clown Posse show. It's always a spectacle," says Rabin, author of You Don't Know Me But You Don't Like Me and 7 Days in Ohio. "The challenge was to show the world that Juggalos are good people. Juggalos are a law abiding people. Juggalos love each other and are a positive force for the community and I think that's been illustrated here."

Subscribe to our YouTube channel.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.

Produced by Paul Detrick and Jim Epstein. Camera work by Epstein, Todd Krainin and Meredith Bragg. Sound by Mark McDaniel.


Stossel: The Best Part of the Constitution


This Constitution Day, John Stossel asks: what's the most important part of the 230-year-old document?

Many people we asked in New York couldn't name a single right.

Parts of the Constitution are hard to read. But they're still important! John Stossel goes through some of the most important ones, like:

-- The right to free speech
-- The right of the people to bear arms,
-- The guarantee of trial by jury
-- The 13th amendment, which outlawed slavery

Stossel also asks liberty-supporting people like Senator Mike Lee (R-Utah), Rep. Thomas Massie (R-Ky.), and Rep. Justin Amash (R-Mich.) for their picks.

Produced by Maxim Lott. Edited by Joshua Swain.

Stossel on Reason

Subscribe to our YouTube channel.

Like us on Facebook.

Follow us on Twitter.

Subscribe to our podcast at iTunes.


Documentarian Ken Burns on How Vietnam Explains the Current Political Moment


Filmmaker Ken Burns is best known as the disembodied voice accompanying black-and-white photographs of everything from baseball players to whiskey bootleggers to Confederate soldiers. This fall, he's back with a new 10-part documentary for PBS, The Vietnam War, created with longtime collaborator Lynn Novick. In June, Reason's Nick Gillespie sat down with Burns to discuss the project. Q: Why should we be talking about Vietnam now? A: We think it's the most important event in American history in the second half of the 20th century. If we want to understand the political divisions and the lack of civil discourse that bedevils us today, the seeds of that were planted in Vietnam. If you could unpack the fraudulence of the conventional wisdom, and repack it benefiting from the testimony of people who lived through it and the recent scholarship that has taken place—and also to triangulate with the South Vietnamese and North Vietnamese perspectives, which are almost always left behind—you have an opportunity to understand it better and maybe pull out some of these fuel rods of discourse. Q: In one scene from the documentary, General Westmoreland goes on TV and says, "I can give you a bunch of statistics on how we're winning," and it's like, "We're shooting this many bullets." But there was no real indicator of what success would look like. A: Part of the tragedy is that many people in government, at high policy levels, understood this and did not reflect it. You can hear in the tapes the anguish of [Lyndon Johnson] or the anguish of Richard Nixon, and then they both go out the next day and say the exact opposite of what's going on. Q: Looking back, the access that reporters got to troops in the field during Vietnam is stunning. A: This is the key ingredient. During World War II, [press access] was really limited. In Vietnam, you got your credentials, you promised not to betray ongoing operations, and you were free. What happened is that the Vietnam War revealed itself to the press and they reported it back to the United States. What the military learned is: We're not doing that anymore. So the "embed" idea is a way of babysitting a journalist. You're not going to ever get to watch [the war] as Morley Safer did: soldiers burning a village in retaliation for the fact that they'd received some fire from there, and then quite frankly saying, "We have no feeling for these people," even though the obvious calculus is that if you destroy the village you are creating more enemies. Q: In one episode, the Marine Karl Marlantes says, "Think about how many times we get ourselves into scrapes as a nation because we are always the good guys. Sometimes I think that if we thought we weren't always the good guys, we might actually get into less wars." Has America changed its self-image when it comes to military interventions? A: You know, we learned some lessons, and the military was very anxious to apply those lessons with Desert Storm [the 1991 invasion of Iraq]: to have a very clear rationale, a very clear sense of beginning, and middle, and end. Q: And the importance of having a wide, multinational consensus. A: Right, and we had a clear enemy who had done a bad thing, invaded another country, so it hearkened back to other very clear, delineated world[...]