2016-12-10T07:00:00-05:00I last visited McDowell County, West Virginia, over 40 years ago. Even then, I was already an outsider, a visitor to my family's past. Sometime around 1950, my grandparents and all six of their grown children pulled up stakes and left McDowell behind. My grandfather bought a dairy farm 100 miles away in Clinchburg, Virginia, and my father joined him after he left the Air Force in the mid-1950s. The house I grew up in didn't have a bathroom until I was 5. My sisters and I bathed in a zinc washtub using water warmed on the chunk burner in our kitchen. Since our house was heated entirely by two wood-burning stoves, I spent a good portion of my summers chopping and stacking cordwood. My upstairs bedroom was unheated, so I slept in a cast-iron bed beneath three heavy unzipped U.S. Army canvas sleeping bags to stay warm. We got a telephone when I was 13 years old; it was a five-party line. But it was the folks in McDowell—including many of my relatives—whom I thought of as poor. To my eyes, a huge number of the houses we drove past in hamlets like Squire, Cucumber, English, Bradshaw, Beartown, and Iaeger on our way to visit my father's hometown of Panther were little more than shacks. Many were covered with tarpaper. Indoor bathrooms and running water were luxuries. The houses that did have bathrooms more often than not simply ran a pipe from their sinks, tubs, and commodes directly to the nearest stream. My grandparent's old home was a nice and pretty spacious white clapboard house, but they got water from an outside hand pump and resorted to a first-class outhouse to answer nature's call. The water tasted distinctly of iron and sulfur. Except deep inside Panther State Forest, where the Bailey family held our annual Labor Day reunion, coal dust coated most buildings and automobiles. I do not long for the chilly, dusty, impoverished life I remember—my experience of the past is whatever the opposite of nostalgia is—but in retrospect, I was witnessing the tail end of McDowell's golden era. Mechanization, especially the development of continuous mining machines, enabled coal companies to mine much more coal with many fewer workers. Out of a population of nearly 100,000 in 1950, 15,812 worked as miners. By 1960 that number was just 7,118. Today there are only about 1,000 employees working for coal companies in the county, out of a population of less than 20,000. The county's dwindling economic prospects were further devastated by massive floods in 2001 and 2002 that destroyed hundreds of houses and businesses and killed four people. In recent years, McDowell has attracted attention for the worst possible reasons. It consistently shows up at the bottom of rankings, with the lowest levels of employment and the worst level of overall health in West Virginia, and the shortest male life expectancy in the nation. But it sits very near the top of lists of counties with the most drug overdoses, obesity, and suicides. The rather unsentimental question I set out to answer as I made my way back this autumn: Why don't people just leave? Bad News One sign things are not going well in your county is when the kids in the social service programs know to do pre-emptive damage control with the press. "Don't you focus just on the negative," warned Destiny Robertson, a spunky African-American senior at Mount View High School and a participant in the Broader Horizons program for at-risk kids devised by the Reconnecting McDowell task force. But it's hard not to focus on the negative when it can seem like that's all there is. Asked about their hometown, the kids shout out the usual list of woes with a world-weary attitude: bad schools, no jobs, drug addiction. They're right to worry: McDowell County has been the iconic symbol of poverty in America ever since the 1960 presidential campaign, during which then–Sen. John F. Kennedy visited the county four times. In his May 3, 1960, speech in the town of Welch, Kennedy cited the collapse of employment in the coal industry and declared that had President Eisenhower "come to McDowell Coun[...]
2016-12-09T13:30:00-05:00No matter who's running the government, America's "transition to a clean energy economy is irrevocably underway," the Natural Resources Defense Council asserted in its Accelerating into a Clean Energy Future report this week. Report co-author Ralph Cavanagh added, "The nationwide momentum for pollution-free energy is undeniable and irresistible because clean energy now costs less than dirty energy." As if to confirm the Council's claim, the infotech giant Google announced this week by the end of next year, its global operations will be fueled 100 percent by electricity generated by renewable sources. ("The science tells us that tackling climate change is an urgent global priority," the company's press release explained.) This does not mean that Google gets its electricity directly from solar panels on the roofs of its data centers or from wind turbines churning away on its corporate campuses. The company basically makes purchasing commitments to renewable projects that offset the conventionally generated electricity that it gets from local utilities. There is, of course, nothing wrong with a business legally adopting measures that it thinks are in the best interests of its customers and shareholders. If the company is on the wrong track, those stakeholders will let Google's executives know through their purchasing and investment choices. But if clean energy really does cost less than dirty energy, then what is there to resist? In that case, surely the invisible hand of the marketplace will make the transition to a clean-energy economy irrevocable. So can we all put aside our worries about catastrophic climate change? Not so fast. You see, policies are needed. Google notes that during "the last six years, the cost of wind and solar came down 60 percent and 80 percent, respectively, proving that renewables are increasingly becoming the lowest cost option." Yet even as proponents insist that clean energy now outcompetes fossil fuels, they nevertheless want to enhance their irrevocablabilty with a little help from the government. As Google obliquely puts it, "We believe the private sector, in partnership with policy leaders, must take bold steps." What might that "partnership" look like? Google doesn't say, but you can get a sense of what might be required by reading From Risk to Return, a new report from the Risky Business Project. This group is supported by the media mogul Michael Bloomberg, the Bush-era treasury secretary Henry Paulsen, and the hedge fund manager and prominent Democratic Party donor Thomas Steyer. Its report presents four pathways toward restructuring America's energy infrastructure, with the goal of cutting U.S. carbon dioxide emissions 80 percent by 2050. While the paper does not favor any of those four pathways—renewables, nuclear, carbon capture, and a mix—it focuses mostly on the costs and benefits of the fourth, which reduces emissions via a combination of renewables, nuclear, carbon emissions captured from fossil fuels, and the transformation of transportation toward reliance on electricity, hydrogen, and biofuels. By 2050, the report projects, the extra expenditures for building out low-carbon energy production and consumption infrastructure would be more than offset by fuel costs. The authors argue that clean energy is unfortunately not yet ready to compete head-on with fossil fuels. "The private sector alone cannot solve the climate change problem," the Risky Business report concludes. "We know from our collective business and investment experience that the private sector will take action at the necessary speed and scale only if it is given a clear and consistent policy and regulatory framework." What sort of policies do they think are necessary? First and more foremost, they want government to put a price on carbon emissions. From their point of view, this would level the energy playing field. In addition, they rightly want to eliminate tax incentives for fossil fuel extraction, end subsidized flood insurance in high-risk areas, and lower regulatory a[...]
2016-12-02T13:30:00-05:00Emma Morano turned 117 on Tuesday. The Italian woman is, as far as we know, the oldest person in the world and the only living person who was born in the 1800s. The secret for her longevity? Eating three raw eggs a day and being single since 1938. The person known to have lived the longest ever was Jeanne Calment, who died in 1997 at 122 years of age. In October, Nature published an article, "Evidence for a limit to human lifespan," by three researchers associated with the Albert Einstein College of Medicine in the Bronx. Noting that the longest known lifespan has not increased since the 1990s, they argue that there is a fundamental limit to human longevity. The occasional outlier aside, they think that limit is about 115 years. Maybe, maybe not. In the 21st century, almost everything that kills people, except for accidents and other unintentional causes of death, has been classified as a disease. Aging kills, so it's past time to declare it a disease too and seek cures for it. In 2015, a group of European gerontologists persuasively argued for doing just that. They rejected the common fatalistic notion that aging "constitutes a natural and universal process, while diseases are seen as deviations from the normal state." A century ago osteoporosis, rheumatoid arthritis, high blood pressure, and senility were considered part of normal aging, but now they are classified as diseases and treated. "There is no disputing the fact that aging is a 'harmful abnormality of bodily structure and function,'" they note. "What is becoming increasingly clear is that aging also has specific causes, each of which can be reduced to a cellular and molecular level, and recognizable signs and symptoms." So why do people age and die? Basically, because of bad chemistry. People get cancer when chemical signals go haywire enabling tumors to grow. Heart attacks and strokes occur when chemical garbage accumulates in arteries and chemical glitches no longer prevent blood cells from agglomerating into dangerous clumps. The proliferation of chemical errors inside our bodies' cells eventually causes them to shut down and emit inflammatory chemicals that damage still healthy cells. Infectious diseases are essentially invasions of bad chemicals that arouse the chemicals comprising our immune systems to try and (too often) fail to destroy them. Also in 2015, another group of European researchers pointed out that we've been identifying a lot of biomarkers for detecting the bad chemical changes in tissues and cells before they produce symptoms associated with aging. Such biomarkers enable pharmaceutical companies and physicians to discover and deploy treatments that correct cellular and molecular malfunctions and nudge our bodies' chemistry back toward optimal functioning. As a benchmark, the researchers propose the adoption of an "ideal norm" of health against which to measure anti-aging therapies. "One approach to address this challenge is to assume an 'ideal' disease-free physiological state at a certain age, for example, 25 years of age, and develop a set of interventions to keep the patients as close to that state as possible," they suggest. Most people's body chemistry is at its best when they are in their mid-twenties. In fact, Americans between ages 15 and 24 are nearly 500 times less likely to die of heart disease, 100 times less likely to die of cancer, and 230 times less likely die of influenza and pneumonia than people over the age of 65 years. For lots of us who are no longer in our twenties, television talk show host Dick Cavett summed it up well: "I don't feel old. I feel like a young man that has something wrong with him." Meanwhile, lots of progress has been made toward ameliorating many of the diseases whose prevalence increases with aging. For example, the five-year survival rate for cancer patients in 1975 was 50 percent; today it is about 68 percent. The annual rates of heart disease and strokes in the U.S. have fallen from 500 and 130 per 100,000 respectively in 1970 to[...]
2016-11-25T13:30:00-05:00Some 1.2 billion people do not have access to electricity, according to the International Energy Agency's World Energy Outlook 2016 report. About 2.7 billion still cook and heat their dwellings with wood, crop residues, and dung. In its main scenario for the trajectory of global energy consumption, the IEA projects that in 2040, half a billion people will still lack access to electricity and 1.8 billion will still be cooking and heating by burning biomass. The agency defines the initial threshold for modern energy access as 250 kilowatt-hours (kwh) for rural and 500 kwh for urban households per year. How much is that? "In rural areas, this level of consumption could, for example, provide for the use of a floor fan, a mobile telephone and two compact fluorescent light bulbs for about five hours per day," the IEA explains. For comparison, in 2015 the average annual electricity consumption for a U.S. household was 10,812 kwh—43 times the IEA's energy access threshold for rural households. In September the United Nations issued 17 new sustainable development goals that are supposed to be achieved by 2030. Universal access to affordable and clean energy is number 7. To achieve this goal, the U.N. says countries can "accelerate the transition to an affordable, reliable, and sustainable energy system by investing in renewable energy resources, prioritizing energy efficient practices, and adopting clean energy technologies and infrastructure." The transition to renewable energy resources in poor countries was discussed in "Scaling of Innovative Solutions for Mitigation and Adaptation," a side event at the U.N. climate change conference in Marrakech, Morocco, last week. The panel highlighted the distribution of solar lanterns to poor households in Africa and the distribution of small solar panels that can be used to for lighting and to recharge mobile phones. Giving poor people access to such technologies is certainly better than nothing, but that still leaves them mired in energy poverty. The eco-modernist Breakthrough Institute takes a very different view than the U.N. in a new report, Energy for Human Development. Eco-modernists argue that through technological progress humanity will increasingly withdraw from nature, enabling a vast ecological restoration over the course of this century. The Breakthrough report rejects any approach based around small-scale energy projects aimed chiefly at supplying tiny amounts of electricity to millions of subsistence farmers. "There is no nation on earth with universal electricity access that remains primarily agrarian," the authors note. "Modern household energy consumption has historically been achieved as a side effect of electrification for non-household purposes such as factories, electrified transportation, public lighting, and commercial-scale agriculture." Rural electrification has always come last, after urbanization and economic development have taken off. For example, in the U.S. nearly 90 percent of city dwellers had electricity by the 1930s but only 10 percent of rural Americans did. Given this universal growth dynamic, the Breakthrough writers call for prioritizing energy development for productive, large-scale economic enterprises. Copious and reliable energy will accelerate the creation and spread of higher-productivity factories and businesses, which then will generate the opportunities for a better life; that, in turn, will draw poor subsistence farmers into cities. They further note that energy access and electricity access are not the same thing. In fact, in 2012 electricity accounted for only about 18 percent of the energy consumed globally. "Efforts to address energy poverty must address needs for transportation fuels and infrastructure, and for fertilizer and mechanization of agriculture," they argue. But what about climate change? Current renewable sources of energy are not technologically capable of lifting hundreds of millions of people out of energy poverty. Consequentl[...]
2016-11-25T12:00:00-05:00"Transhumanism is becoming more respectable, and transhumanism, with a small t, is rapidly emerging through conventional mainstream avenues," Eve Herold reports in her astute new book, Beyond Human (Thomas Dunne). While Transhumanism, with a capital t, is the activist movement that advocates the use of technology to expand human capacities, lower-case transhumanism refers to the belief or theory that the human race will evolve beyond its current physical and mental limitations, especially by means of deliberate technological interventions. As the director of public policy research and education at the Genetics Policy Institute, Herold knows these scientific, medical, and bioethical territories well. Movements attract countermovements, and Herold covers the opponents of transhuman transformation too. These bioconservatives range from moralizing neocons to egalitarian liberals who fear that new technologies somehow threaten human dignity and human equality. "I began this book committed to exploring all the arguments, both for and against human enhancement," she writes. "In the process I have found time and again that the bioconservative arguments are less than persuasive." Herold opens with a tale of Victor Saurez, a man living a couple of centuries from now who at age 250 looks and feels like a 30-year-old. Back in the dark ages of the 21st century, Victor was ideologically set against any newfangled technologies that would artificially extend his life. But after experiencing early onset heart failure, he agreed to have a permanent artificial heart implanted because he wanted to know his grandchildren. Next, in order not to be a burden to his daughter, he decided to have vision chips installed in his eyes to correct blindness from macular degeneration. Eventually he agreed to smart guided nanoparticle treatments that reversed the aging process by correcting the relentlessly accumulating DNA errors that cause most physical and mental deterioration. Science fiction? For now. "Those of us living today stand a good chance of someday being the beneficiaries of such advances," Herold argues. Consider artificial hearts. The Syncardia Total Artificial Heart (TAH) now completely replaces the natural heart and is powered by batteries carried in a backpack. As of this month, 1,625 TAHs have been implanted; one person lived with one for 4 years before receiving a donor heart. In 2015, an ongoing clinical trial began in which 19 patients received permanent TAHs. Herold goes on to describe pioneering research on artificial kidneys, livers, lungs, and pancreases. "Artificial organs will soon be designed that are more durable and perhaps more powerful than natural ones, leading them to become not only curative but enhancing," she argues. In the future, people will be loaded up with technologies working to keep them healthy and alive. (One troubling issue this raises: What do we do when someone using such biomedical technologies chooses to die? Who would actually be in charge of deactivating those technologies? Would the law treat deactivation by a third party as tantamount to murder? In such cases, something akin to today's legalized physician-assisted dying may have to be sanctioned.) Artificial organs have considerable competition too. Herold, unfortunately, does not report on the remarkable prospects for growing transplantable human organs inside pigs and sheep. Nor does she focus much attention on therapies using stem cells that could replace and repair damaged tissues and organs. But such research supports her view that biotechnologies, information technologies, and nanotechnologies are converging to yield a plethora of curative and enhancing treatments. The killer app of human enhancement is agelessness—halting and reversing the physical and mental debilities that befall us as we grow old. Herold focuses a great deal of attention on the development of nanobots that would patrol the body to repair and remove th[...]
2016-11-11T13:30:00-05:00"The only antidote to decades of ruinous rule by a small handful of elites is a bold infusion of popular will. On every major issue affecting this country, the people are right and the governing elite are wrong," wrote Donald Trump in an April op-ed for The Wall Street Journal. That could be a textbook definition of populism. Two political scientists—J. Eric Oliver of the University of Chicago and Wendy M. Rahn of the University of Minnesota—define populist rhetoric as a style "that pits a virtuous 'people' against nefarious, parasitic elites who seek to undermine the rightful sovereignty of the common folk." They add, "A populist moment requires the right rhetoric spoken by the right person to the right audience at the right time. And, as we look to the data, the 2016 election has all the hallmarks of a populist moment." Tuesday's vote proved them right. Consider a post-election television interview of Trump voters from central Virginia, where I live. "People want the people to be in control of the country, not the politicians," Greene County resident Chad Aylor told WCAV-TV. The report noted that "for Aylor, voting Trump was not against Clinton or voting Republican. Aylor said he considers himself voting as an American." Trump trounced Clinton 62 to 31 percent in the mostly rural county. So why is 2016 the right time for populism? University of Georgia political scientist Cas Mudde argues that "populism is an illiberal democratic response to undemocratic liberalism." Populism, he suggests, criticizes "the exclusion of important issues from the political agenda by the elites." Oliver and Rahn agree, proposing that populism arises "when existing political parties are not responding to the desires of large sections of the electorate." This opens up a "representation gap" that can be exploited by a would-be populist leader. Oliver and Rahn measure this gap with polling data. People are asked whether they agree with such statements as "Public officials don't care much about what people like me think" and "People like me don't have any say about what the government does." They found higher percentages of Americans agreeing with such sentiments as the 21st century advanced. They also compared those responses to the degree of partisan conflict in Congress, which they gauged by tracking the number of strict party-line votes. The idea is that the representation gap grows as the distance between the major parties' core supporters and swing voters grows larger. They find that such a gap opened in the early 1990s, which saw the entrance of such populists as Ross Perot and Pat Buchanan. The representation gap in 2016 was even larger. To get a handle on the degree of populism represented by U.S. presidential candidates from both major parties, Oliver and Rahn analyzed the announcement speeches of the top seven candidates, including Trump, Bernie Sanders, Ted Cruz, Marco Rubio, John Kasich, Hillary Clinton, and Ben Carson. They looked for "anti-establishment" and "blame" language, along with rhetoric signifying the creation of a unified people. In addition, they measured the simplicity and "everydayness" of the candidates' language—basically, shorter sentences and shorter words. Trump and Sanders used considerably more blame language, with Sanders focusing more on economic grievances and Trump on political transgressions. Trump made more "we-they" contrasts and pointed more often to international threats than the other candidates did. Trump and Kasich used fewer words that were longer than 6 letters, and Trump's sentences averaged 10 words in length. Sentences from Sanders and Rubio were twice as long. "Trump scores high in targeting political elites, blame language, invoking both foreign threats and collective notions of 'our' and 'they,' and the simplicity and repetition of his language," report Oliver and Rahn. In contrast, they find that Sanders uses more complex locutions and f[...]
2016-11-04T13:30:00-04:00The Paris Agreement on Climate Change comes into effect today, November 4. Auspiciously for climate warriors, that is just three days before the next United Nations climate change conference opens in Marrakech, Morocco. The Marrakech negotiations, held in conjunction with the Conference of the Parties to the U.N. Framework Convention on Climate Change (COP-22), will be the first meeting of the parties to the Paris Agreement, known in UN-speak as CMA1. Countries that have not yet joined the Paris Agreement can attend and participate in the CMA sessions, but only as observers. They do not have decision-making authority. The goal of the Paris Agreement, adopted last December, is to hold "the increase in the global average temperature to well below 2°C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5°C above pre-industrial levels." Current global temperature hovers around 1°C above the pre-industrial average. Each party to the agreement has submitted its nationally determined contributions (NDCs) to the United Nations explaining how they plan to help keep the climate from growing excessively warm. The U.S. NDC involves reducing economy-wide greenhouse gas emissions by 26 to 28 percent below its 2005 level in 2025 and making best efforts to reduce emissions by 28 percent. The really interesting feature of the Paris Agreement is that the nationally determined contribution plans filed by each of its signatories are voluntary. Its legally binding sections are chiefly mandates that countries provide and periodically update the levels of greenhouse gases they are emitting. This reporting requirement is not much different than out obligations under the previously ratified U.N. Framework Convention on Climate Change. Basically, countries are supposed to tell the others how they are doing. A critical part of President Obama's climate change strategy is the Environmental Protection Agency's Clean Power Plan (CPP), which would require carbon dioxide emissions from electric power generation to fall by 30 percent below 2005 levels in 2030. The U.S. Supreme Court stayed its implementation and a number of states have filed federal lawsuits to stop the program. Meanwhile, thanks largely to massive new supplies of cheap shale gas liberated by fracking, the U.S. has produced the lowest carbon dioxide emissions for the first six months of the year since 1991. (Lots of electric power generators have switched from coal to natural gas, which emits only about half as much carbon dioxide.) As a consequence, greenhouse gas emissions in 2014 were 9 percent below 2005 levels. (In 2010, President Obama pledged to cut U.S. emissions by 17 percent below 2005 levels in 2020. Only four years to go!) So how likely is the U.S. to fulfill Obama's greenhouse-gas pledges? Keeping firmly in mind that energy modeling is largely an exercise in policy fiction and wishful thinking, let's take a look at some projections of future U.S. energy use and greenhouse emissions. A new study by the ICF International consultancy, commissioned by the American Petroleum Institute, modeled how the power sector would evolve if market forces determined the fuel generation mix and new capacity additions—as opposed to government-mandated choices under the Clean Power Plan. The analysts found that carbon dioxide emissions from the power sector would drop by 30 percent from 2005 levels in 2030—deeper cuts than those that would result from CPP regulations. This outcome, however, depends on high-end estimates of how much recoverable shale gas there is. In either the market or mandate scenarios, coal continues to fade as a source of energy. Another analysis—by researchers at the Lawrence Berkeley National Laboratory, published in the October issue of Nature Climate Change—calculated that current policies being pursued by the Obama administration will fall far shor[...]
In July, the Zika virus spread to Florida. The outbreak prompted the U.S. Centers for Disease Control and Prevention to issue its first ever health-related travel advisory for the mainland United States: Because of the disease's link to fetal birth defects, the agency warned pregnant women and women who may become pregnant to avoid certain Miami neighborhoods. Municipalities have tried to control the Aedes aegypti mosquito, which carries the virus, by spraying. This has proven ineffective, perhaps because the species has become resistant to the insecticides being used.
But all isn't lost. There is another technology that could reduce the numbers of these deadly bugs. Developed by the British company Oxitec, the genetically engineered Friendly™ GMO mosquitoes could be deployed to spread a gene that is lethal to the larva of the disease-carrying pests. In Brazil, the release of Oxitec's mosquitoes reduced the transmission of dengue fever by more than 90 percent.
The same week the Florida outbreak started, the Cayman Islands government approved the release of these insects to control the Zika virus there. Unfortunately, the Food and Drug Administration, bowing to the demands of anti-biotech activists, waited more more than five years to finally approve them in August.
2016-11-01T12:00:00-04:00St. Petersburg, Russia—Did legislation in the United Kingdom and the United States inspire Russian authorities to adopt strong new domestic spying laws? Russian President Vladimir Putin signed the anti-encryption and data monitoring "Yarovaya Law" on July 7. Named after Irina Yarovaya, the ultraconservative legislator who pushed for it, the legislation is styled as an "anti-terrorism" measure. Among other things, it mandates that internet service providers and other telecommunications companies store all telephone conversations, text messages, videos, and picture messages for six months. In addition, telecom companies must retain customers' metadata—that is, information about with whom, when, for how long, and from where they communicated—for three years. Under the Yarovaya Law, providers of telecommunication services—such as messenger apps, social networks, email clients, and websites that encrypt their data—are required to help Russia's Federal Security Service decipher messages sent by users. In other words, the new law essentially requires internet service providers and other tech firms to install back doors in their services. The fine for refusing to cooperate can be as high as a million rubles (more than $15,000). In order to comply, telecom firms operating in Russia claim that they will have to build vast new data storage infrastructure costing many times more than they now make in profits. They also point out that most of the required data storage technologies are manufactured outside of Russia. And they plausibly argue that the new rules will bring information technology investment and innovation in the country to a halt. From the authorities' point of view, the fact that most Russian telecoms will not be able to comply with the Yarovaya Law is a feature, not a bug. As the U.S.-based Electronic Frontier Foundation notes, those companies are now "de facto criminals," giving the Russian government "the leverage to extract from them any other concession it desires." Russia Direct, a website funded chiefly by the government newspaper Rossiyskaya Gazeta, tellingly observes that "in Russia, the legislation is compared to the USA Patriot Act." No doubt the extensive capabilities exercised in secret by America's National Security Agency (NSA) and disclosed by whistleblower Edward Snowden in 2013 elicited considerable professional envy among Russian spy agencies. Those revelations did provoke alarm among civil libertarians at home, prompting Congress to pass the USA Freedom Act last year, ending the agency's clandestine bulk collection of Americans' telecommunications data. But some analysts argue that, even with the new law, when it comes to the NSA's domestic spying, not much has actually changed. There is one big difference between what's happening in the U.S. and what may soon be allowed in Russia: "While the Patriot Act prescribed covert surveillance of citizens, the new so-called 'Yarovaya Law' mandates open surveillance," the Russia Direct article continues. In other words, Putin is implementing what some lawmakers in the United States and the United Kingdom have long advocated. For example, Britain's Investigatory Powers Bill, nicknamed the "Snooper's Charter," sets up a review process that will likely end up authorizing the bulk collection and retention of telecommunications and internet metadata. It was passed in June—before the Russian Duma passed its new domestic spying law—by an overwhelming majority in the House of Commons, and is now under consideration by the House of Lords. (In July, the Court of Justice of the European Union ruled that Britain's data retention mandates violate the privacy of its citizens. But Brexit will make such rulings moot.) The bill "will fundamentally shift the relationship between citizen and state, allowing mass interception and mass hack[...]
2016-10-28T13:30:00-04:00"Extremism in the defense of liberty is no vice," Barry Goldwater famously declared as he accepted the Republican presidential nomination in 1964. I have never really understood why some people thought those words were beyond the pale. Yet some more anxious and thoughtful souls worry that such "extremism" might possibly provoke people to engage in murderous mayhem. So how does political extremism play out in American politics? One new study shows that extremist arguments can be effective in shifting public policy debates. Another reports that voters do not penalize more ideologically extreme presidential candidates. "Exposing people to extreme conservative policies makes them more likely to prefer moderate conservative policies relative to liberal ones, and vice versa," reports the New York University political scientist Gary Simonovits. Simonovits' study, which was published in the journal Political Behavior, is basically an empirical confirmation of how the "Overton Window of Political Possibilities" works. As Simonovits explains, Joseph Overton was a libertarian policy analyst at the Mackinac Center who "argued that the range of policies or opinions deemed acceptable by the public is in a constant flux and can be shifted by introducing and defending ideas not yet 'on the table.'" Or as Daily Kos blogger David Atkins once summarized it: "You win policy debates by crafting arguments for extreme positions—and then shifting the entire window of debate." Simonovits tested this claim by deploying surveys that make statements about various policy proposals along the standard left-right political spectrum. Some were designed to be more centrist and others are more extreme. (He defines extreme positions as "those far from the policies that mainstream political actors stand for in a given time and place.") The goal was to see how exposure to the extreme policy proposals affects the participants' views. In one survey, participants read centrist liberal and conservative policy statements saying it should be made somewhat easier or harder to immigrate. Extreme liberal and conservative versions stated that immigration should not be limited at all or should be banned, respectively. In another survey, the centrist policy statements proposed slight increases or decreases in welfare; the extreme ones said welfare should be radically increased or entirely abolished. In another, the centrist statements argued that abortion should be legal in most cases or illegal in most cases; the extreme positions held that it should be either always legal or entirely banned. In the last one, the centrist arguments held that the federal minimum wage should be increased to $10.10 or kept at $7.25; the policy extremes involved increasing it to $15.10 or reducing it to $5.10. After running these surveys on about 4,000 participants, Simonovits reports, "Moderate conservative polices were perceived as more centrist when an extremely conservative alternative was introduced; likewise, respondents rated moderate liberal alternatives as significantly more centrist when an extremely liberal policy was added to the choice set." For example, respondents exposed to the proposal that welfare be abolished were more likely to see the idea of somewhat reducing welfare spending as more centrist than the idea that it should be increased somewhat. Simonovits found the same effect for each of the policy statements he surveyed. The relatively small effects in these one-off surveys suggest that political entrepreneurs do have a real hope that by continuously hammering their issues they can eventually get them "on the table" of mainstream policy discourse. Received wisdom of the pundit class is that American presidential candidates must run to the center in order to win, because wary voters punish candidates who are perc[...]
2016-10-21T13:30:00-04:00"We've been busing people in to deal with you fucking assholes for 50 years, and we're not going to stop now," the Wisconsin Democratic operative Scott Foval declares in Rigging the Election, a video released this week by the conservative undercover-media activist James O'Keefe. In the video, Foval drunkenly discusses how to pull off a voter impersonation fraud scheme by sending folks with fake IDs to vote in neighboring states. The indiscreet Foval has since lost his job. Republican presidential candidate Donald Trump invited O'Keefe to attend the third major party presidential candidate debate in Las Vegas. During the debate, Trump refused to say whether or not he would concede if he lost the vote the November, insinuating that there is a conspiracy to rig the election against him. "The O'Keefe videos will add some evidence to Trump's claims about a rigged election," says Joe Uscinski, a political scientist at the University of Miami. "They will give him some red meat to throw around." When asked how he thinks the public will respond to the O'Keefe videos, the Western Washington University political scientist Todd Donovan replied in an email, "My guess is that the viewers will respond to it through their partisan perspectives. It reinforces pre-existing Republican attitudes; Democrats will see the source and assume it's a hack job of editing." In a prepublication study, "The Effect of Conspiratorial Thinking and Motivated Reasoning on Belief in Election Fraud," Uscinski and his colleagues point out that significant proportions of both major parties believe that electoral fraud is common. "Republicans are especially prone to believing that people are casting ballots they should not, whereas Democrats are more concerned that they are not able to cast ballots," they write. As evidence they cite a national poll taken in July 2012 in which 54 percent of Democrats believed that voter suppression was a major problem compared to 27 percent of Republicans who thought so. On the other hand, 57 percent of Republicans believed that casting illegal ballots was a major problem compared to 38 percent of Democrats who did. "Electoral fraud is a form of conspiracy theory," Uscinski tells me. "And like any other conspiracy theory it is hard to disprove. Evidence that the plot didn't happen actually works in favor of the conspiracy theory: 'Look how hard they're working to cover it up.'" How common is electoral fraud? As Uscinski notes, since the would-be perpetrators do not want their schemes to be detected, voter fraud is by definition hard to measure. Nevertheless, most scholars have concluded that voter fraud, especially voter impersonation fraud of the sort that Foval appeared to be discussing, is rare in American elections. Uscinski thinks scholars probably undercount instances of voter fraud because the undetected successful instances don't get tallied. But he also thinks such frauds are vastly overestimated in the popular imagination. Keeping a national electoral fraud scheme hidden would be exceedingly hard to do, Uscinski points out: It would be a huge coordination problem involving lots of people in very uncertain circumstances with many opportunities for blunders. Donovan agrees. In an email, he writes: "Even if we take at face value the 'description' on the edited video of how to commit fraud, the execution wouldn't be possible. It would require thousands of voters per state (tens of thousands?) to affect these elections. Renting cars in dozens of tates to move voters to dozens of Republican controlled states, where they would have fake addresses to vote under, would require 20,000 people or 200,000 people or even more people with rental cars (or each in a car bought at an auction?) and just as many fake addresses. You would need to convince 20[...]
2016-10-14T13:30:00-04:00Progress: Ten Reasons to Look Forward to the Future, by Johan Norberg, Oneworld Publications, 256 pp. $27.99 Johan Norberg wrote his excellent new book Progress for three reasons. First, because something important happened. Second, because no one believes it. And third, because it's dangerous that they don't believe it. Norberg's book comprehensively documents the myriad ways the state of humanity has vastly improved over the past couple of centuries. Global life expectancy was just 31 years in 1900. Now it has risen to over 71 years. In 1800, no country on earth had a life expectancy greater than 40 years. Now no country has a life expectancy under 40 years. And people aren't just living longer; they're living longer with fewer disabilities. The World Bank has defined the level of abject poverty at the equivalent of $2 per day. In 1800, when world population was around one billion, 94 percent of our ancestors lived in abject poverty. In 1990, some 37 percent of people still lived below the abject poverty line. Since then, the percentage of people on earth living in abject poverty has fallen below 10 percent. Global GDP increased as much in the past 30 years as it did in the previous 30,000 years. In 1986, global GDP stood (in inflation-adjusted terms) at $33 trillion. It now exceeds $73 trillion. Thirty years ago, global per capita GDP was $6,600. It is $10,000 today. Being healthier has gotten cheaper. In 1900, for example, the infant mortality rate in countries with a per capita income of $1,000 was 20 per 100 live births. Today, in a country with exactly the same per capita income, the infant mortality rate is 7 per 100 births. "So even if a country had not experienced any economic growth in a 100 years, infant mortality would have been reduced by two-thirds," he writes. Spillovers in sanitation and medical knowledge help even the very poorest live longer and healthier lives. We probably live at the most peaceful time in recorded history; your chances of being killed by another human being are far lower than in the past. For example, the annual homicide rate in medieval Europe was 32 people per 100,000. In the late 20th century, that rate dropped to about 1 per 100,000. The death rates of people being killed in wars have also fallen steeply, dropping from 195 people per million in 1950 to 8 per million in 2013. The environments in which people live, especially as countries become wealthy, have dramatically improved. Thanks for modern farming, the world is approaching peak farmland, which means that millions of acres of land will be reverting to nature over the course of this century. Composite air pollution levels in the U.S. are 63 percent lower than they were in 1980. In a recent talk at the Cato Institute, Norberg presented a graph that showed the global progress made on hunger, poverty, illiteracy, child mortality, and U.S. pollution since 1990: Norberg also writes intelligently about tradeoffs in the environmental arena. For example, he points out that spending $10 billion to build natural gas electric generation plants could help lift 90 million people out of poverty. Spending the same amount on renewable sources of electricity would help only 20 to 27 million people, leaving more than 60 million still living and dying in poverty. Norberg also celebrated the progress made on boosting education. In 1800, only 12 percent of adults could read. As late as 1950, the global literacy rate was just 40 percent. It is now 86 percent. The literacy differential between men and women is also shrinking. Among those aged 15 to 24, the international female literacy rate is almost 96 percent of the male rate. Educating women is key to even faster progress. Study after study finds that enabling girls and women to comple[...]
2016-10-07T00:01:00-04:00Americans remain deeply divided along partisan lines on the issue of climate change, according to a new Pew Research Center poll. Seven in 10 liberal Democrats trust climate scientists to give full and accurate information on the causes of climate change, whereas only 15 percent of conservative Republicans do. In addition, 54 percent of liberal Democrats believe that climate scientists have a good understanding of the causes of climate change, compared to only 11 percent of conservative Republicans. Liberal Democrats also believe that climate research reflects that best available evidence most of the time. Only 9 percent of conservative Republicans agree. Why this partisan difference over what is essentially an empirical question? Some researchers have concluded that conservatives are less likely than liberals to be open-minded or to engage in effortful cognition when evaluating scientific evidence, especially when accepting those data means undermining their faith in free markets. But research from the Yale Cultural Cognition Project supports a different notion: This polarization tends to occur when accepting or rejecting a scientific thesis becomes a signal to your fellow partisans that you're on their side. For example, research by the Yale law professor Dan Kahan finds that as scientific literacy goes up, so too does partisan polarization on the issue of climate change. In other words, the more science people know, the more they are able to seek out and find information justifying their beliefs. In a new study, Kahan and his colleagues assess the relationship between accepting the evidence for man-made global warming with a measure for actively open-minded thinking and attitudes toward climate change. Actively open-minded thinking is defined as the "willingness to search actively for evidence against one's favored beliefs, plans or goals and to weigh such evidence fairly when it is available." In a survey, some 1,600 Americans were sorted by political orientation and their propensity toward actively open-minded thinking. Psychologists have devised various questionnaires that aim to measure an individual's propensity to engage in such salutary cognition; Kahan's survey used a seven-item scale that asked participants to rate their agreement with such statements as "allowing yourself to be convinced by an opposing argument is a sign of good character," and "changing your mind is a sign of weakness." In the past, many researchers have argued that political conservatives tend to be deficient with regard to actively open-minded thinking. Consequently, they contend that if for some odd reason a conservative did have such a disposition, he would be more likely to accept the scientific evidence in favor of climate change. In fact, the opposite occurred. Since most liberals in the survey already believed that there is solid evidence of recent global warming due mostly to human activity, their probability that that they would accept that conclusion rose only modestly with higher actively open-minded thinking scores. On the other hand, the higher conservatives scored on actively open-minded thinking, the lower the probability they would agree that there is solid evidence for man-made global warming. The gap between liberals and conservatives on beliefs about climate change widens the more that both engaged in actively open-minded thinking. What is going on? The researchers argue that "actively open-minded thinking in fact enhances the proficiency of reasoning aimed at forming identity-congruent beliefs." Actively open-minded thinkers are "simply better at screening information for identity-congruent inferences." In other words, sophisticated reasoning skills enable people to more easily find and[...]
2016-10-01T12:00:00-04:00Algorithms are everywhere. You can't see them, but these procedures or formulas for solving problems help computers sift through enormous databases to reveal compatible lovers, products that please, faster commutes, news of interest, stocks to buy, and answers to queries. Dud dates or boring book recommendations are no big deal. But John Danaher, a lecturer in the law school at the National University of Ireland, warns that algorithms take on a very different profile when they're employed to guide government behavior. He worries that encroaching algorithmic governance, or what he calls algocracy, could "create problems for the moral or political legitimacy of our public decision making processes." And employ them government agencies do. The Social Security Administration uses the tool to aid its agents in evaluating benefits claims; the Internal Revenue Service uses it to select taxpayers for audit; the Food and Drug Administration uses algorithms to study patterns of foodborne illness; the Securities and Exchange Commission uses them to detect trading misconduct; and local police departments employ their insights to predict the emergence of crime hotspots. Conventional algorithms are rule-based systems constructed by programmers to make automated decisions. Because each rule is explicit, it is possible to understand how and why the algorithm produces its outputs, although the continual addition of rules and exceptions over time can make keeping track of what the system is doing difficult in practice. Alternatively, so-called machine-learning algorithms (which are increasingly being deployed to deal with the growing flood and complexity of data that needs crunching) are a type of artificial intelligence that gives computers the ability to discover rules for themselves—without being explicitly programmed. These algorithms are usually trained to organize and extract information after being exposed to relevant data sets. It's often hard to discern exactly how the algorithm is devising the rules it's using to make predictions. While machine learning is highly efficient at digesting data, the answers it supplies can be skewed. In a recent New York Times op-ed titled "Artificial Intelligence's White Guy Problem," Kate Crawford, a researcher at Microsoft who serves as co-chairwoman of the White House Symposium on Society and Artificial Intelligence, cited several instances of these algorithms getting something badly wrong. In 2015 Google Photo's facial recognition app tagged snapshots of a couple of black guys as gorillas, for example, and in 2010, Nikon's camera software made headlines for misreading images of some Asian people as blinking. "This is fundamentally a data problem. Algorithms learn by being fed certain images," Crawford noted. "If a system is trained on photos of people who are overwhelmingly white, it will have a harder time recognizing nonwhite faces." But algorithmic misfires can have much more dire consequences when they're used to guide government decisions. It's easy to imagine the civil liberties implications that could arise from, say, using such imperfect algorithms to try to identify would-be terrorists before they act. Crawford cites the results of a May investigation by ProPublica into how the COMPAS recidivism risk assessment system evaluates the likelihood that a criminal defendant will reoffend. Although judges often take COMPAS risk scores into consideration when making sentencing decisions, ProPublica found that the algorithms were "particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants." In addition, "white defendants were mislabeled as low[...]
U.S. life expectancy now averages 78.8 years, according to the Centers for Disease Control and Prevention. Even better, a June 2015 National Bureau of Economic Research report finds that since 1992, the number of years of healthful life expectancy that Americans over age 65 can expect to enjoy has increased by 1.8. The latter study also found that disabled life expectancy had declined by 0.5 years.
This achievement—longer and healthier lives for most people—is largely the result of improved medical care. Cataract surgery and prophylactic treatments that prevent heart disease, such as medications to lower blood pressure, have significantly reduced the incidence of disabilities at younger ages.
"We identify the medical conditions that contribute the most to changes in healthy life expectancy," the Harvard-based researchers write. "The largest improvements in healthy life expectancy come from reduced incidence and improved functioning for those with cardiovascular disease and vision problems. Together, these conditions account for 63 percent of the improvement in disability-free life expectancy."