Subscribe: Ronald Bailey: Reason Magazine articles.
http://reason.com/staff/show/133.xml
Added By: Feedage Forager Feedage Grade A rated
Language: English
Tags:
americans  cancer  climate change  climate  efficiency  energy  health  new  people  percent  research  sugar  trump  years 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Ronald Bailey: Reason Magazine articles.

Ronald Bailey: Reason.com articles.





Updated: 2017-02-20T00:00:00-05:00

 



Bad News: The Government Wants to 'Help' Driverless Car Companies

2017-02-15T06:00:00-05:00

Google's parent company, Alphabet, revealed in December that an unaccompanied blind man had successfully traveled around Austin, Texas, in one of the company's cars, which had neither a steering wheel nor floor pedals. That same month, Alphabet announced that it is spinning off its self-driving vehicle technology into a new division called Waymo. Also in December, Uber launched an experimental self-driving ride-sharing service in San Francisco.

The future is rushing toward us. Unfortunately, the government wants to help.

In the case of Uber, the California Department of Motor Vehicles (DMV) was so eager to help that it ordered the company to shut down its service, declaring that its regulations "clearly establish that an autonomous vehicle may be tested on public roads only if the vehicle manufacturer, including anyone that installs autonomous technology on a vehicle, has obtained a permit to test such vehicles from the DMV."

Anthony Levandowski, head of Uber's Advanced Technology Group, responded by observing that "most states see the potential benefits" of self-driving technology and "have recognized that complex rules and requirements could have the unintended consequence of slowing innovation." By refraining from excessive regulation, added Levandowski, these jurisdictions "have made clear that they are pro technology. Our hope is that California, our home state and a leader in much of the world's dynamism, will take a similar view." Uber moved its self-driving fleet to Arizona.

The U.S. Department of Transportation (DOT) likewise wants to "accelerate the next revolution in roadway safety"—so in September, naturally, the agency issued a 116-page Federal Automated Vehicles Policy that outlines a 15-point design and development checklist applicable to the makers of automated cars. In case that was not enough help, the agency then issued a 392-page Notice of Proposed Rulemaking to mandate that all new light cars talk to each other using a very specific vehicle-to-vehicle (V2V) technology.

Instead, these rules are likely to slow down innovation and development. Compliance with the agency's 15-point safety assessment is supposedly voluntary, but woe betide any company that fails to file the proper paperwork. Even more worrying, the DOT is calling for a shift from the current regime, in which automakers self-certify that their vehicles meet safety standards, to a system where the agency tests and approves the product before it can go to market. This would bring all of the speed and efficiency of the federal government's drug approval process to the auto industry.

Plus, as Competitive Enterprise Institute researcher Marc Scribner points out, the safety benefits of the V2V mandate "will be trivial for the next 15 years, at which point far superior automated vehicle technology may be deployed to consumers." Self-driving cars equipped with autonomous collision avoidance technologies will likely provide all of the supposed benefits of V2V communications—and do it sooner. If the incoming Trump administration really wants to help, it'll get Washington out of the way and withdraw these two proposed rules.




High Population Density Just Might Be Good for You

2017-02-10T13:30:00-05:00

High population density might induce better habits, according to some new research at the University of Michigan. If so, that's good news for the residents of an ever more highly populated world—and a big surprise for a generation of social critics. "Popollution" threatens to destroy the planet, Larry Gordon warned in his 1982 presidential address to the American Public Health Association. "When we consider the problems of hunger, poverty, depletion of resources, and overcrowding among the residents of our planet today, the future of human welfare looks grim indeed," he declared. Overcrowding was a big concern for those 20th-century prophets of population doom. In 1962, National Institute of Mental Health researcher John Calhoun published an influential article, "Population Density and Social Pathology," in Scientific American. Calhoun had conducted experiments in which he monitored overcrowded rats. As population density increased, female rats became less able to carry pregnancies to full term—and they so neglected the pups that were born that most died. Calhoun also documented increasing behavioral disturbances among the male rats, ranging "from sexual deviation to cannibalism and from frenetic overactivity to a pathological withdrawal." All of these pathologies amounted to a "behavioral sink" in which infant mortality ran as high as 96 percent. Calhoun's work was cited both by professional researchers and by overpopulation popularizers. Gordon, for example, argued that "too many members of the human species are already being destroyed by violence in overpopulated areas in the same manner as suggested by laboratory research utilizing other animals." In his 1961 book The City in History, the anti-modernist critic Lewis Mumford cited "scientific experiments in rats—for when they are placed in equally congested quarters, they exhibit the same symptoms of stress, alienation, hostility, sexual perversion, parental incompetence, and rabid violence that we now find in the Megapolis." In The Pump House Gang (1968), the hipster journalist Tom Wolfe referenced Calhoun's behavioral sink: "Overcrowding gets the adrenalin going, and the adrenalin gets them hyped up. And here they are, hyped up, turning bilious, nephritic, queer, autistic, sadistic, barren, batty, sloppy, hot-in-the-pants, chancred-on-the-flankers, leering, puling, numb..." And in his 1968 screed The Population Bomb, the Stanford biologist Paul Ehrlich declared that he had come to emotionally understand the population explosion "one stinking hot night in Delhi" during a taxi ride to his hotel. "The streets seemed alive with people. People eating, people washing, people sleeping. People visiting, arguing, and screaming. People thrusting their hands through the taxi window, begging. People defecating and urinating. People clinging to buses. People herding animals. People, people, people, people....[T]he dust, the noise, heat, and cooking fires gave the scene a hellish prospect." The theme of dystopian overcrowding inspired many popular books in the 1960s and 1970s, note the London School of Economics historians Edmund Ramsden and Jon Adams. Among the texts they cite are Terracide (1970), by Ron M. Linton; My Petition for More Space (1974), by John Hersey; and the novels Make Room! Make Room! (1966), by Harry Harrison; Logan's Run (1967), by William Nolan and George Johnson; and Stand on Zanzibar (1968), by John Brunner. But now, in stark contrast to these visions of chaos and collapse, new research suggests that increased population density isn't a disaster at all. Indeed, it's channeling human efforts and aspirations in productive directions. So says a report by a team of researchers led by Oliver Sng, a social psychologist at the University of Michigan. In "The Crowded Life Is a Slow Life," a new paper published in the Journal of Personality and Social Psychology, Sng and his colleagues probe how life history strategies change as population densities increase. "Life history" involves the tradeoffs individual organisms mu[...]



Terrorism and Liberty in the Trump Era

2017-02-03T13:30:00-05:00

After the atrocities of September 11, 2001, President George W. Bush's approval rating soared from 50 to 90 percent. A month after the attacks, nearly 60 percent of Americans said they trusted the government in Washington to do what is right almost always or most of the time; that was the highest it had been in 40 years. In the weeks after 9/11, more than 50 percent were very to somewhat worried that they or a family member would be a victim in a terrorist attack. Keying off of these fears, various commentators stepped forward to sagely intone that the "Constitution is not a suicide pact." (I prefer "Give me liberty or give me death.") Evidently averse to potentially committing suicide, 74 percent of the country agreed that "Americans will have to give up some of their personal freedoms in order to make the country safe from terrorist attacks." In 2002, an ABC News/Washington Post poll reported that 79 percent of Americans agreed that it was "more important right now for the federal government to investigate terrorist threats even if that intrudes on personal privacy." Support for intrusive investigations purportedly aimed at preventing terrorist attacks fell to only 57 percent in 2013, shortly after Edward Snowden's revelations of extensive domestic spying by the National Security Agency (NSA). In the most recent poll, it has ticked back up to 72 percent. Instead of urging Americans to exercise bravery and defend their liberty, our political leaders fanned fears and argued that we must surrender freedoms. The consequences included the creation of the Department of Homeland Security, the proliferation of metal detectors at the entrances of public buildings, the requirement to show government-issued IDs at more and more public venues, the increased militarization of our police forces, and tightened travel restrictions to neighboring countries where passports were once not required. In October 2001, the House of Representatives passed the Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism (USA PATRIOT) Act just 15 minutes after its 315 pages of text were made available to members. This law eviscerated the Fourth Amendment's privacy protections, and a massive secret domestic spying operation run by the NSA was set up. (Years later, numerous reports by outside and government analysts found that surrendering our civil liberties had been useless, since NSA domestic spying had had "no discernible impact on preventing acts of terrorism.") The Central Intelligence Agency was authorized to torture suspected terrorists; that too proved not just illiberal but ineffective. According to a recent Brown University study, the Global War on Terror (*) has cost $3.2 trillion, in addition to leaving nearly 7,000 American military personnel dead and scores of thousands wounded. How would President Donald Trump react to a significant terrorist attack, especially one motivated by radical jihadist beliefs? In a rally-around-the-flag reaction, his approval rating could surge. It is theoretically possible that such a crisis would reveal Trump as a fierce defender of American liberties, but the signs all point in a more authoritarian direction. In a 2015 speech at the U.S.S. Yorktown, Trump argued for "closing that internet in some way" to prevent ISIS from recruiting people. "Somebody will say, 'Oh freedom of speech, freedom of speech,'" he said. "These are foolish people. We have a lot of foolish people." When Apple refused the FBI's demand that it provide a backdoor to San Bernardino terrorist Syed Farook's iPhone, Trump asked, "Who do they [Apple] think they are? No, we have to open it up." He urged Americans to boycott Apple until it complied with the FBI's demand to decrypt the phone. More generally speaking, Trump has said that he tends "to err on the side of security" and that he thinks the NSA should collect Americans' phone records. He added, "I assume when I pick up my telephone, people are listening to my conver[...]



Energy Efficiency Mandates Are Worse for Poor Americans Than Energy Taxes

2017-01-27T13:30:00-05:00

When U.S. automakers met with President Donald Trump this week, they asked him to relax the vehicle fuel efficiency standards imposed by his predecessor. Just before Barack Obama left office, the Environmental Protection Agency issued a final determination that its Corporate Average Fuel Efficiency (CAFE) standard of requiring fleet-wide fuel efficiency of 50.8 miles per gallon on new cars by 2025 was achievable. "At every step in the process the analysis has shown that the greenhouse gas emissions standards for cars and light trucks remain affordable and effective through 2025, and will save American drivers billions of dollars at the pump while protecting our health and the environment," said outgoing EPA head Gina McCarthy. Ratcheting up the mandatory energy efficiency standards for vehicles and appliances was a major part of Obama's effort to reduce greenhouse gas emissions. The Department of Energy calculated that the Obama administration's energy efficiency standards would save consumers more than $520 billion on electricity costs by 2030. But not all consumers are alike. In a new study contrasting the effects on consumers of energy efficiency standards versus energy taxes, the Georgetown economist Arik Levinson notes that both energy efficiency standards and energy taxes function as a regressive tax, taking a larger percentage of a lower income and a smaller percentage of a higher income. His analysis aims to find out which is more regressive—in other words, which is worse for poor Americans. Levinson cites earlier research that estimates a gasoline tax would cost 71 percent less than the comparable CAFE policy per gallon of fuel saved. Meanwhile, a 2013 study calculates that CAFE standards cost more than six times as much as a corresponding gas tax for the same reduction in fuel consumption. In other words, if policy makers want people to use less fuel and drive more fuel-efficient cars, taxing gasoline is a much cheaper way to achieve that goal than mandating automobile fuel efficiency. Levinson concludes that "efficiency standards are, ironically, inefficient." But would energy taxes be more regressive? Many analysts argue that while both hit low-income Americans, energy efficiency standards whack them less. Levinson disagrees. Levinson argues that energy efficiency standards can be treated analytically as an equivalent to a tax on inefficient appliances and vehicles. Using data from 2009 National Household Travel Survey, he compares the amount of gasoline consumed by Americans at various income levels. The poorest 5 percent (with annual incomes of under $10,000) consume an average of 247 gallons per year; for the richest 20 percent (over $100,000), the average is 991. Assuming a gasoline tax of 29 cents per gallon, the poor pay $71, compared to $286 per year for the wealthy. Families with 10 times the income pay only four times more in fuel taxes. At the outset Levinson cites research that rejects the notion that consumers are shortsighted when it comes to purchasing more expensive vehicles and appliances that will save them money in the long run. Levinson compares the consequences of a 29 cent per gallon gas tax with a notional CAFE standard "tax" on inefficient vehicles that would raise the same amount of revenue. Rich folks own more and larger vehicles and drive more miles than do poor Americans, so they would pay more in either gas taxes or CAFE "taxes." Another wrinkle makes CAFE standards even more regressive. In 2012, the Obama administration set CAFE footprint standards based on vehicle size, determined by multiplying the vehicle's wheelbase by its average track width. Basically, a vehicle with a larger footprint has a lower fuel economy requirement than a vehicle with a smaller footprint. The footprint standard means that gas guzzlers like full-sized Cadillacs now can more easily meet their footprint standard than can smaller Sonics. Recall that under a 29-cent gas tax, the richest Americans pay an average [...]



Is President Trump a 'Climate Menace'?

2017-01-20T13:30:00-05:00

"Donald Trump is a climate menace, no doubt about it," asserts Greenpeace U.K. spokesperson John Sauven. "President-elect Donald Trump threatens our environment and we vow to fight him every step of the way," declares Kate Colwell from Friends of the Earth. The Union of Concerned Scientists Research Director Gretchen Goldman warns, "It is hard to imagine a Trump administration where science won't be politicized." It is the case that in a 2014 tweet Trump notoriously asked, "Is our country still spending money on the GLOBAL WARMING HOAX?" In 2012, Trump tweeted that the concept of global warming had been created by the Chinese to make American manufacturing noncompetitive. During the presidential campaign, he vowed that he would "cancel" the Paris Agreement on climate change. Being his usual consistently inconsistent self, Trump claimed during a Fox News interview last year that the Chinese tweet was a "joke," and he told The New York Times after the election that he would keep an "open mind" about the Paris Agreement. Yet none of Trump's cabinet picks seem to agree that man-made climate change is hoax. In the hearings for various cabinet nominees, Democrats have sought valiantly to unmask them as "climate change deniers." So far, not one has questioned the scientific reality of man-made global warming. On the other hand, they have tended not to be as alarmed as their interlocutors, and/or have failed to endorse the climate policies that Democrats prefer. Take Scott Pruitt. The Oklahoma attorney-general, nominated to run the Environmental Protection Agency, stated flatly: "I do not believe that climate change is a hoax." He added, "Science tells us the climate is changing and human activity in some manner impacts that change. The ability to measure and pursue the degree and the extent of that impact and what to do about it are subject to continuing debate and dialogue." Sen. Bernie Sanders (I-Vt.) was particularly annoyed that Pruitt pointed to uncertainties about the future course of warming. But those uncertainties are real. The latest report from the Intergovernmental Panel on Climate Change (IPCC) argues that warming will continue unless GHG emissions are curbed, but it also notes that "the projected size of those changes cannot be precisely predicted." IPCC further observed that "some underlying physical processes are not yet completely understood, making them difficult to model." Pruitt is one of the 27 state attorneys-general that are challenging the legality of President Obama's Clean Power Plan (CPP), which would require electric utilities to cut their emissions of carbon dioxide by 30 percent below their 2005 levels by 2030. The Supreme Court stayed the implementation of the CPP last February, which indicates that Pruitt and his fellow attorneys-general have substantial legal grounds to challenge that EPA regulation. In November, the eco-modernist think tank the Breakthrough Institute released a study that suggested that the U.S. could well speed up its GHG reduction trends if the CPP was abandoned. Other nominees asked about their views on climate change include former ExxonMobil CEO Rex Tillerson (nominated to run the State Department), Montana Rep. Ryan Zinke (Interior Department); Alabama Sen. Jeff Sessions (Justice Department); businessman Wilbur Ross (Commerce Department); and former Texas governor Rick Perry (Energy Department). Tillerson testified, "I came to the decision a few years ago that the risk of climate change does exist and the consequences could be serious enough that it warrants action." Zinke similarly declared that he does not believe climate change is "hoax." Sessions offered, "I don't deny that we have global warming. In fact, the theory of it always struck me as plausible, and it's the question of how much is happening and what the reaction would be to it." Ross would head the department in charge of the National Oceanic and Atmospheric Administration that just report[...]



Why Aren't More Americans Moving?

2017-01-13T13:30:00-05:00

"I believe that each of us who has his place to make should go where men are wanted, and where employment is not bestowed as alms," advised New York Tribune editor Horace Greeley in a famous 1871 letter. "Of course, I say to all who are in want of work, Go West!" Basically, Greeley was telling Americans to pick up and go to where the jobs and opportunities are. Americans were once more willing to heed Greeley's advice. From the end of World War II through the 1980s, the Census Bureau reports, about 20 percent of Americans changed their residences annually, with more than 3 percent moving to a different state each year. Now more are staying home. In November, the Census Bureau reported that Americans were moving at historically low rates: Only 11.2 percent moved in 2015, and just 1.5 percent moved to a different state. Yet many of the places where people are stuck offer few opportunities. Why have we become homebodies? In a draft article called "Stuck in Place," Yale law professor David Schleicher blames bad public policy. Schleicher argues that more Americans are stuck in places with few good jobs and little opportunity, largely because "governments, mostly at the state and local levels, have created a huge number of legal barriers to inter-state mobility." To get a handle on the mobility slow-down, Schleicher identifies and analyzes the policies that limit people's ability to enter job-rich markets and exit job-poor ones. He also describes how economically declining cities get caught in a policy spiral of fiscal and physical ruin that ultimately discourages labor mobility. The effects of lower labor mobility, he argues, include less effective monetary policy, significantly reduced economic output and growth, and rising inequality. Consider monetary policy. A dollar doesn't buy the same amounts of goods and services across the country. In a sense there are New York dollars, Ohio dollars, Mississippi dollars, California dollars, and so on. Think of what a worker earning the average household income of $30,000 in economically depressed Youngstown, Ohio, would need to have the same standard of living in other more prosperous regions of the country. In San Francisco, according to CNN's cost of living calculator, a Youngstown job seeker would need an annual salary of more than $63,000. (San Francisco's housing, groceries, transportation, and health care are 366, 56, 34, and 42 percent higher than Youngstown's, respectively.) In Manhattan, he'd need nearly $82,000. The median household income in San Francisco is around $84,000, up in real dollars from $59,000 in 1995. Economic theory suggests that this income differential should be bid down considerably as folks from declining areas like Youngstown move to economically vibrant centers such as San Francisco, but that is not happening. The per capita GDP among the states was converging before the 1970s, as people moved from poor states for more lucrative opportunities in richer states. That process has stopped. Why? First, lots of job-rich areas have erected barriers that keep job-seekers from other regions out. The two biggest barriers are land use and occupational licensing restrictions. Prior to the 1980s, strict zoning limitations were mostly confined to rich suburbs and did not appreciably check housing construction in most metropolitan areas. But now many prosperous areas in the United States require specific lot sizes, zone out manufactured and rental housing, perversely limit new rental housing construction by establishing rent control, or set up "historic districts" that limit the changes that owners can make to their houses. Land-use restrictions limit construction to boost housing and rental prices to the benefit current property owners who vote for local officials who support restrictive policies. How much of a barrier to movement are such land-use restrictions? Since 1996 San Francisco's housing stock rose by 12 percent[...]



Cancer Moonshot Misses the Mark

2017-01-12T07:00:00-05:00

In 1971, Richard Nixon vowed "a national commitment for the conquest of cancer" as he signed the law establishing the National Cancer Institute (NCI). Forty-five years later, Barack Obama declared in his 2016 State of the Union address that our country would embark upon a "new moonshot" with the aim of making "America the country that cures cancer once and for all"; Vice President Joe Biden would be in charge of "mission control." In its October 17, 2016, report, the Cancer Moonshot Task Force declared that its goal is "to make a decade's worth of progress in preventing, diagnosing, and treating cancer in just 5 years."

How? The usual federal bureaucratic efforts of "catalyzing," "leveraging," and "targeting" are promised. But there is some meat to the proposals. For example, the NCI is creating a pre-approved "formulary" of promising therapeutic compounds from 30 pharmaceutical companies that will make them immediately available to researchers. In addition, the task force aims to establish open science computational platforms to provide data to all researchers on successful and failed investigations, and a consortium of 12 leading biotech and pharmaceutical companies are working together to identify and validate biomarkers for response and resistance to cancer therapies.

Prevention is also a focus. The moonshot aims to save lives by boosting the colorectal cancer screening rate among Americans 50 and older and raising HPV vaccination rates for adolescents.

The lifetime risk of cancer for American men is 1 in 2. For women it's 1 in 3. So what would a decade's worth of progress look like? According to the latest American Cancer Society figures, the cancer death rate has dropped by 23 percent since 1991, translating into more than 1.7 million deaths averted through 2012. The five-year survival rate has also increased from 49 to 69 percent. Doubling progress might mean doubling the annual reduction in cancer death rates for men to 3.6 percent and for women to 2.8 percent. That would cut the number of Americans dying of cancer from about 600,000 per year now to just above 500,000 in 2021.

But progress may happen even faster than that. The most exciting recent therapeutic breakthrough is immunotherapy—a treatment where cancer patients' immune cells are unleashed as guided missiles to kill their cancer. "It's actually plausible that in 10 years we'll have curative therapies for most if not all human cancers," declared Gary Gilliland, president and director of the Fred Hutchinson Cancer Research Center, at a conference in 2015. The good news is that the cancer moonshot may end up trailing advances that have already taken off.




Is Sugar an Addictive Poison?

2017-01-06T13:30:00-05:00

The Case Against Sugar, by Gary Taubes, Knopf, 368 pp., 26.95. Less than 1 percent of Americans—1.6 million people—were diagnosed with Type 2 diabetes in 1958. As of 2014, that figure had risen to 9.3 percent, or 29.1 million. If current trends continue, the figure could rise to more than 33 percent by 2050. Something has clearly gone wrong with American health. The rising rate of diabetes is associated with the rising prevalence of obesity. Since the early 1960s, the percent of Americans who are obese—that is, whose body mass index is greater than 30—has increased from 13 percent to 35.7 percent today. (Nearly 70 percent of Americans are overweight, meaning their BMIs are over 25.) Roughly put, the prevailing theory is that rising fatness causes rising diabetes. But what if both are caused by something else? That is the intriguing and ultimately persuasive argument that Gary Taubes, author Why We Get Fat (2011) and cofounder of the Nutrition Science Initiative, makes in his new book, The Case Against Sugar. For Taubes, sugar—be it sucrose or high-fructose corn syrup—is "the principal cause of the chronic diseases that are most likely to kill us, or at least accelerate our demise," explains Taubes at the outset. "If this were a criminal case, The Case Against Sugar would be the argument for the prosecution." In making his case, Taubes explores the "claim that sugar is uniquely toxic—perhaps having prematurely killed more people than cigarettes or 'all wars combined,' as [diabetes epidemiologist] Kelly West put it." Taubes surveys the admittedly sparse research on sugar's psychoactive effects. For example, researchers have found that eating sugar stimulates the release of dopamine, a neurotransmitter that is also released when consuming nicotine, cocaine, heroin, or alcohol. Researchers are still debating the question of whether or not sugar is, in some sense, addictive. In the course of his exploration, Taubes devastatingly shows that most nutrition "science" is bunk. Various nutritionists have sought to blame our chronic ills on such elements of our diets as fats, cholesterol, meat, gluten and so forth. Few have focused their attention on sugar. His discussion of how nutritionists started and promoted the now-debunked notion that eating fats is a significant cause of heart disease is particularly enlightening and dismaying. Nowadays the debate over the role of fats in cardiovascular disease consists mostly of skirmishes over which fats might marginally increase risk. Interestingly, Taubes finds that a good bit of the research on fats was funded by the sugar industry. It is not just a coincidence that the low-fat food craze took off when the U.S. Department of Agriculture issued its first dietary guidelines in 1980 advising Americans to eat less fat. The added sugar that made the newly low-fat versions of prepared foods more palatable contributed to the rise in sweetener consumption. The USDA guidelines did advise Americans cut back on eating sugar, but they also stated that, "contrary to widespread opinion, too much sugar in your diet does not seem to cause diabetes." By the way, Taubes agrees since both sucrose and high-fructose corn syrup are essentially half glucose and half fructose there is no important metabolic differences between them. Taubes reviews the global history of sugar consumption. The average American today eats as much sugar in two weeks as our ancestors 200 years ago consumed in a year. The U.S. Department of Agriculture estimates that per-person annual consumption of caloric sweeteners peaked at 153.1 pounds in 1999 and fell to only 131.1 pounds in 2014. A 2014 analysis of data from 165 countries found that "gross per capita consumption of sugar correlates with diabetes prevalence." So how does eating lots of sugar cause disease? Reviewing the scientific literature, Taubes suggests tha[...]



23andme.com

2016-12-30T21:00:00-05:00

In its 2013 letter shutting down 23andMe's Personal Genome Service, the Food and Drug Administration (FDA) dreamed up some entirely hypothetical problems, but cited not a single example of customer confusion or dissatisfaction with the California-based genotype screening company. At the time, 23andMe had developed an excellent, transparent, and still improving online consumer interface that enabled users to obtain and understand the significance of their genetic test results. Customers were linked to the scientific studies on which 23andMe's interpretations of their data were based. The company was helping its customers learn how to understand and use genetic information.

Before the FDA's ban, the company ranked its results using a star system. Well-established research, in which at least two big studies found an association between a customer's genetic variants and a health risk, got four stars. Very preliminary studies got one star. As more scientific research came in, 23andMe would update each customer's health risks.

In my case, 23andMe reported that I had genetic variants that increased my risk for atrial fibrillation, venous thromboembolism, and age-related macular degeneration. On the other hand, based on the genes that were tested, it informed me that my risk for gout, Alzheimer's disease, melanoma, and rheumatoid arthritis were lower than average.

The old 23andMe provided me with results related to more than 125 different health risks associated with my genetic variants. In addition, the company reported my results for more than 60 different traits that I might have inherited, and it told me how I might respond to nearly 30 different pharmaceuticals. Now the FDA merely allows 23andMe to tell me less useful information, such as the fact that I carry more Neanderthal genetic variants than 85 percent of its other customers.

In all, the company is now legally permitted to provide me with just seven "Wellness Reports," which tell me, among other things, that I am probably not lactose intolerant, I don't flush when I drink liquor, I am not likely to be a sprinter, I probably have no back hair, and my second toe is probably longer than my big toe. Thanks for nothing, FDA.




3 Technologies Will Utterly Transform Your World in the Next Decade

2016-12-30T12:05:00-05:00

Technological innovation has permanently slowed down and so too will economic growth asserts Northwestern University economist Robert Gordon. Why? Because all of the low-hanging scientific and technological fruit has supposedly been plucked. You can invent broad technologies like electrification, the light bulb, plumbing and sanitation, the telephone, refrigeration, the internal combustion engine, and the digital computer only once. Therefore most new technologies will consist of slight improvements on the old ones and that will not propel future economic growth. But have all broad technologies really been invented already? Below are three core technologies whose elaborations during the next decade will conjure into existence a world with far less transactional friction, amazing cures, and much smarter machines. The digital currency Bitcoin is the first way most folks heard of blockchain technology. By one simple definition, a blockchain is a kind of independent, transparent, and permanent database coexisting in multiple locations and shared by a community. (For nice simple explanation of how blockchains work, go here.) The beauty of a public distributed a blockchain is that records are permanent and cannot easily be falsified. It basically solves the problem of trust since everyone can see what was agreed to and what transactions actually have taken place. Much of human society is structured with the aim of establishing trust through third parties, e.g., keeping track of who owns what; what was agreed to; and was the transaction completed. Think of intermediary institutions ranging from banks, stock markets, and property registries to the coercive functions of the state to mandate currencies, and enforce contracts. Third parties are trusted to keep track of information and generally take a cut of the action for their trouble. Blockchain technology will cut out the middlemen and increase trust in records. Transactions can be securely paid for using blockchain currencies such as Bitcoin (which is rising toward $1,000 in value). My Reason colleague Jim Epstein brilliantly just reported how folks in Venezuela are using Bitcoin to survive amidst the rubble of socialism in that country. Some visionaries want to put the nation-state on the blockchain, including such functions as "an ID system based on reputation, dispute resolution, voting, national income distribution, and registration of all manner of legal documents such as land deeds, wills, childcare contracts, marriage contracts, and corporate incorporations." The new startup Publicism is developing blockchain anti-censorship tools as a way to promote free speech around the world. Of course, like all technologies, there are kinks to be worked out. Hackers have famously stolen millions in Bitcoins, investments in the DAO, and the passwords for user accounts Ethereum Project's community forum. Such incidents identify the problems and speed up the process of improving security and standardization. Biology is kludgy and complicated and therefore biomedical and biotech progress is maddeningly slow compared to digital technologies. However, precise genome editing made possible by CRISPR greatly simplifies experimentation and will speed up the development of medical therapies, biotech enhanced crops, and even enable humanity to curate wild landscapes. CRISPR genome editing is derived from what is essentially a bacterial immune system in which how bacteria protect themselves against attacking viruses. Cheap and easy to use CRISPR can edit genes much like a word processing program can edit text. And progress has been rapid. For example, Chinese researchers are using CRISPR in an attempt to boost immune responses in lung cancer patients. American researchers are soon to follow. CRISPR could be used to ameliorate a whole variety[...]



An Epidemic of Bad Epidemiology

2016-12-23T13:30:00-05:00

Getting Risk Right: Understanding the Science of Elusive Health Risks, by Geoffrey Kabat, Columbia University Press, 248 pp., $35.00 Eating bacon and ham four times a week could make asthma symptoms worse. Drinking hot coffee and tea may cause cancer of the esophagus. South Africa's minister of health warns that doggy-style sex is a major cause of stroke and cancer in men. And those claims are just drawn from the health headlines this week. The media inundate us daily with studies that seem show modern life is increasingly risky. Most of those stories must be false, given that life expectancy for American men and women respectively has risen from 71.8 and 78.8 years in 1990 to 76.3 and 81.1 years now. Apparently, we are suffering through an epidemic of bad epidemiology. When it comes to separating the wheat of good public health research from the chaff of studies that are mediocre or just plain bad, Albert Einstein College of Medicine epidemiologist Geoffrey Kabat is a national treasure. "Most research findings are false or exaggerated, and the more dramatic the result, the less likely it is to be true," he declares in his excellent new book Getting Risk Right. Kabat's earlier book, the superb Hyping Health Risks, thoroughly dismantled the prevalent medical myths that man-made chemicals, electromagnetic fields, radon, and passive smoking were significant causes of such illnesses as cancer and heart disease. His new book shows how scientific research, particularly epidemiology, so often goes wrong—and, importantly, how hard it is for it to go right. Kabat first reminds readers that finding a correlation between phenomena X and Y does not mean that X causes Y. Nevertheless many researchers are happy to overinterpret such findings to suggest causation. "If researchers can slip into this way of interpreting and presenting results of their studies," observes Kabat, "it becomes easier to understand how journalists, regulators, activists of various stripes, self-appointed health gurus, promoters of health-related foods and products, and the public can make the unwarranted leap that the study being reported provides evidence of a causal relationship and therefore is worthy of our interest." From there he moves to some principles that must be kept in mind when evaluating studies. First and foremost is the toxicological maxim that the dose makes the poison. The more exposure to a toxin, the greater the harm. Potency matters greatly too. Often very sensitive assays show that two different compounds can bind to same receptors in the body, but what really matters biologically is how avidly and how strongly one binds compared to the other. Another principle: Do not confuse hazard, a potential source of harm, with risk, the likelihood that exposure to the hazard will cause harm. Consider bacon. The influential International Agency for Research on Cancer declared bacon a hazard for cancer last year, but the agency does not make risk assessments. Eating two slices of bacon per day is calculated to increase your lifetime risk of colorectal cancer from 5 to 6 percent. Put that way, I suspect most people would continue to enjoy cured pork products. Kabat also argues that an editorial bias skews the scientific literature toward publishing positive results suggesting harms. Such findings, he notes, get more attention from other researchers, regulators, journalists, and activists. Ever since Rachel Carson's 1962 book Silent Spring wrongly linked cancer to exposures to trace amounts of pesticides, the American public has been primed to blame external causes rather than personal behaviors for their health problems. Unfortunately, as Kabat notes, the existence of an alarmed and sensitized public is all too useful to scientists and regulators. He quotes an honest b[...]



Is Economic Growth Environmentally Sustainable?

2016-12-16T14:00:00-05:00

Is economic growth environmentally sustainable? No, say a group of prominent ecological economists led by the Australian hydrologist James Ward. In a new PLoS ONE article—"Is Decoupling GDP Growth from Environmental Impact Possible?"—they offer an analysis inspired by the 1972 neo-Malthusian classic The Limits to Growth. They even suggest that The Limits to Growth's projections with regard to population, food production, pollution, and the depletion of nonrenewable resources are still on track. In other words, they think we're still heading for a collapse. I think they're wrong. But they're wrong in an instructive way. The authors describe two types of "decoupling," relative and absolute. Relative decoupling means that economic growth increases faster than rates of growth in material and energy consumption and environmental impact. Between 1990 and 2012, for example, China's GDP rose 20-fold while its energy use increased by a factor of four and its material use by a factor of five. Basically this entails increases in efficiency that result in using fewer resources to produce more value. Absolute decoupling is what happens when continued economic growth actually lessens resource use and impacts on the natural environment, that is, creating more value while using less stuff. Essentially humanity becomes richer while withdrawing from nature. To demonstrate that continued economic growth is unsustainable, the authors recycle the hoary I=PAT model devised in 1972 by the Stanford entomologist and population alarmist Paul Ehrlich and the Harvard environmental policy professor (and chief Obama science adviser) John Holdren. Human Impact on the environment is supposed to equal to Population x Affluence/consumption x Technology. All of these are presumed to intensify and worsen humanity's impact on the natural world. In Ward and company's updated version of I=PAT, the sustainability of economic growth largely depends on Technology trends. Absolute decoupling from resource consumption or pollutant emissions requires technological intensity of use and emissions to decrease by at least the same annual percentage as the economy is growing. For example, if the economy is growing at three percent per year, technological intensity must reduce 20-fold over 100 years to maintain steady levels of resource consumption or emissions. If technological intensity is faster then resource use and emissions will decline over time, which would result in greater wealth creation with ever lessening resource consumption and environmental spillovers. Once they've set up their I=PAT analysis, Ward and his colleagues assert that "for non-substitutable resources such as land, water, raw materials and energy, we argue that whilst efficiency gains may be possible, there are minimum requirements for these resources that are ultimately governed by physical realities." Among the "physical realities" they mention are limits on plant photosynthesis, the conversion efficiencies of plants into meat, the amount of water needed to grow crops, that all supposedly determine the amount of agricultural land required to feed humanity. They also cite "the upper limits to energy and material efficiencies govern minimum resource throughput required for economic production." To illustrate the operation of their version of the I=PAT equation, they apply it to a recent study that projected it would be possible for Australia's economy to grow 7-fold while simultaneously reducing resource and energy use and lowering environmental pressures through 2050. They crank the notion that there are nonsubstitutable physical limits on material and energy resources through their equations until 2100, and they find that eventually consumption of both rise at the same rate as econo[...]



Stuck

2016-12-10T07:00:00-05:00

I last visited McDowell County, West Virginia, over 40 years ago. Even then, I was already an outsider, a visitor to my family's past. Sometime around 1950, my grandparents and all six of their grown children pulled up stakes and left McDowell behind. My grandfather bought a dairy farm 100 miles away in Clinchburg, Virginia, and my father joined him after he left the Air Force in the mid-1950s. The house I grew up in didn't have a bathroom until I was 5. My sisters and I bathed in a zinc washtub using water warmed on the chunk burner in our kitchen. Since our house was heated entirely by two wood-burning stoves, I spent a good portion of my summers chopping and stacking cordwood. My upstairs bedroom was unheated, so I slept in a cast-iron bed beneath three heavy unzipped U.S. Army canvas sleeping bags to stay warm. We got a telephone when I was 13 years old; it was a five-party line. But it was the folks in McDowell—including many of my relatives—whom I thought of as poor. To my eyes, a huge number of the houses we drove past in hamlets like Squire, Cucumber, English, Bradshaw, Beartown, and Iaeger on our way to visit my father's hometown of Panther were little more than shacks. Many were covered with tarpaper. Indoor bathrooms and running water were luxuries. The houses that did have bathrooms more often than not simply ran a pipe from their sinks, tubs, and commodes directly to the nearest stream. My grandparent's old home was a nice and pretty spacious white clapboard house, but they got water from an outside hand pump and resorted to a first-class outhouse to answer nature's call. The water tasted distinctly of iron and sulfur. Except deep inside Panther State Forest, where the Bailey family held our annual Labor Day reunion, coal dust coated most buildings and automobiles. I do not long for the chilly, dusty, impoverished life I remember—my experience of the past is whatever the opposite of nostalgia is—but in retrospect, I was witnessing the tail end of McDowell's golden era. Mechanization, especially the development of continuous mining machines, enabled coal companies to mine much more coal with many fewer workers. Out of a population of nearly 100,000 in 1950, 15,812 worked as miners. By 1960 that number was just 7,118. Today there are only about 1,000 employees working for coal companies in the county, out of a population of less than 20,000. The county's dwindling economic prospects were further devastated by massive floods in 2001 and 2002 that destroyed hundreds of houses and businesses and killed four people. In recent years, McDowell has attracted attention for the worst possible reasons. It consistently shows up at the bottom of rankings, with the lowest levels of employment and the worst level of overall health in West Virginia, and the shortest male life expectancy in the nation. But it sits very near the top of lists of counties with the most drug overdoses, obesity, and suicides. The rather unsentimental question I set out to answer as I made my way back this autumn: Why don't people just leave? Bad News One sign things are not going well in your county is when the kids in the social service programs know to do pre-emptive damage control with the press. "Don't you focus just on the negative," warned Destiny Robertson, a spunky African-American senior at Mount View High School and a participant in the Broader Horizons program for at-risk kids devised by the Reconnecting McDowell task force. But it's hard not to focus on the negative when it can seem like that's all there is. Asked about their hometown, the kids shout out the usual list of woes with a world-weary attitude: bad schools, no jobs, drug addiction. They're right to worry: McDowell County has been the[...]



Despite Trump, Is the Clean Energy Future Inevitable?

2016-12-09T13:30:00-05:00

No matter who's running the government, America's "transition to a clean energy economy is irrevocably underway," the Natural Resources Defense Council asserted in its Accelerating into a Clean Energy Future report this week. Report co-author Ralph Cavanagh added, "The nationwide momentum for pollution-free energy is undeniable and irresistible because clean energy now costs less than dirty energy." As if to confirm the Council's claim, the infotech giant Google announced this week by the end of next year, its global operations will be fueled 100 percent by electricity generated by renewable sources. ("The science tells us that tackling climate change is an urgent global priority," the company's press release explained.) This does not mean that Google gets its electricity directly from solar panels on the roofs of its data centers or from wind turbines churning away on its corporate campuses. The company basically makes purchasing commitments to renewable projects that offset the conventionally generated electricity that it gets from local utilities. There is, of course, nothing wrong with a business legally adopting measures that it thinks are in the best interests of its customers and shareholders. If the company is on the wrong track, those stakeholders will let Google's executives know through their purchasing and investment choices. But if clean energy really does cost less than dirty energy, then what is there to resist? In that case, surely the invisible hand of the marketplace will make the transition to a clean-energy economy irrevocable. So can we all put aside our worries about catastrophic climate change? Not so fast. You see, policies are needed. Google notes that during "the last six years, the cost of wind and solar came down 60 percent and 80 percent, respectively, proving that renewables are increasingly becoming the lowest cost option." Yet even as proponents insist that clean energy now outcompetes fossil fuels, they nevertheless want to enhance their irrevocablabilty with a little help from the government. As Google obliquely puts it, "We believe the private sector, in partnership with policy leaders, must take bold steps." What might that "partnership" look like? Google doesn't say, but you can get a sense of what might be required by reading From Risk to Return, a new report from the Risky Business Project. This group is supported by the media mogul Michael Bloomberg, the Bush-era treasury secretary Henry Paulsen, and the hedge fund manager and prominent Democratic Party donor Thomas Steyer. Its report presents four pathways toward restructuring America's energy infrastructure, with the goal of cutting U.S. carbon dioxide emissions 80 percent by 2050. While the paper does not favor any of those four pathways—renewables, nuclear, carbon capture, and a mix—it focuses mostly on the costs and benefits of the fourth, which reduces emissions via a combination of renewables, nuclear, carbon emissions captured from fossil fuels, and the transformation of transportation toward reliance on electricity, hydrogen, and biofuels. By 2050, the report projects, the extra expenditures for building out low-carbon energy production and consumption infrastructure would be more than offset by fuel costs. The authors argue that clean energy is unfortunately not yet ready to compete head-on with fossil fuels. "The private sector alone cannot solve the climate change problem," the Risky Business report concludes. "We know from our collective business and investment experience that the private sector will take action at the necessary speed and scale only if it is given a clear and consistent policy and regulatory framework." What sort of policies do they think[...]



Aging Is a Disease and It's Time to Cure It

2016-12-02T13:30:00-05:00

Emma Morano turned 117 on Tuesday. The Italian woman is, as far as we know, the oldest person in the world and the only living person who was born in the 1800s. The secret for her longevity? Eating three raw eggs a day and being single since 1938. The person known to have lived the longest ever was Jeanne Calment, who died in 1997 at 122 years of age. In October, Nature published an article, "Evidence for a limit to human lifespan," by three researchers associated with the Albert Einstein College of Medicine in the Bronx. Noting that the longest known lifespan has not increased since the 1990s, they argue that there is a fundamental limit to human longevity. The occasional outlier aside, they think that limit is about 115 years. Maybe, maybe not. In the 21st century, almost everything that kills people, except for accidents and other unintentional causes of death, has been classified as a disease. Aging kills, so it's past time to declare it a disease too and seek cures for it. In 2015, a group of European gerontologists persuasively argued for doing just that. They rejected the common fatalistic notion that aging "constitutes a natural and universal process, while diseases are seen as deviations from the normal state." A century ago osteoporosis, rheumatoid arthritis, high blood pressure, and senility were considered part of normal aging, but now they are classified as diseases and treated. "There is no disputing the fact that aging is a 'harmful abnormality of bodily structure and function,'" they note. "What is becoming increasingly clear is that aging also has specific causes, each of which can be reduced to a cellular and molecular level, and recognizable signs and symptoms." So why do people age and die? Basically, because of bad chemistry. People get cancer when chemical signals go haywire enabling tumors to grow. Heart attacks and strokes occur when chemical garbage accumulates in arteries and chemical glitches no longer prevent blood cells from agglomerating into dangerous clumps. The proliferation of chemical errors inside our bodies' cells eventually causes them to shut down and emit inflammatory chemicals that damage still healthy cells. Infectious diseases are essentially invasions of bad chemicals that arouse the chemicals comprising our immune systems to try and (too often) fail to destroy them. Also in 2015, another group of European researchers pointed out that we've been identifying a lot of biomarkers for detecting the bad chemical changes in tissues and cells before they produce symptoms associated with aging. Such biomarkers enable pharmaceutical companies and physicians to discover and deploy treatments that correct cellular and molecular malfunctions and nudge our bodies' chemistry back toward optimal functioning. As a benchmark, the researchers propose the adoption of an "ideal norm" of health against which to measure anti-aging therapies. "One approach to address this challenge is to assume an 'ideal' disease-free physiological state at a certain age, for example, 25 years of age, and develop a set of interventions to keep the patients as close to that state as possible," they suggest. Most people's body chemistry is at its best when they are in their mid-twenties. In fact, Americans between ages 15 and 24 are nearly 500 times less likely to die of heart disease, 100 times less likely to die of cancer, and 230 times less likely die of influenza and pneumonia than people over the age of 65 years. For lots of us who are no longer in our twenties, television talk show host Dick Cavett summed it up well: "I don't feel old. I feel like a young man that has something wrong with him." Meanwhile, lots of progress has[...]