The Myth of Big, Bad Gluten

By MOISES VELASQUEZ-MANOFF
New York Times
By permission from the author.
 

AS many as one in three Americans tries to avoid gluten, a protein found in wheat, barley and rye. Gluten-free menus, gluten-free labels and gluten-free guests at summer dinners have proliferated.

Some of the anti-glutenists argue that we haven’t eaten wheat for long enough to adapt to it as a species. Agriculture began just 12,000 years ago, not enough time for our bodies, which evolved over millions of years, primarily in Africa, to adjust. According to this theory, we’re intrinsically hunter-gatherers, not bread-eaters. If exposed to gluten, some of us will develop celiac disease or gluten intolerance, or we’ll simply feel lousy.

Most of these assertions, however, are contradicted by significant evidence, and distract us from our actual problem: an immune system that has become overly sensitive.

Wheat was first domesticated in southeastern Anatolia perhaps 11,000 years ago. (An archaeological site in Israel, called Ohalo II, indicates that people have eaten wild grains, like barley and wheat, for much longer — about 23,000 years.)

Is this enough time to adapt? To answer that question, consider how some populations have adapted to milk consumption. We can digest lactose, a sugar in milk, as infants, but many stop producing the enzyme that breaks it down — called lactase — in adulthood. For these “lactose intolerant” people, drinking milk can cause bloating and diarrhea. To cope, milk-drinking populations have evolved a trait called “lactase persistence”: the lactase gene stays active into adulthood, allowing them to digest milk.

Milk-producing animals were first domesticated about the same time as wheat in the Middle East. As the custom of dairying spread, so did lactase persistence. What surprises scientists today, though, is just how recently, and how completely, that trait has spread in some populations. Few Scandinavian hunter-gatherers living 5,400 years ago had lactase persistence genes, for example. Today, most Scandinavians do.

Here’s the lesson: Adaptation to a new food stuff can occur quickly — in a few millenniums in this case. So if it happened with milk, why not with wheat?

“If eating wheat was so bad for us, it’s hard to imagine that populations that ate it would have tolerated it for 10,000 years,” Sarah A. Tishkoff, a geneticist at the University of Pennsylvania who studies lactase persistence, told me.

For Dr. Bana Jabri, director of research at the University of Chicago Celiac Disease Center, it’s the genetics of celiac disease that contradict the argument that wheat is intrinsically toxic.

 

 

Active celiac disease can cause severe health problems, from stunting and osteoporosis to miscarriage. It strikes a relatively small number of people — just around 1 percent of the population. Yet given the significant costs to fitness, you’d anticipate that the genes associated with celiac would be gradually removed from the gene pool of those eating wheat.

A few years ago, Dr. Jabri and the population geneticist Luis B. Barreiro tested that assumption and discovered precisely the opposite. Not only were celiac-associated genes abundant in the Middle Eastern populations whose ancestors first domesticated wheat; some celiac-linked variants showed evidence of having spread in recent millenniums.

People who had them, in other words, had some advantage compared with those who didn’t.

Dr. Barreiro, who’s at the University of Montreal, has observed this pattern in many genes associated with autoimmune disorders. They’ve become more common in recent millenniums, not less. As population density increased with farming, and as settled living and animal domestication intensified exposure to pathogens, these genes, which amp up aspects of the immune response, helped people survive, he thinks.

In essence, humanity’s growing filth selected for genes that increase the risk of autoimmune disease, because those genes helped defend against deadly pathogens. Our own pestilence has shaped our genome.

The benefits of having these genes (survival) may have outweighed their costs (autoimmune disease). So it is with the sickle cell trait: Having one copy protects against cerebral malaria, another plague of settled living; having two leads to congenital anemia.

But there’s another possibility: Maybe these genes don’t always cause quite as much autoimmune disease.

Perhaps the best support for this idea comes from a place called Karelia. It’s bisected by the Finno-Russian border. Celiac-associated genes are similarly prevalent on both sides of the border; both populations eat similar amounts of wheat. But celiac disease is almost five times as common on the Finnish side compared with the Russian. The same holds for other immune-mediated diseases, including Type 1 diabetes, allergies and asthma. All occur more frequently in Finland than in Russia.

"Specifically, the gliadin and glutenin are acting as immunogenic anti-nutrients. [Grains] create an immunogenic response which increases...

First world countries have the luxury of access to a large amount of a variety of foods, whether rich or poor. Third world countries, where...

WHAT’S the difference? The Russian side is poorer; fecal-oral infections are more common. Russian Karelia, some Finns say, resembles Finland 50 years ago. Evidently, in that environment, these disease-associated genes don’t carry the same liability.

Are the gluten haters correct that modern wheat varietals contain more gluten than past cultivars, making them more toxic? Unlikely, according to recent analysis by Donald D. Kasarda, a scientist with the United States Department of Agriculture. He analyzed records of protein content in wheat harvests going back nearly a century. It hasn’t changed.

Do we eat more wheat these days? Wheat consumption has, in fact, increased since the 1970s, according to the U.S.D.A. But that followed an earlier decline. In the late 19th century, Americans consumed nearly twice as much wheat per capita as we do today.

We don’t really know the prevalence of celiac disease back then, of course. But analysis of serum stored since the mid-20th century suggests that the disease was roughly one-fourth as prevalent just 60 years ago. And at that point, Americans ate about as much wheat as we do now.

Overlooked in all this gluten-blaming is the following: Our default response to gluten, says Dr. Jabri, is to treat it as the harmless protein it is — to not respond.

So the real mystery of celiac disease is what breaks that tolerance, and whatever that agent is, why has it become more common in recent decades?

An important clue comes from the fact that other disorders of immune dysfunction have also increased. We’re more sensitive to pollens (hay fever), our own microbes (inflammatory bowel disease) and our own tissues (multiple sclerosis).

Perhaps the sugary, greasy Western diet — increasingly recognized as pro-inflammatory — is partly responsible. Maybe shifts in our intestinal microbial communities, driven by antibiotics and hygiene, have contributed. Whatever the eventual answer, just-so stories about what we evolved eating, and what that means, blind us to this bigger, and really much more worrisome, problem: The modern immune system appears to have gone on the fritz.

Maybe we should stop asking what’s wrong with wheat, and begin asking what’s wrong with us.

 

 

 

Knives Are Out for No-Show Diners

By Sumathi Reddy
THE WALL STREET JOURNAL
 

The morning after two groups of diners didn't show up at the restaurant Noma in Copenhagen last month, chef and co-owner René Redzepi took to Twitter. "And now a message from the Noma staff: to the people of two different no-show tables last night," he wrote, and sent a picture of staff members showing their middle fingers.

The tweet, deleted shortly after it was posted, was a joke, says Peter Kreiner, managing director of Noma. But at a restaurant that has just 12 tables and takes in as much as $500 per person for a meal, no-shows aren't taken lightly. "It's quite a large percentage of the sales that we missed out on," he says.
Restaurants Take Action Against No-Shows

As more people don't show up for their reservations, some high-end restaurants are taking action, from charging no-shows to shaming them on social media. Sumanthi Reddy has details on Lunch Break.
Fickle diners are every restaurant's worst nightmare. A select group of high-end chefs and restaurants are fighting back—from charging people who don't cancel in time to using Twitter and other social media to call out no-shows.
The impact of an empty table can be significant in an industry where average profit margins run as low as 3% to 5%. In cities like New York, it's not unusual to find 20% of diners unaccounted for on any given night.
 
Torrisi Italian Specialties of New York City is among the restaurants that charge people who don't show up for a reservation, in an effort to stave off no-shows. Ryan Lee
Restaurant owners expend tremendous resources trying to confirm reservations. Some restaurants, like Wylie Dufresne's wd~50, will turn down a reservation from someone with a history of not showing up. Other chefs, like Ron Eyester of Rosebud in Atlanta, will jot down a note if a diner seems wavering on the phone so that the staff knows not to hold the empty table too long.
A number of high-end restaurants now require credit-card numbers from anyone reserving a table. Some, like Hearth in New York and Cochon in New Orleans, seeks credit cards only for larger parties and for special occasions. Others, like Eleven Madison Park in New York and Coi in San Francisco, extend the policy to parties of any size.
 
At Chicago's Next, a nonrefundable-ticket system has left the restaurant with virtually no empty tables. Next
In January, Eleven Madison began charging anyone who didn't show up or cancel a reservation 48 hours beforehand $75 a head. Owner Will Guidara says the restaurant was losing eight to 10 people per night. He adds, "With the length of our waitlist and how many people were turning away, it just became really difficult to say, 'No, no, no,' to so many people and then have people who were supposed to be joining us just not showing up."
Since the policy has been in place, Mr. Guidara says he has had to charge only a couple of cards a week.
According to online-reservation system OpenTable, 10% of restaurants nationally seek credit-card numbers for certain reservations, while about 15% of restaurants in New York do so. Those numbers have been trending down, the company says.
 
Manhattan's Eleven Madison Park requires credit cards for reservations and charge people who don't show. Eleven Madison Park / Francesco Tonelli
But Sherri Kimes, a professor at the Cornell School of Hotel Administration, thinks the practice will only increase. Ms. Kimes says her research has found that consumers are open to being charged for last-minute cancellations—as long as restaurants keep up their end of the bargain. "When the customer shows up… their table better be ready," she says.
In Australia, a campaign to publicly name no-show diners through Twitter has been gaining steam. Erez Gordon, the owner of Sydney's Bistro Bruno, said in an email that he has outed customers just a few times when they failed to respond to his calls. He likened it to diners' jumping online to anonymously rate restaurants. "With Twitter, we are given the opportunity to respond in exactly the same manner as our guests respond if they feel we have let them down," he said.
In the U.S., too, frustrations run high. "Every single day I will look at how the previous night went and every single day there's upwards of 40—four, zero—no-shows at Nobu," says Drew Nieporent, owner of the Myriad Restaurant Group.
Mr. Nieporent has called people the next day to find out why they didn't show up. "Quite frankly, it's worse now, because with online reservations we're not even speaking to the customer," he says. "So it could be someone in theory who is a concierge at a hotel or a broker who can book prime-time tables 30 days in advance, hold on to tables for 29 days and maybe if they feel like it, call to cancel."
 
San Francisco's Coi also imposes fees for no-shows. Coi
Often, the price charged for a no-show doesn't compensate a restaurant for its loss. At New York City's Del Posto and Jean-Georges, the no-show fee for OpenTable.com reservations is $50 a head. In October, Mr. Nieporent's Corton began requiring credit cards to reserve tables on Friday and Saturday nights and charging no-shows a $50 fee if they don't cancel 48 hours ahead.
Daniel Patterson, the owner of Coi, says that when he started a $25 and then a $50 penalty for no-shows about three years ago, he saw few results. It wasn't until he upped the amount to $100 that the rate dropped from 20% to 10%. "Our menu is $165, so we're still losing money," he says. "It's really not about charging people. It's really more about making sure they're serious about the reservation."
Other restaurants charge more. When Torrisi Italian Specialties in Manhattan began accepting reservations in November, it chose to charge diners for the full $125 tasting menu if they don't cancel 24 hours ahead. Diners who reserve its shorter $60 menu have until 4 p.m. that day.
At the Chef's Table at Brooklyn Fare, where reservations are snapped up six weeks ahead of time, consumers pay the full $225 prix-fixe price about a week in advance.
Perhaps most radical is the system started last year at Grant Achatz's Chicago restaurant Next. To dine there, customers must buy nonrefundable tickets for a meal in advance. A dynamic pricing system makes tickets at prime times pricier. Mr. Achatz's business partner, Nick Kokonas, says the system has been so successful they plan to use it at their Alinea restaurant.
Mr. Kokonas is working on a system for other restaurants. Another Chicago restaurant will pilot-test it soon. He sees one reaction from restaurateurs: "Show me how to do it."

 

Print Print | Sitemap
© ARIOSO / website by Julian J Sibony / 2013 ©

E-mail