Is Parkinson’s Disease One Disease Entity? | Brain Blogger
We hear of Parkinson’s disease very often these days. This is indeed a
very common neurological disorder affecting about 6 million people
worldwide. The disease is characterized by a selective loss of
dopaminergic neurons in certain parts of the brain. That causes muscle
rigidity, tremors, bradykinesia (slowness of movement) and problems in
posture.
Decades of research brought some treatments that can delay the onset
of Parkinson’s disease and try to counterbalance the endogenous dopamine
deficit. Unfortunately, we still don’t have a treatment capable to stop
the neuronal death and provide cure for the disease.
The current mainstream treatments for Parkinson’s disease were
introduced around thirty years ago. They focused on dopamine systems and
motor symptoms of advanced disease, and included dopatherapy with
dopamine agonists and a monoamine oxidase B inhibitor selegiline. More
agonists and inhibitors were introduced over the years, but the basic
approach did not change. Parkinson’s disease still remains a serious
condition leading to disability. Newer molecular targets, bio-markers
and better understanding of molecular mechanisms of this condition are
necessary.
The fact that our understanding of Parkinson’s disease is very
limited is clearly indicated by the fact that not all patients respond
to the existing treatments such as levodopa, major dopamine agonist in
clinical practice. Part of the answer to the question why some patients
respond to the existing treatments and others don’t lies in the fact
that Parkinson’s disease is really an umbrella term for a number of
conditions with similar symptoms. Diagnosis of disease is still based on
the descriptive definition provided by James Parkinson almost 200 years
ago. The conditions covered by this description, however, are not all
the same and can be caused by very different genetic and environmental
factors. This has obvious implications for the development and
application of any potential drugs targeting the disease.
Small portion (about 10%) of all Parkinson’s cases are related to
genetics. Mutations in several genes were identified as risk factors in
the development of the condition. The involvement of three genes,
Parkin, PINK1 and DJ1, in the disease pathogenesis seems to be linked to
their neuroprotective properties. They encode proteins that counteract
oxidative stress, prevent damage to mitochondrial DNA and are essential
for effective work of the ubiquitin-proteasome system. However, this is
not specific for the Parkinson’s disease. These three gene products play
equally critical role in a wide spectrum of neurodegenerative
disorders. Is there anything more specific that can cause Parkinson’s
disease, rather than any other neurodegeneration, in either humans or
animals?
It seems that Parkinson’s disease pathogenesis requires not only the
genetic susceptibility, but also environmental exposures to harmful
chemicals and ageing. Genetic factors alone are not enough to cause the
disease. In 90% of cases, the disease is sporadic without any clear
genetic basis.
Current evidences suggest that oxidative stress, abnormal protein
aggregation and mitochondrial disfunction are possible early triggers of
cell death in Parkinson’s disease. Parkinson’s disease can be induced
by mitochondrial toxins MTPT and its metabolite MPP+, as well as
pesticides rotenone and paraquat in both animals and humans.
Lewy bodies are abnormal aggregates of proteins observed inside the
nerve cells of Parkinson’s patients. The formation of alpha-synuclein
and tau inclusions in Lewy bodies in certain neurons is the most
distinctive anatomical feature of the disease. Cell death in Parkinson’s
disease is connected to both oxidative stress and accumulation of
alpha-synuclein. Abnormal accumulation of alpha-synuclein can also
produce oxidative damage to both mitochondria and dopamine.
However, neither Lewy bodies nor alpha-synuclein and tau inclusions
are exclusive for Parkinsonism. They are seen in the broad spectrum of
other neurologic condition broadly classified as synucleopathies,
tauopathies and Lewy body disorder on the basis of presence of the above
characteristic features. In Parkinson’s disease, these individual
pathological features can be present in some patients and absent in
others.
It seems that more diagnostic categories will be required in the
future to properly characterize the sub-classes of Parkinson’s disease.
What is now called “Parkinsonism” can include clinical Parkinson’s
Syndrome, Lewy Body Parkinson’s Disease, several Lewy Body disorders and
synucleinopathies, and taupathies with various aetiology.
The Queen Square Brain Bank for Neurological Disorders (QSBB) has
issued the list of criteria that should be used for clinical diagnostics
of Parkinson’s disease. It includes (a) clinical diagnostics criteria,
(b) genetic testing for mutations in alpha-synuclein gene SNCA in
patients with family history of disease; mutations in leucine-rich
repeat kinase 2 (LPRK2) and glucocerebrosidase (GPA) in sporadic
patients; testing of parkin, PINK1 and DJ-1 in patients with early onset
of the condition, with additional testing of several genes if these
come out negative, (c) panel of various tests (neuroimaging), and (d)
response to levodopa. This long list alone is in itself a manifestation
of the fact that Parkinson’s disease can be caused or triggered by a
variety of factors and therefore cannot be considered as just a single
disease entity.
In the diagnostics of Parkinsonism, the DNA analysis must become a
compulsory element. Improved sub-typing on the basis of genetic data
might improve the prediction of possible disease outcome. A red tulip, a
symbol of the disease, should make researchers and clinicians to
consolidate their efforts in developing and refining therapies to
conquer this devastating illness.
18.3.14
There is More To Glucose Control Than Carbohydrates (4/?):
There is More To Glucose Control Than Carbohydrates
SuppVersity - Nutrition and Exercise Science for Everyone
In
the last three weeks we've already covered the effects of protein, fat
and vitamin D in this series about the "non-carbohydrate"
(micro-)nutrients which have an impact on your blood glucose levels (browse the previous installments).
With
vitamin D as the topic of the last installment, it appears only logical
to jump from vitamins to minerals and take a look at the "bone mineral"
calcium, of which scientists have long believed that its management was
the main, if not the only function of vitamin D.
In view of the
fact that the word "calcium" did not even appear in last week's
installment about the "sunshine vitamin", it may appear questionable,
whether it would even be worth taking a closer look at the soft gray
alkaline earth metal. As a SuppVersity reader who has read my previous articles
about calcium, you will yet be aware that this would be as
inappropriate as the shortsighted idea that the only function of 25OHD
was to control the amount of calcium in your blood and bones.
There is more to calcium than bone health, but is glucose management part of the "more"?
There
is in fact a plethora of studies to suggest that dietary calcium
(specifically from dairy products; see Fumeron. 2011) and, in some
cases, also supplements could have beneficial effects on the blood
glucose levels of healthy and diabetic subjects (in some cases w/,
sometimes w/out vitamin D supplementation; e.g. Pittas. 2007).
Vitamin
D unquestionably is a top candidate for t One of those vitamin D +
calcium studies was conducted by Joanna Mitri et al. in 2011. In their
study, the researchers tested the effects of 2000IU vitamin D
(cholecalciferol) in conjunction with 2x400 mg calcium per dayt on the
pancreatic β cell function, insulin sensitivity, and glycemia in adults
at high risk of diabetes. The marginal improvements in β cell function
minimal attenuation of the rise in HbA1c Mitri et al. observed in the
course of the 16 week study are yet by no means what study titles such
as "Regulation of adiposity and obesity risk by dietary calcium:
mechanisms and implications." (Zemel. 2002) would suggest.
The
reasons for this discrepancy will yet become obvious, if we take a look
at the results from well-controlled animal trials: While there is albeit
inconclusive evidence that high calcium diets markedly inhibit
lipogenesis, accelerate lipolysis, increase thermogenesis and suppress
fat accretion and weight gain, and conclusive evidence that they can
promote a modest energy loss through increased fecal fat excretion
(Soares. 2010), papers that would confirm direct beneficial effects of
calcium on glucose metabolism are rare: Even the often-cited effects
Beaulieu et al. observed in a 1993 study are "vitamin D depenent", i.e.
they occur only when the subjects are vitamin D depleted (interestingly,
these observations were made in the absence of vitamin D
supplementation; cf. Beaulieu. 1993).
supplements, the following examples from the contemporary scientific
literature do thus not support the often-heard claim that calcium
supplements would have beneficial effects on insulin sensitivity:
guess one of the most important reasons that the myth of the
anti-diabetic effects of calcium supplements are so die-hard is the
difference between the short and long-term effects of high calcium meals
vs. diets:
the list of "high calcium" foods, which ranges from dairy (obviously),
over broccoli, kale, water cress, peas, beans, almonds, brazil nuts, to
sardines, salmon, apricots, and figs, it's actually no wonder that
eating a diet that's naturally high in calcium is going to be beneficial
for your glucose metabolism.
If, on the other hand, the
addition of a bunch of calcium carbonate pills on top of the standard
(obesogenic) Werstern diet, would protect you against diabesity, those
5% of the US population who are taking calcium or calcium containing
supplements on a regular basis (Radimer. 2004) would have to be lean and
insulin sensitive... needless to say that this is not the case, right?
all is said and done, there are two fundamental conclusion you can take
home from today's fourth installment of this series (browse
the previous installments). The first one is that there is ample
evidence that (a) a sufficient intake of calcium (800-1200mg total) is
an important prerequisite for optimal glucose management and that (b)
high calcium meals, due to their GLP-1-powered (learn more about GLP-1) thermogenic and "fat burning" effects are another valuable tools in your weight loss toolbox.
The second one, on the other hand, will probably sound less exciting to the supplement maniacs among the SuppVersity
readers. Conclusion #2 is after all: If you are eating a whole foods
diet with significant amounts of dairy and leafy greens in it, and
consume a calcium containing mineral water (in Germany 90% of the
tapwater qualifies as "mineral water), the use of supplements is at
best useless, at worst detrimental to your health (think of the rumors
about Ca supps and Prostate cancer, for example).
SuppVersity - Nutrition and Exercise Science for Everyone
Healthy due to calcium? |
the last three weeks we've already covered the effects of protein, fat
and vitamin D in this series about the "non-carbohydrate"
(micro-)nutrients which have an impact on your blood glucose levels (browse the previous installments).
With
vitamin D as the topic of the last installment, it appears only logical
to jump from vitamins to minerals and take a look at the "bone mineral"
calcium, of which scientists have long believed that its management was
the main, if not the only function of vitamin D.
In view of the
fact that the word "calcium" did not even appear in last week's
installment about the "sunshine vitamin", it may appear questionable,
whether it would even be worth taking a closer look at the soft gray
alkaline earth metal. As a SuppVersity reader who has read my previous articles
about calcium, you will yet be aware that this would be as
inappropriate as the shortsighted idea that the only function of 25OHD
was to control the amount of calcium in your blood and bones.
There is more to calcium than bone health, but is glucose management part of the "more"?
There
is in fact a plethora of studies to suggest that dietary calcium
(specifically from dairy products; see Fumeron. 2011) and, in some
cases, also supplements could have beneficial effects on the blood
glucose levels of healthy and diabetic subjects (in some cases w/,
sometimes w/out vitamin D supplementation; e.g. Pittas. 2007).
You can learn more about this topic at the SuppVersity
D unquestionably is a top candidate for t One of those vitamin D +
calcium studies was conducted by Joanna Mitri et al. in 2011. In their
study, the researchers tested the effects of 2000IU vitamin D
(cholecalciferol) in conjunction with 2x400 mg calcium per dayt on the
pancreatic β cell function, insulin sensitivity, and glycemia in adults
at high risk of diabetes. The marginal improvements in β cell function
minimal attenuation of the rise in HbA1c Mitri et al. observed in the
course of the 16 week study are yet by no means what study titles such
as "Regulation of adiposity and obesity risk by dietary calcium:
mechanisms and implications." (Zemel. 2002) would suggest.
The
reasons for this discrepancy will yet become obvious, if we take a look
at the results from well-controlled animal trials: While there is albeit
inconclusive evidence that high calcium diets markedly inhibit
lipogenesis, accelerate lipolysis, increase thermogenesis and suppress
fat accretion and weight gain, and conclusive evidence that they can
promote a modest energy loss through increased fecal fat excretion
(Soares. 2010), papers that would confirm direct beneficial effects of
calcium on glucose metabolism are rare: Even the often-cited effects
Beaulieu et al. observed in a 1993 study are "vitamin D depenent", i.e.
they occur only when the subjects are vitamin D depleted (interestingly,
these observations were made in the absence of vitamin D
supplementation; cf. Beaulieu. 1993).
Protein or calcium:
Specifically in the case of the "dairy calcium" studies it's difficult,
in many cases even impossible, to know whether the beneficial effects
on blood glucose homeostasis are brought about by their high calcium and
not by their high protein content and/or quality. Intervention studies
with high calcium intake as a single variable, on the other hand, are
scarce. It's thus most likely that it's the synergy of the two - a
synergy you can get in concentrated form from dairy protein supplements
(see box in the bottom line).
At least in the case of calcium Specifically in the case of the "dairy calcium" studies it's difficult,
in many cases even impossible, to know whether the beneficial effects
on blood glucose homeostasis are brought about by their high calcium and
not by their high protein content and/or quality. Intervention studies
with high calcium intake as a single variable, on the other hand, are
scarce. It's thus most likely that it's the synergy of the two - a
synergy you can get in concentrated form from dairy protein supplements
(see box in the bottom line).
supplements, the following examples from the contemporary scientific
literature do thus not support the often-heard claim that calcium
supplements would have beneficial effects on insulin sensitivity:
- As an adjunct to an energy reduced diet,
1,000mg/day of supplemental calcium will have no effect on either
insulin sensitivity or the changes in body composition (Shalileh. 2010).
guess one of the most important reasons that the myth of the
anti-diabetic effects of calcium supplements are so die-hard is the
difference between the short and long-term effects of high calcium meals
vs. diets:
- Acutely, calcium supplements will have
"beneficial" effects on the postprandial expression of hormones that are
involved in the control of blood glucose, because it will augment
the postprandial production of glucose-dependent insulinotropic peptide
(GIP), glucagon-like peptide-1 (GLP-1).Figure
1: Difference in plasma GIP, GLP, insulin, glucose, lactate and NEFA
levels after the ingestion of a standardized breakfast w/ 248mg vs.
1,239mg calcium (Gonzalez. 2013).
the downstream effects of isocaloric breakfasts providing 0.5 g
carbohydrate/kg body mass (energy: 1,258 ± 33 kJ, 299 ± 8 kcal; protein:
11 ± 0 g; carbohydrate: 41 ± 1 g and fat: 10 ± 0 g) with either 248mg
or 1,239mg of calcium on the blood glucose levels of the young, healthy,
physically active study participants of the Gonzales study are
negligible.
The increase in GLP-1 & Co is still not useless:
On the contrary, it's not unlikely that ~150% increases in postprandial
fatty acid oxidation and the protein sparing effects of dairy calcium
Nicola Cummings et al. observed in a three-way cross-over study in which
subjects were randomly provided breakfast meals either low in dairy Ca,
high in non-dairy Ca (calcium citrate; see figure to the left, values expr. rel. to low CA), or high in dairy Ca are eventually triggered by said changes in GLP-1 & Co (Cummings. 2006).
On the contrary, it's not unlikely that ~150% increases in postprandial
fatty acid oxidation and the protein sparing effects of dairy calcium
Nicola Cummings et al. observed in a three-way cross-over study in which
subjects were randomly provided breakfast meals either low in dairy Ca,
high in non-dairy Ca (calcium citrate; see figure to the left, values expr. rel. to low CA), or high in dairy Ca are eventually triggered by said changes in GLP-1 & Co (Cummings. 2006).
- In
the long run, on the other hand, any beneficial effects on blood
glucose management (if they occur at all) are probably "side effects"
of the accumulating beneficial effects on lipid metabolism, body weight
and energy balance of high calcium diets. It is furthermore not clear
to which extend these benefits are eventually driven by additional /
synergistic nutrients in dairy - the "calcium source of choice" in ~90%
of the pertinent long(er) term studies.
the list of "high calcium" foods, which ranges from dairy (obviously),
over broccoli, kale, water cress, peas, beans, almonds, brazil nuts, to
sardines, salmon, apricots, and figs, it's actually no wonder that
eating a diet that's naturally high in calcium is going to be beneficial
for your glucose metabolism.
If, on the other hand, the
addition of a bunch of calcium carbonate pills on top of the standard
(obesogenic) Werstern diet, would protect you against diabesity, those
5% of the US population who are taking calcium or calcium containing
supplements on a regular basis (Radimer. 2004) would have to be lean and
insulin sensitive... needless to say that this is not the case, right?
Mind your total Ca intake if you use dairy protein:
Unless you have been bamboozled into buying overpriced overprocessed
specialty whey & casein products, the latter can easily provide you
with a whopping 200mg (whey) and 500mg (casein) of calcium per serving... maybe another reason they help you to get and stay lean and insulin sensitive?
So what's the verdict then? WhenUnless you have been bamboozled into buying overpriced overprocessed
specialty whey & casein products, the latter can easily provide you
with a whopping 200mg (whey) and 500mg (casein) of calcium per serving... maybe another reason they help you to get and stay lean and insulin sensitive?
all is said and done, there are two fundamental conclusion you can take
home from today's fourth installment of this series (browse
the previous installments). The first one is that there is ample
evidence that (a) a sufficient intake of calcium (800-1200mg total) is
an important prerequisite for optimal glucose management and that (b)
high calcium meals, due to their GLP-1-powered (learn more about GLP-1) thermogenic and "fat burning" effects are another valuable tools in your weight loss toolbox.
The second one, on the other hand, will probably sound less exciting to the supplement maniacs among the SuppVersity
readers. Conclusion #2 is after all: If you are eating a whole foods
diet with significant amounts of dairy and leafy greens in it, and
consume a calcium containing mineral water (in Germany 90% of the
tapwater qualifies as "mineral water), the use of supplements is at
best useless, at worst detrimental to your health (think of the rumors
about Ca supps and Prostate cancer, for example).
15.3.14
What's this subreddit's opinion on Dave Asprey (The Bulletproof Executive)? : Nootropics
What's this subreddit's opinion on Dave Asprey (The Bulletproof Executive)? : Nootropics
What's this subreddit's opinion on Dave Asprey (The Bulletproof Executive)? (self.Nootropics)
submitted ago by grarl
all 28 comments
sorted by:
best
[–]ZombieChalmers 14 points ago
- permalink
12.3.14
Healthy Cooking Oils - The Ultimate Guide
Healthy Cooking Oils - The Ultimate Guide
You have many options when it comes to selecting fats and oils for cooking.
But it’s not just a matter of choosing oils that are healthy, but also whether they stay healthy after having been cooked with.
When oils undergo oxidation, they react with oxygen to form free radicals and harmful compounds that you definitely don’t want to be consuming.
The most important factor in determining an oil’s resistance to
oxidation and rancidification, both at high and low heat, is the
relative degree of saturation of the fatty acids in it.
Saturated fats have only single bonds in the fatty acid molecules,
monounsaturated fats have one double bond and polyunsaturated fats have
two or more.
It is these double bonds that are chemically reactive. Therefore, saturated fats, with no double bonds, are by far the most stable (1).
Alright, now let’s discuss each type of cooking fat specifically.
May 11, 2013 | by Kris Gunnars | 77,921 views | 90 Comments
You have many options when it comes to selecting fats and oils for cooking.
But it’s not just a matter of choosing oils that are healthy, but also whether they stay healthy after having been cooked with.
The Stability of Cooking Oils
When you’re cooking at a high heat, you want to use oils that are stable and don’t oxidize or go rancid easily.When oils undergo oxidation, they react with oxygen to form free radicals and harmful compounds that you definitely don’t want to be consuming.
The most important factor in determining an oil’s resistance to
oxidation and rancidification, both at high and low heat, is the
relative degree of saturation of the fatty acids in it.
Saturated fats have only single bonds in the fatty acid molecules,
monounsaturated fats have one double bond and polyunsaturated fats have
two or more.
It is these double bonds that are chemically reactive. Therefore, saturated fats, with no double bonds, are by far the most stable (1).
Alright, now let’s discuss each type of cooking fat specifically.
Canola Oil: Good or Bad?
Canola Oil: Good or Bad?
One study analyzed canola and soybean oils found on store shelves in the U.S. They found that 0.56% to 4.2% of the fatty acids in them were toxic trans fats (3).
This is not listed on the label, unfortunately.
Artificial trans fats are incredibly harmful and associated with many serious diseases, especially heart disease… the biggest killer in the world (4, 5).
When in doubt, keep this golden rule in mind: “Nature doesn’t make bad fats, factories do!” – Dr. Cate Shanahan
If you want to learn more about which cooking oils to eat and which to avoid, then read this article here: Healthy Cooking Oils – The Ultimate Guide.
)
One study analyzed canola and soybean oils found on store shelves in the U.S. They found that 0.56% to 4.2% of the fatty acids in them were toxic trans fats (3).
This is not listed on the label, unfortunately.
Artificial trans fats are incredibly harmful and associated with many serious diseases, especially heart disease… the biggest killer in the world (4, 5).
When in doubt, keep this golden rule in mind: “Nature doesn’t make bad fats, factories do!” – Dr. Cate Shanahan
If you want to learn more about which cooking oils to eat and which to avoid, then read this article here: Healthy Cooking Oils – The Ultimate Guide.
)
11.3.14
Why I’m Not Dismissing the Latest “Animal Protein is Bad” Study (But Not Losing Sleep Over It, Either) | Raw Food SOS
Why I’m Not Dismissing the Latest “Animal Protein is Bad” Study (But Not Losing Sleep Over It, Either) | Raw Food SOS
I keep doing this thing where I stand in the shower writing blog
posts in my head, emerging from the suds giddy and prune-fingered,
feeling strangely accomplished about the words I have not yet typed. And
then I squeegee the fog off the bathroom mirror and tell myself you can do it Denise!
and think about how awesome it will be to actually update my blog after
so much horrible silence. And then I load WordPress and think I’m blogging, I’m blogging, I’m finally blogging, it’s really happening.
And then suddenly it’s three hours later and I’ve opened 800 new
browser tabs in Firefox and have become distracted by something shiny,
Facebooky, or delicious, at which point all hope is lost.
This madness must end. Today, we blog.
Here’s the gist. The study itself was a two-parter: half human, half
mouse (I realize that sounds like some kind of weird centaur). The human
part grabbed the most media attention, so let’s start with that.
For this leg of the study, the researchers analyzed data from NHANES III—a
giant survey of the health and nutritional status of American adults
and kiddos, which churns up reams of info about what the good folks of
this country eat. Basically, the researchers pulled data from almost
6,400 NHANES III participants aged 50 and over, looked at their food
consumption (gleaned from a 24-hour recall survey they answered two
decades ago), divided them up based on reported protein intake, and
followed their disease and mortality outcomes for up to 18 years. (As
best I can tell, that single recall survey was the sole source of the
study’s dietary data.)
Those eating less than 10 percent of their calories from protein were
scooted into the “low protein” group; those eating between 10 and 19
percent of their calories from protein comprised the “moderate protein”
group; and those eating at least 20 percent of their calories from
protein became the “high protein” group. Simple enough.
Initially, the only visible pattern was a much higher death rate from
diabetes among the moderate- and high-protein groupers—not really worth
sweating, though, because the sample size was too small to draw any
meaningful conclusions. Other than that, protein consumption didn’t seem
to be doing anything statistically noteworthy for the group as a whole:
it was unrelated to all-cause mortality, death from cancer, and death
from heart disease.
But here’s where it gets interesting. Instead of keeping all the
participants lumped together, the researchers tried stratifying folks
based on age—with the 50 to 65 year olds ushered into one group and the
66+ folks into another. The goal was to test for an age interaction, where a variable behaves differently depending on how old the participants are.
And it turned out “age interaction” was there in spades. Suddenly, a
whole slew of diet-disease links cropped up—highlighting a trend for
high protein to be bad news for middle-aged adults (50 to 65 years) but a
boon for anyone older than that. Weird, right? It’s why protein didn’t
have many meaningful correlations for the participant pool as a whole:
its positive effects in the older crowd were canceled out by the
negative effects in the younger crowd, creating the illusion of
neutrality.
Anyway, the most interesting findings of that age stratification included:
higher rates of cancer mortality and deadness in general. Meanwhile, the
66-and-older crowd was apparently benefiting from all things
proteinaceous, and those eating the most were living longer and more
cancer-freely. And because I can’t not: here’s a friendly reminder that
this is an observational study, and we can’t slap a cause-and-effect label on any of these relationships.
* Important caveat: both in the media hoopla and throughout the text of the Cell Metabolism paper, the results are reported as relative risk (e.g., “five-fold greater chance of getting heart disease”) rather than absolute risk
(e.g., “3 percent died of heart disease”)—a great way to make findings
seem wildly more dramatic and scary than they really are. For instance,
this study found that among the NHANES III participants who were
diabetes-free at the study’s onset, those eating the most protein were
73 times more likely to die of diabetes (yikes!). But if we
look at the absolute numbers, which are tucked away in a little PDF
supplement accompanying the study, we’d see that 0.2 percent of the
low-protein group died of diabetes (one person) versus 2.0 percent of
the high-protein group. That’s an absolute difference of 1.8 percent, which no longer sounds quite as horrifying.
The researchers also added another layer to their analysis: percent
of calories from animal protein and percent of calories from plant
protein. Here’s where the plot thickens. When adjusting for animal
protein, all those links between protein intake, cancer mortality, and
all-cause mortality went poof into the abyss—with the
protein-cancer connection significantly diminishing, and the
protein-total-mortality connection disappearing entirely. But when the
researchers tried adjusting for plant protein in the same way, nothing
happened.
So what does that mean? In a nutshell, that animal protein
specifically was driving those disease links, whereas plant protein
didn’t elicit an effect one way or another. (That rules out the
possibility that plant protein had special mortality-slaying superpowers
that made animal protein look bad by comparison.)
way of determining food intake, the answer is simple: it’s a heck of a
lot cheaper (and easier) to ask people what they’re eating than to
hellicopterishly stalk them all day long, weighing and measuring every
morsel of food headed for their lips. When it comes to massive surveys
like NHANES that track thousands of enrollees, affordability and
convenience reign supreme. And sometimes that means cutting corners with
precision.
Bottom line, it’s almost a given that the recall data here is less
than stellar. And despite all the magical things math can do, no amount
of statistical wizardry will heal numbers that are wrong from the start.
And to add insult to injury, keep in mind that this was the only diet
information collected for each participant over the course of 18 whoppin’ years.
Even if the recall managed to be accurate for the time it was recorded,
there’s no way to know whether the participants’ diets evolved over the
next two decades, and how any changes in their noshing habits impacted
mortality outcomes.
That’s a lot of trust to put in one day’s worth of self-reported eating!
Diamonds Among Coals?
kept tickling my brain. Typically, if we dig into an observational
study about meat, we see the heavy meat eaters—particularly those
daredevils mowing down on red and processed varieties—engaging in a
variety of lifestyle practices that spell I AM NOT HEALTH CONSCIOUS loud
and clear: more smoking and drinking, less exercise, higher calorie
intake, fewer fruits and vegetables each day, the works.
In turn, all those health-defeating behaviors tend to confound the
true relationship between meat and various diseases and mortality. It’s
hard to decipher whether meat itself is increasing cancer and
heart disease and early death, or if reckless-with-their-health
folks—already on a crash course towards chronic illness—just eat more of
it because they don’t heed any conventional advice about diet and
lifestyle.
If a situation like that was at play in this study, and protein
intake was a surrogate for health-unconsciousness the same way meat
tends to be, we’d expect to see the folks in the high-protein group
fitting a similar anti-health profile—poorer diets overall, more risky
behaviors. In turn, that would mean the study’s results could’ve been
biased against the high-protein consumers due to all that residual
confounding.
So was that the case?
Unfortunately, the paper doesn’t make it very easy to answer that
question. There’s no data for biggies like drinking or exercise in the
paper’s “participant characteristic” list. But we can see that the high-protein group actually had relatively fewer
smokers than the low-protein group (18.2 percent versus 21.8 percent
for current smokers; 37.8 percent versus 39.8 percent for former
smokers), and that the high-protein group reported a lower
calorie intake than the low-protein group (though heaven knows if
that’s accurate). In addition, more people in the high-protein group
than the low-protein group reported trying to lose weight during the
past year (43.9 percent versus 37.5 percent), as well as changing their
diet for health reasons (29.3 percent versus 15 percent). But it’s hard
to say whether that’s a reflection of greater health awareness or poorer
health when the study started.
What can we piece together from that?
And in the world of aging research, IGF-1 is a bona-fide spotlight stealer. As its name implies, insulin-like growth factor 1 is
a hormone molecularly similar to insulin, with its modus operandi being
“grow, grow, grow!” It promotes growth for nearly every cell in your
body—building muscle, making young’uns get taller, creating new brain
cells, repairing nerve damage, and doing other awesome things that keep
your body cranking like the fabulous machine it is. But IGF-1 is kind of
a double-edged sword. And the bad-slicey side plunges right through the
heart of longevity.
stage instead of catapulting towards maturity like usual. (Which made
their lives longer, but not necessarily more enjoyable: the dauer stage
consists mostly of not eating, not reproducing, and sitting like a lump
on a bump until food supply becomes abundant. Even if we humans had a
dauer stage, I can’t imagine wanting to stay there for very long. It
sounds too much like high school.)
The reasons behind calorie restriction’s perks? A biggie was thought
to be its suppressive effect on growth hormone and IGF-1—essentially
slowing down aging and age-related diseases. Calorie-restricted
organisms had much lower levels of IGF-1 than their more abundantly fed
counterparts, at least in the creatures and fungi that’d been studied up
to that point. And those reduced IGF-1 levels seemed to help protect
cells from DNA damage—a boon for warding off cancer. It made sense:
there’s a huge evolutionary and survival advantage to halting growth in
times of food scarcity.
Although there wasn’t controlled data available for humans (we live
too darn long to make lifespan studies an easy feat), the benefits of
calorie restriction were expected to be universal. And thus emerged an
era of books, gurus, theories, and research funding all pouring towards
the promising new field of “CR.”
But soon cracks in the calorie-restriction theory started appearing. More comprehensive rodent studies, including one looking at 41 different strains of mice, found that calorie restriction shortened the
lifespan in more strains than it extended. Likewise, in wild mice
opposed to massively lab-domesticated ones, a lower energy intake did nada for average life expectancy
(though it did curtail cancer rates). A 25-year rhesus monkey
study—which the longevity world had waited with baited breath for
completion—failed to show any life-extension benefit from feeding them
less. And while studies on calorie-restricted humans weren’t far enough
along to offer mortality data, the existing numbers showed their IGF-1 levels were pretty much the same as everyone else’s, casting doubt on the hope those earlier rodent and yeast and worm studies would be translatable to humans.
What the heck was going on?
Eventually it emerged that calorie restriction, for most species, was only effective if it also restricted protein intake.
And as the study gods breathed more and more research into being, it
seemed all that deliberate hunger might be for naught. Protein
restriction alone could push down IGF-1 levels
and spark a cascade of longevity-enhancing changes. (In case you’re
wondering, neither fat restriction nor carbohydrate restriction seemed
to increase lifespan, at least in rodent models.)
But it didn’t end there! A new wave of studies zeroed in on methionine, a
sulfur-containing amino acid abundant in muscle meat and eggs. In mice,
restricting methionine—without reducing calories—was enough to increase lifespan and induce health perks
like slowing down immune aging, improving blood glucose, reducing IGF-1
and insulin levels, and protecting organ cells from oxidative damage.
The reason? It appeared to be twofold: methionine tends to generate
toxic byproducts—
And then the plot turned once more! Seriously, this saga had more
twists than a pretzel factory. A fascinating but woefully little-known
study in 2011 showed that in mice, supplementing with glycine—an amino
acid found abundantly in connective tissue and gelatin and bone
broth—had the exact same life-extending effect
as restricting methionine. Without reducing calories or other amino
acids, glycine supplementation increased the rodents’ lifespan, reduced
fasting glucose and insulin, decreased IGF-1 levels, and nearly halved
their triglycerides—the very perks that’ve variously been attributed to
calorie restriction, protein restriction, and methionine restriction.
Let me make it abundantly clear: THIS IS HUGE.
If the glycine finding translates over to humans (which I strongly
suspect it does), life-extension-seekers may be barking up the wrong
tree—or at least an unnecessarily complicated one—by trying to
selectively reduce animal protein in order to live longer, as Longo
seems to support. A wiser method could be simply getting a more
“biologically appropriate” balance of amino acids than the standard
Western diet typically provides. That means eating more of the
glycine-rich foods that’ve been gradually stripped from our menus—things
like skin, bones, tendons, tripe, feet, hooves, ears, connective
tissue, and some organ meats—and less of the muscle meat typically
dominating our fridges and freezers.
So to put all that in a Reader’s Digest version, the history of life-extension research went something like this:
“Calorie restriction extends rats’ lifespans. We must eat less to live longer!”
“Wait… reducing protein without reducing calories does the same thing. We must eat less protein to live longer!”
“Well I’ll be darned! The whole ‘life extension’ thing works just by
limiting methionine. Methionine bad. Other amino acids okay! Down with
meat!”
And now, it seems we’re at yet another crossroads—one where
methionine intake may become less important than its balance and
interaction with other nutrients, especially glycine.
*Note: This is deliberately oversimplified and lots of other
really interesting discoveries happened. But heaven knows this blog post
doesn’t need to be even longer than it already is.
might expect—had significantly lower levels than their protein-gorging
brethren.
A lot of things are wrong with this picture, such as “where are the food things?”—but
for the sake of brevity, I’m just going to focus on that second
ingredient: casein. It’s one of the major proteins in milk, and it’s got
an awful track record for promoting tumor growth more than other types of protein, including its dairy-derived cousin whey.
I’ve already written tomes on casein, cancer, and rodents in previous blog entries—including my Fork’s Over Knives critique and China Study critique—so I won’t torture you by rehashing it all here. Chris Masterjohn also has some awesomesauce posts
on the subject, so hop on over there if you’re insatiably curious about
it all. The bottom line is that when we look at the mice-and-protein
studies outlined in Longo’s paper, this is what we’re dealing
with: a cocktail of purified ingredients, with the protein component
being a well-known promoter of cancer in rodents. It’s not at all
surprising that the mice eating the most of it sprouted tumors like mad.
But it’s impossible to say how much of that’s due to protein per se, or
to casein—especially casein that’s been stripped of all the other
goodies in dairy and tossed into a party bag of refined junk.
Putting It All Together
For those of us in the ancestral, paleo, “real food,” low carb, and
other related communities, there’s a tendency to see a study like this
and be like RAAWRRR KILL IT BEFORE IT BREEDS at the first whiff of its
correlation-is-causation tone. And as someone who generally places
epidemiology in the ninth circle of Research Hell, I’ve certainly been
guilty of that myself. But one of the biggest gifts of observational
studies like this one is the opportunity to explore new hypotheses and
test out perspectives that challenge what we believe.
I think that’s definitely the case here.
Think of it this way. For most of human history, dietary consistency
was a fairy tale. Famines struck. Periods of scarcity tangoed with those
of abundance. We gorged on energy-dense foods like meat when they
became available, knowing full well we might not be so lucky the next
day or week. And to be sure, we ate the whole freakin’ animal after a
kill—not just the skeletal muscle.
Constant abundance and pickiness is absolutely new to our bodies,
even for those of us eating foods we deem ancient or ancestral. So it’s
really not all that far-fetched to think that America’s animal protein
habits—heavy on the methionine-rich muscle meats, scant on the glycine,
swimming in ceaseless surplus instead of punctuated with scarcity—could
be a problem for our health.
Perhaps it’s not a coincidence that many of the world’s
longest-living populations eat fairly low-methionine diets or
periodically abstain from protein-rich foods (like in Ikaria,
where the predominantly Orthodox Christian residents cyclically fast
from animal products). And perhaps just as relevant as the types
of foods we eat is the manner in which we eat them—nose-to-tail for
animals, with some plant-only days thrown in for good measure.
That doesn’t mean the solution is to go vegan. Nor is it necessarily
to eat a low-animal-protein diet. But perhaps it’s time to seriously
explore options like protein cycling, periodic fasting, or just cooking
up heaps o’ bone broth to get that glycine down our gullets.
Just to be clear, nothing I’ve written here—even my moments of
quasi-defending this study—changes the fact that the NHANES III data is
observational and the diet recalls are basically handicapped from the
start, thanks to the history-revising sinkhole that is the human mind.
As always, correlation isn’t causation. It’s pretty
disappointing that the study’s own researchers seemed to forget that.
The reason I’m not sliding this study straight into the slush pile is
because regardless of its validity, it at least opens the door to some
important discussion. The bigger point is that the trends it excavated
and hypotheses it explored could feasibly be real—evolutionarily,
biologically, logically. In my opinion, the greatest value of this
study, then, is its role as a springboard for breaking out of the
comfort zone of what we think—and want—to be true.
Otherwise, I guess it could make a nice doorstop.
I keep doing this thing where I stand in the shower writing blog
posts in my head, emerging from the suds giddy and prune-fingered,
feeling strangely accomplished about the words I have not yet typed. And
then I squeegee the fog off the bathroom mirror and tell myself you can do it Denise!
and think about how awesome it will be to actually update my blog after
so much horrible silence. And then I load WordPress and think I’m blogging, I’m blogging, I’m finally blogging, it’s really happening.
And then suddenly it’s three hours later and I’ve opened 800 new
browser tabs in Firefox and have become distracted by something shiny,
Facebooky, or delicious, at which point all hope is lost.
This madness must end. Today, we blog.
So now I stand before you here in Cyberland, up on my
soapbox, rantin’ muscles ready to flex. In case you haven’t heard, the
world just got slammed with a new “meat is bad” tsunami—and it’s a
doozy. We’ve got the familiar swirl of headlines designed to strike fear
in our hearts (“That chicken wing you’re eating could be as deadly as a cigarette!” – The Financial Express), and pretty much every mainstream outlet caught it on their radar (hello ABC, Fox, The Guardian, Scientific American, Washington Post,
and any other big-hitters I left out). The actual study, which is
decidedly less popular than the press releases heralding its existence,
is available here: Low
Protein Intake Is Associated with a Major Reduction in IGF-1, Cancer,
and Overall Mortality in the 65 and Younger but Not Older Population.
Go take a gander. The gist is that animal protein will (purportedly)
shorten your life and increase your risk of chronic disease—at least if
you’re eating a bunch of it before you turn 66. (Once you’re in your
golden years, though, the study implies animal protein is a good thing. Tricky, eh?)
soapbox, rantin’ muscles ready to flex. In case you haven’t heard, the
world just got slammed with a new “meat is bad” tsunami—and it’s a
doozy. We’ve got the familiar swirl of headlines designed to strike fear
in our hearts (“That chicken wing you’re eating could be as deadly as a cigarette!” – The Financial Express), and pretty much every mainstream outlet caught it on their radar (hello ABC, Fox, The Guardian, Scientific American, Washington Post,
and any other big-hitters I left out). The actual study, which is
decidedly less popular than the press releases heralding its existence,
is available here: Low
Protein Intake Is Associated with a Major Reduction in IGF-1, Cancer,
and Overall Mortality in the 65 and Younger but Not Older Population.
Go take a gander. The gist is that animal protein will (purportedly)
shorten your life and increase your risk of chronic disease—at least if
you’re eating a bunch of it before you turn 66. (Once you’re in your
golden years, though, the study implies animal protein is a good thing. Tricky, eh?)
So what’s really going on here? Should we all go vegan until we retire?
To be honest, I get weary blogging about what seems like
the same study repackaged and regurgitated every few months under a
different name (and it appears I’m not the only one). Observational meat studies are a dime a dozen. The media-viral ones seem to pop up at least a few times per year (I’ve already dissected a few).
Ultimately, there’s only so much you can say about a study that uses
wobbly survey methods, tries to squeeze causation from correlation, and
falls victim to the confounders plaguing most epidemiological projects
involving food. So whenever I see a new Meat Is Gon’ Kill Ya Dead study
hijacking the airwaves, I feel kind of like
the same study repackaged and regurgitated every few months under a
different name (and it appears I’m not the only one). Observational meat studies are a dime a dozen. The media-viral ones seem to pop up at least a few times per year (I’ve already dissected a few).
Ultimately, there’s only so much you can say about a study that uses
wobbly survey methods, tries to squeeze causation from correlation, and
falls victim to the confounders plaguing most epidemiological projects
involving food. So whenever I see a new Meat Is Gon’ Kill Ya Dead study
hijacking the airwaves, I feel kind of like
except with more sadness, and less nostril flare.
But this latest study grabbed my attention for a few reasons.
For one, it doesn’t orbit around the usual meat-damning suspects—saturated fat and cholesterol—but instead looks at animal protein, which I’m rather fond of discussing due to my previous shenanigans
on this blog. And two, the researchers padded their observational study
with some follow-up work on mice and cells, which at least earns them
an A for effort. It’s still not the sort of research that should keep
you awake at night, but at least in my mind, it’s interesting enough to
warrant a closer look.
on this blog. And two, the researchers padded their observational study
with some follow-up work on mice and cells, which at least earns them
an A for effort. It’s still not the sort of research that should keep
you awake at night, but at least in my mind, it’s interesting enough to
warrant a closer look.
And perhaps more importantly, I think there might be some truth to the researchers’ findings. Yep, I said it. Gasp shock horror!
So let’s plow into this thing, shall we?
The Study Low-Down
mouse (I realize that sounds like some kind of weird centaur). The human
part grabbed the most media attention, so let’s start with that.
For this leg of the study, the researchers analyzed data from NHANES III—a
giant survey of the health and nutritional status of American adults
and kiddos, which churns up reams of info about what the good folks of
this country eat. Basically, the researchers pulled data from almost
6,400 NHANES III participants aged 50 and over, looked at their food
consumption (gleaned from a 24-hour recall survey they answered two
decades ago), divided them up based on reported protein intake, and
followed their disease and mortality outcomes for up to 18 years. (As
best I can tell, that single recall survey was the sole source of the
study’s dietary data.)
Those eating less than 10 percent of their calories from protein were
scooted into the “low protein” group; those eating between 10 and 19
percent of their calories from protein comprised the “moderate protein”
group; and those eating at least 20 percent of their calories from
protein became the “high protein” group. Simple enough.
Initially, the only visible pattern was a much higher death rate from
diabetes among the moderate- and high-protein groupers—not really worth
sweating, though, because the sample size was too small to draw any
meaningful conclusions. Other than that, protein consumption didn’t seem
to be doing anything statistically noteworthy for the group as a whole:
it was unrelated to all-cause mortality, death from cancer, and death
from heart disease.
But here’s where it gets interesting. Instead of keeping all the
participants lumped together, the researchers tried stratifying folks
based on age—with the 50 to 65 year olds ushered into one group and the
66+ folks into another. The goal was to test for an age interaction, where a variable behaves differently depending on how old the participants are.
And it turned out “age interaction” was there in spades. Suddenly, a
whole slew of diet-disease links cropped up—highlighting a trend for
high protein to be bad news for middle-aged adults (50 to 65 years) but a
boon for anyone older than that. Weird, right? It’s why protein didn’t
have many meaningful correlations for the participant pool as a whole:
its positive effects in the older crowd were canceled out by the
negative effects in the younger crowd, creating the illusion of
neutrality.
Anyway, the most interesting findings of that age stratification included:
- The 50 to 65 crowd had a 74 percent greater risk of death from all
causes for the high-protein group compared to the low-protein group
(hazard ratio: 1.74), and a 433 percent greater risk of dying from
cancer (hazard ratio: 4.33). - Folks aged 66 and older had a 60 percent lower risk of cancer
mortality for the high-protein group compared to the low-protein group
(hazard ratio: 0.40), and a 28 percent decrease in deaths from all
causes (hazard ratio: 0.72).
higher rates of cancer mortality and deadness in general. Meanwhile, the
66-and-older crowd was apparently benefiting from all things
proteinaceous, and those eating the most were living longer and more
cancer-freely. And because I can’t not: here’s a friendly reminder that
this is an observational study, and we can’t slap a cause-and-effect label on any of these relationships.
* Important caveat: both in the media hoopla and throughout the text of the Cell Metabolism paper, the results are reported as relative risk (e.g., “five-fold greater chance of getting heart disease”) rather than absolute risk
(e.g., “3 percent died of heart disease”)—a great way to make findings
seem wildly more dramatic and scary than they really are. For instance,
this study found that among the NHANES III participants who were
diabetes-free at the study’s onset, those eating the most protein were
73 times more likely to die of diabetes (yikes!). But if we
look at the absolute numbers, which are tucked away in a little PDF
supplement accompanying the study, we’d see that 0.2 percent of the
low-protein group died of diabetes (one person) versus 2.0 percent of
the high-protein group. That’s an absolute difference of 1.8 percent, which no longer sounds quite as horrifying.
The researchers also added another layer to their analysis: percent
of calories from animal protein and percent of calories from plant
protein. Here’s where the plot thickens. When adjusting for animal
protein, all those links between protein intake, cancer mortality, and
all-cause mortality went poof into the abyss—with the
protein-cancer connection significantly diminishing, and the
protein-total-mortality connection disappearing entirely. But when the
researchers tried adjusting for plant protein in the same way, nothing
happened.
So what does that mean? In a nutshell, that animal protein
specifically was driving those disease links, whereas plant protein
didn’t elicit an effect one way or another. (That rules out the
possibility that plant protein had special mortality-slaying superpowers
that made animal protein look bad by comparison.)
Should You Freak Out?
To figure out how seriously we should take this, let’s look at the study’s lifeblood: its dietary intake data. Although the Cell Metabolism paper
is strangely silent about how people’s food intakes were gauged (a bit
unnerving, considering how heavily this study depends on that data being
sound), we know that NHANES collects its information via 24-hour
recalls. The CDC website has a file discussing the whole process.
Basically, participants get phoned by an interviewer, are asked to name
everything they ate from midnight to midnight of the previous day, get
prodded to make sure they didn’t forget any snacks or butter pats or
late-night cookie nibbles, and then receive some follow-up questions
about tap water and salt and other fun things. According to the CDC
file, the participants also answer a short questionnaire “to ascertain
whether the person’s intake on the previous day was usual or unusual.”
is strangely silent about how people’s food intakes were gauged (a bit
unnerving, considering how heavily this study depends on that data being
sound), we know that NHANES collects its information via 24-hour
recalls. The CDC website has a file discussing the whole process.
Basically, participants get phoned by an interviewer, are asked to name
everything they ate from midnight to midnight of the previous day, get
prodded to make sure they didn’t forget any snacks or butter pats or
late-night cookie nibbles, and then receive some follow-up questions
about tap water and salt and other fun things. According to the CDC
file, the participants also answer a short questionnaire “to ascertain
whether the person’s intake on the previous day was usual or unusual.”
After looking over that questionnaire, I’ve got to say the
word “ascertain” seems a bit optimistic to me. Keep in mind, the 24-hour
recall is the sole source of dietary data in this study—so it darn well
better strive for accuracy. And indeed, the NHANES survey employs a
five-step strategy to help participants remember every bite they ate,
described in “Nutrition in the Prevention and Treatment of Disease” (PDF) as follows:
word “ascertain” seems a bit optimistic to me. Keep in mind, the 24-hour
recall is the sole source of dietary data in this study—so it darn well
better strive for accuracy. And indeed, the NHANES survey employs a
five-step strategy to help participants remember every bite they ate,
described in “Nutrition in the Prevention and Treatment of Disease” (PDF) as follows:
- An initial “quick list,” in which the respondent reports all the
foods and beverages consumed, without interruption from the interviewer; - A forgotten foods list of nine food categories commonly omitted in 24-hour recall reporting;
- Time and occasion, in which the time each eating occasion began and what the respondent would call it are reported;
- A detail pass, in which probing questions ask for more detailed
information about the food and portion size, in addition to review of
the eating occasions and times between the eating occasions; and - Final review, in which any other item not already reported is asked.
As far as boosting reporting accuracy, that’s all a great
help. But it appears the interviewers only asked one question to gauge
how typical each participant’s reported diet was, relative to
what they generally eat: “Was the amount of food that you ate yesterday
much more than usual, usual, or much less than usual?”
help. But it appears the interviewers only asked one question to gauge
how typical each participant’s reported diet was, relative to
what they generally eat: “Was the amount of food that you ate yesterday
much more than usual, usual, or much less than usual?”
That’s it. No qualifier for what “much more” or
“much less” actually meant; no queries about specific foods; no prodding
to see whether yesterday happened to feature a birthday barbeque, thus
skewing the day’s frankfurter-to-kale ratio in a meatier direction than
usual. Just one vague question about total food quantity, whose answer
could only ever be subjective. (After the diet recall, each person’s
reported intake was converted into food codes and nutrient components—so
any flaws in that initial reporting trickled upstream to the final
statistical analysis.)
“much less” actually meant; no queries about specific foods; no prodding
to see whether yesterday happened to feature a birthday barbeque, thus
skewing the day’s frankfurter-to-kale ratio in a meatier direction than
usual. Just one vague question about total food quantity, whose answer
could only ever be subjective. (After the diet recall, each person’s
reported intake was converted into food codes and nutrient components—so
any flaws in that initial reporting trickled upstream to the final
statistical analysis.)
And it gets worse. While it’d be nice to suspend disbelief
and pretend the NHANES III recall data still manages to be solid, that’s
apparently not the case. A 2013 study took NHANES to task
and tested how accurate its “caloric intake” data was, as calculated
from those 24-hour recall surveys. The results? Across the board, NHANES
participants did a remarkable job of being wrong. Nearly everyone
under-reported how many calories they were consuming—with obese folks
underestimating their intake by an average of 716 calories per day for
men and 856 calories for women. That’s kind of a lot. The study’s
researchers concluded that throughout the NHANES’ 40-year existence,
“energy intake data on the majority of respondents … was not
physiologically plausible.” D’oh. If such a thing is possible, the
24-hour recall rests at an even higher tier of suckitude than does its
cousin, the loathesome food frequency questionnaire.
and pretend the NHANES III recall data still manages to be solid, that’s
apparently not the case. A 2013 study took NHANES to task
and tested how accurate its “caloric intake” data was, as calculated
from those 24-hour recall surveys. The results? Across the board, NHANES
participants did a remarkable job of being wrong. Nearly everyone
under-reported how many calories they were consuming—with obese folks
underestimating their intake by an average of 716 calories per day for
men and 856 calories for women. That’s kind of a lot. The study’s
researchers concluded that throughout the NHANES’ 40-year existence,
“energy intake data on the majority of respondents … was not
physiologically plausible.” D’oh. If such a thing is possible, the
24-hour recall rests at an even higher tier of suckitude than does its
cousin, the loathesome food frequency questionnaire.
(And in case that’s not enough to make your blood boil: the
NHANES data is what the US government uses to determine what the
country is eating, formulate dietary guidelines, and divvy up funding.
Your tax dollars hard at work!)
NHANES data is what the US government uses to determine what the
country is eating, formulate dietary guidelines, and divvy up funding.
Your tax dollars hard at work!)
If it’s that bad with calories, can we really expect the protein data to be much better?
In case you’re wondering why anyone uses such a destined-for-failure way of determining food intake, the answer is simple: it’s a heck of a
lot cheaper (and easier) to ask people what they’re eating than to
hellicopterishly stalk them all day long, weighing and measuring every
morsel of food headed for their lips. When it comes to massive surveys
like NHANES that track thousands of enrollees, affordability and
convenience reign supreme. And sometimes that means cutting corners with
precision.
Bottom line, it’s almost a given that the recall data here is less
than stellar. And despite all the magical things math can do, no amount
of statistical wizardry will heal numbers that are wrong from the start.
And to add insult to injury, keep in mind that this was the only diet
information collected for each participant over the course of 18 whoppin’ years.
Even if the recall managed to be accurate for the time it was recorded,
there’s no way to know whether the participants’ diets evolved over the
next two decades, and how any changes in their noshing habits impacted
mortality outcomes.
That’s a lot of trust to put in one day’s worth of self-reported eating!
Diamonds Among Coals?
Now that I’ve bashed the NHANES diet survey to the moon and
back, let’s look at why it might actually have some legitimacy. Bear
with me!
While combing through the Cell Metabolism paper, one thoughtback, let’s look at why it might actually have some legitimacy. Bear
with me!
kept tickling my brain. Typically, if we dig into an observational
study about meat, we see the heavy meat eaters—particularly those
daredevils mowing down on red and processed varieties—engaging in a
variety of lifestyle practices that spell I AM NOT HEALTH CONSCIOUS loud
and clear: more smoking and drinking, less exercise, higher calorie
intake, fewer fruits and vegetables each day, the works.
In turn, all those health-defeating behaviors tend to confound the
true relationship between meat and various diseases and mortality. It’s
hard to decipher whether meat itself is increasing cancer and
heart disease and early death, or if reckless-with-their-health
folks—already on a crash course towards chronic illness—just eat more of
it because they don’t heed any conventional advice about diet and
lifestyle.
If a situation like that was at play in this study, and protein
intake was a surrogate for health-unconsciousness the same way meat
tends to be, we’d expect to see the folks in the high-protein group
fitting a similar anti-health profile—poorer diets overall, more risky
behaviors. In turn, that would mean the study’s results could’ve been
biased against the high-protein consumers due to all that residual
confounding.
So was that the case?
Unfortunately, the paper doesn’t make it very easy to answer that
question. There’s no data for biggies like drinking or exercise in the
paper’s “participant characteristic” list. But we can see that the high-protein group actually had relatively fewer
smokers than the low-protein group (18.2 percent versus 21.8 percent
for current smokers; 37.8 percent versus 39.8 percent for former
smokers), and that the high-protein group reported a lower
calorie intake than the low-protein group (though heaven knows if
that’s accurate). In addition, more people in the high-protein group
than the low-protein group reported trying to lose weight during the
past year (43.9 percent versus 37.5 percent), as well as changing their
diet for health reasons (29.3 percent versus 15 percent). But it’s hard
to say whether that’s a reflection of greater health awareness or poorer
health when the study started.
What can we piece together from that?
Here’s my take. Contrary to what we might assume, the deck probably wasn’t
stacked against the high-protein eaters from the start. If anything,
the study’s confounders should have given them an advantage in their
health outcomes. And I think that possibility is supported by more than
just the (admittedly sparse) participant characteristics.
stacked against the high-protein eaters from the start. If anything,
the study’s confounders should have given them an advantage in their
health outcomes. And I think that possibility is supported by more than
just the (admittedly sparse) participant characteristics.
Here’s why. When the researchers took their protein
correlations and adjusted for fat and carbohydrate intake (as percent of
total calories), the numbers didn’t budge. That’s pretty
interesting, because this batch of NHANES III surveys happened at the
height of the nation’s fat-phobia, when mainstream thought was that all
fat was bad—regardless of whether it came from something hooved, winged,
or rooted in the dirt. Since adjusting for fat intake didn’t dissolve
the links between protein and mortality, it reduces the likelihood that
fat was acting as a confounder here.
correlations and adjusted for fat and carbohydrate intake (as percent of
total calories), the numbers didn’t budge. That’s pretty
interesting, because this batch of NHANES III surveys happened at the
height of the nation’s fat-phobia, when mainstream thought was that all
fat was bad—regardless of whether it came from something hooved, winged,
or rooted in the dirt. Since adjusting for fat intake didn’t dissolve
the links between protein and mortality, it reduces the likelihood that
fat was acting as a confounder here.
Likewise, protein—at least until this study came out and
ignited terror in omnivorous hearts near and far—has been the only
macronutrient not demonized by any popular diets or mainstream
health authorities. Fat and carbs have received more than their fair
share of bashing over the years, but protein, as far as conventional
thought goes, has clung tightly to its health halo—emerging unscathed
from even the bloodiest of diet wars. (And the perception of “good
protein” certainly includes that from animal sources, thanks in large
part to the USDA’s push to consume our meat and dairy lean. How many
egg-white omelets and and skinless chicken breasts have been choked down
in the name of health?)
ignited terror in omnivorous hearts near and far—has been the only
macronutrient not demonized by any popular diets or mainstream
health authorities. Fat and carbs have received more than their fair
share of bashing over the years, but protein, as far as conventional
thought goes, has clung tightly to its health halo—emerging unscathed
from even the bloodiest of diet wars. (And the perception of “good
protein” certainly includes that from animal sources, thanks in large
part to the USDA’s push to consume our meat and dairy lean. How many
egg-white omelets and and skinless chicken breasts have been choked down
in the name of health?)
So again, if we were going to find any bias in the survey data, it’d probably lean towards protein being a good
thing—at least in the eyes of the health-conscious crowd. The fact that
a non-stigmatized macronutrient had such defined links with mortality
cranks up its relevance, in my mind.
Of Mice and Rodent Chow (And Growth Factors and Protein)thing—at least in the eyes of the health-conscious crowd. The fact that
a non-stigmatized macronutrient had such defined links with mortality
cranks up its relevance, in my mind.
Is your brain full yet? Save room, because there’s still
another piece of the study to run through our wringer—and this one’s a
lot more rambunctious and furry. To understand why protein might be
linked to cancer and overall mortality as their human study suggested,
the researchers conducted a series of experiments on mice, feeding them a
range of protein levels mirroring that of the NHANES III participants—4
percent to 18 percent of calories. The prime goal was to see whether
tweaking those protein levels would impact levels of insulin-like growth factor 1 (IGF-1) circulating in the mice’s bodies, as well as cancer incidence and progression.
another piece of the study to run through our wringer—and this one’s a
lot more rambunctious and furry. To understand why protein might be
linked to cancer and overall mortality as their human study suggested,
the researchers conducted a series of experiments on mice, feeding them a
range of protein levels mirroring that of the NHANES III participants—4
percent to 18 percent of calories. The prime goal was to see whether
tweaking those protein levels would impact levels of insulin-like growth factor 1 (IGF-1) circulating in the mice’s bodies, as well as cancer incidence and progression.
But first, lets back up for a moment and get some context on this whole IGF-1 thing and why it’s so relevant.
As you might’ve seen in some of the news reports, the lead
researcher of this study was Valter Longo—the director of the University
of Southern California’s Longevity Institute, who already has a scroll
of really cool studies under his belt (mostly on fasting and cancer). And he was profiled on “Through the Wormhole” with Morgan Freeman, which ups his awesomeness quotient considerably. Because science.
researcher of this study was Valter Longo—the director of the University
of Southern California’s Longevity Institute, who already has a scroll
of really cool studies under his belt (mostly on fasting and cancer). And he was profiled on “Through the Wormhole” with Morgan Freeman, which ups his awesomeness quotient considerably. Because science.
And in the world of aging research, IGF-1 is a bona-fide spotlight stealer. As its name implies, insulin-like growth factor 1 is
a hormone molecularly similar to insulin, with its modus operandi being
“grow, grow, grow!” It promotes growth for nearly every cell in your
body—building muscle, making young’uns get taller, creating new brain
cells, repairing nerve damage, and doing other awesome things that keep
your body cranking like the fabulous machine it is. But IGF-1 is kind of
a double-edged sword. And the bad-slicey side plunges right through the
heart of longevity.
Part of the problem is that, while fulfilling its
growth-promoting duties, IGF-1 doesn’t distinguish between healthy cells
and damaged ones—potentially spurring cancer proliferation and contributing to tumor growth, if the conditions are right. High levels of IGF-1 have been linked to breast cancer, prostate cancer, bladder cancer, colorectal cancer, endometrial cancer, and lung cancer
(though most of that research is observational, so there’s always the
possibility of tumors increasing IGF-1 levels instead of the other way
around, or a third unmeasured variable raising both). On the flip side,
folks with a genetic deficiency in IGF-1 appear nearly immune to cancer—a phenomenon Longo himself has investigated.
growth-promoting duties, IGF-1 doesn’t distinguish between healthy cells
and damaged ones—potentially spurring cancer proliferation and contributing to tumor growth, if the conditions are right. High levels of IGF-1 have been linked to breast cancer, prostate cancer, bladder cancer, colorectal cancer, endometrial cancer, and lung cancer
(though most of that research is observational, so there’s always the
possibility of tumors increasing IGF-1 levels instead of the other way
around, or a third unmeasured variable raising both). On the flip side,
folks with a genetic deficiency in IGF-1 appear nearly immune to cancer—a phenomenon Longo himself has investigated.
Apart from the potential cancer connection, IGF-1 plays a
huge role in the aging process. After all, the cycle of cells growing,
dividing, and repairing is just a fancy way of explaining that they’re
aging—so IGF-1 is pretty much orchestrating how rapidly that happens.
huge role in the aging process. After all, the cycle of cells growing,
dividing, and repairing is just a fancy way of explaining that they’re
aging—so IGF-1 is pretty much orchestrating how rapidly that happens.
And the evidence comes from more than just the usual rat
and test-tube studies. As far as human data goes, there’s some
interesting research showing a connection between IGF-1 levels and
lifespan when we look at the oldest of the old. A disproportionate
number of centenarians have mutations affecting their IGF-1 receptor activity, which probably plays a role in their long-livedness. Likewise, the offspring of centenarians have lower IGF-1 levels
than others of their age, gender, and BMI—suggesting the hereditary
component of longevity could be due to reduced IGF-1 tricking through a
family’s bloodline. (It’s less useful to look at IGF-1 levels in
centenarians themselves, since the hormone naturally declines with age
and will be pretty low in anyone who reaches the century mark.)
and test-tube studies. As far as human data goes, there’s some
interesting research showing a connection between IGF-1 levels and
lifespan when we look at the oldest of the old. A disproportionate
number of centenarians have mutations affecting their IGF-1 receptor activity, which probably plays a role in their long-livedness. Likewise, the offspring of centenarians have lower IGF-1 levels
than others of their age, gender, and BMI—suggesting the hereditary
component of longevity could be due to reduced IGF-1 tricking through a
family’s bloodline. (It’s less useful to look at IGF-1 levels in
centenarians themselves, since the hormone naturally declines with age
and will be pretty low in anyone who reaches the century mark.)
For longevity researchers, there’s an ongoing quest to
“hack” all this life-extending genetic stuff and help us average Joe
Shmoes reap the same benefits. Quite a few things can influence your
body’s levels of IGF-1 beyond genes—everything from your stress level to
your ethnicity to your estrogen status to the time of day—but diet is a
huge determinant, and perhaps the easiest to tweak and control. So it’s
not too surprising that food gets so much attention in this field. And
as Longo was keenly aware of, protein is chief among the IGF-1 governors
we consume.
But the life-extension hunt is still a work in progress. For at least 60 years, the darling of longevity seekers was calorie restriction (CR)—with the first case of its life-extension properties appearing in 1935, when an experiment showed that rats lived longer if their energy intake was reduced. Ditto for mice, flies, crustaceans, and yeast, more studies revealed. And subsequent research showed the same longevity effect in calorie-restricted ringworms, who idled in their larva’s “dauer”“hack” all this life-extending genetic stuff and help us average Joe
Shmoes reap the same benefits. Quite a few things can influence your
body’s levels of IGF-1 beyond genes—everything from your stress level to
your ethnicity to your estrogen status to the time of day—but diet is a
huge determinant, and perhaps the easiest to tweak and control. So it’s
not too surprising that food gets so much attention in this field. And
as Longo was keenly aware of, protein is chief among the IGF-1 governors
we consume.
stage instead of catapulting towards maturity like usual. (Which made
their lives longer, but not necessarily more enjoyable: the dauer stage
consists mostly of not eating, not reproducing, and sitting like a lump
on a bump until food supply becomes abundant. Even if we humans had a
dauer stage, I can’t imagine wanting to stay there for very long. It
sounds too much like high school.)
The reasons behind calorie restriction’s perks? A biggie was thought
to be its suppressive effect on growth hormone and IGF-1—essentially
slowing down aging and age-related diseases. Calorie-restricted
organisms had much lower levels of IGF-1 than their more abundantly fed
counterparts, at least in the creatures and fungi that’d been studied up
to that point. And those reduced IGF-1 levels seemed to help protect
cells from DNA damage—a boon for warding off cancer. It made sense:
there’s a huge evolutionary and survival advantage to halting growth in
times of food scarcity.
Although there wasn’t controlled data available for humans (we live
too darn long to make lifespan studies an easy feat), the benefits of
calorie restriction were expected to be universal. And thus emerged an
era of books, gurus, theories, and research funding all pouring towards
the promising new field of “CR.”
But soon cracks in the calorie-restriction theory started appearing. More comprehensive rodent studies, including one looking at 41 different strains of mice, found that calorie restriction shortened the
lifespan in more strains than it extended. Likewise, in wild mice
opposed to massively lab-domesticated ones, a lower energy intake did nada for average life expectancy
(though it did curtail cancer rates). A 25-year rhesus monkey
study—which the longevity world had waited with baited breath for
completion—failed to show any life-extension benefit from feeding them
less. And while studies on calorie-restricted humans weren’t far enough
along to offer mortality data, the existing numbers showed their IGF-1 levels were pretty much the same as everyone else’s, casting doubt on the hope those earlier rodent and yeast and worm studies would be translatable to humans.
What the heck was going on?
Eventually it emerged that calorie restriction, for most species, was only effective if it also restricted protein intake.
And as the study gods breathed more and more research into being, it
seemed all that deliberate hunger might be for naught. Protein
restriction alone could push down IGF-1 levels
and spark a cascade of longevity-enhancing changes. (In case you’re
wondering, neither fat restriction nor carbohydrate restriction seemed
to increase lifespan, at least in rodent models.)
But it didn’t end there! A new wave of studies zeroed in on methionine, a
sulfur-containing amino acid abundant in muscle meat and eggs. In mice,
restricting methionine—without reducing calories—was enough to increase lifespan and induce health perks
like slowing down immune aging, improving blood glucose, reducing IGF-1
and insulin levels, and protecting organ cells from oxidative damage.
The reason? It appeared to be twofold: methionine tends to generate
toxic byproducts—
And then the plot turned once more! Seriously, this saga had more
twists than a pretzel factory. A fascinating but woefully little-known
study in 2011 showed that in mice, supplementing with glycine—an amino
acid found abundantly in connective tissue and gelatin and bone
broth—had the exact same life-extending effect
as restricting methionine. Without reducing calories or other amino
acids, glycine supplementation increased the rodents’ lifespan, reduced
fasting glucose and insulin, decreased IGF-1 levels, and nearly halved
their triglycerides—the very perks that’ve variously been attributed to
calorie restriction, protein restriction, and methionine restriction.
Let me make it abundantly clear: THIS IS HUGE.
If the glycine finding translates over to humans (which I strongly
suspect it does), life-extension-seekers may be barking up the wrong
tree—or at least an unnecessarily complicated one—by trying to
selectively reduce animal protein in order to live longer, as Longo
seems to support. A wiser method could be simply getting a more
“biologically appropriate” balance of amino acids than the standard
Western diet typically provides. That means eating more of the
glycine-rich foods that’ve been gradually stripped from our menus—things
like skin, bones, tendons, tripe, feet, hooves, ears, connective
tissue, and some organ meats—and less of the muscle meat typically
dominating our fridges and freezers.
So to put all that in a Reader’s Digest version, the history of life-extension research went something like this:
“Calorie restriction extends rats’ lifespans. We must eat less to live longer!”
“Wait… reducing protein without reducing calories does the same thing. We must eat less protein to live longer!”
“Well I’ll be darned! The whole ‘life extension’ thing works just by
limiting methionine. Methionine bad. Other amino acids okay! Down with
meat!”
And now, it seems we’re at yet another crossroads—one where
methionine intake may become less important than its balance and
interaction with other nutrients, especially glycine.
*Note: This is deliberately oversimplified and lots of other
really interesting discoveries happened. But heaven knows this blog post
doesn’t need to be even longer than it already is.
Now back to Longo’s mice.
In a nutshell, the researchers put groups of mice on
different experimental diets: one relatively high protein (18 percent of
total calories) and one very low (4 to 7 percent of total calories).
Then the mice were injected with cancer cells—melanoma for one
experiment, breast cancer for another—in order to kick off the
tumor-growing process.
different experimental diets: one relatively high protein (18 percent of
total calories) and one very low (4 to 7 percent of total calories).
Then the mice were injected with cancer cells—melanoma for one
experiment, breast cancer for another—in order to kick off the
tumor-growing process.
Now brace yourself for some China Study déjà-vu.
Fifteen days after getting their melanoma implants, all of the mice on
the high-protein diets had developed measurable tumors—compared to only
80 percent of the low-protein group. (Over the course of the experiment,
that number rose to 90 percent, but never got any higher.) What’s more,
the low-protein group’s tumors seemed to grow at a much slower rate: by
the final day of the experiment, the average tumor size in the
high-protein group was 78 percent bigger than in the low protein group.
Fifteen days after getting their melanoma implants, all of the mice on
the high-protein diets had developed measurable tumors—compared to only
80 percent of the low-protein group. (Over the course of the experiment,
that number rose to 90 percent, but never got any higher.) What’s more,
the low-protein group’s tumors seemed to grow at a much slower rate: by
the final day of the experiment, the average tumor size in the
high-protein group was 78 percent bigger than in the low protein group.
When that experiment was repeated with breast cancer cells,
the results were pretty similar—except the tumor rate of the
low-protein group maxed out at 70 percent of the mice, while the
high-protein group was universally tumor-stricken.
And to tie IGF-1 back into the picture, the low-protein mice—as we the results were pretty similar—except the tumor rate of the
low-protein group maxed out at 70 percent of the mice, while the
high-protein group was universally tumor-stricken.
might expect—had significantly lower levels than their protein-gorging
brethren.
So there’s the gist. Is it a legitimate strike against eating lots of protein?
There’s one major reason I’m reluctant to draw any conclusions from all this (apart from the whole we aren’t mice
thing). And that reason is called “AIN-93G standard chow.” That’s the
name of the lab diet used for the high-protein mice, according to some
notes in the paper’s supplement. You can download the AIN-93G specs here, but if you’d like to save yourself the effort (and hard drive space), here are the top six ingredients:
thing). And that reason is called “AIN-93G standard chow.” That’s the
name of the lab diet used for the high-protein mice, according to some
notes in the paper’s supplement. You can download the AIN-93G specs here, but if you’d like to save yourself the effort (and hard drive space), here are the top six ingredients:
- Corn starch (397 g)
- Casein (200 g)
- Maltodextrin (132 g)
- Sucrose (100 g)
- Soybean oil (70 g)
- Cellulose (50 g)
A lot of things are wrong with this picture, such as “where are the food things?”—but
for the sake of brevity, I’m just going to focus on that second
ingredient: casein. It’s one of the major proteins in milk, and it’s got
an awful track record for promoting tumor growth more than other types of protein, including its dairy-derived cousin whey.
I’ve already written tomes on casein, cancer, and rodents in previous blog entries—including my Fork’s Over Knives critique and China Study critique—so I won’t torture you by rehashing it all here. Chris Masterjohn also has some awesomesauce posts
on the subject, so hop on over there if you’re insatiably curious about
it all. The bottom line is that when we look at the mice-and-protein
studies outlined in Longo’s paper, this is what we’re dealing
with: a cocktail of purified ingredients, with the protein component
being a well-known promoter of cancer in rodents. It’s not at all
surprising that the mice eating the most of it sprouted tumors like mad.
But it’s impossible to say how much of that’s due to protein per se, or
to casein—especially casein that’s been stripped of all the other
goodies in dairy and tossed into a party bag of refined junk.
Putting It All Together
For those of us in the ancestral, paleo, “real food,” low carb, and
other related communities, there’s a tendency to see a study like this
and be like RAAWRRR KILL IT BEFORE IT BREEDS at the first whiff of its
correlation-is-causation tone. And as someone who generally places
epidemiology in the ninth circle of Research Hell, I’ve certainly been
guilty of that myself. But one of the biggest gifts of observational
studies like this one is the opportunity to explore new hypotheses and
test out perspectives that challenge what we believe.
I think that’s definitely the case here.
Think of it this way. For most of human history, dietary consistency
was a fairy tale. Famines struck. Periods of scarcity tangoed with those
of abundance. We gorged on energy-dense foods like meat when they
became available, knowing full well we might not be so lucky the next
day or week. And to be sure, we ate the whole freakin’ animal after a
kill—not just the skeletal muscle.
Constant abundance and pickiness is absolutely new to our bodies,
even for those of us eating foods we deem ancient or ancestral. So it’s
really not all that far-fetched to think that America’s animal protein
habits—heavy on the methionine-rich muscle meats, scant on the glycine,
swimming in ceaseless surplus instead of punctuated with scarcity—could
be a problem for our health.
Perhaps it’s not a coincidence that many of the world’s
longest-living populations eat fairly low-methionine diets or
periodically abstain from protein-rich foods (like in Ikaria,
where the predominantly Orthodox Christian residents cyclically fast
from animal products). And perhaps just as relevant as the types
of foods we eat is the manner in which we eat them—nose-to-tail for
animals, with some plant-only days thrown in for good measure.
That doesn’t mean the solution is to go vegan. Nor is it necessarily
to eat a low-animal-protein diet. But perhaps it’s time to seriously
explore options like protein cycling, periodic fasting, or just cooking
up heaps o’ bone broth to get that glycine down our gullets.
Just to be clear, nothing I’ve written here—even my moments of
quasi-defending this study—changes the fact that the NHANES III data is
observational and the diet recalls are basically handicapped from the
start, thanks to the history-revising sinkhole that is the human mind.
As always, correlation isn’t causation. It’s pretty
disappointing that the study’s own researchers seemed to forget that.
The reason I’m not sliding this study straight into the slush pile is
because regardless of its validity, it at least opens the door to some
important discussion. The bigger point is that the trends it excavated
and hypotheses it explored could feasibly be real—evolutionarily,
biologically, logically. In my opinion, the greatest value of this
study, then, is its role as a springboard for breaking out of the
comfort zone of what we think—and want—to be true.
Otherwise, I guess it could make a nice doorstop.
Subscribe to:
Posts (Atom)