Cutting through the Paleo hype

Paleo-Diet-Meal-Plan1

Fad diets come and go. One of the most popular fad diets of recent times is Paleo.

The Palaeolithic diet, also called the ‘Stone Age diet’, or simply ‘Paleo’, is as controversial as it is popular. It’s been increasing in popularity over the last few years, and has had some amazing claims made of it by wellness bloggers and celebrity chefs. Advocates like ‘Paleo’ Pete Evans of MKR fame, claim that the Palaeolithic diet could prevent or cure poly-cystic ovarian syndrome, autism, mental illness, dementia and obesity [1].

So what does the published medical literature say? Is there really good research evidence to support the vast and extravagant claims of Paleo?

About 10 months ago, I started reviewing the medical research to try and answer that very question. My review of the medical literature turned up some interesting results, and so rather than post it just as a blog, I thought I would submit it to a peer-reviewed medical journal for publication. After a very nervous 9-month gestation of submission, review, and resubmission, my article was published today in Australian Family Physician [2].

So, why Paleo, and what’s the evidence?

Why Paleo?

The rationale for the Palaeolithic diet stems from the Evolutionary Discordance hypothesis – that human evolution ceased 10,000 years ago, and our stone-age genetics are unequipped to cope with our modern diet and lifestyle, leading to “diseases of civilization” [3-9]. Thus, only foods that were available to hunter-gatherer groups are optimal for human health – “could I eat this if I were naked with a sharp stick on the savanna?” [10] Therefore meat, fruits and vegetables are acceptable, but grains and dairy products are not [11].

Such views have drawn criticism from anthropologists, who argue that that there is no blanket prescription of an evolutionarily appropriate diet, but rather that human eating habits are primarily learned through behavioural, social and physiological mechanisms [12]. Other commentators have noted that the claims of the Palaeolithic diet are unsupported by scientific and historical evidence [13].

So the Palaeolithic diet is probably nothing like the actual palaeolithic diet. But pragmatically speaking, is a diet sans dairy and refined carbohydrates beneficial, even if it’s not historically accurate?

Published evidence on the Palaeolithic Diet

While the proponents of the Palaeolithic diet claim that it’s evidence based, there are only a limited number of controlled clinical trials comparing the Palaeolithic diet to accepted diets such as the Diabetic diet or the Mediterranean diet.

Looking at the studies as a whole, the Palaeolithic diet was often associated with increased satiety independent of caloric or macronutrient composition. In other words, gram for gram, or calorie for calorie, the Paleo diets tended to make people fuller, and therefore tend to eat less. Of course, that may have also been because the Paleo diet was considered less palatable and more difficult to adhere to [14]. A number of studies also showed improvements in body weight, waist circumference, blood pressure and blood lipids. Some studies showed improvements in blood sugar control, and some did not.

The main draw back of clinical studies of Paleo is that the studies were short, with different designs and without enough subjects to give the studies any statistical strength. The strongest of the studies, by Mellburg et al, showed no long-term differences between the Palaeolithic diet and a control diet after two years [15].

The other thing to note is that, in the studies that measured them, there was no significant difference in inflammatory markers as a result of consuming a Palaeolithic diet. So supporters of Paleo don’t have any grounds to claim that Paleo can treat autoimmune or inflammatory diseases. No clinical study on Paleo has looked at mental illness or complex developmental disorders such as autism.

Other factors also need to be considered when thinking about Paleo. Modelling of the cost of the Palaeolithic diet suggests that it is approximately 10% more expensive than an essential diet of similar nutritional value, which may limit Paleo’s usefulness for those on a low income [16]. Calcium deficiency also remains a significant issue with the Palaeolithic diet, with the study by Osterdahl et al (2008) demonstrating a calcium intake about 50% of the recommended dietary intake [17]. Uncorrected, this could increase a patients risk of osteoporosis [18].

To Paleo or not to Paleo?

The bottom line is the Paleo diet is currently over-hyped and under-researched. There are some positive findings, but these positive findings should be tempered by the lack of power of these studies, which were limited by their small numbers, heterogeneity, and short duration.

If Paleo is to be taken seriously, larger independent trials with consistent methodology and longer duration are required to confirm the initial promise in these early studies. But for now, claims that the Palaeolithic diet could treat or prevent conditions such as autism, dementia and mental illness are not supported by clinical research.

If you’re considering going on the Palaeolithic diet, I would encourage you to talk with an accredited dietician or your GP first, and make sure that it’s right for you. Or you could just eat more vegetables and drink more water, which is probably just as healthy in the long run, but without the weight of celebrity expectations.

Comparison of the current Australian Dietary Guidelines Recommendations [19] to the Palaeolithic diet [17]

Australian Dietary Guidelines The Palaeolithic Diet
Enjoy a wide variety of nutritious foods from these five groups every day:  
Plenty of vegetables, including different types and colours, and legumes/beans Ad libitum fresh vegetables and fruits
Fruit
Grain (cereal) foods, mostly wholegrain and/or high cereal fibre varieties, such as bread, cereals, rice, pasta, noodles, polenta, couscous, oats, quinoa and barley All cereals / grain products prohibited, including maize and rice
Lean meats and poultry, fish, eggs, tofu, nuts and seeds, and legumes/beans Ad libitum lean meats and poultry, fish, eggs, tofu, nuts and seeds, but all legumes prohibited
Milk, yoghurt, cheese and/or their alternatives, mostly reduced fat (reduced fat milks are not suitable for children under 2 years) All dairy products prohibited
And drink plenty of water. Ad libitum water (mineral water allowed if tap water unavailable)

References

[1]        Duck S. Paleo diet: Health experts slam chef Pete Evans for pushing extreme views. Sunday Herald Sun. 2014 December 7.
[2]        Pitt CE. Cutting through the Paleo hype: The evidence for the Palaeolithic diet. Australian Family Physician 2016 Jan/Feb;45(1):35-38.
[3]        Konner M, Eaton SB. Paleolithic nutrition: twenty-five years later. Nutrition in clinical practice : official publication of the American Society for Parenteral and Enteral Nutrition 2010 Dec;25(6):594-602.
[4]        Eaton SB, Eaton SB, 3rd, Konner MJ. Paleolithic nutrition revisited: a twelve-year retrospective on its nature and implications. European journal of clinical nutrition 1997 Apr;51(4):207-16.
[5]        Eaton SB, Konner M. Paleolithic nutrition. A consideration of its nature and current implications. The New England journal of medicine 1985 Jan 31;312(5):283-9.
[6]        Kuipers RS, Luxwolda MF, Dijck-Brouwer DA, et al. Estimated macronutrient and fatty acid intakes from an East African Paleolithic diet. The British journal of nutrition 2010 Dec;104(11):1666-87.
[7]        Eaton SB, Konner MJ, Cordain L. Diet-dependent acid load, Paleolithic [corrected] nutrition, and evolutionary health promotion. The American journal of clinical nutrition 2010 Feb;91(2):295-7.
[8]        O’Keefe JH, Jr., Cordain L. Cardiovascular disease resulting from a diet and lifestyle at odds with our Paleolithic genome: how to become a 21st-century hunter-gatherer. Mayo Clinic proceedings 2004 Jan;79(1):101-08.
[9]        Eaton SB, Eaton SB, 3rd, Sinclair AJ, Cordain L, Mann NJ. Dietary intake of long-chain polyunsaturated fatty acids during the paleolithic. World review of nutrition and dietetics 1998;83:12-23.
[10]      Audette RV, Gilchrist T. Neanderthin : eat like a caveman to achieve a lean, strong, healthy body. 1st St. Martin’s Press ed. New York: St. Martin’s, 1999.
[11]      Lindeberg S. Paleolithic diets as a model for prevention and treatment of Western disease. American journal of human biology : the official journal of the Human Biology Council 2012 Mar-Apr;24(2):110-5.
[12]      Turner BL, Thompson AL. Beyond the Paleolithic prescription: incorporating diversity and flexibility in the study of human diet evolution. Nutrition reviews 2013 Aug;71(8):501-10.
[13]      Knight C. “Most people are simply not designed to eat pasta”: evolutionary explanations for obesity in the low-carbohydrate diet movement. Public understanding of science 2011 Sep;20(5):706-19.
[14]      Jonsson T, Granfeldt Y, Lindeberg S, Hallberg AC. Subjective satiety and other experiences of a Paleolithic diet compared to a diabetes diet in patients with type 2 diabetes. Nutrition journal 2013;12:105.
[15]      Mellberg C, Sandberg S, Ryberg M, et al. Long-term effects of a Palaeolithic-type diet in obese postmenopausal women: a 2-year randomized trial. European journal of clinical nutrition 2014 Mar;68(3):350-7.
[16]      Metzgar M, Rideout TC, Fontes-Villalba M, Kuipers RS. The feasibility of a Paleolithic diet for low-income consumers. Nutrition research 2011 Jun;31(6):444-51.
[17]      Osterdahl M, Kocturk T, Koochek A, Wandell PE. Effects of a short-term intervention with a paleolithic diet in healthy volunteers. European journal of clinical nutrition 2008 May;62(5):682-85.
[18]      Warensjo E, Byberg L, Melhus H, et al. Dietary calcium intake and risk of fracture and osteoporosis: prospective longitudinal cohort study. BMJ 2011;342:d1473.
[19]      National Health and Medical Research Council. Australian Dietary Guidelines. Canberra: National Health and Medical Research Council; 2013.

Gardasil – saves your cervix, does nothing to your ovaries

Vaccine myths are like the fart smell that somehow gets trapped in your air-conditioning in your car.  They both seem to keep going around and around, reappearing at random, and are both similarly fetid.

Doing the rounds of the social media sites this week is the old chestnut that Gardasil, the human papilloma virus vaccine, caused a teenage girl’s ovaries to implode, and that Merck, that rich powerful conglomerate of evil, conveniently forgot to investigate the effects of the vaccine on the female reproductive system.

Actually, this is old news.  I wrote a couple of blogs in in the past about Gardasil conspiracy theories, including one about the whole Gardasil-kills-ovaries thing (and another here).  In the last couple of years, nothing much has really changed, well, except that the benefit of the HPV vaccine has become much clearer, and the hysteria of the anti-vaxxers has become more pronounced as a result.

For example, the article that’s recently been making the rounds is a 2013 article by Jonathan Benson at Natural News.  This particular article was discussing a paper published in the highly esteemed British Medical Journal [1] (which you can read for yourself here). Benson’s opening paragraph shows how ignorant and/or biased anti-vaccine proponents can be.

“A newly-published study has revealed that Merck & Co., the corporate mastermind behind the infamous Gardasil vaccine for human papillomavirus (HPV), conveniently forgot to research the effects of this deadly vaccine on women’s reproductive systems. And at least one young woman, in this case from Australia, bore the brunt of this inexcusable failure after discovering that her own ovaries had been completely destroyed as a result of getting the vaccine.”

There are a couple of big errors here.  First, the article in the BMJ isn’t a study, merely a case report.  There’s a big difference, namely the fact that a case report is just that, a report of a single case.  It’s not a study that proves that one thing causes another, but merely raises the possibility that there might be something going on that other peers should be aware of or further investigate.  The lack of definitive proof didn’t stop Benson from making his other big error, leaping to a rather tenuous conclusion that this girl’s ovaries imploded because of Gardasil.

In actual fact, Premature Ovarian Insufficiency (or POI), was known about long before the Gardasil vaccine was invented.  In 1986, the known incidence was about 1 in 10,000 young women between the ages of 15 and 29 [2], and there’s no known cause in more than 90% of the cases.

So, if Gardasil was one cause of ovary implosion in young women, then it stands to reason that the rate of ovary implosion would be much higher after the introduction of Gardasil.  Is that the case?

As it turns out, the answer is no.  A big fat no.  According to the Therapeutic Goods Administration in Australia, the number of Gardasil doses that have been administered in Australia has been more than 9 million.  The number of reports of ovary implosion? Three.  Just three.

That works out to be a rate of 0.003 per 10,000.

That’s quite a lot less than the rate of ovary implosion before Gardasil was invented.  Maybe Gardasil protects your ovaries rather than destroys them.

So Gardasil isn’t rendering anyone’s daughters infertile.  The TGA has reviewed this issue a number of times and reached the same conclusion every time … there is NO link between Premature Ovarian Insufficiency and the HPV vaccine.

What the HPV vaccine is doing is reducing the incidence of genital warts and gynaecological cancers.  For example, in the years leading up to the introduction of the HPV vaccine, the number of women presenting with genital warts was about 1 in 10.  In the four years after the vaccine was introduced, the rate of genital warts had fallen between 70 to 90% depending on the age group.  The effect was especially obvious in the women under the age of 21, whose rate of genital warts dropped from over 1 in 10 to less than 1 in 100 after the introduction of the vaccine.

The rate of cervical cancer changes also fell, with a study by Gertig and colleagues in 2013 showing that a full three doses of the HPV vaccine decreased the risk of high-grade (that is, nasty pre-cancerous) pap smear changes by nearly 50% [3].

So you won’t hear this from the Natural News team or others of their ilk, but vaccination with the HPV vaccine decreases your risk of genital warts by over 90%, decreases your risk of nasty cervical cancer changes by about 50%, and increases your risk of ovarian implosion by about 0%.

Don’t let the repugnant hot air of the anti-vaxxers put you off.  Vaccination with the HPV vaccine is safe and effective, not harmful like the vaccine myths would have you believe.

References

[1]        Little DT, Ward HR. Premature ovarian failure 3 years after menarche in a 16-year-old girl following human papillomavirus vaccination. BMJ Case Rep 2012;2012.
[2]        Coulam CB, Adamson SC, Annegers JF. Incidence of premature ovarian failure. Obstet Gynecol 1986 Apr;67(4):604-6.
[3]        Gertig DM, Brotherton JM, Budd AC, Drennan K, Chappell G, Saville AM. Impact of a population-based HPV vaccination program on cervical abnormalities: a data linkage study. BMC medicine 2013;11:227.

Christian male modelling

Zoolander

Some love him.  Some hate him.  It doesn’t change the fact that he was still “ridiculously good looking”.

Zoolander was one of those cult movies that polarised people into “absolutely love it” or “absolutely loathe it” camps.  I admit, I’m one of the former.  (“Moisture is the essence of wetness, and wetness is the essence of beauty”  … It still cracks me up!)

For those who aren’t familiar with the story, Derek Zoolander was a top male model who was famous for his different looks: “Blue Steel”, “Ferrari”, “Le Tigre” and the famous “Magnum”. They were all the same pose, of course, but everyone thought they were different. Except for evil fashion designer, Mugatu, who in a burst of rage at the climax of the movie, yells, “Who cares about Derek Zoolander anyway? The man has only one look … Blue Steel? Ferrari? Le Tigra? They’re the same face! Doesn’t anybody notice this? I feel like I’m taking crazy pills!”

There are times when I read Dr Leaf’s social media posts, and I feel the same as Mugatu.

“Dr Leaf isn’t a scientific expert … ‘When we think, we learn because we are changing our genes and creating new ones’ … That’s not scientifically possible! Doesn’t anybody notice this? I feel like I’m taking crazy pills!”

Screen Shot 2016-01-28 at 9.57.06 PM

Dr Caroline Leaf is a communication pathologist and a self-titled cognitive neuroscientist.  If Dr Leaf was a legitimate scientist, she would know that our genes do not change when we process new information. Our genes are stable. They do not change unless there’s a mutation, which occurs in one out of every 30 million or so genes. We do not make new genes at will. Last year, scientists at MIT were reported to have shown that DNA breaks when new things are learnt, but in a normal nerve cell, these breaks are quickly repaired. That’s certainly interesting, but that’s not changing the DNA or making new genes. Making claims that we make new genes to hold new information is like saying that pigs fly.

Dr Leaf’s supporters would likely make a counter-argument that she probably didn’t mean that genes really change, or we make new genes, she’s just not worded her meme properly. Well, there are two responses to that, neither of which are any better for Dr Leaf. Because scientists who really are experts don’t make errors so large that you can spelunk through them. And, this isn’t the first time that Dr Leaf has made claims about how our genes fluctuate. She made a similar claim back in September 2014. Saying the same thing several times isn’t a mistake, it shows she really believes that we change our DNA code by the power of our thoughts.

Whether someone thinks DNA is changeable isn’t likely to cause any great harm to that person, but what is concerning is that Dr Leaf has been given her own show on the Christian cable TV network TBN to discuss mental health. She’s already proven that her knowledge of psychiatric medications is dangerously flawed. If Dr Leaf doesn’t know the basics of DNA, then giving her a platform to preach something that can effect whether a person might live or die is particularly perilous.

Dr Leaf’s rise is also a worrying symptom of a Christian church that is intellectually imploding. In a 2013 blog for the Huffington Post, Charles Reid wrote,

“Christians must provide effective witness against both extremes. But before Christianity can engage atheism it must first address the scientific illiteracy in its own house. For the greatest danger Christianity confronts at the present moment is not incipient persecution, but increasing marginalization and irrelevance. If Christians cannot engage reasonably and responsibly with science, there will be no place for them in the public life of advanced societies.”

Reid was paying particular attention to Ken Ham in this blog, but the principle remains the same. Scientifically illiterate Christians quickly lose credibility with people. We can’t meaningfully engage with a person who has a rudimentary understanding of biology by proudly tell them that we create new genes with the power of thought. That makes us sound like a male model.

For the sake of other Christians health and well-being, and for the sake of our credibility and our witness, we need to critically assess Dr Leaf’s work, not promote it as another gospel.

I love this sunburnt country

I love this sunburnt country.
I know there’s been some pains,
when colonists advanced and pillaged
and subdued our coasts and plains.

But white, black, red or yellow,
or whatever your skin may be,
Together we are Australians,
and together, we live free.

Our unity is our strength,
many cultures give us beauty.
Our past may be dark and painful,
but our future’s as bright as can be.

So let’s love this sunburnt country,
Together, let’s take a stand,
To treat everyone as equals,
To extend a welcome hand.

Let’s celebrate this country
And all that makes us tick
Today, and every Australia Day,
Each January twenty-six.

Does helping others help you?

John Holmes wrote “There is no exercise better for the heart than reaching down and lifting people up.”

We all know that exercise is good for us, but is the exercise of the heart, “reaching down and lifting people up” just as good for us?

Dr Caroline Leaf is a communication pathologist and self-titled cognitive neuroscientist.  Her meme of the day today was a claim that “Helping others can increase your lifespan.”  She explained that “Researchers found a link between serving others, improved health and decreased mortality! See more at: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3780662/pdf/AJPH.2012.300876.pdf”.

Screen Shot 2016-01-16 at 6.28.07 PM

The journal she referenced was a 2013 article by Poulin et al in the American Journal of Public Health [1].  Poulin and his colleagues examined data from nearly 850 people in the Detroit area.  At the start of their study, they asked their participants about stressful life events in the last year and whether they provided tangible assistance to friends or family members.  They then followed their participants for five years and analysed the characteristics of who died in that time.

According to the study by Poulin, those who helped others were younger, healthier, more likely to be White, of higher socioeconomic status, and higher in social support and social contact than those who didn’t help, all factors that have been shown to influence mortality.  They also noted that 70% of their cohort didn’t experience any stressful life events.  While they adjusted for these variables, their statistics would still be affected by them.  As it turns out, while their results were significant, their numbers had broad confidence intervals, so the effect they found is very weak.

What about other studies looking at the same question but in a different way?  Well, there are mixed findings.  Roth and colleagues published a study in 2013 in the American Journal of Epidemiology which also showed that care-givers had better life expectancy than matched controls [1] but then a number of other studies show the opposite.  The Caregiver Health Effects Study found that those who were providing care to a disabled spouse and who reported some strain associated with that care had a 63% elevated risk of death compared with non-caregiving spouses [2]. Other studies suggest that caregivers have poorer mental and physical health status than non-caregivers [3], and caregiving has been widely portrayed as a serious public health problem in the professional literature [4, 5].

So while Poulin found a loose association between helping others and decreased mortality, Dr Leaf has taken that a step too far:

> Firstly, correlation does not equal causation.  Just because a study found those who helped others had a decreased mortality doesn’t mean that the reverse, helping others increases your lifespan, necessarily holds.  There may be other explanations.
> Secondly, other studies show conflicting results, so Poulin’s study may be a statistical hiccough.

It’s not clear that helping others is actually good for our health.  That doesn’t mean to say we shouldn’t help others. I think we should, if for no other reason than the golden rule, “Do unto others as you would have them do unto you.”  But we can’t definitively say that helping others will help us directly by making us live longer.  That’s scientifically still up in the air.

References

[1]        Poulin MJ, Brown SL, Dillard AJ, Smith DM. Giving to others and the association between stress and mortality. Am J Public Health 2013 Sep;103(9):1649-55.
[2]        Schulz R, Beach SR. Caregiving as a risk factor for mortality: the Caregiver Health Effects Study. JAMA : the journal of the American Medical Association 1999 Dec 15;282(23):2215-9.
[3]        Pinquart M, Sorensen S. Differences between caregivers and noncaregivers in psychological health and physical health: a meta-analysis. Psychol Aging 2003 Jun;18(2):250-67.
[4]        Talley RC, Crews JE. Framing the public health of caregiving. Am J Public Health 2007 Feb;97(2):224-8.
[5]        Centre for Disease Control and Prevention. Caregiving, A Public Health Priority.  2010, 7 Dec 2010 [cited 2016 Jan 16]; Available from: http://www.cdc.gov/aging/caregiving/index.htm

Mobile phone mothering – one more thing for mums to feel unnecessarily guilty about

Mothers.  They are probably the single most important group of people in the world.

It’s not that I’m belittling the role of fatherhood, or demeaning the amazing work that fathers do for their children, but simply put, we wouldn’t be here if it wasn’t for the tireless patience and sacrifice of our mums.  Nine months of nausea, sore breasts, swollen appendages and having your organs used as punching bags.  Then there’s the trauma of birth itself, which is rewarded with the full-time care of a screaming, incessantly ravenous alimentary canal which has taken the form of a baby.  Over the years, the screaming and the pooping become slightly more manageable, but most mothers remain the head chef, playmate, laundromat, ironing lady, teacher, taxi-driver, nurse and drill sergeant for their offspring.

Despite these daily feats of amazement, most mothers are haunted by this nagging sense of not being good enough – Mother Guilt.  As author Mia Redrick wrote,

“Mother’s guilt is real. Nearly all of us experience it. We are racked with guilt, feeling that our best isn’t good enough. We struggle when work commitments prevent us from attending school events and we are crushed by the looks of disappointment on our children’s faces. We wonder if choices we have made, such as what school to send our kids to, have not had far-reaching negative consequences, if a different path would have resulted in happier, more well-adjusted kids. We moms might feel guilty when we can’t afford something for our kids or are nagged by the feeling that we simply don’t spend enough time with them.”

Mothers seems to feel guilty about anything, and everything, for the whole day …

“The kids are in the bed again. I was sure I shushed them back to their beds at 2am, they must have snuck in during the wee hours. Tonight I will make sure they sleep all night in their own beds. How will they ever learn to sleep if I keep letting them come in to my bed?”
“Whose children get only eight hours of sleep a night? I am sure at this age they are meant to be getting 12 – 14 hours sleep. I am going to damage then for life. Maybe I should let them sleep in my bed so they get more sleep?”
“Oh so much sugar in EVERYTHING.  Don’t you read the articles? Don’t you hear the “experts”? Don’t you see those diagrams with spoonful upon spoonful of the deadly substance displayed, a visual representation of poison imprinted on your mind each and every time you take the bran flakes from the cupboard?”

And so it goes on.

Today, Dr Leaf added one more thing for mothers to feel guilty about – smartphones.

Screen Shot 2016-01-09 at 1.58.46 PM

“Mothers, put down your smartphones when caring for your babies! That’s the message from researchers, who have found that fragmented and chaotic maternal care can disrupt proper brain development, which can lead to emotional disorders later in life.”

She then exhorted her followers, “Lets get some real eye-to-eye contact going – dads included!”

Dr Caroline Leaf is a communication pathologist and self-titled cognitive neuroscientist.  Credit where credit’s due – in the past, Dr Leaf has pathologically avoided citing her references, but today, she cited the article itself and the news story that promoted it.

But again, like the meme she posted a couple of days ago about sadness making people sick, Dr Leaf has posted the opening paragraph of a promotional PR puff piece and made it sound like a scientific pronouncement.  When you actually read the journal article that the news story is promoting, it has nothing to do with smartphones.  Or indeed, human beings.

The research was performed entirely on rats.

The research itself, by Molet and colleagues [1], seemed entirely legitimate.  The rat pups raised in a more chaotic way appeared to have higher levels of anhedonia, because they didn’t engage as much in the things that rats normally find pleasurable, namely, drinking sugar water or playing with their rat buddies.

I’m not sure if you’ve ever seen a mother rat on a smartphone.  I certainly haven’t, which means that news article Dr Leaf took her meme from, the one published on Science Direct, made some pretty tenuous assumptions:

  1. Chaotic mothering to rat pups is the cause of rat anhedonia
  2. Rat mothering and human mothering have similar outcomes
  3. Smartphone use causes fragmented and chaotic maternal care
  4. Not using smartphones would improve outcomes.

There’s no evidence from this study, or any work that I know of, that definitively proves any one of these things.  There are a number of alternative explanations as to why those rat pups weren’t as happy as the control group, but even if the chaotic nurturing of the rat babies was THE cause of their unhappiness, human beings are completely different to rats in cages.  And there are many things, other than smartphones, that can strain the mother-baby relationship.  Excessive mother guilt for one.

Dr Leaf’s meme is a good example of just how misinformation can spread quickly through the internet.  The PR department of a university writes a puff piece on the article to promote the university and its research.  But no one wants to read about depressed rats – they need a better hook.  There’s a love-hate relationship with smartphones in our culture, and lots of Mommy-guilt, so they use a sentence about smartphones and mothering to grab people’s attention, even though the journal article had nothing to do with either.

Science Direct then simply republished the press release from the university without filtering it, where it’s then picked up by wannabe scientists and self-titled experts like Dr Leaf, who pass on the misinformation to hundreds of thousands of their followers.  Pretty soon, mothers everywhere are feeling guilty about looking at their phone instead of their children’s eyes, when it probably doesn’t make a blime bit of difference.

The take home messages:

  1. Unless you’re a rat, there’s no evidence that using your smartphone makes you a bad mother.
  2. Be wary of social media memes, and what you read on the internet.
  3. Dr Leaf is hurting her own credibility by reposting the opening paragraphs of sciencey promotional PR articles instead of reading the actual article first. We need experts to reduce the amount of misinformation clogging the internet, not increase it.

References

[1]        Molet J, Heins K, Zhuo X, et al. Fragmentation and high entropy of neonatal experience predict adolescent emotional outcome. Translational psychiatry 2016;6:e702.

Does sadness make you sick?

LeafMeme20160107

We’ve all heard of being “homesick”, or “heartsick”, or “lovesick”.   Sometimes when we’re extremely sad, we feel the knot in our stomachs, the pressure in our chests, or the confusion and distraction in our minds as the waves of sadness wash over and discombobulate us.

But can being sad really make you physically ill as well as emotionally distraught?

Dr Caroline Leaf declared today on her social media platforms that “Feeling sad can alter levels of stress-related opioids in the brain and increase levels of inflammatory proteins in the blood that are linked to increased risk of comorbid diseases including heart disease, stroke and metabolic syndrome.”

Dr Caroline Leaf is a communication pathologist and a self-titled cognitive neuroscientist.  She believes that our cognitive stream of thought determines our physical and mental health, and can even influence physical matter through the power of our minds.

She also added some further interpretation to her meme: “So this is more evidence that our thoughts do count: they have major epigenetic effects on the brain and body! We need to apply the principles in the Bible and listen to the Holy Spirit – no excuses this year!”

With all due respect to Dr Leaf, the study she quotes doesn’t prove anything of the sort.

Dr Leaf’s meme is a copy and paste of the opening paragraph of a news report published by the university’s PR people to promote their faculty.  This isn’t a scientific summary, it’s a hook to draw attention to an article which amounts to a PR puff piece.  If Dr Leaf had read further into the article, I don’t think she would have been quite so bold in claiming what she did.

The article discussed a study by Prossin and colleagues, published in Molecular Psychiatry [1].  You can read the original study here.  The study specifically measured the change in the level of the activity of the opioid neurotransmitter system and the amount of a pro-inflammatory cytokine IL-18 across two experimental mood states, and in two different groups of volunteers, people with depression, and those without.

For a start, it’s important to note that the study isn’t referring to normal day-to-day sadness.  This was an experimentally induced condition in which a sad memory was rehearsed so that the same feeling could be reproduced in a scanner, and the study was looking at the effect of this sad “mood” on people who were pathologically sad, that is, people diagnosed with major depression.

It’s well known that people with depression are at a higher risk of major illnesses, such as heart attacks, strokes and diabetes [2] The current study by Prossin et al looked experimentally at one possible link in the chain, a link between a neurotransmitter system that’s thought to change with emotional states, and one of the chemical mediators of inflammation.

They found that:

> Depressed people were much sadder to start with, and remained so throughout the different conditions.  The depressed people stayed sadder in the ‘neutral’ phase, and the healthy cohort couldn’t catch them in the ‘sad’ phase.
> Depressed people had a much higher level of the inflammatory marker to start with, and interestingly, this level dropped significantly with the induction of the neutral phase and the sad phase.  What was also interesting was that the level of the inflammatory marker was about the same in the baseline and the sad phase for the healthy volunteers.
> A completely different pattern of neurotransmitter release was seen in the two different groups.  People with depression had an increase in the neurotransmitter release over a large number of areas of the brain, whereas in the healthy controls with normal mood, the sad state actually resulted in a decreased amount of neurotransmitter release, and in a much smaller area within the brain.  This suggests that the opioid neurotransmitter system in the brains of depressed people is dysfunctional.

Affect/Sadness Scores - Prossin et al Molecular psychiatry 2015 Aug 18.

Affect/Sadness Scores – Prossin et al Molecular psychiatry 2015 Aug 18.

IL18 v Mood state/diagnosis - Prossin et al Molecular psychiatry 2015 Aug 18.

IL18 v Mood state/diagnosis – Prossin et al Molecular psychiatry 2015 Aug 18.

Effectively, the results of the study reflect what’s already known – the emotional dysregulation seen in people with depression is because of an underlying problem with the brain, not the other way around.  And, sadness in normal people is not associated with a significant change in the evil pro-inflammatory cytokine.

So, according to Prossin’s article,

  1. normal sadness in normal people is not associated with physical illnesses.
  2. sadness is abnormally processed in people who are depressed, which maybe related to an abnormal inflammatory response, which might explain the known link between depression and increased risk of illness

The article is not “more evidence that our thoughts do count.”  If anything, it shows that underlying biological processes are responsible for our thoughts and emotions and their downstream effects, not the thoughts and emotions themselves.

And unfortunately, it appears that Dr Leaf hasn’t got past the opening paragraph of a puff piece article before jumping to a conclusion which only fits her worldview, not the actual science.

References

[1]        Prossin AR, Koch AE, Campbell PL, Barichello T, Zalcman SS, Zubieta JK. Acute experimental changes in mood state regulate immune function in relation to central opioid neurotransmission: a model of human CNS-peripheral inflammatory interaction. Molecular psychiatry 2015 Aug 18.
[2]        Clarke DM, Currie KC. Depression, anxiety and their relationship with chronic diseases: a review of the epidemiology, risk and treatment evidence. Med J Aust 2009 Apr 6;190(7 Suppl):S54-60.