2016: A New Hope

“Hope, it is the quintessential human delusion, simultaneously the source of your greatest strength, and your greatest weakness.” The Architect, Matrix Reloaded

I confess, sometimes I can be a little bit rigid.  And grumpy.

Every New Years Eve, I get a tinsy bit frustrated by the vague aspirations that adorn social media statuses everywhere.  From the self-realisation types …

“Lets make 2016 the best year ever / I’m gonna take 2016 to the next level / Be the love, feel the power, live the life, bask in the light”

through to the typical vague self-improvement ones …

“This year I’m gonna lose weight / stop smoking / be nicer to people / save more / give more / love more / exercise more / eat less …”

It’s all a bit too much for my inner cynic.

My pragmatic cynic dismissed them as pointless. “These aspirations that people post are just pathetic, they won’t benefit anyone.  Goals need to be SMART – Specific, Measurable, Attractive, Realistic and Time-Framed.  Why bother with anything else.”

The activist cynic chimed in, “Honestly, there are so many other more important things … who cares about ‘going to the next level’ when you’re being conned everyday by charlatans and snake-oil salesmen.”

My core cynic was like, “What’s the fuss anyway? The transition into 2016 ‘holds no more meaning that the silent segue from March 14th into March 15th, or the almost imperceptible movement of the minute hand as 2:38pm becomes 2:39pm. If we’re going to celebrate one meaningless moment passing, then shouldn’t we extend the same courtesy to all other moments too? Why does the passage of time matter so much more at the stroke of midnight? I bet 11:58pm feels a bit miffed.’”

I thought about letting my sceptical trinity loose on this post today, but somehow I felt like it wasn’t quite right.  And then I had a small epiphany – each aspiration represents more than vague self-affirmation and cyclical mediocrity.  Together, they represent hope, and who am I to stifle the incredible power of hope.

The power of hope is being realised in secular psychology in recent times.  Hope involves having goals, along with the desire and plan to achieve them.  Dr Shane Lopez, a leading expert on the psychology of hope, describes hope as “the golden mean between euphoria and fear. It is a feeling where transcendence meets reason and caution meets passion.”

Hope leads to everything from better performance in school to more success in the workplace to greater happiness overall.  There may also be a role for teaching hopefulness in the treatment of depression.

So how can we harness the power of hope?  How can we use hope to make 2016 a better year than 2015?  Hopeful people share four core beliefs:
1. The future will be better than the present.
2. I have the power to make it so.
3. There are many paths to my goals.
4. None of them is free of obstacles.

So if we’re going to engage the power of hope, we need to believe that the future is brighter and it’s within our grasp, so long as we keep moving toward it, in spite of the expected obstacles.

Of course, like the Architect noted in the Matrix Reloaded, hope can sometimes be a weakness.  Like Lopez noted, hope needs the right mix of caution and reason, not just passion and transcendence.  If you want to move forward into a better future, you have to keep your feet on the ground.  You need to be aware of those that would take advantage of blind trust.

The conclusion: I’m glad to have my sceptical inner trinity on board, so long as I temper them with a bit more optimism, and maybe an occasional self-affirmation or two.

I hope that 2016 would bring you new hope, along with prosperity and peace.

Happy new year everyone!

Bibliography

http://psychcentral.com/blog/archives/2013/03/21/the-psychology-of-hope/

http://psychcentral.com/news/2008/08/19/hope-therapy-for-depression/2778.html

http://wonkyperfectionism.blogspot.com.au/2015/01/new-years-vague-sort-of-aspirations.html

Why we need Christ at the beginning of Christmas

ChristmasLights

The tinsel has been adorning shopping centres for weeks now, while houses glow with festive spirit and the rainbow of thousands of tiny bulbs.  And yet it’s only now, with Christmas less than a week away, that I’ve had enough of a chance to slow down and contemplate the place of Christmas in the world of 2015.

It’s certainly a different world now than it used to be.  I remember only a few years ago, the meaning of Christmas seemed to be drowning in a rampant flood of commercialism.  This year, the meaning of Christmas seems like it’s being assaulted by rampant secularism on one hand, and a terrorism-related pervading sense of apprehension on the other.

Jason Wilson recently wrote an opinion piece for The Guardian Australia.  The tone was a bit hubristic, but the conclusion was fair:

“It has long since stopped being a primarily religious event in Western culture, so the secular left does not need to be too concerned about reclaiming Christmas for themselves.  And the way to do that is to insist on the enactment of its deepest meaning for Christians and secularists alike, which is a radical generosity – to refugees, to those who do not share our faith (or lack thereof), and even to our political enemies.”

Wilson is right on both counts; Christmas is, and always has been about radical generosity, and Christmas has lost its traditional Christian roots.

What I’ve been pondering is whether it’s possible to have radical generosity without “Christ” as the first part of “Christmas”?

After all, Christmas is Christmas because of the ultimate example of radical generosity, the son of God giving himself as the ultimate sacrifice to a world who despised, tortured and killed him.  Whether you’re a Christian or an atheist, the moral of the Christmas story is a universal principle that we can all aspire to.

There’s also a lot more about Christmas that can inspire us, especially to those of us who do celebrate the deeper spiritual meanings of our Saviour’s birth.

Jesus taught that he was “the way, the truth and the life”.  It seems that the average western Christian has forgotten this fundamental.  Jesus gives life a direction, a unity of purpose that should fuse us together into a unified body, inspired by and continually pursuing the truth of the gospel.  Instead, it seems that we’re scattered, running in different directions like spooked horses, ignoring the common truth of the gospel and blindly accepting every alluring pseudo-profound notion, so long as it has a bit of out-of-context scripture mixed in.

Jesus also taught that he was the light of the world.  Paris, Kenya, Nigeria, the Lindt Café, or San Bernardino … it seems that we’re being overwhelmed by darkness.  Evil seems to be touching all corners of the globe at the hands of ISIS, Al-Shabaab, Boko Haram, or just lone wolves with tar-pitch souls and itchy trigger fingers.  It seems that any one could be a victim of the new terrorism, that no one is ever truly safe.

The thing about darkness is it’s not a force of its own.  Darkness is only present because of an absence of light.  It’s human to fight darkness with more darkness – radical Muslims have waged war on the West, and it’s natural to retaliate against other Muslims.  But adding darkness to darkness doesn’t enlighten.  We need to add light.  As Christians, we need to be the light that Jesus shines into the darkest places.

It isn’t easy.  I’m certainly not going to pretend that I have it all worked out, or put myself up as a shining example of love and tolerance.

Not that anyone can do it all on their own either.  It takes thousands of little bulbs to light up a prize-winning Christmas-lights display.  And it takes all of us working as the body of Christ to overcome the darkness.  Whether your bulb is dull and flickering, or powering brightly, if we all give God our best, he will put us together to become the perfect display of his light.

This year, put your little light on display by putting Christ at the beginning of Christmas.

And have a very Merry Christmas (and a safe holiday season)!

Should pregnant women still take antidepressants if they’re depressed? – SSRI’s and the risk of autism

As is my usual habit, I sat down tonight to do something useful and wound up flicking though Facebook instead.  Procrastination … avoidance behaviour … yeah, probably.  But at least this time it turned out to be rather useful procrastination, because I came across a science news story on Science Daily about a study linking the use of anti-depressants in pregnancy with an 87% increased risk of autism.

Actually, this is old news.  Other studies have linked the use of some anti-depressants with an increased risk of autism, such as Rai et al in 2013 [1].

The latest study to come out used data from a collaboration called the Quebec Pregnancy Cohort and studied 145,456 children between the time of their conception up to age ten.  In total, 1,045 children in that cohort were diagnosed with autism of some form, which sounds like a lot, but it was only 0.72%, which is actually lower than the currently accepted prevalence of autism in the community of 1%.

What the researchers got excited about was the risk of developing autism if the mother took an antidepressant medication at least at one time during her pregnancy.  Controlling for other variables like the age, wealth, and other health of the mothers, a woman who took an anti-depressant during pregnancy had a 1.87 times greater chance that her baby would end up with ASD, compared to women who did not take an anti-depressant [2].

An 87% increase sounds like an awful lot.  In fact, it sounds like another reason why anti-depressants should be condemned … right?

Well, like all medical research, you’ve got to consider it all in context.

First, you’ve always got to remember that correlation doesn’t always equal causation.  In this particular study, there was a large number of women being followed, and their children were followed for a long enough time to capture all of the likely diagnoses.  So that’s a strength.  They also tried to control for a large number of variable when calculating the risk of anti-depressants, which also adds more weight to the numbers.

Although the numbers are strong, studies like these can’t prove that one thing causes another, merely that they’re somehow linked.  It might be that taking anti-depressants causes the brain changes of autism in the foetus, but this sort of study can’t prove that.

Even if the relationship between anti-depressants and ASD was cause-and-effect, what’s the absolute risk?  Given the numbers in the study, probably pretty small.  With a generous assumption that ten percent of the study population was taking anti-depressants, the increase in the absolute risk of a women taking anti-depressants having a child with ASD is about 0.5%.  Or, there would be one extra case of autism for every 171 that took anti-depressants.

Hmmm … when you think of it that way, it doesn’t sound as bad.

You also have to consider the increase in risk to women and their offspring when they have depression that remains untreated, or in women that stop their anti-depressant medications.  There is some evidence that babies born to women with untreated depression are at risk of prematurity, low birth weight, and growth restriction in the womb, as well as higher impulsivity, poor social interaction, and behavioural, emotional and learning difficulties.  For the mother, pregnant women with depression are more at risk of developing postpartum depression and suicidality, as well as pregnancy complications such as preeclampsia, and an increase in high-risk health behaviour such as smoking, drug and alcohol abuse, and poor nutrition.  Women who discontinued their antidepressant therapy relapsed significantly more frequently compared with women who maintained their antidepressant use throughout pregnancy (five times the rate) [3].

So the take home messages:

  1. Yes, there’s good evidence that taking anti-depressants in pregnancy is linked to an increased risk of a child developing autism.
  2. But the overall risk is still small. There is one extra case of autism for every 171 women who take anti-depressants through their pregnancy.
  3. And this should always be balanced out by the risks to the mother and child by not adequately treating depression through pregnancy.
  4. If you are pregnant or you would like to become pregnant, and you are taking anti-depressants, do not stop them suddenly. Talk to your GP, OBGYN or psychiatrist and work out a plan that’s best for you and your baby.

References

[1]       Rai D, Lee BK, Dalman C, Golding J, Lewis G, Magnusson C. Parental depression, maternal antidepressant use during pregnancy, and risk of autism spectrum disorders: population based case-control study. Bmj 2013;346:f2059.
[2]       Boukhris T, Sheehy O, Mottron L, Bérard A. Antidepressant use during pregnancy and the risk of autism spectrum disorder in children. JAMA Pediatrics 2015:1-8.
[3]       Chan J, Natekar A, Einarson A, Koren G. Risks of untreated depression in pregnancy. Can Fam Physician 2014 Mar;60(3):242-3.

Dr Caroline Leaf’s war on drugs

Today, Dr Leaf posted this on her social media feeds.  It’s clearly meant to shock and enrage her followers.

Screen Shot 2015-12-12 at 11.56.44 AM

Dr Caroline Leaf is a communication pathologist and a self-titled cognitive neuroscientist.  She’s also cast herself as an expert on mental health.

To the detriment of her followers, and sadly, to the rest of the Christian church, most people believe her.

Her most recent book, and her social media memes for the last couple of months, have made it clear that Dr Leaf is pursuing her own personal war on drugs … but prescription psychiatric drugs not the illicit kind.

Unfortunately, her attacks on prescription psychiatric drugs have amounted to nothing more than a hysterically illogical smear campaign under the guise of her concern for public safety.

Today’s offering follows the same pattern of narrow-minded hysteria.

Her main quote from was from Robert Whitaker, “Twenty years ago, our society began regularly prescribing psychiatric drugs to children and adolescents, and now one out of every fifteen Americans enters adulthood with a ‘serious mental illness’.”

Whitaker, like Dr Leaf, is an outspoken critic of modern psychiatric treatment with a poor understanding of how psychiatric medications actually work.  The statement that Dr Leaf quotes is remarkable for it’s poor logic.  The quote implies that the rise in childhood mental health is because of the rise in psychotropic medication use in children.  But correlation does not equal causation.  Even if one in fifteen Americans enters adulthood with a ‘serious mental illness’, and twenty years ago our society began regularly prescribing psychiatric drugs to children and adolescents, there’s no evidence that the psychiatric medications are actually causing the psychiatric problems.

Then there’s Dr Leaf’s emotionally charged statement that “They are even prescribing these psychoactive substances to infants!”

The New York Times article that she linked to discusses the case of Andrew Rios, a child suffering from severe epilepsy, having his first seizure at 5 months.  Though it’s clearly more complicated than just “simple” epilepsy – he’s pictured wearing a helmet which suggests that he has myoclonic epilepsy which is clearly uncontrolled. It’s also clear from the article that the child was having mood swings and violent behaviour before the anti-psychotic was given. The history of early seizures with ongoing poor control and violent behavior means that this unfortunate young boy likely has a severe and complicated neurological syndrome, quite possibly because of an underlying abnormality of his brain. And the symptoms he had which the mother claimed were from the antipsychotic were just as likely to have been night terrors, a common problem in two year olds.

In the end, who really knows?  But there’s certainly not enough in this article to clearly convict antipsychotics of being toxic or evil.

Neither is the use of antipsychotics for infants widespread.  20,000 prescriptions for antipsychotic medications sounds like a travesty, but according to the article, the real numbers are probably much less, or about 10,000, since not every prescription is filled.  Even 10,000 sounds like a lot, but that represents 0.0002% of all prescriptions in the US, and most of those scripts are not actually being taken by the child, but by their uninsured parent(s).

Indeed, as the article itself said, “In interviews, a dozen experts in child psychiatry and neurology said that they had never heard of a child younger than 3 receiving such medication, and struggled to explain it.”

So the prescribing of antipsychotics to infants is extremely rare, almost unheard of, and is only likely to be done in extreme cases where all other options have been exhausted.

That’s certainly not the impression you get from Dr Leaf’s post, which is just another misinformed smear against anti-psychotic medications.

Dr Leaf’s war against psychiatric medications is reckless.  When people who need psychiatric medications don’t take them, suffering increases, as do suicides.

It’s time Dr Leaf stopped spreading needless fear about these medications.  They help more people than they harm, people who already suffer from the stigma of having a severe mental illness.  They don’t need any more suffering stemming from Dr Leaf’s so-called “expertise”.

Does our attitude towards aging increase Alzheimer Dementia?

“I think I’m forgetting something …”
Does our attitude towards aging increase Alzheimer Dementia?

For the last few years, I’ve worked as a doctor for a number of my local nursing homes.  On my morning rounds, I would literally reintroduce myself to every second patient, because even though I’d seen them every week for the previous few months, they still couldn’t remember who I was.

And it’s not just because I have a less than memorable face.  Most of my nursing home residents had dementia.

While there are many different causes for dementia, the one first described by Mr Alzheimer in the (early 1900’s) is the best known and most feared.  It is also the most common, and is a significant drain on the nation’s economy as well as the quality of life in the twilight of years.

Recently, an article was published by a group of researchers from Yale University in the US which claimed to show that the attitude a person had towards aging contributes to their chances of Alzheimer Disease.  I first saw it yesterday on the social media feed of Dr Caroline Leaf, communication pathologist and self-titled cognitive neuroscientist.  Dr Leaf is known for her scientifically dubious assumptions that the mind changes the brain, not the other way around, and has previously publically stated that dementia was caused by toxic thinking.  This article seems to vindicate her assumptions.

Screen Shot 2015-12-10 at 6.08.55 PM

However, this article also made it onto Facebook’s trending list ands was picked up by news site all over the world (such as this article in the Australian http://goo.gl/RavbMl), so the interest wasn’t just from Dr Leaf, but also from the broader public.  And I can understand why.  No one wants to ‘grow old and senile’, or to ‘lose our marbles’.  Any potential cure or prevention for Alzheimer Dementia is worth paying attention to.

I admit, the headline intrigued me too, both personally and professionally.  I wasn’t aware that one’s attitude towards aging would contribute to Alzheimers, since Alzheimers is predominantly genetic, and the other associated risk factors have more to do with physical health (like diabetes, high blood pressure etc).  Psychological stress is a risk factor for Alzheimers in mice, but good evidence in humans has been lacking [1].

So, does negative attitudes to aging really cause stress which then leads to Alzheimers as the report suggested, or is there a much better explanation?

The scientific article that the news reports were based on is A Culture-Brain Link: Negative Age Stereotypes Predict Alzheimer’s Disease Biomarkers [2].  This study was done in two stages.  Volunteers were recruited from a larger study called the Baltimore Longitudinal Study of Aging.  At entry point, the participants answered a questionnaire about their attitudes towards aging.  This was about 25 years before the participants were actively studied.

The first study examined the change in volume of a part of the brain called the hippocampus (which plays an essential part in our memory system).  The second part of the study examined the volunteers’ brains at autopsy for markers of Alzheimer Dementia, namely ‘plaques’ and ‘tangles’.  The number of plaques tangles were combined to form a single composite score, which was then compared to the baseline attitude towards aging score.

In the first study, the researchers reported that those people who held negative views of aging were more likely to have a smaller hippocampus which more rapidly decreased in size over time.

In the second study, the researchers reported that those people who held negative views of aging were more likely to have more plaques and tangles in their brain.

On the surface, this seems to suggest that people who hold negative views on aging contribute to the development of Alzheimer Dementia, and certainly this is how the different news agencies seemed to interpret the outcomes of the study.  Though on deeper palpation, a number of questions arise about how the researchers did the study and chose to interpret the results.

For example, the aging attitude survey was only done once, which means there’s a 25 year gap or longer between the questionnaire and the active studies. That’s a long time, and the attitudes of the volunteers may have improved or worsened in that time, but that doesn’t seem to have been considered

Levy and her researchers also report that the average size of the hippocampus changed significantly when they averaged the size of the left and the right hippocampus.  But when they analyzed the two sides separately, there was no significant change over time.  So this makes me wonder about the validity of their analysis too – if the volume of each side separately doesn’t change much at all, then how can the average volume of the two sides change so much?

I’m not much of a statistician, but I wonder if the secret’s in their modeling.  They used a linear regression model to compare their data to their hypothesis, a legitimate statistical method, but which involves adjustment for other variables.  If you do enough adjusting, you can get a significant result statistically, but according to their numbers, their Cohen’s d was 0.29, which is considered a weak effect overall.

Then there’s the question of clinical significance.  Even if the hippocampus did shrink in those who thought aging was negative two decades ago, was the shrinkage enough to contribute to the cognitive impairment seen in Alzheimer Dementia?  When compared to other studies, probably not.  Looking at Levy’s graph, the “negative” attitudes group changed about 150mm3 over the 10 year follow up period, or about 5%.  A recent study also showed that the the hippocampal size of subjects with mild memory loss is about 12% less than a healthy age matched control [3].

The same problems are seen in study 2 – Levy and her researchers reported an increase in the number of plaques and tangles in the “aging is bad” group.  But her numbers are small, and not statistically strong.  And again, the question of clinical significance arises.  Plaques and tangles represent biomarkers of Alzheimer Dementia, not necessarily a diagnosis.  Normal aging brains without dementia also have plaques and tangles, and it’s the number of tangles that seem more significant for developing cognitive impairment [4, 5], not the combined score that they used in this study.

And when all is said and done, all Levy and colleagues have shown is a correlation between attitude to aging and changes in the brain.  But correlation does not equal causation.  Just because two things are associated does not mean that one causes the other.  There maybe another variable or factor that causes both observations to co-occur.

In Levy’s case, the common connecting cause could easily be neuroticism, which they discussed as a co-variant but did not say if or how they corrected for it.  The other thing they did not examine in this study is the ApoE gene subtypes, which contribute significantly to the onset of Alzheimer Dementia [6].  The action of ApoE subtypes in the brain may contribute to both negative attitudes and Alzheimers changes?

The bottom line is that Levy’s study shows a weak correlation between a single historical sample of attitude towards aging, and some changes in the brain that are known to be markers for Alzheimer Dementia some three decades later.

They’ve certainly NOT shown that stress, or a person’s attitude to aging, in anyway causes Alzheimer Dementia.  They did not correct for genetics in this study which is the major contributor to the risk of developing Alzheimers.  So the results mean very little as it stands, and further research is required to delineate the cause and effect relationship here.

So don’t stress.  It’s not definitely proven that how you view the aging process determines your risk of dementia.  There will be those like Dr Leaf who will trot out this cherry-picked little titbit of information in the future to try and justify their pretense that thought can change our brain and impact our mental health, but what the press release says and what the study shows appear to be two different things altogether.

References

[1]       Reitz C, Brayne C, Mayeux R. Epidemiology of Alzheimer disease. Nat Rev Neurol 2011 Mar;7(3):137-52.
[2]       Levy BR, Slade MD, Ferrucci L, Zonderman AB, Troncoso J, Resnick SM. A Culture-Brain Link: Negative Age Stereotypes Predict Alzheimer’s Disease Biomarkers. Psychology and Aging 2015;30(4).
[3]       Apostolova LG, Green AE, Babakchanian S, et al. Hippocampal atrophy and ventricular enlargement in normal aging, mild cognitive impairment (MCI), and Alzheimer Disease. Alzheimer Dis Assoc Disord 2012 Jan-Mar;26(1):17-27.
[4]       Nelson PT, Alafuzoff I, Bigio EH, et al. Correlation of Alzheimer disease neuropathologic changes with cognitive status: a review of the literature. J Neuropathol Exp Neurol 2012 May;71(5):362-81.
[5]       Jansen WJ, Ossenkoppele R, Knol DL, et al. Prevalence of cerebral amyloid pathology in persons without dementia: a meta-analysis. JAMA : the journal of the American Medical Association 2015 May 19;313(19):1924-38.
[6]       Liu CC, Kanekiyo T, Xu H, Bu G. Apolipoprotein E and Alzheimer disease: risk, mechanisms and therapy. Nat Rev Neurol 2013 Feb;9(2):106-18.

Dr Caroline Leaf – The mystery of he said/she said is no longer a mystery

This weeks edition of New Scientist magazine carried an article entitled “Scans prove there’s no such thing as a ‘male’ or ‘female’ brain” [1].  The article was inspired by a journal article published in the PNAS last month [2], which reviewed the scans of 1400 different people to see if there were specific differences in the neuroanatomy of the brains of men and women (i.e., are there ‘male’ and ‘female’ brains, or are the commonly accepted male/female differences just a myth, or a cultural, not biological phenomenon?)

According to the article, there is an “extensive overlap between the distributions of females and males for all gray matter, white matter, and connections assessed. Moreover, analyses of internal consistency reveal that brains with features that are consistently at one end of the ‘maleness-femaleness’ continuum are rare. Rather, most brains are comprised of unique ‘mosaics’ of features.” [2]

So essentially, there’s no strong biological basis for gender differences after all.  “This means that, averaged across many people, sex differences in brain structure do exist, but an individual brain is likely to be just that: individual, with a mix of features. ‘There are not two types of brain,’ says Joel.” [1]

This news is a blow to one of Dr Leaf’s less renowned books, “Who switched off your brain? Solving the mystery of he said/she said” [3].

Dr Caroline Leaf is a communication pathologist and a self-titled cognitive neuroscientist.  Her ‘he said/she said’ book is based on the idea that there are definitive characteristics of the male and female brain which define each gender.  From her conclusion on page 211,

“Men and women are different.  Both the physical anatomy and functional strategy of our brains are different.  We can’t attribute this to social engineering, cultural norms or our up-bringing.  We’ve been created different – it’s in our fundamental design.  Our parents, our communities, and the cultural context of our childhood and adolescence certainly have a prominent developmental role in each of our lives.  But your brain has been fashioned in a specific way that shapes your ‘true you’ long before any of these other factors have had the opportunity to exercise their influence on you.”

As a quick aside, this quote shows the confusion in Dr Leaf’s teaching.  As I’ve discussed before in other blogs, Dr Leaf contradicts herself by claiming that our brain determines our gifts and our behaviours in some books (like ‘He said/she said’ and ‘The gift in you’) but then claims that our thought life controls our brains and our physical reality in the rest of her teaching.  So which is it?

But this quote also sounds the death knell for her book, in light of the recent scientific evidence to the contrary.  Which is a shame, since out of all of her books, this one initially seemed the most scientifically robust.

Even though the book is based on a now defunct theory, I wonder if the thrust of her book still holds true to a point.  We’ve all been created to be different, and we should celebrate those differences and how they complement other people around us.  It just so happens that those differences aren’t inherent to our gender, but to us as individuals, uniquely designed by God “for good works, which God prepared in advance for us to do” (Ephesians 2:10).

So, yes, the mystery of he said/she said has been solved, but not quite as Dr Leaf envisaged.

References

[1]        Hamzelou J. Scans prove there’s no such thing as a ‘male’ or ‘female’ brain. New Scientist. 2015 Dec 5.
[2]        Joel D, Berman Z, Tavor I, et al. Sex beyond the genitalia: The human brain mosaic. Proceedings of the National Academy of Sciences of the United States of America 2015 Nov 30.
[3]        Leaf CM. Who swithced off your brain: Solving the mystery of he said/she said. Texas, USA: Inprov, Ltd, 2011.