transcript of episode 4: EXTREME DREAM ARGUMENTS, 26th February 2021
[🎶 INTRO: "Spring Swing" by Dee Yan-Kee 🎶]
welcome to the error bar: where cerebral claptrap is called-out
in this episode: how your brain does maths while you dream, the brain differences between agreeing and disagreeing with someone, & why extreme opinions slow down decisions
here is the brain news on the 26th February 2021:
[🎶 BIG BEN BONGS 🎶]
PEOPLE CAN ANSWER QUESTIONS WHILE ASLEEP
Scientific American, The Independent, and The Dail Mail all reported on a remarkable study of the apparent two-way communication between researchers and volunteers who were asleep. the volunteers were experiencing 'lucid dreams' where they have some control over their dreams
in the study - which was actually four independent studies from four countries, all with slightly different methods and participants - 36 volunteers were trained to give simple eye-movements or facial twitches in response to (very) simple maths questions. three left-right eye movements is the correct answer to 'one plus two'; five eyebrow twitches is the wrong answer to 'two plus two'
the volunteers then went to sleep in the laboratory, connected to some electrodes recording activity from their brain and facial muscles. the remarkable bit of the story is that when the volunteers started showing signs of the 'rapid eye movement' sleep phase which indicates dreaming, the volunteers were then asked the maths questions again. and some of them responded - sometimes - with eye movements or facial twitches. and for a not-insignificant proportion of those questions, the movements corresponded with the correct answer
the journalists concluded that two-way communication is possible during sleep, that we can break into peoples' dreams and leave them messages. while the numbers of successful communications remained quite small, the scientists hailed this a proof-of-concept study
[🎶 INTERLUDE 🎶]
can we really communicate with the sleeping?
remarkable claims require remarkable evidence. this is a remarkable report, so the error bar is raising its evidence threshold to 'remarkable'
i want to believe it. it should be possible to communicate with lucid dreamers. it fits with my expectations. but i'm sorry listeners - i'm not convinced. i don't necessarily believe that the evidence is not there in the data. i just don't think the reported analyses show this evidence sufficiently - or correctly
first, the overall study is actually four independent studies from four groups that have been somewhat cobbled together post-hoc. multi-site studies can be fantastically powerful ways to do prospective science, but this seems to be a retrospective collaboration. was the evidence from the individual groups insufficient? perhaps this paper is the pilot study for a prospective collaboration?
second, each study tests a different population, recruited in a different way, using different methods. the experimental training, task, interventions & analysis are all different, yet the data are all pooled
and third, that's my main problem with the paper. all the data are pooled together as if they are equivalent. but they're not. the data come from different numbers of people & different numbers of sleeping & dreaming events. for example, 80% of the data come from just 27% of the participants. but the statistics all ignore this hierarchical or clustered structure in the data
22 participants in the US were trained to dream lucidly; 10 Germans were recruited from an (expert) lucid dreamers' forum; 37 Dutch volunteers had had at least one lucid dream; & there was a single French narcolepsy patient. these 70 participants were reduced in various slightly opaque ways to the 36 mentioned in the report
from these 36 dreamers, only 6 produced one or more correct responses to questions asked during sleep. statistically, then, should we be analysing either the 36 tested participants, or the 6 selected 'responder' participants? we're then asked to focus on 158 communication attempts, themselves selected from a total of 850. (there were also 802 control attempts.) and from these 158 selected attempts, only 29 resulted in a correct answer. some of these 29 events were presented graphically in the paper
much of the analysis was done by independent experts' eyes, blind to condition - which is great. but why was there no attempt to analyse the data computationally? surely such signals - if they exist in the data - can be detected automatically?
finally, the statistical approach here is a mess: sometimes the stats use the participant as the 'unit of analysis' - an important concept about what is being measured - sometimes it's the experimental session being analysed, sometimes it's the data epoch, & sometimes it's the individual responses. what you want to be able to say at the end of a study like this is that 'people can answer questions when they're asleep'. but we can't say that. what we can say is that 'some questions can be answered when people are asleep' - the unit of analysis is the questions, not the people
in summary, there is so much whittling & selection of data here that it is very hard to tell what's wheat and what's chaff
conclusion
a highly-selective analysis of highly selected participants shows that a small minority of them seem able to respond to simple questions while they sleep. the stats here are a mess
sources
the science was by Konkoly et al. 2021: Current Biology; reported in Scientific American by @DianaMKwon on 18/Feb/21, & The Independent by @adamndsmith on 23/Feb/21, & The Daily Mail by @jwillchad on 18/Feb/21
[🎶 BIG BEN BONGS 🎶]
ARGUING BRAINS REVEAL SOMETHING, MAYBE
the psychologist Susan Pinker, writing in her column in the Wall St Journal, reports on a study of people having their brains scanned during discussions of controversial topics
the 19 familiar pairs were selected for agreeing on two, and disagreeing on two controversial political or social topics
they discussed these four topics, taking 15 seconds at a time in a carefully-orchestrated turn-taking discussion
while they did so, optical brain imaging was performed on both sides of their head. the data were separated according to whether the topic was one on which they agreed or disagreed, and whether they were currently talking or listening
the results revealed a number of (not very surprising) things - mostly that regions of the brain known to be involved in speaking and listening were activated during these conversations
[🎶 INTERLUDE 🎶]
do arguing brains reveal something?
not really, no
the study suffers from a fundamental flaw that makes it very difficult to compare the different experimental conditions - there was no baseline condition against which to compare each of the other conditions. so, while the authors talk about the 'neural correlates of talking or listening or agreeing or disagreeing', they can only really talk about the differences between talking and listening, or between agreeing and disagreeing
if this doesn't sound like a problem, then consider cheese. two people could both like cheddar cheese a little bit more than they like wensleydale cheese. but one of these people could really hate both cheeses, and the other could really love both cheeses. if we only look at the differences between cheddar and wensleyday, we can't tell if they actually like cheese. the brain is the same
so the study is unable to tell us, for example, if any brain areas are involved in both talking and listening, or both agreeing and disagreeing. all we know is that some brain areas do different things in the two conditions
but more importantly than the data, both the scientific and the Wall St Journal articles both claimed that you can't get two people into the same MRI scanner. not true. the first MRI of a heterosexual couple having actual sex was in 1999, the first custom two-person scanner was described in 2011, and there are dozens of articles on two person MRI. just alta vista it
conclusion
while the neural correlates of agreement and disagreement would be extremely interesting to the error bar, this study doesn't seem able to tell us much more than what we already know about talking and listening
sources
the science was by Hirsch et al. 2020: Frontiers in Human Neuroscience; reported in The Wall Street Journal by Susan Pinker on 11/Feb/21
and the brain in brief...
[🎶 BIG BEN BONGS 🎶]
EXTREMISTS ARE WORSE AT COGNITIVE TASKS
the occasionally-extremist Guardian newspaper reported on a study that measured more than 500 hundred people's personalities and cognitive task performance, then later measured a subset of nearly 400 on their political views
the very large dataset from multiple tasks and multiple questionnaires was analysed to create much smaller, more consistent sets of measurements. these measurements were then compared to each other to see if any of the cognitive task performance was related to the political views
they were! people with more conservative, dogmatic, or religious views tended to be more 'cautious' (i.e., slower), less strategic, and less forward-thinking in decision making tasks
[🎶 INTERLUDE 🎶]
are extremists worse at cognitive tasks?
no
this was not a study of extremists. the guardian seems to be making a little too much political capital out of this one
the study is, however, this episode's good one - a large, comprehensive dataset, very well analysed, displayed & reported. just listen to some of these stats terms: "out-of-sample cross-validation", "Bayesian model averaging" oooh, that hits the spot
what the study does show is that, across the entire sample of people studied, there are strong statistical relationships between measures of conservatism and cognitive task performance
but it's not justified to claim that extremists can't do puzzles, or that these tests would necessarily help prevent radicalisation, as claimed
conclusion
on a comprehensive set of cognitive, personality, & political tests, there are strong statistical relationships between decision-making skills & conservative or dogmatic views. no extremists were identified or studied
sources
the science was by Zmigrod et al. 2021: Philosophical Transactions of the Royal Society of London, B: Biological Sciences; reported in The Guardian by @NatalieGrover on 22/Feb/21
[🎶 BIG BEN BONGS 🎶]
WHAT'S GOOD FOR YOUR BRAIN THIS WEEK
the Daily Mail's regular assault of occasionally insightful scientific articles includes advice that:
caffeine temporarily reduces grey matter in the hippocampus - an important part of the brain associated with memory
but you can boost your memory by eating lots of cocoa which contains flavonols and, err, caffeine
if you don't take caffeine, you might lie-in bed longer at weekends, which will give you depression
but if you don't lie in, and instead sleep for less than 5 hours a night, you'll get dementia
but you can counteract one form of dementia - Alzheimer's - if you eat an apple every day
finally, if you need the cognitive flexibility required to make sense of all this contradictory tabloid brain science, then start singing in a choir
I'm not kidding - every one of these was a Daily Mail brain science story in the last two weeks. every single one
conclusion
everything and nothing
sources
reported in The Daily Mail by @jwillchad on 18/Feb/21, & The Daily Mail by @jwillchad on 16/Feb/21, & The Daily Mail by @jwillchad on 12/Feb/21, & The Daily Mail by Dan Avery on 12/Feb/21, & The Daily Mail by @RyanMorrisonJer on 18/Feb/21, & by @jwillchad on 12/Feb/21
[🎶 BIG BEN BONGS 🎶]
LESLIE UNGERLEIDER (1946-2020)
the scientific journals Nature Neuroscience and Neuron report on the death of Professor Leslie Ungerleider at the end of 2020
Professor Behrmann describes her as an "apeirogon, a generalized polygon with a seemingly infinite number of sides" and a "uniformly brilliant, consumate scientist"
Professor Kastner says she "fostered a culture of curiosity, creativity, and critical thinking"
her career spanned 50 years and as many research topics, but she's best known for her 'what and where' pathways in the visual brain - sorting out the primate visual system in an enormously influential book chapter printed in 1982
Behmann concludes: "Dr. Ungerleider stated that her career comprised one happy accident after another. Statistically, this seems highly improbable"
conclusion
rest in peace
sources
the science was by Behrmann 2021: Nature Neuroscience, & Kastner 2021: Neuron;
a word from our sponsors
err, sorry, it's still me
i’ve read in a number of places that, if your podcast gets to 5 episodes, then you’re well on your way to an award-winning masterpiece. well, we’re here! if you count the trailer
so far, so good
i’ve managed to fit podcast production into my regular work schedule. most of the work is scanning the media, then reading, reviewing, and summarising scientific papers, which is part of my dayjob anyway. the recording and editing only takes two or three hours on a tuesday afternoon. but it’s the listening and re-listening on every device, and the obsessing over every syllable and background hiss that eats the rest of the week before episode release. if you’re subscribed on a podcast app, it will arrive every other thursday at 6pm uk time. if you get it elsewhere, links will start appearing on friday mornings
lots of podcasts have adverts, or they regularly tap you for cash. unless i get sacked, i’m not going to do that – i think i get paid enough. but if you like the error bar, then you can show your appreciation in many other ways: by recommending the podcast to others, and by liking, following, or retweeting the various posts on twitter. there will be a facebook page at some point; and i’m hoping that academics will start recommending the error bar to their students – particularly those on methods, stats, or science communication courses
if you’d like to suggest a brain science story to be fact-checked, ask a question, or volunteer to record or collaborate on a story – whether news, views, controversy, or error – then email talk@theerrorbar.com
stay tuned – next episode there will be actual human guests!
[🎶 OUTRO: "Cosmopolitan - Margarita - Bellini"by Dee Yan-Kee 🎶]
it's closing time at the error bar, but do drop in next time for more brain news, fact-checking & neuro-opinions. take care.
the error bar was devised & produced by Dr Nick Holmes from University of Birmingham's School of Sport, Exercise and Rehabilitation Sciences. the music by Dee Yan-Kee is available from the free music archive. find us at the error bar dot com, on twitter at bar error, or email talk at the error bar dot com.ε