Healthcare (Commonwealth Union) – A cutting-edge technology that uses light waves to monitor brain activity in infants has given researchers the most comprehensive view yet of functions like hearing, vision, and cognitive processing outside of traditional brain scanners, according to a recent study led by UCL and Birkbeck.
This wearable brain-imaging headgear, created in collaboration with UCL spin-out Gowerlabs, revealed surprising activity in the prefrontal cortex—an area involved in emotional processing—when babies were exposed to social stimuli. This suggests that infants as young as five months old begin processing social interactions early on.
The latest version of this technology can track neural activity across the entire outer surface of an infant’s brain, whereas previous iterations could only focus on one or two areas at a time.
According to the researchers, this advancement could help map the connections between various brain regions, identifying the differences between typical and atypical neurodevelopment in early childhood. It could also provide insights into neurodiverse conditions such as autism, dyslexia, and ADHD.
The development of this device and the findings from early tests are described in a new study published in Imaging Neuroscience were discussed at the British Science Festival on September 14.
Dr. Liam Collins-Jones, the study’s lead author from UCL Medical Physics & Biomedical Engineering and the University of Cambridge, pointed out that their earlier work introduced a wearable imaging method that could measure activity in specific brain regions.
“But this made it difficult to get a complete picture as we could only focus on one or two areas in isolation, whereas in reality different parts of the brain work together when navigating real-world scenarios”, he said.
“The new method allows us to observe what’s happening across the whole outer brain surface underlying the scalp, which is a big step forward. It opens up possibilities to spot interactions between different areas and detect activity in areas that we might not have known to look at previously.
Professor Emily Jones, an author of the study from Birkbeck, University of London, stated that for the first time, they have been able to measure differences in brain activity across such a broad area in babies with the application of a wearable device. This includes regions responsible for processing sound, vision, and emotions.
Professor Jones indicated that the technology developed and tested in this study marks an important step forward in understanding the brain processes that drive social development, something that has been difficult to observe outside the constraints of an MRI scanner.
She also indicated that now, they are able to see what is happening in babies’ brains as they play, learn, and interact with others in a much more natural setting.
A new device was tested on sixteen infants between five and seven months old. While wearing the device, the babies sat on their parent’s lap and watched two types of videos: one featuring actors singing nursery rhymes to simulate a social interaction, and another showing objects like a ball rolling down a ramp to represent a non-social environment.
Researchers noticed variations in brain activity between the two situations. In addition to unexpected results in the pre-frontal cortex in response to social stimuli, they found brain activity was more concentrated during social scenarios than non-social ones, supporting earlier results from optical neuroimaging and MRI studies.
At present, the most detailed method for observing brain activity is through magnetic resonance imaging (MRI), which requires the subject to remain very still inside the scanner for 30 minutes or longer.
The downside of this method is that it’s tough to replicate real-life situations, like conversing with someone or doing an activity, especially with babies who must either be asleep or restrained to allow an MRI to capture their brain activity accurately.