What a baby hears while asleep matters more than previously thought
University of Colorado at Boulder News Jun 07, 2017
What an infant hears during sleep has an immediate and profound impact on his or her brain activity, potentially shaping language learning later in life, suggests a new University of Colorado Boulder study of slumbering babies. ÂWe found that even while babies sleep, they are still processing information about their acoustic environment, and their brains are using that information to develop pathways for learning, said lead author Phillip Gilley, PhD, principal investigator of the Neurodynamics Laboratory at the Institute of Cognitive Science (ICS).
Research dating back to the 1970s suggests that newborns can already recognize their motherÂs voice. What has remained unclear, however, is how early, and to what degree infants can distinguish between the rapid–firing sounds – such as long or slow vowels or consonants – that serve as the building blocks of human language.
To find out, Gilley and his colleagues enlisted the parents of 24 healthy infants under the age of 5 months to bring their newborns to a lab. Each infant had electroencephalogram (EEG) electrodes attached to their heads, then fell asleep. Researchers tested sleeping infants because they are more still and infants spend up to 80 percent of their time asleep. A speaker nearby played a sequence of repeated sounds – Âah, ah, ah or Âbah, bah, bah – interspersed with an occasional oddball sound – Âeeh or Âdah, respectively. Meanwhile, EEG measured the childÂs brainwaves.
Afterward, Gilley used an algorithm developed in his laboratory to identify and measure these different brain patterns.
The findings, published in March in the journal BMC Neuroscience, came as a surprise.
When the Âstandard sound hummed along – ah, ah, ah – the infantÂs brainwaves remained primarily in a theta, or low frequency, wave. But within a few milliseconds of hearing the oddball sound the brainwave pattern shifted to a complex blend of gamma, beta and theta frequencies, a signal of neurons in different regions of the brain oscillating and harmonizing.
In essence, the brain quickly learned what the expected sound was, anticipated it, and reacted with surprise to a different one. In the process, Gilley notes, new neuronal pathways key to discriminating sounds were likely formed. ThatÂs important, because knowing how to discriminate between distinct sounds is key to learning speech and language.
ÂThe most surprising finding here is how quickly these infants brains are able to make those predictions. Within the span of one test, their brain learns a pattern and begins to respond to it.Â
The paper is the first of a series the group will be rolling out as part of a five–year, multi–center grant from the National Institute on Disability, Independent Living, and Rehabilitation Research.
Go to Original
Research dating back to the 1970s suggests that newborns can already recognize their motherÂs voice. What has remained unclear, however, is how early, and to what degree infants can distinguish between the rapid–firing sounds – such as long or slow vowels or consonants – that serve as the building blocks of human language.
To find out, Gilley and his colleagues enlisted the parents of 24 healthy infants under the age of 5 months to bring their newborns to a lab. Each infant had electroencephalogram (EEG) electrodes attached to their heads, then fell asleep. Researchers tested sleeping infants because they are more still and infants spend up to 80 percent of their time asleep. A speaker nearby played a sequence of repeated sounds – Âah, ah, ah or Âbah, bah, bah – interspersed with an occasional oddball sound – Âeeh or Âdah, respectively. Meanwhile, EEG measured the childÂs brainwaves.
Afterward, Gilley used an algorithm developed in his laboratory to identify and measure these different brain patterns.
The findings, published in March in the journal BMC Neuroscience, came as a surprise.
When the Âstandard sound hummed along – ah, ah, ah – the infantÂs brainwaves remained primarily in a theta, or low frequency, wave. But within a few milliseconds of hearing the oddball sound the brainwave pattern shifted to a complex blend of gamma, beta and theta frequencies, a signal of neurons in different regions of the brain oscillating and harmonizing.
In essence, the brain quickly learned what the expected sound was, anticipated it, and reacted with surprise to a different one. In the process, Gilley notes, new neuronal pathways key to discriminating sounds were likely formed. ThatÂs important, because knowing how to discriminate between distinct sounds is key to learning speech and language.
ÂThe most surprising finding here is how quickly these infants brains are able to make those predictions. Within the span of one test, their brain learns a pattern and begins to respond to it.Â
The paper is the first of a series the group will be rolling out as part of a five–year, multi–center grant from the National Institute on Disability, Independent Living, and Rehabilitation Research.
Only Doctors with an M3 India account can read this article. Sign up for free or login with your existing account.
4 reasons why Doctors love M3 India
-
Exclusive Write-ups & Webinars by KOLs
-
Daily Quiz by specialty
-
Paid Market Research Surveys
-
Case discussions, News & Journals' summaries