By the age of six months, infants have trained their ears to the sounds of their native language, and they have learned to distinguish these sounds before actually learning words.
Those research findings, obtained by Patricia K. Kuhl and colleagues in the UW Department of Speech and Hearing Sciences, have overthrown previous theories about how language development occurs. Formerly, it was thought that the perception of sound becomes refined by about the age of one year, when babies begin to understand that sounds convey meaning. But Kuhl and colleagues demonstrated that infants' early experiences listening to the voices of their mothers and other speakers actually "prime" them for learning language. The infants learn to distinguish sound categories on the basis of the distribution of sounds in their parents' speech.
The implications are that chronic ear infections which impair infants' hearing may set the stage for language problems later on. Furthermore, the way parents speak to their infants has an important influence on language learning. There may be an important reason for "baby talk" after all.
Kuhl explains that newborns are language universalists; they
can learn any sound in any language, and they can distinguish
between all the sounds uttered by human beings. Over time,
however, they become language specialists--that is, they
lose the ability to distinguish sounds not in the native
tongue. Kuhl and colleagues set out to discover when this
change occurs. They designed an experiment that led to the
landmark results published in the journal Science in
1992 and picked up widely by the national news
Kuhl's experiment used a modification of the "conditioned head-turn" technique developed by two other UW faculty members in the 1970s--professors Gary Thompson and Wes Wilson of the UW Department of Speech and Hearing Sciences. That classic method is now used the world over to test infant hearing.
The experiment involved six-month old infants in the U. S. and Sweden. Babies sat on their mothers' laps and listened to pairs of sounds. The sounds included variations on the English vowel sound "ee," as well as the Swedish vowel "y" (roughly a "eu" sound, made by rounding the lips to make a "u" sound while saying "ee"). Babies were trained to look over their left shoulders at a moving toy when they heard a difference in each pair of sounds--for example, when they heard two English vowels "ee" pronounced slightly differently, or two Swedish vowels "y" pronounced differently.
The researchers found that the American babies ignored any difference in pronunciation of the native "ee"; but they distinguished even the slightest variations in the foreign "y" sounds. And the exact opposite was found for the Swedish babies. The experimental findings demonstrate that sound perception does not depend on the use of words, and that language experience shapes perception much earlier than researchers previously believed.
Baby talk, or "Parentese," may be vital to infants' early categorization of sounds. Kuhl suggests that the clear enunciation and prolonged vowel sounds of baby talk may be priming infants for language processing. Her work on "Motherese" in the 1980s demonstrated infants' preference of Motherese over other auditory signals.
The findings also may help to explain why adults find it difficult to learn foreign languages, especially to distinguish sounds in the foreign tongue that are not discriminated in the native tongue. For example, Japanese people have a hard time articulating and distinguishing the English sounds of "l" and "r." In Japanese, Kuhl explains, these two sounds are distant members of the same sound category, the prototype of which is half-way between them. The Japanese ear has been conditioned since birth to regard these as fundamentally the same sound.