Finding Meaning in a Word

EEG testing gives researchers the power to look at complex language skills in depth.

Go to the profile of Nicola Bell
Jul 25, 2018
2
0
Upvote 2 Comment

Language is power. When a spoken language user is among other members of the same community, their ability to speak gives them the power to tell stories, to inspire, to express emotion. Speech gives people the power to be heard.

Then again, it’s also just noise.

When I go to my local café, there’s nothing in the sound waves coming from my face that magically inception-ises the phrase ‘egg and bacon roll’ into the barista’s brain. If I were to produce the same sound waves towards a café worker in Iceland, I would likely receive a look of bemusement in response, rather than breakfast. ‘Egg and bacon roll’ has meaning in my local Brisbane café because it is part of the same language that my barista and I speak.

So just how do we translate those sound waves? How did we learn to pull meaning from the series of mouth-made noises that otherwise seem so arbitrary?

Measuring access to word meaning
Experimental tasks and technologies have been developed over the last few decades that allow researchers to measure how children’s brains process meaning in real time. In particular, electroencephalography (EEG) is a common way of capturing the brain activity that occurs in response to a certain word.

[1]

Based on my own personal experience of EEG testing with children, the process goes like this:

  1. An elasticated cap of spongey electrodes is fitted onto the participant’s head by a lovely and well-meaning researcher.
  2. They sit in front of a computer screen and complete a simple task that requires them to access the meaning of a word. Meanwhile, their brain activity is recorded.
  3. After lots of repetitions of the same task (and a few reminders to limit blinking or, very occasionally, to wake up), the participant finishes, takes off the EEG cap, and goes home with a smile on his or her face and a grape-scented sticker on his or her shirt.

Sensing nonsense
The very first EEG studies to measure the processing of word meaning were conducted in the 1980s[2]. Adults were asked to read through a series of sentences, where the final word was either meaningful (e.g., ‘He took a sip from the [cup]’), or non-meaningful (e.g., ‘He took a sip from the [transmitter]’).

Over many hundreds of similarly meaningful and non-meaningful trials, the average waveforms produced in response to the final word looked something like:

The important result to come from this experiment was that meaningful and non-meaningful words elicited different patterns of activity, within specific brain regions. Around 400 milliseconds after exposure to the final word, the waveform produced by non-meaningful items was more negative than that stimulated by meaningful items. The participants were sensitive to whether or not the word’s meaning fitted with the rest of the sentence, and this sensitivity showed up in the brain’s response, measured within half a second of their reading the word.

The same findings have been replicated many times since the 80s[3], using the same or similar tasks that draw on a person's semantic processing, or their ability to sort out word meaning. That is, in specific parts of the brain, non-meaningful (or semantically 'incongruent') words stimulate more negative brain potentials than meaningful (or semantically 'congruent') words. This pattern is referred to as the 'N400 effect'. It has been elicited not only in studies with written words, but also in those with spoken language and sign language[4].

The N400 effect during childhood
In most studies involving children and infants, the EEG task is simplified by having the target word presented after a picture, rather than after a sentence. So, the N400 effect is elicited when the waveform in response to non-matching picture-word pairs (e.g., CAR-horse) is more negative than what is produced in response to picture-word pairs that do match (e.g., CAR-car).

Typically developing preschool-[5] and school-age[6] children consistently show the N400 effect. In fact, it's been observed in infants as young as 12 months old, although only if their word production skills were relatively advanced[7].

This doesn’t mean that babies who don’t show the N400 effect cannot tell the difference in meaning between two words. But it might mean their semantic representations of those words are not yet very specific or distinct. For example, if a child knows only that ‘dog’ refers to a four-legged animal, their semantic representation of a ‘dog’ may overlap with that of a ‘cat’. Those kinds of loose and under-specified semantic representations may not be enough to trigger the N400 effect[7].

In comparison with adults, the N400 effect tends to emerge a little later for young children and infants, and it also lasts for a shorter amount of time[7]. Semantic processing skills therefore seem to continue developing into adulthood, leading to more distinct and complex vocabulary knowledge, and more efficient access to that knowledge.

There is plenty yet to be discovered about how humans learn word meanings. To answer those sorts of questions, researchers are developing more and more elaborate ways of looking at complex brain processes, and gradually, such methodological advances will allow us to form a better understanding of how all the little skills – including those related to semantic processing – develop and interact and contribute to the ultimate goal of using language.

 

References

[1] 'Brain study' by Simon Fraser University available at https://www.flickr.com/photos/sfupamr/13878767643 under a Creative Commons Attribution 2.0. Full terms at http://creativecommons.org/licenses/by/2.0.

[2] Kutas, M., & Hillyard, S.A. (1980). Reading senseless sentences: brain potentials that reflect semantic incongruity. Science, 207(4427), 203-205.

[3] Kutas, M., & Federmeier, K.D. (2011). Thirty years and counting: finding meaning in the N400 component of the event-related potential (ERP). Annual Review of Psychology, 62, 621-647.

[4] Zachau, S., Korpilahti, P., Hamalainen, J.A., Ervast, L., Heinanen, K., Suominen, K., Lehtihalmes, M., & Leppanen, P.H.T. (2014). Electrophysiological correlates of cross-linguistic semantic integration in hearing signers: N400 and LPC. Neuropsychologia, 59, 57-73.

[5] Byrne, J.M., Connolly, J.F., MacLean, S.E., Dooley, J.M., Gordon, K.E., & Beattie, T.L. (1999). Brain activity and language assessment using event-related potentials: development of a clinical protocol. Developmental Medicine and Child Neurology, 41, 740-747.

[6] Henderson, L.M., Baseler, H.A., Clarke, P.J., Watson, S., & Snowling, M.J. (2011). The N400 effect in children: relationships with comprehension, vocabulary and decoding. Brain and Language, 117, 88-99.

[7] Friedrich, M., & Friederici, A.D. (2010). Maturing brain mechanisms and developing behavioral language skills. Brain and Language, 114, 66-71.

Go to the profile of Nicola Bell

Nicola Bell

PhD candidate, University of Queensland

Nicola Bell is currently completing her PhD at The University of Queensland, having graduated in 2014 with a Bachelor of Speech Pathology. The topic of her PhD is literacy development in children with cochlear implants, and her research interests extend more broadly to include language and literacy development in all school-age children.

No comments yet.