Your eyes reveal more than you know

Learn more about our new article "Eye gaze patterns reveal how reasoning skills improve with experience" published by npj Science of Learning
Published in Neuroscience
Your eyes reveal more than you know
Like

My colleague, Belen Guerra-Carillo and I are based at the University of California, America, and our new research article Eye gaze patterns reveal how reasoning skills improve with experience was published in the npj Science of Learning Journal. The Nature team approached us with a few questions about the study and we have provided further insight to our research findings.

What was the main aim of your study and why did you decide to investigate this topic?  

We wanted to know whether it's possible to hone one's general reasoning skills through education, and, if so, how? We found previously that studying for the LSAT - an exam that heavily taxes reasoning skills - strengthens the brain network that supports reasoning and reduces brain activity in a part of the brain that is active when participants carry out a cognitively demanding task. However, this work didn't tell us in which way reasoning improved. To gain additional mechanistic insights, we leverage eye tracking, a technique that can essentially track a participant's thought processes in real time. 

Is eye gazing an activity that humans learn over time or is it purely a subconscious behaviour?

Moving our eyes to what we're focusing on is totally natural. It takes effort to learn not to look at what we're paying attention to. 

What were the key findings from the study?

We found that participants who were randomly assigned to study for the Logic Games section of the LSAT, as compared with the Reading Comprehension section, improved on a battery of tests of reasoning that were very different from the kinds of problems they had practiced in their course. The LSAT Logic Games Problems are text-based: students must read a series of premises and deduce a logical conclusion from them. By contrast, the transfer tasks that they performed before and after taking the course are non-verbal: they require reasoning about the relationships among coloured shapes. Although they don't remotely resemble each other on the surface, all of these measures tax the ability to consider multiple relations between things, that is, relational thinking. This finding replicates our prior work in showing that practicing reasoning skills in one context can be beneficial in another context.

To further test whether the mechanism of transfer of learning was an improvement in relational thinking, a foundational cognitive ability that is essential for reasoning, we turned to our eye tracking data. Behavioural analyses yield only two measures of performance: accuracy and response times. In contrast, eye tracking yields a rich dataset. By analysing the sequence of eye movements between stimuli and the duration of fixation on each stimulus while people solve problems on a computer screen, we can figure out how a participant identifies and uses the relevant pieces of information. Here, we used a transitive inference task in which participants had to determine (based on a complex visual stimulus array involving balance scales), which of the two coloured balls was heavier. We developed three eye gaze metrics to distinguish between practice-related changes in attention and two facets of reasoning on this task. We found that the biggest change associated with Logic Games practice was increased efficiency in encoding the relevant relations (e.g., a balance scale showing that a green ball is heavier than a yellow one). We could not have drawn this conclusion from the behavioural data alone.

Do other forms of non-verbal communication similarly influence reasoning ability? For example, hearing and touch.

This is an important area for future study. Can practicing reasoning in one modality transfer to another modality? If the tasks have overlapping cognitive demands, like relational thinking, I think that transfer of learning is plausible.


Will your study's findings have an impact on other industries? For example, health, education or artificial intelligence?

This type of approach could be applied in a variety of ways. Eye tracking provides more sensitive measures than behavioural data, so it could be useful for diagnosing cognitive difficulties in both an educational and clinical setting, and tracking change over time, for instance, improvements in cognitive functioning as a student takes a course or participates in a targeted intervention, or tracking the effectiveness of a clinical treatment. Eye tracking is already employed in the field of artificial intelligence, for example, to study what drivers pay attention to on the road or to glean insights regarding personality. The eye-tracking industry is skyrocketing.

What is the next step for this field of research?

In the sphere of education, eye tracking could be used to determine when a student is struggling to understand a particular concept and provide additional scaffolding. Eye tracking could also be used to identify more basic problems, such as poor processing speed (ability to encode and use information quickly) or selective attention (ability to identify relevant pieces of information) or working memory (ability to keep relevant information in mind and manipulate it) or relational thinking (ability to consider and integrate the relationships among pieces of information or ideas). In addition to diagnosing learning difficulties, eye tracking could be used to predict future problems and try to nip them in the bud. 

Please sign in or register for FREE

If you are a registered user on Research Communities by Springer Nature, please sign in

Subscribe to the Topic

Neuroscience
Life Sciences > Biological Sciences > Neuroscience

Related Collections

With collections, you can get published faster and increase your visibility.

Implications of artificial intelligence in learning and education

Exploring the transformative potential of AI in education, this Collection delves into the implications and applications of AI for learning and pedagogy.

Publishing Model: Open Access

Deadline: Jan 26, 2024