This article is the third of a weekly five-part series about how evidence can inform classroom practice. Read part one, in which Charlotte argues that an emphasis on evidence-based practice would lead to prescribed practice, or part two, on the role of a teacher and the purpose of education.
Research has both a cultural and an instrumental role to play in informing education practice
While research in almost all fields aim to approach “truth,” there is no single cookbook approach that can guarantee this outcome. Broadly, research is a process of testing our ideas. Education research is a form of social science research that aims to test ideas about education policy and practice. Social science research is characterised much the same as research in the pure sciences.
Research in all fields is based upon the same principles: the search for universalism (general principles); organisation (to conceptualise related ideas); scepticism (questioning assumptions and looking for alternative explanations); and communalism (a community that shares norms and principles for doing research; Merton, 1973).
However, social science research, and education research specifically, is different to science research in its context and scope. By the nature of the contexts that social scientists investigate, the questions that they pose, the methodologies that are use to collect evidence, and the forms of evidence that are collected, the evidence must be analysed and interpreted differently to that in the physical or natural sciences.
Like research in the physical sciences, education researchers pose significant questions that can be investigated empirically, link research with relevant theory, and use methods that permit direct investigation of the questions.
Like other social sciences, education research plays both an instrumental role in the generation of strategies, techniques, practices, and other means for achieving ends, and a cultural role in the provision of different frameworks for understanding and imagining social realities. Such frameworks can help teachers to develop different understandings of their practice, and to see and imagine their practice differently. The examination of practice through different lenses allows us to understand problems in new ways, or to see new problems we hadn’t anticipated (Biesta suggests feminist theory as an example of how cultural research can unveil problems not previously recognised, and help us toward resolution). These two roles, the cultural and the technical or instrumental, are distinguishable, but not easily separable, as they mutually inform and reinforce each other. If there is too much of either, education research risks losing relevance.
Education research is also different to other forms of social science research such as psychology research. Education researchers pose questions about education policy and practice. They inquire about policy and practice at all levels of the education systems we operate, from the individual student and teachers, to classrooms and schools and school systems. They make predictions about what the impacts of policies and practices are or might be and why, what teachers are teaching and how what the outcomes of various practices might be in particular contexts and for particular students. Education researchers build on earlier work, challenging, re-examining, and extending ideas about what education is and can do, which and how educational strategies and activities “work.”
Research from many other fields informs (and may be informed by) the results of education research: for example, psychology, particularly the fields of developmental, cognitive, and behavioural psychology.
Education research methods suit the purposes of education research. Some education research aims to test an explanatory hypothesis; some studies be exploratory, identifying ideas for further examination; some may examine a particular case, place, event, interaction, system, policy, learner, teacher, practice, or technology. Education research with these aims may collect evidence, and the form and amount of evidence collected varies with the purpose and question of the research. This is the case in other fields of research, too. This instrumental research is balanced by cultural research that questions normative assumptions about education, constructs new frameworks, integrates ideas into new theories, and critically considers the normative roles, functions, practices, contexts and values of education. Just as education, and education practices, cannot be value-free, nor can evidence and research, in education or in any other field. Such an assumption is fallacious.
Education, unlike science, involves interactions between related factors that are not tangible or concrete
There are important and irreconcilable differences between science and education. Education and the natural world are not as homologous as some would want us believe. Science describes physical interactions that are tangible, predictable, concrete, and often isolatable, leading us to develop reliable understandings of causal mechanisms. In contrast, education describes a complex process of mediated transactions between humans and their environment. “If teaching is to have any effect on learning, it is because of the fact that students interpret and try to make sense of what they are being taught” (Biesta, 2007, p8).
It is extremely difficult to measure, analyse, and interpret relationships between variables in education research in the same way that we can, for example, the laws of gravity or biochemical interactions in the cell. It is also much harder in education to manipulate a single variable at a time, or examine a single relationship at a time, and virtually impossible to identify with perfect certainty practices as causes and learning as effects.
This is because education is an open, recursive, semiotic system: it has a high degree of interaction with external factors in the environment; it is characterised by behaviours which are caused by both internal and external feedback; and it operates through an exchange of meaning rather than physical force. In other words, while careful observations of the scientific world can reveal causal mechanisms, evidence collected about education practices is limited to suggesting approximate probable correlations.
“What works” often assumes education is a closed, deterministic system, in which relationships between factors can be controlled, and observed directly; and that causes necessarily trigger effects, and effects must have direct causes.
While predictions about the outcomes of practice may be informed by evidence, they are by no means guaranteed. Claims about “evidence-based practice” are predicated on the belief that education is a causal process, and scepticism is warranted. Evidence cannot tell us causal rules for action in education.
Education research can fulfil a technical role in investigating practice (and other aspects of education) by collecting and interpreting evidence. The role of cultural and critical research is to ask important questions about the normative practices and values of education that allow us to identify, define, and respond to problems in education. This research is complementary to technical research, and the two forms of research each inform the other. Education is not like science, in that the interactions occurring are complex and largely unseen. There is a multitude of factors, known and unknown, that can affect the outcome of any intervention or practice, and thus we cannot assume causal relationships in the same ways we can in the physical sciences.