Beth Holland will soon be joining me as the co-author of EdTech Researcher, as she embarks on her doctoral work at Johns Hopkins. In this guest post, she reflects on how we should be assessing learning in a world of evolving access to information.
In my role as an EdTechTeacher instructor, I often begin workshops by asking what could learning look like? As an initial activity, I encourage participants to examine the concept from the perspective of a school year. They identify the skills, knowledge, and preconceptions that students bring to the first day of school and then project to the last day. In this context, we define learning as growth - the sum of the knowledge and skills acquired over the course of an extended period of time.
After that exercise, we break down the tasks. Learning becomes more immediate and we attempt to identify not only the desired outcomes but the processes to get there. However, this often becomes our stumbling block because we are asking the question what could learning look like as opposed to how it currently appears in our classrooms.
Too often as educators, we take our own ideas and experiences about teaching and learning and impose them on our students; and yet, none of us experienced school in the age of the iPhone, Google, or ubiquitous access to content and media. In Why School, Will Richardson warns that “we need to prepare kids for old-school expectations and new-world realities alike.” We need to consider both our experiences as well as their reality. Without this consideration, teaching and learning often become disconnected as students exist between two worlds - theirs and ours.
The Metrics of Learning
One of the greatest distinctions between our learning and our students’ may be their access to information. With a single device, they carry a global library as well as an instant connection to people around the world. The learning experience no longer needs to be confined to the walls of a classroom, the hours of a school day, or even the guidance of a single teacher. Many teachers and students agree that how students now learn has changed, and yet, considerable debate exists with regard to the methods used to measure that learning.
Throughout modern history, educators have viewed learning as a metric that could be assessed. Essays, tests, and entrance exams have been a staple of education because they served a critical purpose: they identified whether or not knowledge had transferred from the teacher to the student. Until recently, this has been the primary indicator of learning. Without a guarantee of this transaction, there was no assurance that a student could later recall the information. Given the historical scarcity of knowledge, this metric made perfect sense. Learning equalled the sum of all knowledge retained and repeated. A greater emphasis was placed on the ability to recall facts because the larger the mental library, the more effectively a student could reference that knowledge in order to build new connections.
Outside of the K-12 classroom, we regularly research answers and reach out to others to support our own learning. In fact, I contacted a number of colleagues in the UK for background information on the previous article. And yet, we continue to measure learning primarily on the efficiency of student recall. Professor David Perkins raises the question, “What’s worth learning?” in his new book, Future Wise. He argues that when students have ubiquitous access to information and facts through mobile devices, then perhaps what we should focus more on the content, processes, and skills that have relevance to their lives rather than whether or not they can regurgitate a mountain of disparate content facts.
The Google Dilemma
In his 2012 TEDx Talk, Ben Kestner said, “If the kid can Google the answer, maybe we’re asking the wrong questions.”
Along similar lines, John Seely Brown and Douglas Thomas call out the need for both explicit and tacit knowledge in A New Culture of Learning. They identify the former as easily repeated, articulated, and tested knowledge - essentially information that is Google-able. While students need explicit knowledge available as a schema in order to cultivate deeper understanding, they contend that learning should not stop at that point. For example, students need the explicit knowledge of vocabulary in order to be effective communicators as well as the ability to perform mental calculations in order to quickly apply mathematical thinking to more complex concepts.
Given the speed at which students can access content, and the rate at which that content may change, Brown and Thomas state that a greater emphasis can now be placed on the acquisition of tacit knowledge - the product of synthesis, comprehension, and experience. In Tacit Expertise vs. Explicit Knowledge, Bill Ferriter points out that in his twenty-plus years as a classroom teacher, a parent has never prioritized explicit knowledge. When asked, parents - and teachers - want students “to ‘learn how to learn’ or to ‘develop persistence’ or to ‘Discover their own interests’ or to ‘think critically.’” However, current assessments typically focus on the retrieval of explicit knowledge that may or may not be worth knowing rather than tacit understanding.
As information continues to evolve, and the technologies that empower students to construct understanding and create artifacts as evidence of their learning further develop, the requirements for students to piece explicit knowledge into tacit expertise becomes increasingly critical. The question remains, though: what we should be measuring and how?
The opinions expressed in EdTech Researcher are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.