What exactly is environmental literacy, a phrase that gets bandied about a lot? And how should it be assessed?
A small group of experts in the subject—backed by a National Science Foundation grant—has been working on a project over the past year to address these matters, and yesterday released an executive summary of their forthcoming framework for assessing environmental literacy. The assessment equation, as articulated in that document, is not simply knowledge plus skills. It also involves dispositions and even “environmentally responsible behavior,” the framework suggests.
The effort comes as a global assessment on environmental literacy is coming in 2015. That’s when the high-profile Program for International Student Assessment, or PISA, will for the first time include an optional exam on the topic. And in fact, in August, the folks who developed the new environmental literacy framework also submitted a plan to the Organization for Economic Cooperation and Development to help guide that exam.
In addition, a report on the results of a recent National Environmental Literacy Assessment was released earlier this year. This voluntary exam, focused on the middle-school grades, was first administered several years ago. It is NOT affiliated, by the way, with NAEP. Work on it has been supported both by the the National Oceanic and Atmospheric Administration and the Environmental Protection Agency. In fact, the second round of testing was given only to a set of schools with an environmental focus.
Anyway, back to the assessment framework unveiled yesterday. Crafted by a six-person team of experts in environmental literacy, in consultation with 17 outside experts across disciplines, the document is likely to spark some debate, especially on the matter of assessing “environmentally responsible behavior.” First off, that’s no easy task, even the authors acknowledge. Also, I imagine some people might question whether it’s even appropriate to try, at least in a school context. (I can see the critique already, that the framework is looking to inculcate an army of eco-warriors.)
I’ll come back to those issues in a moment, but another thing that is striking about the framework is that it’s short on detail. It appears to leave a lot to the imagination of test-designers, or those would engage them. Perhaps most notably, the executive summary (and I’m told the same is largely true of the forthcoming full framework) provides very little guidance on what exactly students should be expected to know about environmental issues, or even which general issue areas are the most important to understand today.
This is by design, but while some educators might be pleased by that, others may find it frustrating.
When the full document comes out next year, it will include some suggested “contexts” for environmental literacy, such as biodiversity, land use, and population growth, said Karen Hollweg, a member of the core writing team and the former president of the North American Association for Environmental Education. But she said the idea was to allow those who reference the framework, whether in states, at the national or global level, to decide which particular environmental concepts and core ideas are most pressing and relevant.
“We’ve suggested contexts that could be used for designing [test] items, but we’re only suggesting contexts, because, as I said, it will depend tremendously on the locale and age range and so forth of the population to be assessed.”
Stepping back, she explained: “The way I think of this framework, it’s a rough plan for building a house, a rough plan for building an assessment.” As such, it requires a lot of work to “fill in the blanks,” she added.
She also cautioned that this enterprise was operating with limited time and resources, basically a year with a $108,000 grant from the NSF.
So, what exactly does the framework say? The executive summary, a short document only about five pages long, identifies four “interrelated components” of environmental literacy: knowledge, dispositions, competencies, and environmentally responsible behavior.
Under the knowledge strand, it highlights:
• physical and ecological systems;
• social, cultural, and political systems;
• environmental issues;
• multiple solutions to environmental issues; and
• citizen participation and action strategies.
The competencies include such skills and abilities as identifying, analyzing, and investigating environmental issues, as well as using evidence and knowledge to defend positions and resolve issues. The dispositions identified include sensitivity, attitudes, personal responsibility, and motivation, among others.
As for environmentally responsible behavior, there’s no quick list offered. Instead, it outlines several ways of conceptualizing such behavior for purposes of large-scale assessments, including an approach that contains political and legal action as well as ecomanagement and persuasion.
In fact, the document concludes that environmentally responsible behavior is the “ultimate expression of environmental literacy.”
“If there is a place that people are going to have some trouble and disagreement, it’s probably on the behavior part [of the framework],” acknowledged William McBeth, a professor of education at the University of Wisconsin-Platteville who helped craft the document (and also has played a leading role in the National Environmental Literacy Assessment). “It’s hard to measure, and some may think we should not be prescribing behavior.”
But McBeth was quick to add that the framework does NOT prescribe any specific behaviors, even as it does make reference to some examples of how to conceptualize it, such as political and legal action.
Scott Marion, a testing expert who participated in the event yesterday, offered praise for the framework in an interview the other day, saying he appreciated the way it represents a “multifaceted domain,” and its approach of moving beyond simply knowledge and skills.
At the same time, he suggested that far more flesh is needed on the bones before this kind of guidance could be handed over to test-makers.
“There’s not enough detail to build an assessment without allowing the assessment developer more leeway than we usually like to give them,” said Marion, the associate director of the National Center for the Improvement of Educational Assessment. “This is sort of a first step in trying to identify what are the broad topics to study.”
A version of this news article first appeared in the Curriculum Matters blog.