Published Online:
Published in Print: January 30, 2008, as Tests of Tech Literacy Still Not Widespread Despite NCLB Goals

Tests of Tech Literacy Still Not Widespread Despite NCLB Goals

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

Any educator who’s ever had to ask a pupil to fix a computer might be surprised to learn that not all students are technologically proficient—or at least not savvy enough to be considered “technologically literate.”

While that term has no universal definition, the core idea could be boiled down to this: Technologically literate students not only know how to operate hardware and software—they can also analyze the information flowing through it, evaluate that digital content’s relative merit and relevance, and use it creatively and ethically in communicating with others.

The federal No Child Left Behind Act, signed into law six years ago, made it a national goal for all 8th graders to be technologically literate. Unlike reading and math, though, tech literacy does not factor into the law’s school accountability provisions, and most states do not administer separate tech-literacy tests statewide.

Still, at least one test-maker has seen the NCLB goal as an opening and developed assessments of tech literacy for 8th graders. Appropriately, those tests do use not paper and pencil, but instead are delivered to students via computers.

Learning.com, a privately held Portland, Ore.-based company, has sold hundreds of thousands of its middle school version of TechLiteracy Assessment since the test was launched in 2005. “I think we’re at the early stages of this market—we’re just seeing a few of the early-adopter states that are doing an assessment [of tech literacy],” said company spokesman Mark Tullis.

The Educational Testing Service, the Princeton, N.J.-based nonprofit testing giant that administers the SAT, has produced an online version of a tech-literacy assessment called iSkills that is appropriate for high school seniors and college freshmen, and another version for college juniors.

Stephen Denis, ETS’ iSkills product manager, said that the assessment is marketed only to colleges and universities. He estimated that less than 5 percent of the roughly 15,000 iSkills tests that ETS has administered since the assessment was launched in 2005 were taken by precollegiate students.

Moreover, the company has no immediate plans to come up with a test for 8th graders, the grade level specified in the NCLB law, said ETS spokeswoman Karen Bogan.

“If [technological literacy] becomes part of NCLB [accountability requirements], we’d have more of a drive to do that,” she said. Under NCLB, states that receive federal Enhancing Education Through Technology grants must report their progress toward making their students technologically literate by the end of 8th grade, but that technological literacy is defined only by each state.

The lack of teeth in that provision is often cited as a reason that the market for online assessments of students’ technological literacy has not caught up to the national goal. “It’s not that we don’t want to do that—there’s not a market demand for that right now,” said Ms. Bogan of the ETS.

“It’s a slow-growing market right now because it’s a voluntary test.”

No Momentum

When the NCLB law was enacted, “we were hoping that we’d see a wave of high-quality, 21st-century assessment tools,” said Donald G. Knezek, the chief executive officer of the International Society for Technology in Education, a Washington-based professional organization that advocates greater use of technology in schools.

Instead, he said—in part because the U.S. Department of Education didn’t collect information about states’ assessment of technological literacy, and because states were too busy testing reading and math proficiency for accountability purposes—“there wasn’t enough momentum to guarantee a market to invest in those quality products.”

“There’s so much pressure on the system to test to whatever tests are being required,” said Elsa M. Garmire, a professor of engineering at Dartmouth College who chaired a committee on the assessment of technological literacy for the National Academies. “Technology literacy has still not really been adopted, other than the concept of how to use computers.”

The committee’s 2006 final report, the culmination of a two-year study, found tech literacy to be still in its infancy, concluding: “No one really knows the level of technological literacy among people in this country.” According to Technology Counts, an annual Education Week report on school technology, only four states—Arizona, Georgia, North Carolina, and Utah—offer statewide testing of students on technology.

“We have been frustrated,” said MarkSchneiderman, the director of education policy for the Software and Information Industry Association, a Washington-based trade group that includes many publishers and online-assessment companies. “My sense is that there’s a great desire at the state and local level to look at these kinds of [technological-literacy] skills. But there’s a challenge with curricular requirements and overtesting.”

Learning.com’s Mr. Tullis acknowledged those barriers, but said his company remains committed to working with states: “There are still states that want to know their students’ tech-literacy levels, regardless of whether the Department of Education is telling them what to do or not.”

Different Approaches

That was the case in Arizona, which became the first state to buy and launch Learning.com’s 8th grade tech-literacy assessment statewide in 2005.

“We did this just because we thought it was the right thing to do,” said Cathy J. Poplin, the state’s deputy associate superintendent for educational technology. “The least of the reason was the feds.”

Fifth and 8th graders in the state are tested twice a year, in the fall and in the spring.

“The data that we’ve received back has been phenomenal,” Ms. Poplin said. “Within 48 hours, [districts] can have their results back. School-level results, class-level results, student-level results. A teacher can drill down and make correlations.”

Other states have tweaked off-the-shelf assessments. “There’s nothing out there that meets our exact needs,” said Dee Appleby, the director of South Carolina’s office of e-learning.

“I think Learning.com’s probably the closest we’ve seen to the ISTE standards,” she said, referring to the tech-literacy standards drawn up by the International Society for Technology in Education, which have been adopted by most states as the starting point for their own standards.

Still, Ms. Appleby noted, the state has employed a Maryland-based company to customize Learning.com’s TechLiteracy Assessment for its particular needs.

“No two states are the same,” she said. “It’s difficult to come up with one baseline program that will work for everybody.”

Florida has taken the customization idea to the nth degree, having its technological-literacy assessment built to suit at Florida State University’s Florida Center for Interactive Media, in Tallahassee.

“You can’t build something that’s one-size-fits-all,” said Kate J. Kemker, the state’s bureau chief for instruction and innovation.

Congressional Action

This year, for the first time, the Education Department has collected data on tech literacy, something Mr. Schneiderman of the SIIA called a step in the right direction.

And bills introduced last year in both the U.S. Senate and the House of Representatives may portend even wider changes. Called the Achievement Through Technology and Innovation Act, the bills would uniformly define student tech literacy, and would authorize a maximum of $2 million a year to develop an annual national report on the subject.

Looking back on the federal approach to student technology literacy since the NCLB’s passage, ISTE’s Mr. Knezek decried what he called the Education Department’s “selective enforcement.”

“The negative leadership they’ve shown has cut seriously into 8th graders’ tech literacy,” he said.

Timothy J. Magner, the director of the Education Department’s office of educational technology, said it was “probably a fair statement” that commercial assessments of tech literacy would have grown more quickly if the department had collected such data since the NCLB’s enactment, and that states would by extension be farther along in assessing students’ tech literacy.

But, added Mr. Magner, a former executive director for K-12 education at the Microsoft Corp. and a former deputy executive director of the Council of Chief State School Officers: “I’m not sure it’s quite as causal as Don [Knezek] would say. That’s a market dynamic.”

Vol. 27, Issue 21, Pages 1,12

Related Stories
You must be logged in to leave a comment. Login |  Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories

Viewed

Emailed

Commented