Special Report
Education

Reviewers Play Critical Role In Market for Digital Content

By Mary Ann Zehr — September 13, 2017 8 min read
  • Save to favorites
  • Print

On a drizzly day here in this small town, a new piece of software is making a good impression on Warren Buckleitner, a former teacher and the editor of Children’s Software Revue magazine.

He’s sizing up a CD-ROM called What’s the Big Idea, Ben Franklin? that’s designed to teach 3rd to 7th graders about the life of the colorful statesman, writer, and inventor.

“I’m very excited about this. This is obviously not produced by amateurs,” Buckleitner says 20 minutes into cruising through the software, which is published by the New York City-based Scholastic.

But when he sits down to draft a rating for the software several hours later, his excitement has been tempered a bit, mainly because of some problems with its design.

“This content is very seductive,” he says. However, he adds, “The fact is, the program has a couple of things that could be better. It needs a little sanding.”

After evaluating the software against six main criteria-whether it’s easy to use, childproof, educational, entertaining, designed well, and worth its price-Buckleitner gives it a 4.4 rating on a 5-point scale. He’ll explain why in a written review for an upcoming issue of Children’s Software Revue, which is published by Active Learning Associates, the company Buckleitner and his wife, Ellen Wolock, founded in 1993.

While Buckleitner sometimes seems like a boy at play- nickering at how an animated Franklin talks out of the side of his mouth or toying with an innovative technical feature of the program-the educational software industry takes his evaluations most seriously. He was awarded the best software reviewer by the Software Publishers Association (now the Washington-based Software & Information Industry Association) in 1995.

“In a competitive market, companies need reviews to get the name of a product out in front of potential buyer,” says Sue Kamp, the director of the education market division for the SIIA. Companies regularly ask her which reviewers they should woo, she says, and mention good reviews liberally in their marketing materials.

Educators rely on the reviews as well, she adds. “They are looking for sources to turn to [for recommendations] because they don’t want to have to walk through each product. They don’t even know what products to request for review.”

Wearing a Variety of Hats

The process Buckleitner goes through to review What’s the Big Idea, Ben Franklin? shows that evaluating educational software is, in many ways, more complex than assessing a printed textbook.

For one thing, most software titles are designed both for home and school use, so Buckleitner tries to evaluate each product from a variety of perspectives.

“I always have my psychology hat on, my teacher hat on, my parent hat on, my child-reality hat on,” he says.

Furthermore, while technical quality is rarely an is use for a book, it’s very important when considering the usefulness of a computer program.

In fact, Buckleitner has little to criticize about the actual subject material of What’s the Big Idea, Ben Franklin? He sums up the program as one of the most exciting he’s seen recently and says it will make his list of top 100 school software titles for the year.

But the technical aspects of the program, while generally well-executed, have a few snags, the reviewer finds.

For one, the software allows the teacher to tum off the sound for certain sections- but not all-of the program, something for which Buckleitner docks the rating slightly.

“I’d like as a teacher to be able to tum the whole bloody thing off,” he says, noting that teachers often find audio distracting.

Also, What’s the Big Idea, Ben Franklin? doesn’t permit a student to choose a particular level of subject matter from the outset, nor does it automatically adjust the level based on a student’s responses. Buckleitner takes the rating down another notch.

One oversight by the developer causes Buckleitner no small annoyance. After a half-dozen tries, he can’t figure out how to get into the program’s teacher resources. He finally discovers he must type in the word “teacher” in the space labeled “name,” something that isn’t noted in the program or teacher’s manual. The rating slips ever so slightly yet again.

Buckleitner and the other three reviewers for Children’s Software Revue acknowledge having biases about what constitutes high-quality software, but they try to be upfront about them.

They say they prefer programs that give a child control over the learning and technical aspect of the program. For example, they note whether students can exit a program at any time or determine how quickly they move through the lessons.

The reviewers also place a high priority on how entertaining the software is. They try to assess whether their instincts are right by testing each program with at least one child before they finish up a review.

“If it’s not fun, forget it,” says Wolock, the magazine’s managing editor.

A Team Approach to Reviews

While Children’s Software Revue is a leader in the software-evaluation business, there are at least a dozen other outfits, each with their own criteria, that publish reviews in magazines, in books, or on the Web.

Several are nonprofit operations. The California Instructional Technology Clearinghouse, for example, which is run by the Stanislaus County Office of Education, pays particular attention to how a software title’s content correspond’ to the state’s academic standards.

All the reviewers are classroom teachers who have received a day and a half of evaluation training from the clearinghouse.

Each piece of software is reviewed in one sitting by a team of three teacher who decide on a rating One of the teachers then tests the software in his or her classroom to make sure it works out like the team expects.

Some districts, however, prefer to use their own criteria for software evaluation and buy software based on review by their own teacher.

Prince George’s County, Md., schools, for example, assign a team of three teacher plus one community member to evaluate every piece of software being considered for purchase. The group meets to examine and di cuss how the software performs in 10 areas, including critical thinking, curriculum content, and pedagogy. The finer points range from whether the objectives of the software are clearly stated to whether it is free of racial and ethnic stereotypes.

The reviewers test the software together with a hands-on activity and try to simulate as much as possible how it might be used in the classroom. If they approve the software, they write a description of the product and how it fits into the district’s curriculum in a guide that goes to teachers.

While any teacher may ask that a particular piece of software be considered, he or she cannot purchase it unless it make the district’ official approval list.

“Teachers should not be using any software in any classroom unless it’s on this approved list,” says Judy Finch, the chief of technology support and training for the 127,000-student suburban Washington district. “We want high-quality materials used with the youngsters.”

The 92,OOO-student Jefferson County, Ky., district, meanwhile, opts for a less rigid approach. School technology coordinators may buy software of their choosing, but are expected to preview the titles before writing up a purchase order, which is channeled through the district.

The district, which spends about $2.5 million each year on curriculum-based software, facilitates the process by maintaining an up-to-date software library and list of titles recommended by school technology coordinators and district-level resource teachers.

In addition to reviews, awards given out by education groups, journals, and the software-publishing industry can act as a form of evaluation and add credibility to a piece of education software. Among the most prominent are the Codie Awards presented each year by the SUA.

Web Sites Evaluated, Too

Reviewers are starting to pay more attention to evaluating Web sites, though most formal outfits evaluating digital content still give more attention to stand-alone software.

Children’s Software Revue, for example, has developed separate criteria for Web-site evaluation and includes information about children’s Web sites in the magazine, but so far hasn’t published any full-length reviews of Web sites. This is partly because most Web sites are free, and the magazine has traditionally focused on commercial digital content.

“We intend to cover the Internet more,” says Buckleitner. “It’s taken us like a tidal wave .... We definitely are thinking along the lines of it doesn’t matter what form content is. If it’s electric and on a screen, we want to evaluate it.”

A fair amount of Web-site evaluation is happening on a grassroots level, says Kathy Schrock, who manages the widely used Kathy Schrock’s Guide for Educators on the Discovery Channel School site. This is appropriate, she says, because the bottom line for a site’s quality is whether it serves a teacher’s purpose.

“I really think teachers are starting to look at the information [on the Web], compare it to their knowledge base, effectively evaluate it, and teach those skills to students,” says Schrock, who has written a book on how to evaluate Web sites.

She note as well that a number of organizations that manage educational site ,including the American Library Association, have evaluation criteria they use before adding a link to their sites and often post the e criteria on the Web.

The Mel World Com Foundation has set up a Web site, called MarcoPolo, that brings together six such organizations-the American Association for the Advancement of Science, the National Council of Teachers of Mathematics, the National Endowment for the Humanities, the Council of the Great City Schools, National Geographic, and the National Council on Economic Education.

Each partner, in turn, has given its “seal of approval” to other Web sites that educator might find useful.

The AAAS, for instance, promote “Super Science Site " that have met 12 criteria, including whether the content is presented in a logical sequence, whether different scientific viewpoints are presented where appropriate, and whether the science content is accurate.

Evaluation of digital content, of course, is not an exact science.

A survey of 544 MarcoPolo user found that most gave the overall site high marks and believe it is very important for individual sites to be endorsed by a leading educational organization. But Schrock is less impressed with the effort.

As of this summer, Schrock hadn’t included the MarcoPolo site on her recommended list, saying it doesn’t yet contain enough useful information.

“I’m reserving judgment,” she says.

Related Tags:

A version of this article appeared in the September 23, 1999 edition of Education Week

Events

Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Education Briefly Stated: January 31, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education Briefly Stated: January 17, 2024
Here's a look at some recent Education Week articles you may have missed.
9 min read
Education In Their Own Words The Stories That Stuck With Us, 2023 Edition
Our newsroom selected five stories as among the highlights of our work. Here's why.
4 min read
102523 IMSE Reading BS
Adria Malcolm for Education Week
Education Opinion The 10 Most-Read Opinions of 2023
Here are Education Week’s most-read Opinion blog posts and essays of 2023.
2 min read
Collage of lead images for various opinion stories.
F. Sheehan for Education Week / Getty