Special Report
Classroom Technology

Testing Tech Products in the Classroom

By Michelle R. Davis — June 06, 2016 7 min read
  • Save to favorites
  • Print

At Pittsburgh’s Avonworth School District educators are experimenting with a new way to test digital tools they might buy for their classrooms.

In the past, the approach to such an ed-tech pilot project might have involved an administrator or teacher hearing a buzz about an app or software, trying it out in a class for some period of time, then recommending it based on whether students or teachers said they liked it. But in Avonworth this year, that process is more formal, with upfront planning, a relationship with the product vendor, and conclusions based on hard data.

The old way “was more of an impulse buy,” said Scott Miller, the principal of the Avonworth Primary Center, a K-2 school. “That’s not really effective. We want to make an educated, informed decision to see if a product is a fit for us.”

School districts routinely do some kind of testing to sample ed-tech products for their students and often invest in much of that technology. In 2014, pre-K-12 schools spent $8.3 billion on educational software and digital content, according to the Software & Information Industry Association.

But the evaluations often look very different in different districts, or within the same district. They can be short-lived or long term, spread over several academic years. These trials can be an amorphous exercise with no defined way to determine what products are best.

“I see a lot of misunderstandings during this process,” said Katrina Stevens, the deputy director of the office of educational technology at the U.S. Department of Education. “It’s ripe for improvement.”

As districts are inundated with ed-tech products that aim to solve their “pain points” and claim to provide everything from personalized instruction to gamified content, finding ways to help districts run more effective pilot projects—and ultimately make better spending decisions—has become a high priority.

More structured pilot projects are now being encouraged through a number of initiatives. For example, the Learning Assembly project, funded by the Bill & Melinda Gates Foundation, brings together seven organizations working to improve school and district ed-tech projects. (The Gates Foundation also provides support for Education Week’s coverage of college- and career-ready standards and personalized learning.)

One of those organizations, the Washington-based nonprofit Digital Promise, which promotes the use of ed-tech in schools, is working with the 1,650-student Avonworth district, and Miller said that’s made a big difference.

Under the project, Avondale was paired with researchers from Pittsburgh’s Carnegie Mellon University who helped the district set up two elementary-grade pilots this academic year. ESpark, a personalized-learning program, is being used in 1st grade classes, while the digital toy Puzzlets is being sampled in grades K-2. Students used the tools from October through April.

District officials worked with researchers upfront to determine if the products were aligned with district needs, Miller said. And the district collaborated closely with both vendors to be sure teachers were using the products as intended. Teachers also provided feedback about how eSpark and Puzzlets worked or didn’t, Miller said.

For eSpark, the district will primarily use student-growth and -achievement data to determine effectiveness. For Puzzlets, the district is strictly looking at engagement and student interest, Miller said. Out of this process, the district hopes to craft a system or checklist for pilot projects that can be replicated when its grant through the Gates Foundation runs out, Miller said.

“This is going to allow us to make an educated and informed decision on whether these products are a fit for us,” he said. “If they are, great, but do they need any tweaks? If not, we’ll walk away, no harm, no foul.”

Financial concerns play a major role in the growing interest in creating more formal ed-tech pilot projects in schools.

Ed-tech products “can be an enormous investment for a district,” said Julia Freeland Fisher, the director of education research for the Clayton Christensen Institute, which studies blended learning. “They want to make sure they’re spending their scarce dollars wisely.”

The Education Department’s Stevens said her office is currently trying to improve rapid-cycle evaluations for ed-tech products by working to create an online pilot “wizard,” akin to a TurboTax for school-product-testing projects, she said.

The digital toolkit would give districts a guide on the front end on how to do a needs assessment, the technicalities of rolling out a pilot, what questions to ask the product developer, and how to collect and analyze data to determine if a product should be used on a wider basis.

Currently, the department is creating a prototype of the pilot tool and will test it out in districts in the fall, she said.

“We want to walk a school or district leader through setting up a pilot and evaluating the tools being used in their system,” Stevens said.

Other efforts to streamline the pilot-testing process are more regional. LEAP Innovations, a Chicago-based nonprofit that is also part of the Learning Assembly project, is working with schools to bridge the “gap between innovation and education,” said CEO Phyllis Lockett.

LEAP partners with Chicago-area K-8 schools to match them with ed-tech companies seeking to pilot their products. “Many of our schools get calls from vendors constantly, and they don’t know where to start,” she said.

LEAP has a panel of experts, including learning scientists, educators, and ed-tech investment experts, to evaluate and vet ed-tech products. Those that are approved are matched with individual Chicago-area schools for one-year pilots. Educators involved receive semester-long professional development before their project launches to hone their role in the process, Lockett said.

LEAP then works with researchers to crunch the data. “We can tell schools if the solution moved the dial on achievement,” Lockett said.

Digital Promise is working on pilots with several other districts in addition to Avonworth. It also plans to use the information it has gleaned to create a product-testing template, which can then be tailored to each district’s unique characteristics, said Aubrey Francisco, the organization’s research director. Digital Promise is also hoping to share the results of district pilot projects to provide information to other educators.

“A district might look at a study and say, ‘I feel comfortable using this product,’ based on the research done elsewhere,” she said

Along those lines, Jefferson Education, a commercial entity advised by the University of Virginia’s Curry School of Education, hopes to build a system to share robust pilot-project information with valid data on a wider scale, so that every district doesn’t have to do its own test of a product, said CEO Bart Epstein. The project is in the beginning stages, he said.

“Very few schools have the bandwidth to be able to do pilots properly,” he said. “Right now there are probably 1,000 school districts all reviewing the same 15 math products.”

In a 2015 Digital Promise study, researchers found that districts’ prevailing processes for testing technology products are largely informal and often lack a clear approach and consistency. The study also found a disconnect between the aims of companies looking to test their products and schools.

“There’s a real need to have a more structured process to talk about what is needed, how to bring teachers in early so they buy in, how to work with the developer, implement properly, and measure success,” Francisco said.

Some of these new pilot efforts may also help districts that have already purchased ed-tech software and digital tools in an ad hoc way. That’s the situation in the 1,200-student West Ada district in Meridian, Idaho, where 50 different math programs are being used across schools, said Eian Harm, the district’s research and data coordinator.

West Ada is, in effect, trying to do pilot projects in reverse on the most popular five math programs to determine which ones are most effective, Harm said.

To that end, a tool like the new EduStar platform might be of help, said Benjamin F. Jones, a professor of entrepreneurship and strategy at Northwestern University’s Kellogg School of Management, who is a co-creator.

EduStar, developed in collaboration with the nonprofit digital-learning provider PowerMyLearning, aims to provide rigorous and rapid trials of digital-learning tools and more granular content, like a lesson, video, or a game. Those trials can take just a few minutes—to test an app, for example—and are done through an automated system, he said. Currently the system is being tested with 40 schools already using the PowerMyLearning platform, but Jones said he hopes to add many more that want to test out digital content.

The goal is to provide feedback to the developer about how a product works in a real classroom and to communicate deeper research about why or how certain games or techniques work or don’t, Jones said.

“In the long run,” he said, “we hope the system can scale so it could test large numbers of digital-learning activities and provide a Consumer Reports function in the marketplace.”

Related Tags:

Coverage of trends in K-12 innovation and efforts to put these new ideas and approaches into practice in schools, districts, and classrooms is supported in part by a grant from the Carnegie Corporation of New York at www.carnegie.org. Education Week retains sole editorial control over the content of this coverage.

Events

Mathematics Live Online Discussion A Seat at the Table: Breaking the Cycle: How Districts are Turning around Dismal Math Scores
Math myth: Students just aren't good at it? Join us & learn how districts are boosting math scores.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Achievement Webinar
How To Tackle The Biggest Hurdles To Effective Tutoring
Learn how districts overcome the three biggest challenges to implementing high-impact tutoring with fidelity: time, talent, and funding.
Content provided by Saga Education
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Student Well-Being Webinar
Reframing Behavior: Neuroscience-Based Practices for Positive Support
Reframing Behavior helps teachers see the “why” of behavior through a neuroscience lens and provides practices that fit into a school day.
Content provided by Crisis Prevention Institute

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Classroom Technology See Which Types of Teachers Are the Early Adopters of AI
Most still aren't using AI in instruction, study shows.
4 min read
Image of the hand of a robot holding a pen with open books flying all around.
iStock/Getty
Classroom Technology Don't Make This Mistake When It Comes to Teaching AI Literacy
Teachers can provide the lessons without AI-powered tools.
2 min read
Classroom Technology Spotlight Spotlight on Empowering Educators and Engaging Students
This Spotlight will help you leverage technology to meet students’ individual needs, investigate how ed tech can help teachers, and more.
Classroom Technology Opinion No, AI Detection Won’t Solve Cheating
Want to address concerns about student ChatGPT use? Here are five steps to take instead of turning to unreliable detection tools.
Kip Glazer
4 min read
AI Robot caught in a spot light. Artificial intelligence plagiarism, cheating and ai detection concept.
DigitalVision Vectors/Getty