Published Online:
Published in Print: May 17, 2006, as Chat Wrap-Up: Technology Counts 2006

Commentary

Chat Wrap-Up: Technology Counts 2006

Article Tools
  • PrintPrinter-Friendly
  • EmailEmail Article
  • ReprintReprints
  • CommentsComments

Education Week sponsors regular online chats on its Web site, edweek.org. On May 5, the topic under discussion was the special report Technology Counts 2006: The Information Edge: Using Data to Accelerate Achievement (May 4, 2006). On hand to answer readers’ questions were Assistant Managing Editor Caroline Hendrie, the Technology Counts project editor, and Christopher B. Swanson, the director of the Editorial Projects in Education Research Center. Below are excerpts from the discussion.

Question: Computer technology changes rapidly. With so many budgetary constraints, how do schools keep up with the need to constantly update equipment? And how do they keep their staff up to date?

For More Info
A full transcript of this chat is available at www.edweek.org/chat/tc06/

Hendrie: Many schools and districts report feeling the pinch as they try to keep current with the rapidly evolving world of education technology, especially as some prominent sources of federal and state funding for technology have retreated in recent years. That said, many educators have had success in pursuing grant funding for technology initiatives from private foundations as well as government sources. Many companies are seeking partners to help them pilot products in a real-world setting as well.

As for professional development, there’s no doubt that this area needs to be a priority. Without training, computers and other technology risk becoming dust-catchers. Technology Counts points to one potential bright spot in this area: Our research center’s survey of the states found that two-thirds of them cited professional development as one of their top two priorities for education technology spending this school year. That percentage was far greater than for any other single priority cited.

Question: Do you think there will be a future requirement for SIF-compliant data to address the centralization of state data?

Swanson: For those not familiar with the alphabet soup of education technology, SIF stands for Schools Interoperability Framework. To oversimplify, this initiative deals with developing standards so that data systems can communicate more seamlessly and effectively with each other.

As states increasingly get down to the business of building and developing data systems, issues such as interoperability are becoming a hot topic (at least as hot as you can get in the world of data and data systems). Whether or not there will be formal requirements from the federal government or the states is hard to tell.

The key issue, though, is whether states are taking interoperability into account voluntarily as they design their systems. Data systems will be both more functional from a technical perspective and more educationally useful if they are designed with the same interoperability standards in mind.

Technology Counts reports, based on information from Market Data Retrieval, that 82 percent of schools are not aware of SIF. So there is much work to be done.

Question: Many colleges and universities have integrated e-learning study skills into their “college survival” courses for new students. Did you see any similar trends with states?

Hendrie: Though this level of detail was not part of this year’s report, we do know that many schools are incorporating online study skills into their curricula. School librarians are often acting as point people to help students acquire skills for identifying reliable sources on the Web or avoiding plagiarism, for example. We also have reported on the recent move by Michigan to require students to complete an online-learning assignment as part of a package of changes to its graduation requirements. Students could reportedly satisfy the requirement by taking an online course for credit, for example, or a noncredit test-preparation course online, among other ways.

Question: How do you respond to those who advocate for more and more money for technology even though a causal link between authentic student achievement in school and access to technology remains to be established?

Hendrie: First, let’s make clear that we’re neither advocates nor critics of education technology. And the effectiveness of computers in increasing student achievement was not the main topic of this year’s report, although it has been in the past. I will say, though, that it stands to reason that access to technology, coupled with teachers trained to use it to promote higher-order-thinking skills, can help prepare students for a world in which familiarity with information technology is more and more taken for granted. Whether that presumption stands up to scientific scrutiny is something that lots of folks feel the field needs to examine more closely.

One potentially useful addition to the knowledge base is expected later this year, because the U.S. Department of Education is slated to release the findings of a major evaluation of the effectiveness of 16 computer-based products designed to help teach reading and math.

Question: Do the grades that each of the states earns proportionally reflect the monies allocated for technology in these states?

Swanson: The technology-leadership grades in this year’s report do not directly account for technology spending (although that was the special theme of last year’s report). Of course, it’s probably fair to say that some of the indicators included in our grading indirectly reflect financial support from the state in one way or another. For example, state funding may have been used to purchase computers, which factors into the grades.

But other indicators reflect actions that states can take without making a substantial financial investment. An example of a low-cost lever might be a policy establishing standards that outline expectations for what students should know about and be able to do with technology. So, while funding may help in some ways, we look at a mix of state-level strategies in the report.

Question: What is the greatest challenge faced by schools, school districts, and states in implementing data-driven decisionmaking?

Hendrie: Our research suggests that what’s needed most at the local level are easy-to-use tools that can connect the dots for educators and that educators feel comfortable using. All the data in the world doesn’t matter if folks can’t figure out what it means for them. But most teachers and administrators were not trained to be data analysts, and most of the people developing computerized data systems aren’t educators. So there is a disconnect there that needs bridging. Some leading thinkers in this field believe that the gap between what the data tools can do and what educators can do with them is getting wider. So that suggests that time and resources for training are critical.

Question: Are there states that use technology to manage and transfer knowledge among schools and/or districts and the research community?

Swanson: All states use technology to communicate data in one form or another. But states vary tremendously at this point in how sophisticated that communication is and the audiences they are able to reach. This can range from a Web site that contains “report cards” on school performance to a secure, Web-based data system that teachers can log into and call up detailed information on individual students.

The survey we conducted for this year’s Technology Counts addressed this issue to some extent. In particular, you can find a detailed table with a state-by-state view of the kinds of access and analysis tools the state offers educators and the public.

Question: What does good data-driven decisionmaking look like in the classroom, in the schools, and in districts? How do teachers or administrators use data to make instructional decisions?

Hendrie: For teachers, good data can reveal strengths and weaknesses of individual students, and for their classes as a whole. That can give them a road map for what they should be doing in class tomorrow, next week, next month, and next year. Our report found plenty of examples of this sort of activity around the country. In one school in Massachusetts, for instance, teachers who used to cut and paste old test items to cobble together warm-up tests for their kids are now letting sophisticated software programs do that work, leaving them time to tailor lessons to address their students’ needs. In California, groups of teachers are getting together in grade-level teams to scrutinize the results of benchmark tests their kids are taking three or four times a year, and then figuring out how to group them and get them what they need to succeed on state tests. And in Pennsylvania, teachers are getting students involved in looking at their own data, and in figuring out the strategies they should pursue to fill in their learning gaps.

One teacher engaged in that kind of work told us that analyzing data had made her a much more comprehensive teacher. She said she’s more attuned now to what her students need, rather than what she happens to enjoy teaching. “I can’t teach irony for six months just because I like it,” she told us.

Vol. 25, Issue 37, Page 34

You must be logged in to leave a comment. Login |  Register
Ground Rules for Posting
We encourage lively debate, but please be respectful of others. Profanity and personal attacks are prohibited. By commenting, you are agreeing to abide by our user agreement.
All comments are public.

Back to Top Back to Top

Most Popular Stories

Viewed

Emailed

Commented