Education Chat

Technology Counts 2006: The Information Edge

Christopher B. Swanson, director of EPE Research Center and Caroline Hendrie, project editor of Technology Counts, discussed the findings of this year's Technology Counts report and states' use of technology and data.

Technology Counts 2006: The Information Edge

May 5, 2006

Kevin Bushweller (Moderator):
Welcome to today’s online chat about Technology Counts 2006: The Information Edge: Using Data to Accelerate Achievement.

While the No Child Left Behind Act has touched off a boom in school data collection, much work needs to be done before the vast amounts of student information can be harnessed to improve learning, according to the findings of this year’s report, which is based on a systematic analysis of the structure and quality of states’ computerized data systems, and how those systems are being used.

We have some excellent questions waiting to be answered. So let’s get the discussion started ...


Question from Jennifer Adams, Teacher of Educational Technology:
I would like to have a copy of the Technology Counts 2006. Where would I go to find it.

Caroline Hendrie:
You can find the report online at www.edweek.org/techcounts06. As for a print version, there is a link from that page to online ordering information. Also, all subscribers to the print verion of Education Week receive a copy as part of their subscription. Nonsubscribers can receive a copy for $6 from Kable Product Services; PO Box 554: Mt. Morris, IL 61054; Phone: 800-788-5692; Fax: 815-734-5864. Thanks for your interest!


Question from Connie Louie, Instructional Technology Director, Massachusetts Department of Education:
In your report you counted the ratio of one instructional computer to the number of students in the CLASSROOM. What about schools that put their computers in the computer labs? Or the their computers are in carts and the schools called them mobile labs? The computers are NOT in the classroom, but when teachers use them the ratio of computer to student is 1:1?

Caroline Hendrie:
Thanks for your question, Connie. The Access to Technology section of our grading scheme takes into account four indicators: two of them measure students per instructional computer in the classroom, and the other two look more broadly at all instructional computers in the school (including computer labs and laptop carts). States’ scores in those four categories have equal weight in the state’s access grades. Hope that helps.


Question from Sandra Johnson, Secondary Regional Instructional Specialist:
Computer technology changes rapidly, how do schools keep up with the need to constantly update equipment when many are faced with budgetary constraints?

Also how do we keep the staff up-to-date with constantly evolving technology?

Caroline Hendrie:
Excellent questions. Many schools and districts report feeling the pinch as they try to keep current with the rapidly evolving world of educational technology, especially as some prominent sources of federal and state funding for technology has retreated in recent years. That said, many educators have had success in pursuing grant funding for technology initiatives from private foundations as well as government sources. Many companies are seeking partners to help them pilot products in a real-world setting, as well. As for professional development, there’s no doubt that this area needs to be a priority. Without training, computers and other technology risk becoming dust-catchers. Tech Counts 2006 points to one potential bright spot in this area. Our Research Center’s survey of the states found that two-thirds of states cited professional development as one of their top two priorities for education technology spending this school year. That percentage was far greater than for any other single priority cited. So the need for more and better training of educators does seem to be front and center for many state technology leaders.


Question from Debbie Pravatta, Software Developer, Technology Institute of Teamwork:
Do you think there will be a future requirement for SIF compliant data to address the centralization of state data?

Christopher B. Swanson:
For those out there not familiar with the alphabet soup of education technology, SIF stands for Schools Interoperability Framework. To oversimplify, this initiative deals with developing standards so that data systems can communicate more seamlessly effectively with each other.

As states are increasingly getting down the the business of building and developing data systems, issues such as interoperability are becoming a “hot” topic (at least as hot as you can get in the world of data and data systems). Whether or not there will be formal requirements related to SIF compliance from the federal government or the states is hard to tell.

The key issue, though, is probably whether the states are taking interoperability issues into account on a more voluntary basis as they design their systems. Data systems will be both more funcational from a technical perspective and more educationally useful if they are designed with the same interoperability standards in mind. This would apply to communication among different data systems that reside at the state level and to communication between state and local systems.

This year’s Technology Counts reported some information obtained from Market Data Retrieval regarding familiarity with SIF. That data showed that 82% of schools are not aware of SIF. So there is clearly much work to be done in building awareness of these issues at the school level.


Question from Sasheen Phillips, Assistant Director, Ohio Department of Education:
What was Ohio’s grade?

Christopher B. Swanson:
As you may know, this is the 9th edition of Technology Counts. But it is the first time we have awarded states letter grades for their leadership in technology policymaking.

States received a grade based on a set of 14 indicators that cover three specific areas: Access to technology, Use of technology, and Capacity for educators to use technology effectively. We have been tracking some of these indicators from the earliest years of the report.

Results for all states appear in the Technology Counts report, which can be accessed at online at www.edweek.org/techcounts06. This year we also prepared individualized web-only reports that highlight the results for each state.

Here are some highlights for Ohio:

Ohio received an overall grade of B-minus, putting it above the national average (C-plus). The state did particularly well in the area of providing access to technology in the schools (receiving an A compared to C-plus in the average state). Ohio lags a bit behind the national average in the Use of Technology category, having implemented only one of the four policies we examine there. In the area of Capacity, Ohio is again ahead of the curve receiving a B-minus in that category for its policies related to technology standards for teachers and administrators as well as including a technology requirement in the state’s initial teacher licensing process.

I would also add that Ohio was one of the states the stood out from the crowd for having a highly-developed statewide data system for students and teachers. That was the special theme of this year’s Technology Counts. You can find plenty more on that topic in the report.


Question from Ryan Watkins, Professor of Educational Technology, George Washington University:
Many colleges and universities have integrated e-learning study skills into their “college survival” courses for new students, did you see any similar trends with states focusing attention on preparing students with study skills for effectively using technologies (e.g., email, chat rooms, blogs, wikis) for academic success?

Caroline Hendrie:
For this year’s Tech Counts, our Research Center really didn’t get down to that level of detail when reviewing whether states had technology standards for students. So it’s a good question, but we haven’t really analyzed state standards from that standpoint. Meanwhile, we do know that many schools are incorporting online study skills into their curricula. School librarians are often acting as point people to help students acquire skills for identifying reliable sources on the Web or avoiding plagiarism, for example. We also have reported on the recent move by Michigan to require students to complete an online learning assignment as part of a package of changes to its graduation requirements. Students could reportedly satisfy the requirement by taking an online course for credit, for example, or a noncredit test-preparation course online, among other ways.


Question from Gregg Sinner, Education Alliance, Brown University:
How do you respond to those who advocate for more and more $$ for technology even though a causal link between authentic student achievement in school and access to technology remains to be established?

Caroline Hendrie:
Excellent question. First, let’s make clear that we’re neither advocates nor critics of education technology. And the effectiveness of computers in increasing student achievement was not the main topic of this year’s report, although it has been in the past (our 1998 Tech Counts focused on that, and featured an original study by Harold Wenglinsky of ETS on the relationship between technology and student scores on NAEP math tests). I will say, though, that it stands to reason that access to technology, coupled with teachers trained to use it to promote higher order thinking skills, can help prepare students for a world in which familiarity with information technology is more and more taken for granted. Whether that presumption stands up to scientific scrutiny is something that lots of folks feel the field needs to examine more closely. One potentially useful addition to the knowledge base is expected later this year, because the U.S. Department of Education is slated to release the findings of a major evaluation of the effectiveness of 16 computer-based products designed to help teach reading and math. Meanwhile, this year’s Tech Counts, while not focused specifically on the causal link between technology and achievement, did find plenty of evidence that educators are starting to use data-analysis tools to help students progress. Given the time, training, and tools, teachers said they are doing a better job of pinpointing and responding to their students’ learning gaps, both as individuals and as a group. Granted, much of this activity centers on various forms of standardized testing, and whether such tests effectively capture what you call “authentic student achievement” is of course a matter of lively debate!


Question from Angelina Boyce, Online Instructor, Florida Virtual Schools:
Florida currently has good backing each year from our legislature for technilogical initiatives and our vitual school enrollments continue to grow yearly at most grade levels. Do the grades that each of the states earn proportionally reflect the monies allocated for technology in these states?

Christopher B. Swanson:
The technology leadership grades in this year’s report do not directly account for technology spending (although that was the special theme of last year’s report).

Of course, it’s probably fair to say that some of the indicators included in our grading indirectly reflect financial support from the state in one way or another. For example, state funding may have been used to purchase computers, which factors into the grades.

But other indicators reflect actions that states can take without making a substantial financial investment. An example of a low-cost lever might be a policy establishing standards that outline expectations for what students should know about and be able to do with technology.

So while funding may help in some ways, we look at a mix of state-level strategies in the report.


Question from irene spero, Vice President, CoSN:
What is the greatest challenge faced by schools,school districts and states in implementing data driven decision making?

Caroline Hendrie:
Fabulous question. Our research suggests that what’s needed most at the local level are easy-to-use tools that can connect the dots for educators and that educators feel comfortable using. All the data in the world doesn’t matter if folks can’t figure out what it means for them. But most teachers and administrators were not trained to be data analysts, and most of the people developing computerized data systems aren’t educators. So it seems there is a disconnect there that needs bridging. As you well know, some leading thinkers in this field believe that the gap between what the data tools can do and what educators can do with them is getting much wider. So that suggests to me that time and resources for training are critical.


Question from Jeffrey L. Peyton, Founding Exec. Dir., Puppetools:
At a time when testing imposes itself throughout the learning culture and a state education division actually receives recognition for using technology to facilitate ‘on-line’ testing, what models of technology are emerging that effectively change the way teachers teach and feel about themselves (creatively), the way they perceive and work with subject matter, and the way they understand their students? In other words, are there any uses of technology out there that impact education at its heart?

Caroline Hendrie:
We’ve run some really interesting stories lately on what teachers and students are doing in in the classroom with blogs, podcasts, and wikis. These new tools offer different means of self-expression that some teachers and kids feel are helping them express themselves creatively. The interactive nature of the technology and the sense that they are connecting with the world outside their classroom walls is something that some kids are finding stimulating, based on what we’ve seen in our reporting. We also ran a thought-provoking piece not long ago on how some educators and videogame mavens are starting to put their heads together. Among other questions they’re considering is whether and how the motivational and problem-solving aspects of gaming can be harnassed to help teach kids academic subjects. As far as changing how teachers perceive their students, I recall one teacher saying that some kids who clammed up in class were far more forthcoming in classroom blogs. So we’re finding that certain technology tools can open up new avenues for communication and creativity in an academic setting.


Question from Glenn E Snelbecker, Professor, Temple University, Philadelphia:
I’m curious as to where my state, Pennsylvania, stands in these state technology ratings. I suspect that people from other states may have similar interests.

Is there some place where we can see the survey results? At the moment I can’t reading or hearing about access to the results.

Glenn Snelbecker

Caroline Hendrie:
Well, please go to www.edweek.org/techcounts06 and pull up the State Technology Report on Pennsylvania. These online-only reports are a totally new feature of our report this year, and we’re eager to hear what you think of them.


Question from Marti Maguire, Reporter, News & Observer, Raleigh, N.C.:
Did the researchers consider whether funding plays any systematic role in the rankings? I.e., are the lower-ranked states getting less money or are they simply using the money ineffectively? Or was there no pattern?

Christopher B. Swanson:
We did not look at the connection between state funding for technology and the grades they received.

It’s an interesting question but one that is also tricky to answer, for a couple reasons.

One is that some of the indicators used for grading may requrie a substantial financial investment. But others do not. Another issue is that documenting how much states spend on education technology can be difficult to track.

Buying computers, for example, may be a clearly technology-related expense. And it may also be releatively easy to document (although those dollars might come from a variety of sources). However, other technology-related spending may be harder to see. State may invest significant resources in training educators how to use technology but this may show up as a professional development expense.

But I think you have hit on an important issue - the distinction between the amount of money that is being spent and how effectively those funds are being used. Technology Counts has a state-level focus. But we hope that our report will help other researchers get a better grasp on those issues at a local level as they follow up with their own studies.


Question from Pam Richau, Tech Director, Lockwood School:
What do you say to educators who feel we are assessing more than teaching?

Caroline Hendrie:
You know, our reporters definitely encountered variations on that theme among some teachers whose classrooms they visited while gathering information for this year’s report. Schools and districts that are using what are called either benchmark or formative assessments are particularly prone to that feeling, it seems. (For those not familiar with these terms, in the context of school data analysis they usually refer to tests given in the classroom periodically, say monthly, quarterly, or three times a year. They are often scored and analayzed electronically and can be designed to predict how students would do on state tests, among other things.) On the downside, some teachers feel like these tests are too narrowly focused and that the administration and the subsequent dissection of the results can be so time-consuming that they detract from other important instructional objectives. But others feel that the tests provide them with valuable feedback on where their students stand and therefore provide a road map to help them course-correct their lessons. Teams of teachers who analyze benchmark testing results together and then group their students and plan team-teaching approaches accordingly have sometimes reported marked improvement in student test scores. So in these instances, the assessments are leading to more targeted teaching, for good or ill.


Question from Mark Meldola, Consultant, www.mlabs.info:
Since technology instruction has been cut back in many states, why didn’t you look at the amount of technology instruction offered in your assessments? You can have raw access to personal computers or TCP-IP(Internet), but that is not a good measure of expectations for acquiring technological or engineering skills, in my opinion.

Caroline Hendrie:
Our grading scheme does consider whether states have academic standards for what students should know and be able to do with technology. But I do think that delving more deeply into just what those standards require is a good topic for further inquiry.


Question from Camille LoParrino, M.S. Ed, Ed Technology, Mercy College:
Being in New York, and reviewing your report, I am wondering how we could have received such a low score. What do your guests feel should be done to improve this situation?

Caroline Hendrie:
Hi, Camille. I would encourage you to take a look at our in-depth report on New York at www.edweek.org/techcounts06. But I can say that what hurt New York most was its ranking in the area of students per instructional computer, which we define as access to technology. New York got a D in this area because it has more students per computer than the national average. For example, nationally an average of eight students share each classroom computer with high-speed Internet access, while in New York, an average of nine students must share each one, according to our data.


Question from Minda Aguhob, Researcher:
Did you include data portability between district and state data systems, etc. in your evaluations?

Christopher B. Swanson:
The issue of data portability and interoperability did come up somewhat in the context of work related to the special theme of this year’s report. Based on conversations I am hearing in policy circles, I think this is an increasingly prominent topic among state and local data systems gurus. (It’s also something that the rest of us should keep an eye on because it will have important implications for how functional and educationally useful these data systems will be for educators, parents, and the public as they evolve.)

Our grading of the states, however, does not look at this particular topic.


Question from John McCreary, Technology Trainer, Minneapolis Public Schools:
Minnesota is one of a small handful of states that does not allocate funds specifically for technology. We are 49th in the nation in technology spending. What evidence can we present to our legislators to show them that technology spending can lead to higher achievement in our students?

Caroline Hendrie:
Hi, John. Of course, we’re not in the business of advocating for more school technology. But our report has numerous examples of schools and districts that have seen tangible improvements that they attribute the introduction of sophisticated systems for data management and analysis. Take a look at our story about Philadelphia, for instance. But whether there’s a causal link between dedicated state funding for technology and higher student achievement would seem to be a good topic for future research.


Question from Jim Kohlmoos, President, NEKIA:
In addiition to collecting and analyzing data, are there states that use technology to manage and transfer knowledge among schools and/or districts and the research community?

Christopher B. Swanson:
All states use technology to communicate data in one form or another. States, however, vary tremendously at this point in how sophisticated that communication is and the audiences the are able to reach. This can range anywhere from a website that contains report cards on school performance to secure web-based data system where teachers can log in and call us detailed information on individual students.

The survey we conducted for this year’s Technology Counts addressed this issue to some extent. In particular, you can find a detailed table with a state-by-state view of the kinds of access and analysis tools the state offers educators and the public.

So far, much of the discussion on state data systems has been focusing on questions of structure (whether states assign unique identification codes to students, the particular data points and pieces of information collected, and the like). But I think we’ll be hearing much more about exactly this issue - knowledge transfer and communicating with educators and other stakeholders - in the coming years.

In a way this is a natural evolution of the issue. First, collect the data. Then, get it into the hands of the people who need that information. Of course, the on-going development of these information systems would benefit if we leapfrogged this discussion a bit. That is, as these systems are being built, we should be aware of the kinds of information that educators (and others) need and the ways that information can be most effectively communicated to them. That way, we’ll have a better change of getting the data systems that we really need.


Question from Catherine Burdt, Sr. Analyst, Eduventures:
How are schools funding the systems that record and manage the data?

Caroline Hendrie:
Well, as you well know, Catherine, these systems can cost a pretty penny. We heard from some educators that funding was a real obstacle. Businesses told us that financial constraints often limited what they could do for school districts, particularly compared with better funded corporate clients. What’s more, some of the districts that we focused on in our report had received discounts on their data systems because they were partnering with companies to serve as R & D sites for their products. That said, states and districts are being pushed to make spending on data a priority in large part by the No Child Left Behind Act and state accountability systems, so many are finding money for it in their budgets. The survey that our Research Center conducted for Tech Counts 2006 found that one third of states cited data-management systems as one of their top two priorities for this school year for education technology. Moreover, the federal government has a grant program that we write about in our report that supports that design and implementation of statewide longitudinal data systems.


Question from Joe Petrosino-Mid Career Doctoral Student @Penn:
How can data be collected, coalesed, and interpeded to facilitate building a community of trust in a school community? How can technology improve the quantative research process?

Christopher B. Swanson:
Trust is an important issue, although not one you hear about too often in these circles.

Information can be a powerful tool for school improvement and also, potentially, a way to help individualize instruction for individual students. Transparent and meaningful dialog among educators within a school can be one key to turning data into action into student learning into better school performance.

An important step here would be to get information into everyone’s hands and to make sure all teachers and administrators had the know-how to make sense of the numbers. Here, I think collaboration among staff members - learning from each other - can be particularly effective.

We are all very aware that this is a time where accountability pressures are high. But data should be about more than just AYP. If used the right way, data on student performance (and by this I do not just mean test scores) can be a powerful tool for improving teaching and learning. But it takes a village, so to speak. And trust will help there.


Question from John Craig, Principal, Twenty First Century Christian Academy, Philadelphia, PA:
What was Pennsylvania’s grade?

Caroline Hendrie:
Hi, John. Pennsylvania scored an overall grade of C, slightly below the national average of C+. Its highest grade in our three grading subcategories was a B- in “capacity to use technology.” Its lowest, a D+, came in of “use of technology,” and it got a C in “access to technology.” Please check out our State Technology Report on Pennsylvania at www.edweek.org/techcounts06 for further information and explanation.


Question from Charles Pyle, Director of Communications, Virginia Department of Education:
I found it curious that none of the feature articles in the report focused on a state or district in a state in the first quintile.

Christopher B. Swanson:
Technology Counts does have two somewhat separate parts.

One is the grading of the states on a set of indicators we have been tracking over the year in the areas of Access, Use, and Capacity. But much of the report, and the journalism in particular, focuses on a special theme. That special topic typically addresses a hot or emerging issue in education technology and changes from year-to-year.

So it can be true that states who fare well in our Technology Leadership grading are not necessarily featured in the theme-related journalism.


Question from Mark Sampson, citizen:
The information is great how do you distribute the information to the people that need it the most?

Christopher B. Swanson:
Technology Counts is a special issue of Education Week, which reaches a broad readership of policymakers, administrators, and educators. So there is always the newsstand / mailbox route.

We are also increasingly reaching out through our website, where readers can fine the full report online as well as some web-only extras (www.edweek.org/techcounts06).

But word of mouth is very important too. So we hope readers will also tell their friends and colleagues about the report.


Question from Willow Sussex, Education Researcher, SRI International, CA:
What do you think good data-driven decision making LOOKS like in the classroom, in the schools, in districts? Can you cite examples of teachers or administrators, and how you have seen them using data to make instructional decisions? What kind of decisions do they make? How common is this, in your experience?

Caroline Hendrie:
I’m really glad you asked these questions. For teachers, good data can reveal strengths and weaknesses of individual students, and for their classes as a whole. That can give them a road map for what they should be doing in class tomorrow, next week, next month, and next year. Our report found plenty of examples of this sort of activity around the country. In one school in Massachusetts, for instance, teachers who used to cut and paste old test items from test states to cobble together warm-up tests for their kids are now letting sophisticated software programs do that work, leaving them time to tailor lessons to address their needs. In California, groups of teachers are getting together in grade-level teams to scrutinize the results of benchmark tests their kids are taking three or four times a year and then figuring out how to group them and get them what they need to succeed on state tests. And in Pennsylvania, teachers are getting students involved in looking at their own data, and in figuring out the strategies they should pursue to fill in their learning gaps. One teacher engaged in that kind of work told us that analyzing data had made her a much more comprehensive teacher. She said she’s more attuned now to what her students need rather than to what she happens to enjoy teaching. “I can’t teach irony for six months just because I like it,” she told us. So how common is this? Experts tell us that the overwhelmingly majority of schools haven’t really gotten out of the starting gate when it comes to mining the mountains of data they could be accessing.


Question from Teri Sanders, Director K-12 Outreach, California K-12 High-Speed Network:
Is it correct to say that the matters reviewed and incorporated in the grading have a focus on technology policy and less to do with how technology is being used to enrich instruction in each state?

Christopher B. Swanson:
Technology, like any other educational improvement strategy, is complex. There are policies or other types of actions states can take to help things along. But then there are also the next steps related to: how policies are implemented and supported locally, whether instruction changes, and whether improved instruction leads to more student learning and gains in performance.

Our report focuses primarily on that first part of the puzzle - state-level policies and strategies. But we do believe the rest of the process linking the statehouse to the schoolhouse is extremely critical if efforts like these are actually going to improve learning. So we hope that other reseachers can take what we found as a starting point in future work that looks more directly at local implementation and practice.


Question from Alan Kunerth, WW II taxpayer:
Here in Sarasota, about 4000 kids enter the 9th grade but only 2,500 show up in the 12th grade four years later. Even then, only 2,300 earn a standard high school diploma in four years.

Yet the school board insists that the dropout rate is 3.1% --

How can they get away with this year after year ?

Sarasotans pay $12,500 each year for each enrollee for expenses of education -- 80% from local property taxes including construction, class size reduction. etc.etc.

Are we paying enough compared to the national average to get our kids to the 12th grade in four years ?

By the way, only 20% of our kids in Sarasota are Hispanic and Black, virtually none Oriental. Only 20% are eligible for free luch

Christopher B. Swanson:
As some readers may know, I have personally been heavily involved in research and analysis on high school graduation rates over the past few years.

I won’t speak to that particular issue in detail today ...

But I would like to let folks know that, with the support of the Bill & Melinda Gates Founation, we are launching a new special report series focusing specifically on graduation and other high school issues.

The first of four annual installments - entitled Diplomas Count - will be released as a special issue of Education Week next month (June 22, to be exact). So be on the look-out for that.


Question from Gerald Isaacson, Supervising Program Development Specialist, University of Medicine and Dentistry of NJ:
How are schools using discipline data to improve school climate and achievement?

Caroline Hendrie:
Thanks for asking this. One interesting development we found is that some schools are teachers are using data systems that instantly enable them to see how many days a particular child has been absent. A school we wrote about is piloting an electronic grade book that lets teachers, parents, and students monitor not only students’ daily academic progress, but also lets them track attendance on a class-by-class basis. I’d be very interested in hearing about any examples out there of schools using discipline data in the way you suggest, however.


Question from John Terry, Teacher, Carnegie MS:
Looking at the grade given to California, C-, doesn’t truly tell the story. If the state can get a “B” in capacity and an “F” in access, how do we improve access? How can I as a teacher improve access for my students if I only have two computers in my classroom?

Christopher B. Swanson:
You make a good point. It’s always important to look beneath the surface to understanding a process like implementing technology and using that in instructionally effective ways.

We know that there are many pieces to that puzzle. That’s why we include several categories within our Technology Leadership grade. States can be strong in one area but not in others.

Ideally, access to computers and other technology should be paired with strong educator proficiency in using that technology, expectations for students, and other kinds of innovative ways to integrate technology into to instruction to promote learning.


Kevin Bushweller (Moderator):
Thank you for joining us for today’s discussion about “Technology Counts 2006: The Information Edge: Using Data to Accelerate Achievement.” Please join us Wednesday, May 10 from 3 p.m. to 4 p.m. Eastern time for our second chat accompanying the release of “Technology Counts.” Our guests for that chat will be Lisa Petrides, president of the Institute for the Study of Knowledge Management in Education, and David J. Hoff, a senior writer for “Technology Counts.” Meanwhile, don’t forget you can read the report online here.


The Fine Print

All questions are screened by an edweek.org editor and the guest speaker prior to posting. A question is not displayed until it is answered by the guest speaker. Due to the volume of questions received, we cannot guarantee that all questions will be answered, or answered in the order of submission. Guests and hosts may decline to answer any questions. Concise questions are strongly encouraged.

Please be sure to include your name and affiliation when posting your question.

Edweek.org’s Online Chat is an open forum where readers can participate in a give- and-take discussion with a variety of guests. Edweek.org reserves the right to condense or edit questions for clarity, but editing is kept to a minimum. Transcripts may also be reproduced in some form in our print edition. We do not correct errors in spelling, punctuation, etc. In addition, we remove statements that have the potential to be libelous or to slander someone.

Please read our privacy policy and user agreement if you have questions.

Chat Editors