Education Chat

Knowledge Management in Education

This chat, the second of two discussing the findings of "Technology Counts 2006," discussed the process of using data to improve decisionmaking at all levels of education.

Knowledge Management in Education
May 10, 2006

Guests:

  • Lisa Petrides, president, Institute for the Study of Knowledge Management in Education; and
  • David J. Hoff, senior writer for “Technology Counts 2006"

Kevin Bushweller (Moderator):
Welcome to today’s online chat about using data to improve decisionmaking at all levels of education. This is the second of two chats accompanying the release of “Technology Counts 2006: The Information Edge: Using Data to Accelerate Achievement.” We have some very thought-provoking questions already. So let’s get the dicusssion started.


Question from Christopher Forney:
What is your definition of knowledge management and how do you propose its use in the educational setting?

Lisa Petrides:
My definition of knowledge management is that it is a human-centered, organization-wide approach to knowledge sharing and learning. This requires a conscious integration of people, processes and technology brought together for the purpose of collecting, sharing, and using data and information--the goal of which is to build organizational capacity for continuous improvement. So in that sense it is really an approach or methodology. It’s not something you can buy off a shelf, nor is it a way to control information within your organization. Ultimately, it is about transforming what an organization knows into how it acts.

There are those throughout the education sector who are already doing pieces of what I would call knowledge management, but might not necessarily be doing them in a concerted effort. There are those who focus on building information systems, technology training, improving school culture, etc. I think where the real benefit of knowledge management comes into play is bringing the pieces together in a larger framework or ecosystem. For example, today we know that there is a great deal about everything needing to be “data-driven” but what does that really mean? For example, organizational processes and policies that are in place can be used to either enhance or inhibit the sharing of information and knowledge. Do you know how data flow throughout your organization? Who has access? And is knowledge created and information shared cross-functionally or is it in isolated silos? These are some of the questions that I think knowledge management offers to the education sector, with the purpose of improved student achievement and increased organizational effectiveness.


Question from Danielle Logan< Reading Coach Trainer:
With the influx of data available for analyzing student performance, who do you see at the school level as being the person responsible for the gathering and analyzation. Do you forsee an new direction for the educational professional? How should we equip ourselves to make the most of what we are able to collect and study?

David J. Hoff:
On the school level, the principal will be the first person who learns how to use these data tools. Once these tools are commonly available, teachers will need start using them as well. I think the new direction will require all educators to become computer savvy. Just as software has changed the way tax accountants do their jobs, teachers will need to learn how to use these tools to figure out how to change student performance. The first step will be learning the basics of how to use these tools. From there, educators at all levels will need to get further training to get the maximum value out of them.


Question from Joanna James:
How can schools and facilities work with Union representatives to improve decision-making?

Lisa Petrides:
On a general level, openly sharing data and engaging in meaningful conversations around analysis and interpretation is often the first place to start. It’s not that all arguments can be solved with data, or that multiple interpretations won’t take place, but supporting positions with data is often a good place to start and build a case from. It’s interesting that while unions have worked hard to secure things for teachers, such as better wages, conditions, and health care, the work of accountability and assessment has been left to State legislatures and those who create the tests and assessments in the first place. Yet, when it comes to the real work of teachers, that which takes place in the classroom every day, Union representatives are not typically brought into the fold to discuss or help solve issues surrounding the improvement of student achievement systemwide. Meet with union reps and ask them how they can be more involved in decision-making around these issues. Ask districts to develop and implement annual surveys of parents, staff, teachers, and administrators, and then to engage all stakeholders in discussing results. It’s also important to insist on transparency and self-assessment at the top if you expect teachers to feel more comfortable sharing what they know with each other and with school and district administrators. Ultimately it’s about unions become more engaged in professional issues related to helping teachers improve teaching and learning.


Question from Loniece Ningo, Teacher, Mauldin HS:
How can you get individuals of the same mind set to accept another approach to collecting and subcategorizing data, so problems can be clearly delineated?

David J. Hoff:
It’s fair to say that learning new skills and changing existing practices is difficult. But because the No Child Left Behind Act is forcing teachers to pay more attention to their students’ academic needs than ever, teachers might be looking for any resources they can find to help their students meet those challenging achievement goals. The advocates of data-driven decisionmaking are counting on that to be the impetus for teachers to turn to technology.


Question from John Terry, Teacher, Carnegie MS:
Many of the teachers I work with rarely use data to inform instruction or they think that is too much work. How does one go about creating an environment that individuals want to look at data to inform decisions?

Lisa Petrides:
Most teachers are trying to figure out how to do more with less. So any new task or activity is seen as an additional burden on their time. This means that there needs to be a strong motivation for teachers to engage with data. Successful data coaches have told us that there is a tipping point when it comes to convincing teachers to use data to examine the impact of their own teaching practices. But that the process of understanding, for example, why a lesson worked and how to repeat, it is extremely validating. I think we’ve learned this over and over again, that people are more likely to understand evidence that shows where they could be more effective in their work if they are themselves a part of the process of inquiry. They need to own the process and own the data. It is at that point they begin to better understand and appreciate what are their strengths as teachers and what they still need to learn. So creating the conditions for this to happen is paramount.


Question from Dale Patrick Dempsey, writer/editor The Ohio Gadfly, Thomas B. Fordham Foundation, Dayton, Ohio:
Some teachers complain about the added workload required by No Child Left Behind. How would productivity gains through the use of technology/training ease that load?

David J. Hoff:
I think the experts expect the techonology to make it easier and faster for teachers to identify students who need help and help teachers tailor instruction to their needs. Teachers may save time there.

But I’m not sure the technology will provide any shortcuts in the instructional interventions needed to bring those students up to proficient performance. That’s still going to be a difficult and time-consuming task.


Question from Steven Boone; Coordinator of Developmental Reading; Towson University:
My research tells me High School GPA is the best predictor of students’ success in Higher Education. That said, why do standardized tests continue to fail to adequately describe students’ performance in the schools?

Makes one wonder. Perhaps the curricula of standardized tests need to be remediated? And, the format in which students are being tested needs to be remediated?

David J. Hoff:
Your question goes to the heart of the matter: What is the best indicator of what a student has learned. Critics say standardized tests give only a shallow glimpse of what students know; the students’ work is the best indicator of what they have learned. Supporters of standardized tests say they are an important data point to assure policymakers that students are learning something; without the test data, no one knows for sure if a student who receives a passing grade actually knows the material--or was just given a grade to pass him along to the next grade.

When designing technology tools to assist teachers with instruction, policymakers will be challenged to include data from both standardized tests and coursework. With it, teachers would have access to everything they need to know to help students.


Question from John Shacter, consultant and teacher, Kingston, TN:
Major educational improvements will occur when we first agree on its core mission: To truly prepare graduates for world-wide quality employment, quality higher education, and quality participation in family and society. I have a multi-faceted, practical proposal to accomplish that. But computer programming and utilization are NOT at the core of it. Why or how would you argue that they should be?

Lisa Petrides:
I agree 100 percent that computer programming and utilization are not at the core of educational improvement. While technology can be leveraged for information sharing, the implementation of technology does not in and of itself promote the use of data. That being said, advances in information technologies have and will continue to make data much more accessible, particularly for non-experts. Therefore, it is important to adequately fund and support new information technologies, aligning them with cross-functional goals. As well as to ensure widespread access to data and information in easy to query formats for non-experts.


Question from Rami Benbenishty, Researcher, Israel:
Do you envision progress toward gathering and analyzing information which is not directly related to academic achievement, such as school climate, victimization, safety, etc. Further, do you envision using these tools to help in mental health assessment and early identification of students in emotional distress? Thanks!

David J. Hoff:
I think it’s certainly possible for some of the data regarding school safety and other issues would be part of state data systems. But I don’t think that those issues will be the top priority for policymakers as they build these data bases.

In the end, I don’t think these technology tools will be used to identify students in emotional distress because they will be built primarily to assist teachers in understanding student learning. I think teachers will have to continue using their current resources to identify students who need help from mental health professions.


Question from Pam Capps Instructional Technology:
I noticed on Georgia Technology Report it said Technology was not required for recertification. This is not true as all teachers have to be certified in technology. If they did not have when they got initial certification they must complete courses or take an online test to meet requirement before they can be recertified.

David J. Hoff:
When the state responded to our research survey, it told us that teachers are required to meet technology standards only if they did not do so in their initial certification. That means teachers who were certified before technology was required for the initial certification must earn a technology certification in order to be recertified. Those who received the technolgy certification when they entered the field don’t need to be recertified in technology. That’s why the state didn’t receive credit under that category.


Question from netranand pradhan, Head of the Dept. of Educational Administration, M.S.University of Baroda, India:
Why do some teachers and school principals not feel the need to use technology?

Lisa Petrides:
I’m not sure if it is that they don’t feel the need to use technology as much as that they might not see the practical implication of technology for them in their own practice. I think we’ve come to realize over the past 20 or 30 years that technology alone isn’t going to improve teaching practice. Yet the mavericks of technology use early on, in their eagerness to implement and leverage the use of technology, often side-stepped the issue of how technology impacts the change that is desired, particularly as it applies to improved student learning. Only now are these bridges beginning to be crossed, with a lot of excellent work on the part of those organizations that are studying technology use and it’s impact on learning. One thing we’ve seen over and over again, which I’ve alluded to earlier, is that it has been easier to raise money for hardware and software than it has been to raise money for the education and training necessary to support and sustain it. And it is only through an emphasis on that training that we can hope for teachers and principals to have a better understanding of how technology can be leveraged to help them do what they do more successfully.


Question from John Terry, Teacher, Carnegie MS:
How can I be a visible model for teachers on the benefits of using data to inform instruction?

Lisa Petrides:
One thing is to be open with other teachers about how you are using data and what you have learned from your own teaching practice. If you are not already doing so, things such as meeting with teachers on a regular basis to share teaching strategies on what worked, what didn’t, and where there is room for improvement, begins to create an environment that is conducive to sharing. In same cases, schools have created deliberate learning communities to look at and analyze data around particular issues. Focusing on how to improve rather than on who is not performing is a way to build trust. You might also look at what trainings or other resources are available for you to use when wanting to start the data inquiry process. Are there relevant technology trainings or does the district have data coaches who can spend time guiding teachers through the array of issues, and if not, you might help to facilitate the development these types of resources.

Another key thing is to not sit and wait for the perfect data to arrive. It’s better to start with whatever testing or assessment results are available, and then through the inquiry process itself, you will be better able to makes suggestions for improving and expanding on what data are currently collected. The same holds true for technology. Some schools think that they can’t use data to inform decision-making until they have a new state-of-the-art information system in place. However, data comes in all sizes and shapes. You can work with teachers on their own internal assessment data or collect stories of what works and analyze them. It is the inquiry process that really takes the most time to develop.


Question from Bonita DeAmicis, SUSD, California:
I grow concerned that our focus on data that can be manipulated by technology decreases the attention we give to in-depth formative assessment in the classroom. Everything appears rather summative (and broad and thin) on a computer screen, yet formative is where the best difference can be made in terms of directing and improving instruction. Can you speak to using technology to collect information that helps a teacher determine improvement of writing (beyond conventions), thinking and pondering of science or social studies issues, solving and discussing complex math problems, or comprehending reading materials with depth and complexity? How can we put some emphasis back on these bigger, deeper issues of instruction? Are they dead in this age of accountability?

David J. Hoff:
We’re beginning to see an increase in the number of states that provide formative assessments for teachers to use throughout the year. It’s possible that these assessments will assess those “bigger, deeper issues” you refer to.

Any state that builds a comprehensive data system would incorporate such formative assessments into the tool that’s available to teachers.


Question from Nitin Julka, MBA Candidate 2007, Columbia Business School:
I am curious about the specific questions that EPE used to “grade” technology usage.

1) Student standards include technology (48 states with policy) 2) State tests students on technology (4) 3) State has established a virtual school (22) 4) State offers computer-based assessments (22)

The idea of a good grading system should be to have a wide distribution of scores. It is clear that question 1 is completely worthless because 48 / 50 states have that policy.

Question 2 also fails from this simple idea of having a wide distribution with only 4 positives.

Another important aspect of a quality grading system should be that the data accurately describes average situations in the state.

Question 3 asked if the state has established ONE, SINGLE virtual school.

Do you believe this data accurately describes technology usage in states across America?

David J. Hoff:
Think of it this way: All tests have easy questions designed to assess whether a student has basic knowledge of the subject and difficult questions to assess if they also have an advanced understanding of the subject.

It’s important to note that virtually all states have student standards that include technology. And it’s important to highlight the two states that don’t. Conversely, it’s important to note that states in general aren’t testing students on their technological skills, and to note the four that are.

As for the virtual schools, we asked whether the state has one virtual school because that’s all it takes to reach any student in the state. In fact, I’m not aware of a state that has more than one state-run virtual school.


Question from Sherry Doyle, Parent & Teaching Credential Student, Spcl Ed & Gen Ed:
Dear Lisa: Why does the State of California require each seperate District to purchase a data management system. Why not have a State-wide system that manages attendance data, cum data and assessments that are Standards based? It seems like most Districts are buying and revamping their systems constantly and with little satisfaction. In the private sector, for every $1,000 spent on technology there is $700 spent on training, in education for every $1,000 spent on technology, only $170 on average is spent on training. This underserves our students, parents,teachers and our communities. Please respond. Thanks Sherry

Lisa Petrides:
Your question has touched on several of the key factors that make the mandating of State-wide systems a complex issue to solve. The fact that we can even consider State-wide systems has been made possible by the advent of technological advances. However, even if the technology software and hardware make it possible to do so, most states are not yet there. What are some of the reasons for this? First, it takes a fair amount of convincing for state budgeting to consider the financial resources necessary at that level. So parsing it out to districts takes off some of that burden, at least in a deferred type of way.

Additionally, districts and schools argue that their needs might be different than the needs of the states, and have concerns that these needs may not be met by the state. For example, states typically want benchmarking and report card based data. Yet while districts also find that data important for comparison, what they need more are data that can help them improve what it is they do on a day-to-day basis. So while there may be plenty of reports available on test score results, teachers are more likely to need data that can be used to improve their own practice. One teacher recently said to me, “It is about using assessments for learning, as opposed to assessments of learning.” So how can both sets of needs be met? Another problem has been that schools and districts are asked to feed their raw data up to the state level, but often don’t get the data back in a timely fashion that allows them to actually use the data to do something differently in their own classroom teaching. I think these issues create resistence when decisions on state-wide systems are being made.

Your last point on the financial resources spent for technology versus training is right on target. It seems easy to justify a hardware or software line item for a budget, but it is ten times more difficult to get the necessary training dollars funded, not to mention the necessary support staff to keep the technology up and running and well-used over time. There are hundreds of technology grants made to districts each year, often from well-intentioned corporations and foundations, but these dollars are often quite restrictive when it comes to training and support. We all have countless examples of information system projects that never quite got off the ground, or projects that faded out once the project’s champion moved on to a new district. In some ways, this does make the case that State-wide coordinated efforts might be more efficient and effective in the long-term. But the short-term impact is that districts will also need some amount of say and control over policies of data collection and dissemination, and that is where the tough conversations begin, and where many states have begun to make important inroads.


Question from Tomas Saucedo, National Council of La Raza, Ed Data Project:
What could be done to improve turnaround time so that state & district assessment info reaches school staff in time to guide changes in curriculum, staff development & instructional practice? Do you see any evidence of states focusing on struggling schools [not making AYP] as a priority for improving school site data access and skill building via tools, methods & training?

David J. Hoff:
The No Child Left Behind Act is requiring test publishers to release test scores by the beginning of the following school year. Districts need the scores by then to meet the law’s deadline for determining which schools made adequate yearly progress during the previous school year. That reporting time doesn’t help teachers create instructional plans for the school year in which the testing occurs, it does give students’ teachers for the following school year the chance to evaluate the test scores. And that’s a lot better than it used to be whe test publishers took longer to release test scores.

As far as focusing data access on struggling schools, I only see that anecdotally so far. Because of the pressure to improve performance under NCLB, it only makes sense for policymakers to put the most effort in getting these data tools up and running in the lowest-performing schools first.


Question from Joe Petrosino, Mid Career Student , Penn:
I am studying how to build a community of trust in a high school. How can I use data to analayze the current level of trust in the school (i. e. administrator v.teacher trust). Then, how can I use data to begin to build a plan that will develop a community of trust? What type of data collection method would one use in relation to this topic?

Lisa Petrides:
There are certain readiness assessments and other diagnostics that you can use to determine how people are currently using data throughout your school. We have done studies where we look at how data flow throughout the organization, the politics of sharing information--meaning is it shared freely or hoarded--and then how people within the organization are rewarded for using data. All of these things contribute to the climate of trust that is desired.

There is a concept that myself and others write about called a “culture of inquiry.” Simply put, a culture of inquiry is about creating the conditions for continuous improvement in schools. This includes establishing a level of consistency, trust, transparency, and self-reflection that begins at the top and affects practices and behaviors at all levels of the school. In practice, this would require you to a) agree on a particular set of problems or a specific issue in your high school that you would like to address, b) examine data in cross functional settings (i.e. with teachers and administrators), c) have sense-making or interpretation sessions where you analyze data together, and then d) develop action plans that will move the problem forward. Ultimately, this also requires you to reassess or reevaluate the initial problem to make sure that you correctly identified the problem in the first place or to find out how effective the action plan was. Over time, school administrators and teachers tell us that this process, aimed at successfully creating an environment that is conducive to asking questions and proactively solving them, can take anywhere from 18 months to 2 years.

In terms of data collection methods, not all problems are equally solved with the same methods, so I would suggest first clearly defining the problem, and then working backwards to determine what it is you would need to know to do so. For example, is about improving classroom instruction through professional development or aligning human resources to improvements in classroom instruction. Each issue requires a different type of data to address it.


Question from Marty McCall, Researcher, Northwest Evaluation Association:
From your experience and research, what kind of infrastructure fosters teacher inquiry? I am thinking of systems of data access, user-friendly analysis tools, professional training and venues for teachers to report and talk about data. What have you found to be the most effective?

Lisa Petrides:
Our research has shown that it really is a combination of all of those factors that you’ve mentioned: data access, user-friendly analysis tools, professional training, and opportunities for reflection and analysis that foster teacher inquiry. The development of principals and master teachers as instructional leaders has been shown to be an important part of the infrastructure as well.

One important finding from our research has been that there is not a one-size-fits-all model. This means that successful districts and schools do not necessarily engage in all of these processes at the same time, nor in the same order. Some strive to make sure that they are able to implement all of these pieces simultaneously, while others, depending on their unique historical context, the needs they face, and the opportunities that are currently in front of them, take one piece at a time.

Overall, having practices in place that clearly demonstrate how data use is linked to the overall mission of the school, and providing incentives for people to share and use what they know are the most effective forms of fostering a culture of inquiry for both teachers and administrators.


Question from Brian, Graduate Student:
Teachers have at their fingertips on enormous amount of objective and subjective student data (grades, observations, etc.), most of which are not centralized in any way nor placed in the hands of the principal. How does an administrative team without a comprehensive database go about gathering and analyzing pertinent data (other than standardized test data) from all of the school’s teachers and support staff?

Lisa Petrides:
In some ways, this is a perfect opportunity to bring together teachers and support staff to determine what the main issues or problems are that you are trying to address, both as a school, as well as at the classroom level. Once a set of problems are well-defined, this is the time to begin to determine what data, both objective and subjective, would be the most important for you to start to collect, gather, and put in some type of database. Too many times I’ve seen schools try to compile all of their data about everything into one database, only to find out that most of it isn’t directly useful and relevant for what they need. What data are collected and analyzed is not an objective process. So bringing all stakeholders into the discussion ensures more buy-in later on.


Question from Hal Portner, Consultant:
In the ongoing effort to reduce the number of new teachers leaving a district (and even the profession) after only a couple of years on the job, it is important to know in some detail why they leave and what might have changed their decision to leave. Do you know of any districts that use technology for this purpose, not only to collect and organize such data, but also to analyze and apply it?

Lisa Petrides:
The area of human resources in schools has gotten more and more attention over the past few years, and there are a number of districts who are revamping their human resources departments both in terms of restructuring organizational hierarchies as well as putting information systems in place that enable them to better track, monitor and evaluate their own effectiveness. As you know, districts face many challenges in hiring and retaining enough qualified teachers, particularly in the areas of math and special education, and there are substantial challenges in placing and retaining qualified teachers in their lowest-performing schools. As such, many districts are working hard to become more “customer” oriented--for example, having streamlined or online applications for teachers, creating more competitive mechanisms for notifying potential new teachers of their acceptance (thereby ensuring that a district gets its top choice of teachers), and assigning teachers to their new positions in a more timely manner so that they can plan accordingly for their fall classes, and creating exit interviews for addressing the problem you specifically have addressed.


Question from Dale Patrick Dempsey, writer/editor The Ohio Gadfly, Thomas B. Fordham Foundation, Dayton, Ohio:
Is there a rule of thumb as to how many hours in, say, a 40-hour week should be devoted to developing technology skills?

David J. Hoff:
I don’t know of a rule of thumb, per se. But the Gainesville City School District in Georgia recognizes that it takes significant amount of time in preparing their teachers to use technology. Even after several years of effort, the district stil has monthly meetings where teachers discuss what data tell them about student achievement.


Question from Joe A. Abalos III, Executive Director of Planning & Accountability, Collier County School District:
Some school districts effectively use technology to collect, aggregate and analyze data. These districts are data-rich but the aggregated data do not translate into meaningful and actionable information. Thus, you have a data-rich but information poor organization. How do you set up a system whereby data is used to truly effect improvement and meaningful change?

Lisa Petrides:
You can be a data-rich district, but be information-poor. Although it’s important to remember that having data is a good first step. The system to be set up is one in which cross-functional problem-solving teams (sometimes referred to as communities of practice) work together to make sense of the data, to determine what action needs to be taking, and then to create an iterative process of feedback and reevaluation so that what you think is true for today is also true for tomorrow. It’s often likely that it won’t be.


Question from David Barry, program dir. for science, Boston Latin Academy (Ma.):
Is there a “best practices” data base for science education or k-12 education in general?

Lisa Petrides:
There are, in fact, several sites that are collecting best practices and innovations in both science education and for K-12 education in general. One important thing to note, however, is that while there is a great deal to learn from best practices, the context and applicability from one situation to the next can vary greatly. Therefore, a static best practices database isn’t as effective as creating online venues in which dynamic conversations can take place. This could be as simple as being able to make comments on and review items in a best practices database. Also, I think it’s important to note that we also have a lot to learn from our mistakes. Although of course it isn’t always acceptable to say that, much less discuss it in detail. I’ve always liked the analogy of a product that we all have under our sinks at home, WD-40. It gets its name from being a product that deals with water displacement (hence, WD), but they didn’t get it right until the 40th try.


Question from Dr. Sharon E. Anderson, Senior Research Associate, MPR Associates, Inc.:
Thanks for a terrific and timely report. Over the past ten years, my company has worked with states, school districts, and schools to train principals and teachers on using data for decision making. Our experience is that such training is immediately applicable to teachers because it helps them understand how to use data to improve instruction as well as for accountability purposes. It is, however, a time and labor intensive process. Based on your research for the report, do you have an opinion on the best way to reach teachers in classrooms with this type of professional development?

Lisa Petrides:
Our research suggests that creating a culture of inquiry is probably the most important way to reach teachers. Additionally, the most successful professional development strategies we have observed involve creating support mechanisms for teachers and principals along the lines of coaching, real-time training, at-elbow support, peer support and just-in-time training. I think the main message here is that the old model of a one-time presentation or training session is simply not enough. It has to be an active ongoing engagement.


Question from Darell Cain, School Improvement Facilitator:
As the states and large school districts, over 6,000 students, focus on development of policies related to data and information systems that support administrative functions, what would you recommend smaller districts and individual buildings do to support instructional functions? For example, would you recommend learning and application of data gathering, organizing and analysis at the classroom and building by the teacher and administration while the state and larger districts focus on policy?

Lisa Petrides:
I think there are real differences between the needs of states and the needs of districts. States need to respond to accountability and assessment mandates. While schools need to focus on creating a culture that is focused on improved student achievement at the classroom and school level. Part of this is developing opportunities that allow schools to define their own terms of success, and to refine their ability to present their successes to external stakeholders. This might be different for small versus large districts, but ultimately, even the large districts still operate at the classroom level.


Question from Jim Kohlmoos, President, NEKIA:
In last week’s online chat on Technology Counts 2006 we learned that all states use technology to communicate data in one form or another but they vary tremendously when it comes to more sophisticated uses like knowledge dissemination and management. What are the preconditions for states to be able to advance beyond data collection and focus on knowledge utilization?

David J. Hoff:
One major problem for states is figuring out how to take data collected in a variety of formats and put them into a single format that users can analyze it and draw conclusions from it. Until a state can link a students test scores to other data--such as attendance, transcripts and grades, etc.--the data are of limited use. Because each data point may be collected in different formats--a spreadsheet for one, a custom software application for another, etc.--they can’t be analyzed until states figure out how to warehouse that data in one entity. Some states, such as Florida, have addressed this issue, but not many so far.


Question from Betty Jeung, Org. Specialist, Nat’l Education Association:
What role would a school site staff (teachers, paras, administrators, etc.)have in knowlege management?

Lisa Petrides:
Every person in the organization that you desire to become more effective in what they do and to become part of a continuous learning culture should be part of the knowledge management process. But remember that it is not a quick fix nor an isolated approach.


Question from Susanna Kemp, Technology Assc, Colllaborative Communications Group:
Hi - We’re very interested in how KM has been adopted so far and would appreciate it if you could give us an overview of the history of KM in educaiton. Thank you.

Lisa Petrides:
I’m sorry, but won’t have time to adequately answer your question now. However, if you are interested, we do have a KM in education primer on our website at: http://www.iskme.org/monograph.html


Kevin Bushweller (Moderator):
Thank you for joining us for this online chat. And a special thanks to our guests for taking the time to answer your questions. This chat is now over. A transcript of the discussion will be posted shortly on edweek.org.



The Fine Print

All questions are screened by an edweek.org editor and the guest speaker prior to posting. A question is not displayed until it is answered by the guest speaker. Due to the volume of questions received, we cannot guarantee that all questions will be answered, or answered in the order of submission. Guests and hosts may decline to answer any questions. Concise questions are strongly encouraged.

Please be sure to include your name and affiliation when posting your question.

Edweek.org’s Online Chat is an open forum where readers can participate in a give- and-take discussion with a variety of guests. Edweek.org reserves the right to condense or edit questions for clarity, but editing is kept to a minimum. Transcripts may also be reproduced in some form in our print edition. We do not correct errors in spelling, punctuation, etc. In addition, we remove statements that have the potential to be libelous or to slander someone.

Please read our privacy policy and user agreement if you have questions.

Chat Editors