Education Chat

Chat Transcript: How Education Week Grades the States

The research staff at Education Week discusses how they collect and analyze mounds of data each year to deliver grades on education policy for the 50 states and the District of Columbia in Quality Counts.

How Education Week Grades the States About the Guests:

Ron Skinner, research director, Education Week; and

•Research Associates Melissa McCabe, Jennifer Park, and Lisa Staresina

Kathryn Doherty, moderator (Moderator):
Welcome to Education Week on the Web‘s TalkBack Live session on how Education Week grades the 50 states and D.C. on their policies to improve education. Each year, for eight years running, Education Week has published an annual report called Quality Counts. In addition to exploring a special theme each year, Quality Counts provides a report card on the the states in key policy areas: student achievement; standards, tests and accountability; improving teacher quality; school climate; and the adequacy and equity of resources provided to public schools. The report includes hundreds of indicators related to public education that can be compared across the states.

Today we decided to open up the floor to your comments, questions, suggestions (and perhaps complaints) about how Education Week‘s research team collects and analyzes a mountain of data and ultimately uses those data to determine grades for each state. We hope you will take the time to review Quality Counts 2004. In addition to grading states in the areas mentioned earlier, the report also explores the issue of special education policy and how states are coping with including students with disabilities in testing and accountability systems.

Our guests for today are Education Week‘s research staff: Ron Skinner (Research Director); and Jennifer Park, Lisa Staresina, and Melissa McCabe (research associates). Let’s jump right in...

Question from Kathryn Doherty, moderator:
Ron, to start, why don’t you give the audience a run down of the top and bottom scoring states in each of the categories you graded in Quality Counts?

Ron Skinner, Education Week:
Sure, and I think an important point here is that we don’t give states an overall grade across categories, so while a state may do very well in one category, they may not do so well in another. Louisiana, New York, Ohio, and West Virginia were the highest scorers in our Standards and Accountability category, while DC, Montana, and Iowa received the lowest grades in this category. In Efforts to Improve Teacher Quality, South Carolina and Connecticut received the top grades, both with an A-. The lowest grades go to Alaska and Arizona. For School Climate, Delaware and Minnesota were at the top of the heap while Maryland and Louisiana are last. The resources section is divided into two sections, adequacy and equity--two very different resource categories. For the Adequacy of resources, New York, New Jersey, West Virginia, Vermont, and Wisconsin all received. The lowest grades went to Nevada, Arizona, and Utah. Hawaii received the only A in equity because it is one statewide district, but Delaware and Minnesota also receive high frades for equity. Pennsylvania and New Hampshire both received a D-, and Illinois received the only F in Equity.

Question from Gwen:
Exactly how was the target score determined? I understand you used multiple regression score - but could you clarify why you used this type of grading and how you came about to determine the target score?

Jennifer Park, Education Week:
The targeting score is one part of the grades we give states on the equity of their school resources, and represents the extent to which district property wealth influences state aid. In the statistical model we use to calculate this score we control for factors besides property wealth that may influnce state aid such as student enrollment, the geographic size of the district, and the number of students from low-income families or enrolled in special education. The targeting score is used with the state share of funding to calculate “state equalization effort,” which accounts for 50 percent of a state’s equity grade.

Question from Jihong Li, Journalist, Oriental Morning Post:
What do you think is the biggest shortcoming of standardized tests such as the SAT or GRE?

Lisa Staresina, Education Week:
The assessments section of our report does not contain information on norm-referencd tests such as the SAT, ACT or GRE. We do however, keep track of the states that use other types of norm-referenced or “off-the-shelf” tests. Twenty states and the District of Columbia currently use such tests. The grading scale used in the standards and accountability section gives states credit for criterion-referenced/ standards-based tests. Unlike norm-referenced tests,these tests are aligned with the state’s academic content standards. This makes it possible for states to assess whether or not their students are learning state standards. Norm-referenced tests purchased “off-the-shelf” by the state will not necessarily be aligned to a particular state’s academic content standards.

However, this year we found that twelve states are using a hybrid or augmented test. They are using a norm-referenced tests and are incorporating questions that measure their state academic content standards.

Question from Jim McComas-Bussa, Teacher, Minneapolis Public Schools:
Why did Minnesota recieve a D+ when it has the highest percentage of teachers teaching with in thier major?

Melissa McCabe, Education Week:
Education Week grades the states on state-level education policy. The teacher quality section of the report measures state-level policy efforts to ensure a highly qualified teaching force; we aren’t grading the teachers themselves. States must have comprehensive education policies that hit on several areas of teacher quality - teacher education and qualifications, teacher assessment, professional support for teachers, and accountability for teacher quality - in order to earn high marks in this category. At the state level, MN does not require its teachers to complete a minimum amount of coursework in the subject areas they will teach and its middle school teachers are not required to take subject-knowledge tests. The state also does not require performance assessments such as local team evaluations or portfolios for teachers to obtain a more advanced level of certification, and does not require and finance mentoring for all new teachers. MN does not finance professional development or have comprehensive efforts in place to hold teacher education programs accountable for providing prospective teachers high-quality preparation. In short, MN is lacking comprehensive state-level policy across most of the areas that we grade in this section.

Question from Kelly Frankum, ACCEL (GT) Specialist, Mansfield ISD:
It appears that no group scored about 40% as the highest in any category of NAEP. Who determines “proficient?” Is it the same for everyone across the country? Are the standards too high?

Ron Skinner, Education Week:
It is true that no state has a majority of its students scoring at or above the proficient level on NAEP in any subject or grade level. Cut scores are determined by specialists at the National Assessment Governing Board (, but in general proficiency on NAEP is defined as solid academic performance over challenging subject matter. Its a high standard, but not an unreasonable goal.

Question from William Irwin, Senior Research Associate, PA School Boards Association:
In addition to the four disaggregation requirements associated with AYP, NCLB requires further disaggregation of data by gender and migration status in state report cards. Your evaluation of state report cards under “School Accountability” doesn’t address this requirement. Is this an oversight?

Lisa Staresina, Education Week:
In the Standards and Accountability section, we looked at what states required on their school-level report cards. In fact, we asked states about all the subgroups by which they disaggregate data, and included a few of these subgroups in the report. Although at times there is an overlap between the data we track and No Child Left Behind requirements, Quality Counts should not be seen as a report card on how well states are complying with the federal law.

Question from Carletta Fellows, Director, Extended Learning Programs, Friendship Edison Public Charter School:
Most state districts performed poorly in the area of equity of educational resources across the states. What areas within the states seem to receive lesser of the educational resources and how does this affect student achievement among that population of students? How can mechanisms be put in place that determine resource needs and distribution?

Jennifer Park, Education Week:
You are right that there are a lot of states that do not do well on our equity grades. According to our analysis, districts that receive less revenue for education are those with lower property wealth. Although most states provide at least half of the total state and local revenue for education (reducing the reliance on local property taxes) and target state aid to property-poor districts, the majority of states still have inequities in funding due to property wealth. We found that in 42 states property-rich districts, on average, actually have more revenue for education then property-poor districts.

We did not look at the connection of these inequities to student achievement or the evaluation of resource needs.

Comment from Jim Watts, Vice President Southern Regional Education Board:
No question, just a compliment.

Once again you have done an outstanding job.

We use Quality Counts information frequently with our sixteen legislatures, Governors and their staffs. It’s an excellent starting point for discussions on policy and state progress. It works hand in hand with SREB goals and our tracking of state and Southern progress.

Question from Anita Karney, parent:
Regarding Special Education ... I noticed that under the Participation Chart, “Children Served by IDEA”, on page 2 of the online report, that the Deaf/Blind category was often 0.0 ... why?

Jennifer Park, Education Week:
According U.S. Department of Education data for the 2002-03 school year, only 1,500 of the almost 6 million students served under IDEA were in the Deaf-blindness category. The states in our report with a 0.0 in that column just have very few students in this category and their percentage rounds to zero.

Question from Marcella Lewis, Dedicated Parent:
If one happens to live in a state that received a low grade, what can concerned parents do to ensure that the State’s Education Department does what is necessary to make the necessary corrections? As a parent, I often feel powerless to enact change. What can parents do to help our children receive a better educational experience?

Melissa McCabe, Education Week:
Education Week’s grades provide a comprehensive and comparative look across the states and D.C. on education policy at the state level. The grades are a good jumping off point for discussion about how to improve the quality of education across the states, but it is also important to remember that local and federal policy shape education as well. Parents are particularly well-positioned to enact positive changes at the school and local levels. School report cards are a great place for parents to start. They often contain a wealth of information about each school and can be a good way to assess strengths and weaknesses. Education Week found that 46 states and D.C. require that school report cards include disaggregated student performance data; 29 states have school report cards with school safety information; and 41 states have school or district report cards that include teacher qualification data.

Question from Sharon Teasley, student, University of Phoenix Online:
What has been the impact of Safety in the schools as far as the grade each state received?

Ron Skinner, Education Week:
School safety is 20% of the states’ grades for school climate. We look at a few policies relevant to school safety, particularly, whether school safety information is included on school report cards, whether states have enacted policies to prevent bullying, or whether they require specific penalties for incidents of school violence. We also include in the grades data from the background survey of NAEP that indicate the percent of students in schools where physical conflicts are considered not a problem or only a minor problem.

Question from Phyllis Fisher, Retired Lib. Aut Spec, NYC Schools:
Adequacy of Resources and Resource Equity

Were there any attempts to assess equity of access to information in computing states’ scores?

Jennifer Park, Education Week:
We do not ask each state to report their school funding figures. Instead we use federal data on district-level spending (mostly from the National Center for Education Statistics and the U.S. Census Bureau) to grade states on the adequacy and equity of school resources. These data work well for our report because they are collected in a way that makes comparisons across states possible. The downside to using federal data is that the most recent information available is from the 2000-01 school year.

Question from Thomas A. Wilson, Education Research Associate, Brown University:
When you grade accountability for Rhode Island, why do you consistently overlook the SALT school visit, an interesting and particularly effective component of the Rhode Island State Department of Education’s accountability system? Rhode Island has invested considerable resources in the development over 7 years of this rigorous school visit based on international research. Peer visiting teams have now written public reports on how well students are learning in more than 60% of RI public schools. The visit has won international acclaim as an important accountability innovation. It is judged to be effective and rigorous by most who know it. It is unusually successful in connecting assessment with action that leads to improvement in student learning.

Lisa Staresina, Education Week:
Site visits are not a graded component of the report, so Rhode Island’s grade did not suffer due to the lack of credit in that column. We rely on the states to give us information on various aspects of their accountability systems through our survey. Rhode Island did not provide information on the school visit portion of the SALT program. Thank you for bringing this to our attention. We will follow up with the state and update our the data on the web if appropriate.

Comment from J. H. Kennedy, NAEP State Coordinator for Maine:
Comment: In re. the question regarding who determines “proficiency” on the NAEP, the National Assessment Governing Board does. More information about the NAEP achievement levels can be found at

Question from Mike Starrett, Assistant Principal, Faith Middle School:
Why does Edweek omit the data from the Dept. of Defense schools in it’s annual report? I think this would be a real eye opener to the nation of just how well the DoD schools perform. seeing the level of performance is an indicator of just how much “Quality Counts.”

Ron Skinner, Education Week:
We’re often asked about including jurisdictions other than states (like Puerto Rico, Guam, or DoDEA) that have state-like control over schools. But I believe one of the best features of Quality Counts is the comparability of data across states, and non-state actors often make for awkward, if not unreasonable, comparisons. The DoDEA schools do serve as a great example in some areas though and they have some of the smallest achievement gaps (differences in scores between white students and their minority counterparts) across the jurisdictions included in NAEP.

Question from Andrew Reeves, Regional President, Vantage Learning.:
1) What is the Historic relevance of these numbers; are we better or worse then previous years, do you see any trends developing?

2) Much has been written regarding accountability for students with disabilities, are they reflected in these numbers and if so what impact did they have?

Ron Skinner, Education Week:
In regards to NAEP scores across the states, there’s been a general improvement in student math achievement since the NAEP started being administered, and several states had a statistically significant improvement in their 2003 scores since the last time they took the test in 2000. With regards to reading, the scores have been more stagnant. Students with disabilites are included in NAEP, and while you find these students contributing at all achievement levels, as a group, their scores are significantly lower than students without disabilities. For example on the grade 4 reading test, 32% of students without disabilities scored at or above proficient, while only 9% of students with disabilities reached that level.

Question from Charlotte Thompson,Graduate Research Asst.,UT-Tyler:
The Governor of Texas is pushing for college graduates to get on the job training instead of enrolling in alternative course to become certified. I feel this is a good idea what do you think? Also after 2 years they must take the state mandated test to become certified. So this is not a free pass. Please tell me why people think this is not a good idea?

Melissa McCabe, Education Week:
Quality Counts shows that state alternative-route programs and certification requirements for teachers vary considerably. While some alternative routes require comprehensive education courses, field experiences, and subject-matter tests that are not unlike all the requirements that traditional teacher candidates must meet, others are much more streamlined. Alternative routes typically involve some amount of on-the-job training, but the Governor’s plan for Texas could be controversial because college graduates potentially enter the classroom with little or no preservice education training at all.

Question from Jane Forrestal, M.Ed. candidate, Antioch N.E. Graduate School:
I wonder what other states aside from New Hampshire fund public education through local and/or statewide property taxes?

Jennifer Park, Education Week:
New Hampshire and Vermont are good examples of states that have implemented a statewide property tax for education in an effort to increase the equity of school funding. Most states, however, still rely heavily or solely on local property taxes for the locally generated portion of education funding, which is the main reason many states have inequitable funding systems. In Quality Counts we found that in 42 states property-rich districts have, on average, more total education revenue than property-poor districts.

Question from Carol Gillen, parent/volunteer, Enlarged City School District of Middletown, NY:
Is the criteria for establishing the breakpoints in test scores at which a student is considered proficient the same across all states, and, if not, how much variation occurs nationwide?

Lisa Staresina, Education Week:
Proficiency cut scores are set by each state and vary widely from state to state. We do not track the different cut scores by state, but if you are interested in looking at state cut scores, you can go to This organization compared students scoring at or above the “basic” or “proficienct” level on the National Assessment of Educational Progress (NAEP) tests, and the percent considered proficient based on state tests.

Question from John Stallcup, Parent, River School Napa Unified:
Given students raised in the Asian culture tend to out score upper class anglo students the states with the highest percentage of Asian students would have higher overall NAEP scores. Why aren’t Asians (Chinese, Indian, Japanese, Singapore,)students broken out?

Ron Skinner, Education Week:
NAEP provides scores broken out by several racial and ethnic categories. The category closest to what you are looking for is the combined category Asian/Pacific Islander. And while its true that in math these students generally outperform other groups, that’s not always so in reading, most likely due to those students learning English as a second language. You can get breakdowns directly from NAEP using the NAEP data tool at

Question from GloJean Todacheene, Curriculum Support, CCSD#22:
With all the rankings and punitive measures against schools, why the disregard for societal factors that impact on student learning? We have a high number of children in poverty, single parent households, working middle class poor, etc. We have to fix these problems before students will be able to learn. I think we are out of focus.

Ron Skinner, Education Week:
I think you make an important point on a couple of levels. First, education and learning is bigger than just schools, and second, that defining a problem may be the biggest part of solving it. We know that schools don’t have influence over some of the factors correlated with achievement, so its important that they use all the tools at their disposal to raise the achievement of all their students, regardless of background.

Question from Karolyn Hayes, Parent, Milford, DE:
How are special education services evaluated? Are they part of the total, evaluated separately, not included?

Lisa Staresina, Education Week:
Special education services are not evaluated as part of our graded section. Every year Quality Counts features a special theme and this year’s theme was special education. The “State of the States” section contains 4 sections with indicators that are tracked and graded from year to year. These sections are “Standards and Accountability,” “Efforts to Improve Teacher Quality,” “School Climate,” and “Finance.” There are a few indicators in the Standards and Accountability section that track states’ accountability for the performance of students with disabilities. For example, there are columns indicating which states show disaggregated performance data of students with disabilities on school report cards, and which states include the performance of students with disabilities as part of their school rating systems.

Question from Coralee Reiss, Education Activist:
I had read before NCLB became law, that the scoring companies were unprepared to cope with the workload which would develop. I know there have been problems in New York. In Connecticut, we don’t have our Connecticut Mastery results yet, because McGraw Hill failed. By the time they get them, it will be too late to help students who need it.

Are there any other states in this situation? Is there somewhere I could find out

what other states have this problem

And what scoring companies are failing to deliver?

Ron Skinner, Education Week:
Under NCLB states are supposed to have their report cards with the most recent assessment data by the start of the next school year, while there have been some isolated scoring problems, I’m not aware of any systemic issues around lack of capacity. But as more states move to meet the 2005-06 deadline for implementing tests in grades 3-8 and once in high school, grading these results in a timely manner (so that the results can inform instruction as well as accountability) could become a more prominent issue.

Question from Maxine Sullivan-Pepper, Coordinator Parent & Community Involvement Programs, Grant Joint Union High School District:
How can parents support changing schools to support parents and students in increasing the performance of students?

Ron Skinner, Education Week:
There’s a whole host of groups out there with information and ideas to support parent involvement in schools. We include just a few indicators of parent involvement in the school climate section of Quality Counts, but I would refer you to our issue page on parent involvement which includes links to organizations, news, and research on the topic. Go to

Question from Edee from Illinois - Admin in Education:
Would you please comment on Illinois’ “F” in equity? What does that grade entail?

Jennifer Park, Education Week:
Illinois earned the lowest grade in equity this year. The state only provides about 40 percent of total state and local funding indicating a heavy reliance on local property taxes, and the state does less than most states to target aid to property-poor districts. Illinois also ranks 48th out of the 50 states on our wealth-neutrality score, meaning there is a strong link between the property wealth of a district and the amount of state and local revenue available for education. We also grade on how much of a difference there is in funding across districts in the state, and Illinois has moderate disparities.

Comment from Karen Brock, Audiologist, Baltimore City Public School System:
Comment- Measuring student achievement for students with disabilities involves more than measuring teacher qualifications and student performance. The impact of related service providers in this picture should somehow be included in the stats.

Question from martha patrick, interested citizen and school counselor:
What commonality have you found among states with the highest NAEP performances? What does MN have in common with New Hampshire or North Carolina? Is school and state success always directly tied to statewide SES and only to that or are there other common factors more related to educational delivery, school funding, etc.

Ron Skinner, Education Week:
Although Quality Counts doesn’t analyze the relationship between student achievement and other factors, you’re right that SES and other demographic factors are the most highly correlated with achievement. But research has consistently shown that, among factors under the control of the school, teacher quality accounts for a larger portion of variations in achievement than other school characteristics.

Question from Paulette Black, Arts Education Director, Oklahoma Arts Council:
Since the arts have been added to the core curriculum, do you envision a time that either student assessment or district allocation of instructional time/resources, etc. might be added to the report card? If not, why not?

Ron Skinner, Education Week:
While I think we will probably continue to limit our annual analysis to English/language arts, science, math, and social science, the Arts have come up in the past as a potential special theme to focus on in Quality Counts, I think the diversity of programs in this subject would lend itself much better to a more indepth look than we could undertake annually. As far as student assessment around the Arts is concerned, NAEP did an interesting national assessment and survey around the Arts in 1997, but unfortunately doesn’t have the Arts on its schedule again until 2008.

Question from Doug Clements, Dean of Students, Glenbrook Elementary:
If I understand the data, it is the states who do the prescribed mandates that receive the higher score. Who establishes the prescribed mandates? And are they best educational practices?

Ron Skinner, Education Week:
You’re right, Quality Counts focuses on state-level policy, those states that leave decision making around the policies that we track up to the discretion of districts will not do well in our grading. The indicators we track are based on the best research and advice of experts in each of our graded areas, and we consistently seek expert advice to inform our work in these areas.

Question from michelle Parent connecticut:
why did connecticut score the way they did?

Melissa McCabe, Education Week:
Connecticut received a B- in Standards and Accountability. The state has clear and specific standards in science and math at all three grade spans (elementary, middle, and high school). The state has such standards in English at the elementary and high school levels and in social studies/history at the middle school level only. In addition, state tests in English and math are aligned with the state standards in elementary, middle, and high school. The high school science test is also aligned with state standards. CT publishes school report cards and assigns school ratings, but other accountability efforts are lacking, such as sanctions for Title I and non-Title I schools that are rated as low-performing and rewards for high-performing schools. CT received an A- in Efforts to Improve Teacher Quality because it has minimum coursework requirements in place for teachers entering the classroom as well as a comprehensive battery of tests for teachers to pass before they receive their initial licensure. The state also finances professional development for teachers and has established a system for holding teacher-preparation programs accountable for the quality of education they provide prospective teachers. The state received a B- in School Climate because of positive ratings that school officials gave their schools on the NAEP background survey around issues such as absenteeism and parent involvement. However, CT does not have strong policies around school choice. Finally, CT received an A- in Resource Equity because it spends well above the national average per pupil and scores well on our adequacy index. But the state received a D in Resource Equity because it ranked second to last among the 50 states in its state share of total state and local funding for education. CT targets a fair amount of state aid toward prperty-poor districts but has lingering inequities in the distribution of funding based on prperty wealth.

Question from Sue Roehrich, Curriculum Director, Winona Area Public Schools, Winona, MN:
How do we truly grade the quality of education in our country when the academic progress of private and home schools is not measured. Aren’t we leaving some children behind?

Jennifer Park, Education Week:
We focus only on public K-12 education in our Quality Counts report, but we do include indicators on state policy about school choice in our school climate section of the report. The National Assessment of Educational Progress ( actually does track the achievement of private school students.

Kathryn Doherty, moderator (Moderator):
Each year, in addition to grading the states on standards and accountability, teacher quality, school climate, and resources, we also pick a special theme and explore state policy in that area in depth. Last year was special education. The year before teaching. The year before, early childhood education. We welcome suggestions from the audience for future Quality Counts special themes. Any suggestions?

Question from :
From reading web pages, it would appear that definitions for common terms vary from state to state. How have these differences been factored into the results?

Ron Skinner, Education Week:
That’s an important part of our vetting process. States have different terms for the same thing and the same term for different things. And when you look at funding mechanisms it gets even worse. We do our best to account for this in the many steps of our survey process--from how we ask and define our questions, to how we confirm our final results with the states before publication.

Question from Kathryn Doherty, moderator:
Lisa, can you talk a little bit about how our indicators related to standards, testing and accountability line up with the requirements states are supposed to meet for NCLB?

Lisa Staresina, Education Week:
The requirements of No Child Left Behind have significantly affected developments in the area of standards and accountability. The public disclosure of disaggregated student performance data has doubled since last year. For example, last year about half the states provided test data on school report cards by race, ethinicity, poverty, limited English proficiency, or disability. This year, 46 states and the District of Columbia provide school-level test data disaggregated by at least some subgroups. Last year, 30 states had a system for rating schools. This year, with the expansion of AYP to include all schools, 50 states and the District of Columbia received credit for school ratings. For additional information on the progress states are making in implementing No Child Left Behind, please see Lynn Olson’s 12/10/03 article, “In ESEA Wake, School Data Flowing Forth,” at the following link.

Question from Forest Thigpen, President, Mississippi Center for Public Policy:
When grading on school size, did you consider smaller schools as better than larger schools, or worse than larger schools?

Ron Skinner, Education Week:
States received credit toward their school climate grade based on the percent of students in schools under a certain number of students, which was dependent on the level of the school (elementary, middle, high). So yes, small schools were better than large, but by using the percent of students, it minimized the impact of very rural areas where extremely small schools might not be the most efficient AND effective way of educating students.

Lisa Staresina, Education Week:
As you are browsing through Quality Counts 2004 either on the web or reading the print version, please take a few minutes to fill out our survey. Input from past survey respondents has been extremely helpful to our process. For example, in the survey you have the opportunity to tell us which special themes you think we should tackle in the future. Also, through February 29th, you can recieve 25% off a new or renewal subscription to Education Week. A link to the web survey is located in the left hand column in major sections of the web version of the report. The print survey is located in the middle of the report between pages 58 and 59.

Question from :
In response to Kathryn Doherty’s question, how about a special issue on achievement gaps and changes in gaps in the states?

Ron Skinner, Education Week:
The achievement gap between the performance of different ethnic, racial, and special need groups in states is going to be a big issue in the years to come, as NCLB will require states to report specifically on the performance of all these groups. One issue we are thinking about for the next QC-- what policies and practices are states implementing to help low-performing students and schools?

Question from Tom Easterly, retired Legislative analyst, IL:
From reading the web pages, it would appear that definitions for common terms, ie. salary, vary from state to state. How have these factors been considered?

Jennifer Park, Education Week:
Although we collect most of our data through a survey of state departments of education, we also compile data from other sources for Quality Counts. We only choose sources where the data are comparable across the states. For example, for teacher salaries we use data from the annual survey of teacher salary trends from the American Federation of Teachers. For more information see the “Sources and Notes” section of the report.

Comment from Karen Brock, Audiologist, Baltimore City Public School System:
response to K. Doherty, Moderator. A fair number of parents have participated today. Perhaps a Parental Involvement special theme highlighting state policies is an idea.

Question from Ryan Hayes:
How is it that some states that receive bad grades in your report have high levels of student achievement?

Jennifer Park, Education Week:
Well this is kind of a chicken and the egg problem. Which comes first, state policy or high achievement? Since we grade on state policies, is it that the state has low achievement and therefore implemented more policies? Or do the states with traditionally high achievement not have a need for as much state intervention?

I also want to note that no state has a majority of its students performing at or above proficient on NAEP, and in our Student Achievement section we show very large achievement gaps between the performance of white and minority students. All states have a lot of work to do!

Question from morton egol, ceo,wisdom dynamics llc:
By publishing the test scores and creating a ranking, Ed Week is lending support to NCLB. Would it be better to publish a broader set of assessment measures that would diminish NCLB and support deveopment of a more relevant and robust education system?

Lisa Staresina, Education Week:
Quality Counts does not only track indicators and data related to No Child Left Behind. As states make expected progress over the years, indicators are added to obtain an up-to-date and current picture of state policy across the core areas that we track. Oftentimes, these indicators intersect with requirements of NCLB, but this isn’t always the case. For example, No Child Left Behind requires testing in grades 3-8 in reading and math and once in high school. Quality Counts grades the states on whether they have standards-based tests at the three grade spans - elementary, middle and high school. In summary, the report tracks policies and indicators that have been proven important through quality research.

Question from Pete Schlieker Teacher Big Pine:
Working in a small school (less than 75 high school students)I see a wide range of standardized test scores. I also see a number of students that do not have a good home life, one that pertains to bettering their education through assurances that homework is done or behavior is acceptable for the classroom. Based on all these outside distractions (as well as others) how can we as educators wade through these issues beyond our control and insure that all students become proficient in the subject areas that we teach?

Ron Skinner, Education Week:
Because we analyze so many factors, most of which, we hope, have a positive influence on achievement, we do often get questions about the outside influences over which a school has no real control. Take a look at the issue pages available on the Ed Week website ( We provide basic information on issues such as parent involvement, character education, violence and safety, and high school reform. Within each of these pages are links to various organizations with resources and information that might be of assistance to you.

Question from Joe Smith, Primary PE teacher, Jefferson Elementary Center:
When will the data showing academic success rates increase where fitness and quality PE programs exist in school districts be utilized to expand and improve academic rperformance? With obesity and type 2 diabetes being health issues, and data showing the correlations of fitness and better academic performance, why do educational planners continue to remove PE/recess and the art/music from curriculum priorities?

Jennifer Park, Education Week:
One of the main reasons we use for including indicators in our report is their relationship to student achievement, but unfortunately we cannot include everything or no one would be able to get through all of the data! You can find information on current trends in student health and what schools are doing to help by searching for recent articles on the topic in the archives of Education Week.

Question from Larry Reed, Supervisor, Milwaukee Public Schools:
In identifying and analyzing the racial disparities in the various educational categories, what trends are seen to explain the possible causes.

Melissa McCabe, Education Week:
I’m assuming that you are referring to racial disparities in the identification of students for specific special education classifications. The under and overrepresentation of students typically occurs in the categories that are the most subjective to identify (mental retardation, emotional disturbance). Some scholars attribute this to the use of identification tools such as IQ tests which could be culturally biased. Also, some experts feel that students are being over identified in general, due to more universal problems such as the misidentification of students with reading troubles as needing special education services.

Ron Skinner, Education Week:
Since I also compile the data for the student achievement section, I want to point out a new indicator we’ve included this year--the “chance for college.” Although slightly complicated, the chance for college indicator is one way of looking at how well a state’s high schools are preparing it students for post secondary education. It looks at the percent of ninth graders who go on to graduate with a regular diploma and enroll in a degree-granting program at a 2 or 4 year college.

Kathryn Doherty, moderator (Moderator):
Thank you all for your questions and comments about Quality Counts and about education policy in general. One of the main reasons we publish this report each year is to disseminate comparable and high-quality data that can help spark discussion, debates, and thoughtful analysis of education policy. Sharing information and becoming knowledgeable about policy is an important part of the process of improving our schools. So we thank you for your engagement in today’s conversation. Check back later for a transcript of this session and please read more about state education policy in Quality Counts 2004 at

The Fine Print
All questions are screened by an Education Week online editor and the guest speaker prior to posting. A question is not displayed until it is answered by the guest speaker. We cannot guarantee that all questions will be answered, or answered in the order of submission. Concise questions are encouraged.

Please be sure to include your name and affiliation when posting your question.

Education Week maintains Live Chat as an open forum where readers can participate in a give-and-take discussion with a variety of guests. Education Week reserves the right to condense or edit questions for clarity, but editing is kept to a minimum. Questions may also be reproduced in some form in our print edition. We attempt to correct errors in spelling, punctuation, etc. In addition, we remove statements that have the potential to be libelous or to slander someone. In cases in which people make claims that could be libelous, we will remove the names of institutions and departments. But in those cases, we will not alter the ideas contained in the questions.

Please read our Privacy Statement and Visitor Agreement if you have questions.