Break out the balloons and the bubbly drinks, it’s April 3! That’s right, it’s the first official deadline for states to turn in their plans for implementing the Every Student Succeeds Act to either the U.S. Department of Education or to their governors for review. (States that go that second route officially get to turn in their plans, to the department on May 3.)
Late last year, 17 states and the District of Columbia said they were shooting to turn in their plans on April 3, although a couple, including Ohio, have decided to sit tight and keep working. There is a second deadline, on Sept. 18.
The plans will now be read by different teams of peer reviewers at the department. Political appointees, including U.S. Secretary of Education Betsy DeVos, are forbidden from monkeying with that process. But the secretary gets to give the plans the final thumbs or down. More on how all that will work here.
The peer reviewers aren’t the only ones who will be looking at the plans with a fine-tooth comb. Advocates, educators, and reporters will also be scrutinizing them.
Here’s a preliminary list of questions to ask as you go through plans for your favorite state, or states:
What did the state pick for its big, long-term goal, and what kinds of short-term goals is it setting?
What ESSA says: ESSA requires all states to set both long-range and shorter-term goals. But unlike under its predecessor, the No Child Left Behind Act, which required everyone to shoot for proficiency on state tests by 2013-14, ESSA allows states to pick their own goals. ESSA doesn’t set any sort of deadline for states.
These goals must address student achievement as measured by state tests, English-language proficiency, and graduation rates. Goals have to set an expectation that groups of kids that are furthest behind—like students in special education and English-learners—close gaps in achievement and graduation rates with their peers. The goals matter because they will give the state a big common mission, something to shoot for. Plus, individual schools will be judged by how close or far off the mark they are.
What that looks like in state draft plans: In Illinois’ draft plan, which has been submitted to the governor for review, the number to watch is 90 percent and the year to watch is 2032. (The kids who will be seniors in high school by that point aren’t even kindergarteners today).
For instance, the plan calls for 90 percent or more of 3rd‐grade students to be reading on grade level on state tests by that year and 90 percent or more of 5th graders to meet or exceed expectations in math by then. It also calls for 90 percent of 9th-graders to be on track to graduate from high school with their cohort by 2032, and for 90 percent of students to graduate high school ready for the work force.
Subgroups of students have a lot farther to go than other kids. The Land of Lincoln’s overall graduation rate is 85.5 percent right now, so a five-point climb over the course of 15 years may arguably not be much of a stretch. But the graduation rate for students in special education is only 70.6 percent. Pushing that up by 20 points could be a challenge.
Interestingly, Massachusetts’ draft plan notes that the state is still rolling out its new testing system, so the state says it will wait for data from the first year of the new assessment before setting long-term goals for things like academic achievement. (The plan does set grad rate goals.)
Want more on goals? Our colleague Daarel Burnette II has you more than covered in this great story.
What did states pick for their academic indicators, and how much do they matter in rating schools?
What ESSA says: States must pick at least three academic measures to gauge school performance for elementary and middle schools. At least two of them must be English-language proficiency and achievement (“proficiency”) on state tests. The other one can look at how much progress students are making on state tests, aka “growth.” That other academic indicator could also be something else, like how well schools are doing when it comes to closing the achievement gap. And states don’t have to stop at three academic indicators. (Massachusetts, for instance, pitched growth, proficiency, and gap-closing, along with English language proficiency in its draft plan.)
And although states must use reading and math tests in their accountability systems, they can add other subjects if they want. Vermont, for instance, would like to incorporate science, as well as health and physical education, according to its draft plan.
For high schools, states must consider at minimum English-language proficiency, plus proficiency on state tests, plus graduation rates in rating their schools.
These academic indicators, as a group, must count for “much more” than other indicators (like school climate) in figuring out a school’s overall rating.
What that looks like in state draft plans: The District of Columbia, for instance, pitched making academic achievement 30 percent of a school’s overall score, making academic progress 40 percent, and making English-language proficiency 5 percent. The remaining 25 percent would be for “school environment,” which includes chronic absenteeism, regular garden-variety attendance, re-enrollment in the district, and a measure that will be piloted down the road called “access and opportunity.”
What did states pick for their indicator of student success and school quality?
What ESSA says: One of the biggest differences between ESSA and No Child Left Behind is that states have to consider something other than test scores—for example, school climate, teacher engagement, access to arts education—into school ratings. This factor or factors must be broken out by school and subgroup. And it can’t count for more than all of the academic factors combined.
What this looks like in state draft plans: Chronic absenteeism was a big favorite. For instance, New Jersey‘s draft plan makes it the “school quality” indicator, and weights it at 10 percent. States are also considering things like college and career readiness, early-childhood indicators, and more.
Massachusetts, for instance, is using chronic absenteeism (defined as kids missing 10 percent of school days), successful completion of 9th grade courses, and “successful completion of a broad and challenging curriculum” (defined as the percentage of kids taking Advanced Placement, International Baccalaureate, or dual enrollment courses).
Some states clearly want to incorporate certain things into their systems, but don’t feel like they’ve figured out how to measure them yet. For instance, Oregon would eventually like to include things like school climate surveys and staff absenteeism in its plan. It has listed those things as possible “future indicators” in the most recent draft on its website.
How will states communicate school ratings to parents, the feds, and the public?
What ESSA says, or at least implies: States can either use a dashboard—which displays a mix of different ratings for different indicators—or come up with an overall “summative score” (on an A through F scale, for instance). States are going both routes.
What that looks like in a state draft plans: California has been working on its dashboard for years, even before ESSA passed, in fact. Oregon is using a dashboard approach, too. New Mexico, however, will rate schools as exemplary, high effective, effective, minimally effective, and ineffective. The District of Columbia rates schools on a five-star system.
How are states tackling testing and opt-outs?
What ESSA says: ESSA kept annual testing, but put in place a range of flexibilities, including the opportunity for states to use a bunch of interim tests instead of one big summative exam. It also allowed high schools to use a nationally recognized college entrance exam (like the SAT or ACT) instead of the state test, with state permission.
The language on test participation in ESSA is really complex. In a nutshell, every school is supposed test 95 percent of its students, just like under NCLB. Under NCLB, schools that dipped below the 95 percent threshold were deemed automatic failures. Under ESSA, states get to decide what happens in schools with low test participation. This issue has been a big, politically charged deal, in part because of the movement of parents opting their kids out of standardized tests.
What this looks like in state draft plans: Oregon, for one, is mulling some of the testing flexibilities, according to its state plan. It would like to pursue the nationally recognized test for high school students. And its draft plan envisions a new “pilot” for interim assessments.
States have a range of ideas for what to do for schools where fewer than 95 percent of students take tests. For instance, if a school in New Jersey doesn’t meet the 95 percent threshold, the Garden State will take public note of it on its school performance report.
Schools that miss the 95 percent target in the District of Columbia will get technical assistance and monitoring, according to its most recent draft plan. If those schools keep missing the participation mark, they may be identified for “additional actions and interventions.”
How are states handling school improvement?
What ESSA says: ESSA gave states and districts way, way more flexibility when it comes to fixing schools that are seriously struggling (those in “comprehensive improvement”) under ESSA, and schools where certain groups of students aren’t performing well (those in “targeted improvement”). Essentially, districts get to design a plan for the lowest-performing schools, monitored by the state. Schools with low-performing subgroups come up with their own plan to fix the problem, monitored by the district.
What it looks like in state draft plans: Some states are essentially sticking with what they feel has worked well in the past. Massachusetts, for instance, is staying with its school improvement system, which rates districts as well as schools and gives the state the option of taking over really low-performing districts, as Massachusetts has with Lawrence. Other states are pitching new ideas. Nevada wants to develop a state-run “achievement school district” like the one in Tennessee and other states.
And as part of its technical support, Colorado would like to provide its districts with a list of evidence-based strategies and partnerships, which it is planning to add to and evolve over time. Importantly, districts won’t be restricted to the things on the list—it’s just a resource.
There’s obviously much more to consider in state plans, everything from how states decide when subgroups are “consistently underperforming” to teacher distribution and how states are measuring English-language proficiency.
We know we missed a lot, even in the small sampling of draft plans we looked at briefly. Email us and tell us what you’re finding as you read through them at aklein@epe.org and aujifusa@epe.org.