Lest you harbor any doubt that there is no clear consensus on what constitutes college readiness, allow me to share with you this story from the Chicago Tribune.
What happened here? The Tribune analyzed how ACT’s “college readiness benchmarks” lined up with schools’ own ideas of their students’ preparedness. At some of Illinois’ most well-regarded high schools, as it turns out, rather substantial portions of students are falling short of the ACT’s benchmarks, which are supposed to indicate how ready they are to succeed in entry-level credit-bearing college coursework.
Predictably, this sort of thing can prompt some squirming and defensiveness in high schools that are used to elite distinctions. But it takes us back to that persistent question: Exactly how do you define college readiness?
The ACT’s research led it to conclude that students who reach a certain score in given subjects stand a much better chance of succeeding in college coursework than those who don’t produce those scores. High schools can point out, as some in the Tribune story did, that their kids go off to college in droves and are very successful. But how many schools really track their students into college well enough to make that claim with credibility?