State report cards are broken and it feels intentional

<strong>The Way I See It</strong> Jason Hawk, editor

The Way I See It Jason Hawk, editor

Just try picking apart this year’s state report cards. I dare you.

I’ve been trying for days to analyze the often nonsensical, convoluted wreckage supposedly intended to give the public insight into how effective our schools are across Lorain County and the rest of Ohio.

The report cards use A’s, B’s, C’s, D’s, and F’s along with a smattering of apples-and-oranges numerical grades to tell how many kids passed last year’s state tests. They also measure how well school districts reach their best students, their worst students, and their disadvantaged students.

Here’s the problem: You might think you know what an A or an F means. The report cards bend those letter grades so far as to mean nothing.

Take Amherst as an example. The public schools were graded an A for progress among all students. But despite that high-ranked growth, the district’s overall grade paradoxically slipped from a B last year to a C now. And while the Amherst Schools have a long record excellent grades for progress, their numerical performance rating has gotten lower the last three years.

I sat down with Michael Molnar, whose job includes overseeing testing in Amherst. He says the weirdness is the result of the Ohio Department of Education grading on a curve. The grades don’t reflect how local students do from year to year — they rank districts compared to each other. And in some cases, they’re just plain inaccurate and miscalculated.

Tack that on to the testing problems Lorain County superintendents have been rankled by these past three years, including the disastrous PARCC exams, back-to-back-to-back changes in test provides and content, and transitioning from paper-and-pencil to online tests.

Just this month they sent a letter to state lawmakers voicing exasperation with how the legislature is holding Ohio’s schools to far stricter testing standards than the federal government. Amherst, Oberlin, Firelands, and Wellington superintendents all signed.

And all suffered far lower grades on the report cards released last week:

• Oberlin got C’s both overall and for progress and F’s for elementary reading levels and “gap closing,” which measures success among minorities, children from poor families, and kids with disabilities.

• Firelands was graded a D overall and F’s for progress, elementary reading, and gap closing.

• Wellington was also slapped with a D grade overall and F’s for progress, elementary reading, and gap closing.

Ohio School Board member A.J. Wagner took to social media to call the report cards “a disaster,” not because of poor teaching but because of “poor test composition, poor expectations, poor measurement, poor supports, poor financing, and poor policy making.”

More than half of all districts and about 60 percent of all schools received a D or an F for achievement this time around, he noted. A third of all districts are in D or F territory for progress.

Lower grades were expected and dreaded this year. “Even though schools are seeing local improvement on many fronts, the results were not unexpected since students are being judged against new, higher state standards,” the Ohio School Boards Association said in a written statement.

“OSBA supports accountability and welcomes the opportunity to learn how students are progressing and where improvement is needed,” it continued. “At the same time, there are concerns about 2016 being the third year in a row with different tests and varying standards. Districts need adequate time to properly prepare for such transitions.”

OSBA president Eric Germann said report cards don’t speak to how districts fare with job, college and military placement, scholarships awarded, the arts, and community service.

He’s right. What’s more, standardized tests aren’t being released in a timely way that would help teachers learn what weak spots need addressed. Teachers never even get to see which test questions threw their students, which begs the question: How are the results helpful in improving the way students are reached in the classroom?

I can’t help but feel the report cards’ devolving quality the past several years is intentional. It feels like a deliberately broken system is being used to make public schools look bad as state officials continue to divert our tax money to charter schools, which are breeding academic failure but making investors rich.

At the very least, the nature of the report cards is political. As Molnar said: “This is all tied to politics because it’s the legislature that passes (educational) laws and policies, which it passes into code and then the ODE has to translate it into practice.”

You may have legitimate anger over any number of things happening in your local public school system. No school system is perfect.

But the state report cards should be the last bit of “evidence” you use to judge our local school districts. The metrics are deeply flawed and not to be trusted — especially next year when the state jacks up the minimum passage rates even higher. If fewer than 80 percent of students pass any given test, it will yield an F.

The Way I See It Jason Hawk, editor Way I See It Jason Hawk, editor