PARCC practice tests (Ty Wright/AP)

(Update and correction: New information; new chart; percentage of students who took PARCC online  was 64 percent, not 75 percent; clarifying headline)

If you want to understand how political the process of determining how well students do on tests can be, consider what happened in Ohio.

Ohio just released an initial wave of results from the Common Core test known as PARCC, which stands for the Partnership for Assessment of Readiness for College and Careers, the multi-state consortium that designed the exam with federal funds. (The test is aligned to what is called the Ohio Learning Standards, which are basically the Common Core State Standards with a different name.) The scores were for online PARCC tests, which were taken by about 64 percent of  students who took the exam in Ohio.

Under PARCC scoring guidelines used in other PARCC states, a little more than a third of students who took the test online would be considered “proficient” in math and English language arts. But state officials some time ago approved a grading system for PARCC that uses a lowered benchmark for “proficiency,” and under this, the number of students considered “proficient” in Ohio is near 65 percent.

In a memo (see text below) about  the Ohio proficiency controversy, Karen Nussle, executive director of the nonprofit Collaborative for Student Success, shows how test results can be viewed in different lights depending on what a passing score is determined to be.

“Proficiency as defined by the Ohio State Board of Education is inconsistent with how proficiency is defined by both the Partnership for Assessment of Readiness for College and Careers (PARCC) and the National Assessment of Educational Progress (NAEP), the nation’s report card. This discrepancy should give pause to parents, community leaders and policy makers who expect transparency in Ohio’s transition to higher standards and new tests. … [It] suggests that Ohio has set the proficiency bar too low and undermines the promise of ensuring kids are on track for college and career.”

This past summer, Ohio decided to stop using PARCC and contracted with a testing company for a new exam to give to students.

This is a chart provided by the Ohio Department of Education comparing Ohio’s PARCC grading levels with PARCC’s.  In Ohio, students who “approached expectations” are considered “proficient.” For PARCC, Level 4 would represent college -and career-ready, not Level 3.

Level Ohio
(as required by law)
PARCC
5 Advanced Exceeded expectations
4 Accelerated Met expectations
3 Proficient Approached expectations
2 Basic Partially met expectations
1 Limited Did not yet meet expectations

 

Education officials in various states who don’t like the optics of very low test scores have taken a number of different actions over the years. As explained in this Answer Sheet post by educator Carol Burris, cut scores are selected points on the score scale of a test, and the points are used to determine whether a particular test score is sufficient for some purpose. Notice the word “selected.” Cut scores are selected based on criteria that the selectors decide have some meaning. It is often the case that the criteria have no real validity in revealing student achievement, which is the supposed mission of the test — and that means the scores have no meaning either.

In 2012, Florida gave a new standardized writing test to students in various grades  and only 27 percent of fourth-graders had proficient scores on the Florida Comprehensive Assessment Test, which was down from last year’s 81 percent. Eight-graders and 10th-graders also had dramatically lower scores than last year. State education officials panicked, and the Florida Board of Education decided to lower the passing score on the exam.

Another way to game the testing system is to make the tests easier so the scores will be better, which is what happened in New York City public schools during the eight-year tenure of Joel Klein as chancellor. He had to resign in 2010 after it was revealed  that test scores that he pointed to as proof of the success of his business-based reforms were based on increasingly easy standardized exams.

Here’s the full memo by Karen Nussle:

Ohio this week became the first PARCC (Partnership for Assessment of Readiness for College and Careers) state in the country to release preliminary student test results measuring the Ohio’s Learning Standards, which are based on the Common Core State Standards.

That the number of students deemed proficient on the tests was lower than previous years is no surprise. Common Core State Standards are, after all, considerably more rigorous than previous K-12 standards. But what parents should pay attention to is the percentage of students determined by the state to be proficient under the new assessments.

Proficiency as defined by the Ohio State Board of Education is inconsistent with how proficiency is defined by both the Partnership for Assessment of Readiness for College and Careers (PARCC) and the National Assessment of Educational Progress (NAEP), the nation’s report card.

This discrepancy should give pause to parents, community leaders and policy makers who expect transparency in Ohio’s transition to higher standards and new tests.

Who’s Looking Out for Students?

According to the state, more than half of Ohio students who took the PARCC exam are now officially proficient in math and English. Both PARCC and NAEP, however, would consider that percentage to be significantly lower. The discrepancy suggests that Ohio has set the proficiency bar too low and undermines the promise of ensuring kids are on track for college and career.

For the past five years, since Ohio first adopted Common Core State Standards, the state has been engaged in a very difficult transition to not only bring transparency and honesty to the process of reporting student achievement, but also to raising the bar in order to ensure kids were prepared for the next step after high school.

This herculean effort was needed because in past years, Ohio’s Honesty Gap – that is, the difference between proficiency rates reported by state tests and those on the NAEP – was among the most pronounced in the country.

By expanding the definition of proficiency to include students that are less-than-proficient, it appears the state is regressing. “I’m trying to understand how these [proficiency rates] are raising expectation,” said Sarah Fowler, a member of the state board of education. There has been no explanation as to why this decision was made, but we can speculate that it was so more students would score “proficient” on paper, and not because they truly earned that designation. We encourage Ohioans to ask these questions.

For further insight, read this post from the Fordham Institute’s Ohio Gadfly.

Local Control Still Needs to be Honest and Transparent

We very much support Ohio’s ability to make these decisions for themselves without intrusion by the Federal government or other national entities. These decisions need to be made in Ohio by Ohioans.

But at the same time, Ohio parents deserve an honest assessment of student proficiency. ‘Local control’ cannot become a fig leaf that covers up a dumbing down of the system in order to make policy makers look good at the expense of kids.

At this critical time of transition, parents deserve transparency. And they deserve the truth. They should demand it in Ohio.