Good or bad? New rating system can’t decide about this principal

freepik.com freepik.com

I recently published a post about how a teacher in New York was wronged by the state’s controversial new educator evaluation system, which is based in large part on student standardized test scores. Here’s a story about a school principal’s personal experience with the scores. This was written by Sean C. Feeney, principal of The Wheatley School in New York and president of the Nassau County High School Principals Association. He is a co-author of  the New York Principals letter of concern regarding the evaluation of teachers by student test scores. It has been signed by more than 1,535 New York principals and more than 6,500 teachers, parents, professors, administrators and citizens. You can read the letter by clicking here.

By Sean C. Feeney

Superintendents in New York State’s public school districts have for the last week or so been receiving what’s called the “growth scores” for teachers of grades 4-8 and principals serving grades 4-8 and 9-12. These scores are based on the new Common Core test results, which saw a 30-point drop in students being classified as “proficient.” Although our commissioner of education has repeatedly tried to assure the public that the dramatic drop in student scores does not reflect negatively on student, teacher or school performance, this message flies in the face of reality. As the state continues to “build its plane in the air” — meaning that it is implementing an educator evaluation system before it has really been properly designed —  those of us who work in schools are seeing how destructive these poorly planned initiatives have been.

As thousands of educators across New York State have publicly indicated, the excessive testing and use of student scores to rate teachers, principals and schools is misguided, not based in sound research and rushed in its implementation. These facts are ignored at the peril of our students and schools.

As we see more results of New York state’s initiatives, the more serious problems and contradictions are revealed. Looking deeply into the recently released scores, three glaring problems became immediately obvious:

1) The scores provided by the state don’t reflect reality
As the principal of a Grade 8-12 school, I receive two separate growth scores from the state. According to the state’s growth measures for Grades 4-8 principals, I have been classified as a “Developing” principal. This is one step above the “Ineffective” rating. According to the growth measure for Grades 9-12 principals, I am an “Effective” principal. The notion that one can look at a single grade and extrapolate a rating for an educator is nonsensical and bad statistical practice. Additionally, it is not reflective of the educational offerings we provide our students across Grades 8-12.

Our school programs and offerings reflect a supportive community that has high expectations for its children. We offer robust music, theater and art programs, and students have ample opportunity to participate in student clubs and athletics. Our school has a strong commitment to community service with faculty and students all participating in a daylong a Day of Service and Learning. I have the blessing of working with a wonderfully talented faculty. As the principal, I have been smart enough to listen and support  them all the time, to stay out of the way some of the time, and to push hard other times.

So how has our school done preparing students for college? Our most recent state report card reflects a graduation rate of 100 percent, with 89 percent of our students earning the higher Advanced Designation diploma. These rates are far above the New York State average of 74 percent and 30 percent, respectively. Our school is highly ranked on the national lists of top high schools. Virtually every graduate attends college, with over 90 percent of them attending a four-year college. Students have the opportunity to participate in career mentoring, science research and mathematics research. Every school in our country should have the support and programs we are able to provide our students. How do these two scores, as well as the scores that my teachers received, reflect our work in preparing all students for college and careers?

2) The scores provided by the state are not consistent.
In one of the many memoranda to schools, Commissioner John King boasts that “about three-quarters of individual teachers will earn the same or better [test score-based] HEDI rating than they did in 2011-12.” This is not something that should make our commissioner proud! If 75 percent of teachers earned the same or better rating than last year, that means about 25 percent of teachers earned a worse rating than they did in 2011-12. No matter how it is viewed, this amounts to an alarming amount of movement in a model that purports to measure “teacher effectiveness.” Did that many teachers become worse at their craft from one year to the next? Which measure is accurate: last year’s measure or this year’s measure? Of course, this lack of inter-temporal stability in value-added measures is one that has been identified by researchers for years. Clearly, there is a problem with the model. A system that purports to be objective but results in teachers bouncing from Ineffective to Effective or from Highly Effective to Developing in one year is evidence of a capricious, inconsistent system.

3) Different state measures of effectiveness contradict each other.
My “Developing” rating as a Grade 4-8 principal is based on the performance of our eighth graders on the Common Core examinations administered in April 2013. As Ken Slentz,  deputy commissioner, explained in this March memo, only students achieving at a level of 3 or 4 on these exams are considered to be on a college and career trajectory. (This is the same memo that informed us that only one-third of New York State students would demonstrate proficiency on the exams that were still weeks away from even being taken!) Well, how did our 8th graders do on the mathematics examination? Only 39 percent of them earned a Level 3 or Level 4 designation. Clearly, I must be an ineffective leader to have such low performance among my students!

Not so fast. At our school, nearly 90 percent of our eighth graders also take the high-school Integrated Algebra examination. Our overall passing rate on this exam was 97 percent last year. The passing rate of the eighth graders who took the Algebra Regents examination was 99 percent. A few years ago, the state established “College and Career Readiness” passing thresholds for these exams. Despite the fact that these thresholds and the correlational study on which they are based have been discredited, Commissioner King insists that only students who meet the Aspirational Index score of at least 80 on the Integrated Algebra examination are college and career ready. So how did our eighth graders do against this measure? Fully 73 percent of them scored at this higher threshold. Of even greater concern is the fact that nearly 88 percent of the students who earned Level 1 or Level 2 scores on the eighth grade assessment passed the Integrated Algebra Regents at the “college and career ready” threshold of 80. How do I tell a student that he was not proficient in April, but met the Commissioner’s College Ready standard for graduation two month’s later?

State education officials have created a system of contradictions, mixed messages and harmful outcomes. This is what happens when one throws an entire state into a chaotic system of mandates and practices that are not thoughtfully planned and are not grounded in best practices. Yet I’m the one being labeled as a developing leader.


Valerie Strauss covers education and runs The Answer Sheet blog.

local

answer-sheet

Success! Check your inbox for details. You might also like:

Please enter a valid email address

See all newsletters

Comments
Show Comments
Most Read Local

local

answer-sheet

Success! Check your inbox for details.

See all newsletters

Next Story
Valerie Strauss · September 10, 2013