Here are the percentages, by race or ethnicity, of the total number of offers for fall 2019 to these eight specialized schools: 51.1 percent were Asian, 28.5 percent were white, 7 percent were unknown, 6.6 percent were Latino, 4 percent were black, 2.3 percent were multiracial and 0.6 percent were Native American.
According to the New York City Department of Education website, 40.5 percent of the city’s public school students are Hispanic, 26 percent are black, 16.1 percent are Asian and 15 percent are white.
The issue was discussed Wednesday at a New York City Council meeting, where schools Chancellor Richard Carranza made clear that he is not a fan of a single test having such a high stake.
The state Assembly is holding a hearing on the issue in a few days.
This post looks at the test that is given to students for admittance to these specialized schools. It was written by Akil Bello, who has spent nearly three decades researching, analyzing and helping students prepare for the test. Bello is the founder of a test prep company called Bell Curves, and is an author and a veteran test prep tutor.
By Akil Bello
New York City was rocked in March (as it has been in previous years) by the appallingly low number of black and Hispanic students admitted to the specialized high schools, which are public selective high schools that in many parts of the country would be called magnet schools.
This year, though, seemed different with some city officials calling for changes in the admissions policies at these schools to redress the deep level of segregation in America’s largest school district.
The recent installation of a new school chancellor, Richard Carranza, who seems willing to change the admissions policy, and Mayor Bill de Blasio’s proposal for changes to the admission process, have led to hope of change among some people. There is also intense fear among those who believe using one test as the sole determinant of “merit” is legitimate.
Two key questions about the Specialized High School Admission Test (SHSAT) have not received enough attention in the current debate.
First, is the SHSAT a good test? Second, is using a test, even if it’s good, as the sole basis for admission a good idea?
The answer to the second question is easy. No.
No one should use a test score in isolation to determine who should be admitted to a school, which is likely why no one but New York’s specialized schools does it. The American Educational Research Association, the American Psychological Association, and the National Council on Measurement in Education all recommend using “multiple sources and types of relevant information” to make educational decisions.
The College Board, which owns the SAT college entrance exam, and ACT Inc., which owns the ACT test, have long insisted that colleges should use test scores as only one valuable piece of information among others. If a holistic approach to admissions is good enough for Stanford, Caltech, Phillips Exeter and Thomas Jefferson, shouldn’t it be for Stuyvesant and Brooklyn Tech?
The harder question to answer is whether the SHSAT is, in fact, a good test.
The New York State Department of Education has released only one validity study in the past 30 years and no technical manual. That study was conducted in 2013, on what can only be described as a different test. The SHSAT has undergone many substantial changes since 1991.
It’s been revised twice in the past two years alone. The findings of the 2013 study, which did show a “strong positive predictive relationship” with student achievement during the first two years of high school, measured by grades, scores on AP tests, and Regents exams, were for a test that no longer exists. All these revisions point to a bigger question, though: If something ain’t broke, why do you keep fixing it?
Answering that question is hard, too, because the city releases no copies of the exam after they are given. All the major test makers for the college and graduate school admission make retired exams and test questions readily available, but New York City’s Department of Education does not.
The company that writes the SHSAT is a subsidiary of Pearson Inc., and it says it produces up to 18 different test forms for administration on four different days. If it is not reusing exams once they are given, why not release them free so families could have useful practice materials and see what the test is really about?
It takes between 2 1/2 to 3 years to get a single SAT question written and vetted before it becomes a question on the exam, and, according to the College Board, only about 50 percent of written items are actually used.
The most recent changes to the SHSAT were announced in the fall of 2016, a vendor was selected later that year, and the test was administered in October of 2017. It’s possible no mistakes were made, but that wouldn’t be thanks to the amped-up production schedule for the SHSAT.
Pearson has some history, however, of making mistakes. In 2012, the seventh-grade New York State English Language Assessment a nonsensical passage about a pineapple racing a hare, and asked students to answer questions with no clearly correct answer.
More recently Pearson included a question on the Massachusetts Comprehensive Assessment System (MCAS), which asked students to write from the point of view of a racist slave owner. Pearson has lost contracts for state assessments in New York, Texas, and North Carolina. It’s hard to imagine that the intense production schedule of the SHSAT has decreased the likelihood of similar errors in the design of the test.
As a parent, I check my children’s work for mistakes. Shouldn’t Pearson be subject to that same level of scrutiny?
And why does anyone believe that a single test score is a more valid predictor of success than the combination of grades students receive from teachers, the results of statewide assessments, and the strength of a school’s curriculum?
To get an understanding of how much the test has changed, here’s a partial history of the Specialized School Admission Test. Click on the chart for a full view: