(iStock)
Reporter

Eighth-graders in Chicago just received their high school placements for the next school year, decisions made through a new system that uses an algorithm to match students and schools.

Algorithms — which are step-by-step procedures on how to solve a problem or completing another task  — are used by school districts, governments and other entities to make important decisions. Some of them have been exposed as racially biased — yet often the community has no idea how they operate or has had no opportunity to weigh in on their use.

This post looks at this problem and argues for “algorithm transparency.” It was written by Charles Tocci, an assistant professor of social studies education at Loyola University Chicago, a parent and a former high school teacher in Chicago Public Schools. He is also a Public Voices Fellow with The OpEd Project, an effort to expand the range of voices speaking out in the public square on critical issues.

(Clarification: An earlier version said there was no public information on exactly how the student-school matching algorithm works. Chicago Public Schools have information on this on its website, including factors that are considered.  It does not show exactly how the algorithm works.)

By Charles Tocci

Nearly 30,000 eighth-graders in Chicago recently learned where they are going to high school next year. The student-school matches were made by  a new system called GoCPS, which relies on an algorithm — not educators — to match students to schools.

The idea behind the process is that it will open new opportunities for students who need them most. We don’t know if that will happen. But we also don’t know exactly how the algorithm works — and we should.

Like many of the algorithms used by governments at all levels, there’s little clarity about how GoCPS works and who decided it should work that way. The algorithm is the intellectual property of SchoolMint, a San Francisco-based company specializing in school enrollment software, and it is not disclosing details.

The public has a right to know. We need policies that bring the public into the design and oversight of the automated decision-making systems that impact on our lives. We need algorithmic transparency.

As a parent of a Chicago Public Schools student, a former high school teacher in the district, and a current education professor, I became concerned about the GoCPS plan.

New York City has been using a similar system for 14 years, and their high schools remain as segregated and disparate in outcomes as before.  Chicago, however, modeled its approach on New York’s and didn’t solicit any public input into the design. Could we really expect it to do any better?

To try to find out, I began to dig into the ways governments use algorithms to automate decision-making. The stories are troubling, especially when it comes to race.

Here in Chicago, the police department employs a highly controversial “strategic subject list” that tries to predict who will be the victim of gang violence as well as who will commit it. The effect has been to profile young Latinos and African American men, lumping together those with serious criminal records with those who happen to live on the same block or share a relative. Until recently, the New Orleans Police Department had been secretly using a similar automated program. It abruptly ended the program a few weeks after it was made public.

Several states use algorithms to conduct “risk assessments” meant to inform judges of how likely a given convict is to commit future crimes. A ProPublica analysis found that one widely used algorithm systematically misidentified African Americans as far more likely to commit future crimes than whites.

Anthropologist Virginia Eubanks’s new book,  “Automating Inequality, explores what she calls “the digital poorhouses.” From Los Angeles to Indiana to Alleghany County, Pa., Eubanks carefully charts how automated decision-making systems unjustly deprive people of color and people living in poverty of housing, food stamps, and even custody of their own children.

The risks of severely injuring people’s lives has prompted New Zealand and the European Union to strengthen privacy laws in ways that significantly limit the use of algorithms for social programs.

We seem to be handing computers more and more power, sometimes without even knowing it. Given the unintentional outcomes and racial discrimination wrought by algorithms, the promise of more fairness and efficiency just doesn’t stack up.

I believe that the programmers, entrepreneurs and public officials involved all want to tap technology’s potential to create a better society. Unfortunately, in their zeal to do good, they have failed to engage the public in the kinds of democratic deliberation required to sound out underlying biases, protect privacy and craft fair, responsive policies.

There is a tricky balance to strike between the right of companies to protect their intellectual property and the right of the public to be informed. But there are several policy steps we can take immediately to create more transparency for the algorithms our governments use. These include public bidding and, once a contract has been awarded, public forums to solicit residents’ input on how decisions should be made and how outcomes should be evaluated.

After an algorithm is put into use, the company that developed it should be required to create and update a website providing clear, layperson explanations about how the system works with illustrative examples. Agencies must also retain the right to audit how companies are using the personal data they collect and levy fines when privacy rights are trampled. Finally, the provider must be contractually mandated to make changes to the algorithm to meet the goals generated by public input.

For example, we Chicagoans want stronger, more equitable and less segregated public schools. SchoolMint should be part of the solution by routinely updating the high school application algorithm to help get us there.

I hold out hope that new technologies can get us the schools, police departments and public services we need for a truly equitable society, but we can’t rely on software start-ups to get us there. It will require all of us to be involved, every step along the way.