Late last month, the College Board announced that it was dropping a plan it had introduced a few months earlier to provide colleges with a single numeric rating of the adversity that students who took the SAT faced in their communities.
The authors of the following post argue that despite its problems, the idea that the College Board introduced was a good one, and that some version of it could help improve “an unconscionably unjust process” — that being the college admissions process.
The writers are Brennan Barnard and Richard Weissbourd. Barnard is the college admission program manager at the Making Caring Common project at the Harvard Graduate School of Education and director of college counseling at the private Derryfield School in New Hampshire. Weissbourd is a senior lecturer at the Harvard Graduate School of Education as well as the faculty director of the Making Caring Common project and the co-director of the school’s Human Development and Psychology Program.
By Brennan Barnard and Richard Weissbourd
How do we measure human potential? And how do we do it in a way that’s both accurate and just, taking into account an individual’s advantages and disadvantages? Schools, college admissions offices, businesses and many other professions have always struggled with these questions, and now the College Board, the organization that owns the SAT, Advanced Placement program, and other educational assessments, finds itself in the thick of them.
In May, College Board officials announced an Environmental Context Dashboard (ECD) tool — and were immediately swamped with criticism. As originally designed and piloted, the ECD was a numerical snapshot — using College Board, census, and federal education data — on a scale of 0-100 that attempted to provide for college admissions offices context for the potential adversity or disadvantage that a student has faced.
It factored a student’s socioeconomic circumstances using public data on crime, poverty, income, housing, family structure, employment, education resources and other factors into one score. Critics argued that some of this data is problematic and that the concept itself is wrongheaded. They said that some students’ adversity scores would be too high because they’re well off but live in neighborhoods and/or go to schools with significant levels of poverty and violence, while other students’ scores would be too low because they deal with serious family burdens and adversities but live in low-poverty neighborhoods and attend well-resourced schools.
College Board President David Coleman recently responded with updates to this tool, rebranded as “Landscape.” It will now provide separate scores for a student’s neighborhood and school, and students will also be able to view their scores. Undoubtedly, criticisms will persist, in part because neighborhood and school scores are still rough, flawed measures of any individual student’s challenges.
Yet admissions officers appear to recognize these limitations and will use these scores as data points among a broad array of factors in reviewing applications.
Moreover, lost in the criticism is that the index is right in concept. It works to make an unconscionably unjust process more just, to level the playing field for students facing disadvantages. It’s not fair to ignore these disadvantages when assessing students’ grades and test scores.
And when it comes to the SAT in particular, the deck is wildly stacked. Some students have access to great schools and high-priced SAT tutors starting in the ninth grade, while many other students not only lack access to tutors but are crammed into classrooms with too many students and far too few basic school resources and supports.
Let’s not give up on basic equity. Let’s instead get better at assessing and weighting disadvantage.
How? Admissions offices need better information about the adversities and burdens of individual applicants, and they need good ways of assessing and weighing that information. What if — in addition to using neighborhoods, schools or parental education as rough proxies for level of adversity — we did something more straightforward: ask students in applications about the particular burdens and adversities they face?
Getting B’s and C’s and doing okay on the SAT is impressive when you’re also working 20 hours a week, but most college applications are not capturing that context. It’s also not fair that students can get credit for community service for painting a house in Belize one summer but not get service credit for supervising a younger sibling or taking care of a sick relative 15 hours a week year after year because they’re not prompted to report that information on an application.
Over the last few years, the Turning the Tide initiative at the Harvard Graduate School of Education’s Making Caring Common Project has been working on this problem. We are collaborating with college admissions offices to increase the number of students reporting these responsibilities and burdens. That’s challenging — students may feel stigma in reporting burdens or adversities, don’t think to report certain types of adversities or responsibilities or don’t imagine that this information is valued. Furthermore, many admission staffs don’t have clear, carefully considered ways of assessing and weighing these responsibilities and adversities in application review.
But there’s much that college admissions offices can do, including underscoring in applications the importance of reporting responsibilities and adversity, offering specific guidance in reporting them, including providing specific examples, and assuring students that their responses will be treated confidentially.
There is the will. Both our Turning the Tide report, endorsed by almost 200 colleges, and our Deans’ Commitment Letter, endorsed by more than 140 college admissions offices, highlight the importance of weighing substantial family commitments and contributions and encouraging students to report these commitments in their applications. Some colleges and universities that have endorsed Turning the Tide, such as Georgia Tech, are taking more concrete steps.
While Georgia Tech is piloting the ECD with the College Board, the Georgia Tech application also prompts students to answer this question:
“Tech’s motto is Progress and Service. We find that students who ultimately have a broad impact first had a significant one at home. What is your role in your immediate or extended family? And how have you seen evidence of your impact on them?”
Our hope is that many more college admissions offices will try out different types of prompts for eliciting this information, track responses, and share their findings with other admissions offices to see what prompts are most effective. We have been collaborating with the Coalition for College to pilot new questions and approaches for students to share information about their unique context, background and responsibilities. We have also begun collaborating with the Common Application and its Reach Higher initiative on short- and long-term changes to advance this work.
To be sure, whatever methods colleges use to elicit and factor this individual information into application review will have flaws. There is no perfect assessment. We are a long way from assessments that faithfully capture the complexities of any individual’s potential. That’s why assessments are easy targets for critics.
It is also very hard to measure and weigh certain burdens and adversities — a parent suffering serious depression, for example, or a sibling with a drug addiction. But let’s not throw the baby out with the bathwater in our criticism of the Landscape. Let’s instead continually seek to find better ways. Equity and justice demand it.