Home   |   Register               Web Search: by Google
channel navigation

College Board Web Site
Education News
Local Schools News
Metro News
Talk: Metro message boards
Live Online Transcripts
Subscribe to washingtonpost.com e-mail newsletters
-- customized news, traffic, weather and more

SAT Scores
With Brian O'Reilly,
Executive Director, College Board SAT Program

Tuesday, Aug. 28, 2001; 1 p.m. EDT

National SAT scores are set to be released today, reflecting how well America's youth perform on one of the standards for college admission.

Brian O'Reilly is the director of the College Board's SAT program, which includes the SAT I and II standardized tests.

The transcript follows.

Editor's Note: Washingtonpost.com moderators retain editorial control over Live Online discussions and choose the most relevant questions for guests and hosts; guests and hosts can decline to answer questions.


washingtonpost.com: We would like to welcome Brian O' Reilly, Executive Director of the College Board SAT Program. Brian, with SAT scores on a slight rise over the past ten years, what do you think the scores will look like in the next ten years?

Brian O'Reilly: Thank you, it's a pleasure to be here. It's hard to say what will happen to SAT scores over the next decade. In the early '80's they were declining, and throughout the '90's (and over the past two years) they have been slowly rising. I guess I would expect them to continue going up.

Alexandria, Va.: Compare current scores to what would have been this year's scores had the Board not
"recentered" the scoring when, 1995 (?)

Brian O'Reilly: When the College Board recentered SAT scores, back in 1995, Verbal scores went up 70 points and Math scores about 25. So, if recentering had not occurred, the current Verbal average of 506 would have been about 437 and the current Math average of 514 would have been about 494.

I once talked to someone whose job title suggested he should be knowledgeable about this, who said that the College Boards every year essentially decide a priori what they want the national average score to be, based on estimates of how many and what kind of students they expect to take the test, and then design the test to produce that average score.

Is this correct; if so, it makes the idea of comparing national average scores across year largely irrelevant?

Brian O'Reilly: Laurel, I'm not sure who told you that, but it's absolutely untrue. The test content changes very infrequently, and the average level of difficulty of the questions has been the same for many years. There are statistical processes, called "equating," that guarantee (as much as you can) that scores from one form of the SAT mean the same thing as scores from any other version. So comparisons across years are meaningful.

Alexandria, Va.: With all due respect, I think the SAT's value as a predictor of educational success is highly questionable. I speak from personal experience. I did very poorly on the SAT both times that I took it, but I went on to attend a top 25 university, graduated Phi Beta Kappa and have enjoyed a very successful professional career.

Why do you continue to insist that the SAT should be the gatekeeper for the nation's universities and colleges?

Brian O'Reilly: There is no indicator (whether it's SAT scores or high school grades or an application essay or an interview) that is 100% accurate at identifying which high school students will do well at which colleges. Thanks God. Human nature being what it is, you wouldn't expect otherwise. The two best predictors of success in college, however, are high school grades and SAT scores. They both do a good job at predicting which students will do well, and together they do the best job. Colleges do validity studies to measure how well all the criteria they use in admitting students are working. SAT works well. That's why they keep using it. But colleges will tell you that they try not to use it (or any other criteria, including grades) as a "gatekeeper." Believe it or not, colleges are looking for reasons to admit students, not to keep them out.

Plano, TX: I glad that they are actually reporting the true state of American Education and admitting things have been getting better for a while. As the student population mix changed from people with better educational backgrounds some scores went down, now as schools and backgrounds improve the scores are going up. American Education could be better, but it's absolutely not falling apart.
Test Scores are up, violence is Way, way down, and even teenage motherhood is declining. The real story is the success of modern Education movement, not its failure.

Brian O'Reilly: You are correct, in my opinion, that American education is not "falling apart." But you are also correct that it "could be better." Average SAT scores deliver both those messages. Across the country, scores (especially for Math) have gone up in the past decade. However, the average scores for African American and Latino students lag well behind scores for White and Asian American students. That's the "could be better" part and the part that should continue to have us worried.

Washington, DC: Today's Washington Post story about SAT scores stated that there are "nagging disparities" between the average scores of Asians and whites on the one hand, and all other minorities on the other. Why do you think people assume that there should be precisely the same distribution of scores across racial/ethnic lines? Has such a distribution ever been achieved with any standardized test? Is there any evidence to suggest that it can be?

Brian O'Reilly: I think the American people believe strongly in the notion of equal opportunity. Public education is designed (in part) to provide that equality of opportunity. When you see differences among racial/ethnic groups in an educational measure, like the SAT, it's even more disturbing to many people than differences in economic measure. As long as there are differences in educational opportunities, by sub-group, we'll continue to see SAT score differences as well.

Gaithersburg: Are SAT prep courses useful for students all across the spectrum of points? That is, if
taking a course will raise one kid's score from, say, 950 to 1100, will it raise another's from, say, 1300 to 1450? Or does a kid who already tests well already have the skills the course teaches?

Brian O'Reilly: Coaching appears to work best for students who have the least familiarity with the test. That's not surprising. Students who spend time reading directions, or figuring out how to respond to a certain type of question, are losing the time they need to answer questions. However, after gaining a certain amount of familiarity, the gains associated with coaching seem to fall off pretty dramatically. My advice, to parents who ask, has always been that there are better ways for you to spend your money and for your children to spend their time. All test-takers should be familiar with the SAT before they take it, but that can be done by taking a few practice tests.

Arlington, VA: How do you think the explosion in private SAT tutoring and classroom SAT tutoring has effected and/or will effect the fairness of the SAT test?

Brian O'Reilly: I have friends who tell me that their son or daughter benefited greatly from coaching. I have other friends who tell me that the coaching did no good. Remember, most students will see an increase in their scores just by repeating the SAT -- whether they had coaching or not. Scientific studies have shown that, on average, coaching will further increase scores by anywhere from 10 to 30 points. This is within the "standard error of measure" on the test. Admission officers know how precise the SAT is. A difference of 10 to 30 points should not be the difference between getting in and not getting in to a college.

Michael McCabe, Washington, DC: I have two sons (high school senior and freshman) and over the years their percentiles have been relatively constant from one standardized test to another. It seems that one could project a likely SAT score from the results of standardized tests taken in grade school. Isn't more being made of the SAT that it deserves?

Brian O'Reilly: Interesting point. You are correct that most students will be consistent across time (elementary school to high school) and across measures (standardized test scores, grades). However, I also think most students and parents will want to be evaluated on something current rather than something from their past. Students do change. Some who are getting only A's in middle school will have difficulty with high school course work and see their grades decline. Others are late bloomers, who will see increases -- in both grades and test scores.

Towson, MD: Is today's technology helping kids to do better with SATs? I was considered above average when I scored a combined score of 1200. Nowadays, more and more average kids are trying to get the perfect scores and succeed.

Brian O'Reilly: I'm assuming you earned your 1200 before 1995, when the SAT was recentered. If that is the case, your 1200 are equivalent to about a 1270 today. And, even with recentering, a 1200 is well about average. About 20% of College-Bound Seniors in the class of 2001 earned 1200 or above.

Washington, DC: How important are the SATs? If my son takes advanced placement (AP) classes, has good grades, is active in clubs and organizations, but has only marginal scores, what does that do to the chances for scholarships and acceptance in first-choice schools?

Brian O'Reilly: Every kid is different. Every college is different. The SAT's are not unimportant, but they are not as important as the courses students take in high school and the grades they earn.

Washington D.C. : Since there is no real drop-off in this year's scores would you think that tests are getting easier or students are getting smarter?

Brian O'Reilly: The test is not getting easier. We have statistical methods (called equating) to control for that. Are kids getting smarter? Hard to say. A 1-point increase in Verbal scores reflects a very small change in the performance of the total group. And, remember, that the SAT measures reading and mathematical problem solving, but it does not measure a lot of other things that go into whether students are "smart." This is just one indicator.

Bethesda, MD: Hi, are you aware that some states (e.g. GA) use SATI scores to let 10-13 children attend colleagues. The so called prodigy rule states that any kid younger then 15 years and making better SATI score then 95% of high school graduates has a right to attend college of his choice in GA.

What is your opinion about such use of SATI scores? Do you have any data how those children actually progress in college?

Brian O'Reilly: I'm not aware of data indicating how such students do in colleges in Georgia. I wasn't even aware of this policy. I know that lots of 7th and 8th graders take the SAT for "talent search" programs run by colleges. (Johns Hopkins in your area.) These children, usually aged 13 or 14, can attend summer programs as colleges, based (in part) on their SAT scores. But they are not enrolled as college students -- just taking a summer program.

Rockville, Md.: My daughter will be taking the SATs for the first time in November. She will be a junior. We have decided to let her take the test the first time cold, without any courses and see how she does. If necessary, after her initial try, we will decide if she needs additional help. From your experience, do these courses do any good or do they just teach you how to take a test and recognize the wording, etc.

Brian O'Reilly: When you say, "take the test the first time cold," I hope you don't mean without any practice whatsoever. Has she taken the PSAT/NMSQT to become familiar with the format of the SAT and the kinds of questions asked. There are a couple of question types that your daughter would benefit from knowing about in advance. But, after sufficient practice, I think she will do as well on the SAT as she would after a coaching course. In my opinion, the courses work best for the kids who won't do the practice unless they're in the structure of a classroom setting.

Washington, D.C.: Regarding the extent to which Hispanic scores are too low, I think the fact that many schools, government agencies, and even private companies catering to Hispanics by providing services in Spanish are doing the Hispanic community a huge dis-service. Many immigrant groups from around the world have come to the U.S. without special language provisions in schools and government, but were forced to adapt through immediate immersion, which I think is far better for getting better test scores, better grades, better jobs, and eventually the best integration into American society. Would you care to comment to what extent that might be a factor?

Brian O'Reilly: We don't know which test-takers are in bi-lingual courses and which are not. We only know which ones grew up with a language other than English. Those students don't do as well on the Verbal part of the SAT as students who grow up speaking English. But they do just about as well on the Math. How much (if any) of this is due to bi-lingual classes is anyone's guess.

DC: I took the SAT in 1988 and was wondering how that would compare to scores today. How often are the SAT score recentered?

Brian O'Reilly: The SAT was first put on scale in 1941, and then again in 1995. The scale will probably not be changed again for at least another decade -- or longer. In 1995, Verbal scores increased by about 70 points and Math by about 25. So you would have scored nearly 100 points higher if you'd waited to test until after 1995.

Castleton, VA: When are educators going to quit trying to close the achievement gap and face up to the findings of "The Bell Curve"? Certain races are on average smarter than others. You can't change that.

Brian O'Reilly: If that's true, then you wouldn't expect to find members of that race at the top of the curve, would you. But members of all races do very well on the SAT, and members of all races do poorly on the test. I'm sorry you believe in the myth of eugenics. It's bad science and has been disproved many times.

Rockville, Md.: RE: Taking the Test. Yes, she has taken the PSATs and scored quite high. I would expect no child to take what many people have decided is a life-determining test without adequate preparation

Brian O'Reilly: Good to hear that. As a parent, I totally agree.

washingtonpost.com: Read about the latest numbers and trends on SAT scores.

Arlington, VA: Is the SAT an intelligence test or a measure of the degree of a student's learning, or both combined together?

Brian O'Reilly: The SAT is not an intelligence test. Its origin, back in the '20's, however, was in this testing movement. SAT measures what you have learned, both in and out of school. It is not curriculum-based. That is, it does not reward students who have taken a particular class or used a particular textbook. It measures "developed reasoning ability," mostly by assessing how well you can interpret what you've read and how well you can use math to solve problems.

Vienna, VA: Some say that SAT is just a measure of the ability to take tests and is not measuring how "smart" the child is. Why should this be an indicator in colleges when kids apply?

Brian O'Reilly: The SAT is used by colleges because it works. They look at SAT scores, and then they look to see which students were successful at their college. They compare the two, and understand that the SAT has done a good job of predicting which students would be successful. The grades a student earns in high school, and the rigor of the courses a student has taken, are also good predictors.

Ft. Myers, Florida: Is it not true that the SAT underpredicts the college performance of females and overpredicts grades for boys? Is this not the classic definition of test bias? Why should colleges continue using an exam that its own manufacturer admits is flawed?

Brian O'Reilly: You are correct that SAT scores, on average, slightly overpredict how well boys will do in college and (by definition) slightly underpredict how well girls will do. In other words, if two students earn the same SAT score and go to the same college, (all other things being equal) the girl is likely to earn slightly higher grades than the boy. But grades reflect more than just academic ability. They also reflect whether you show up for class on time, whether you participate in discussions, whether you do the out-of-class work and submit it on time. These are things that girls, on average, do better than boys. And they're not measured on the SAT -- or on any other test, for that matter. Colleges should not use the SAT in isolation, but in conjunction with high school grades.

Boston, MA: Where can we get more information on state-wise (50 U.S. states) SAT performance? Is there also a place where we can find how U.S. schools are performing compared to other developed nations?

Brian O'Reilly: State SAT scores are on our website, www.collegeboard.com. As for comparisons with other nations, you could try the U.S. Department of Education's web site, especially the section from the National Center for Education Statistics.

Washington, DC: I worked for a private SAT tutoring company for several years and saw much more than the 10-30 point increases you mentioned previously. We saw anywhere from 100-400 point increases, 100-150 being more typical. How can you say that the test isn't biased towards rich kids, when this is the case?

Brian O'Reilly: When you saw a 100-point increase, what did you use as the base? A previous SAT, or your own test? If it was your own test, you have no way of knowing whether the score on that test was comparable with a real SAT score. If it was a previous SAT, then you have to take into account that all students see an increase on repeating the test, whether they are coached or not. And did your "average" increase take into account the kids whose scores went down? If not, then it wasn't really an average. Finally, no one is saying that "rich kids" don't have advantages in life. They go to good schools. They have educational resources aplenty in their homes. They travel widely. They know (or can be taught) how to take best advantage of the college admission system. But admission officers know all of this. They also take students' backgrounds into account. Does it work perfectly? Is the college admission process totally fair? Of course not. But it works well.

Richmond, Virginia: How can the College Board continue to claim that test prepping services only have minimal success, especially when your organization itself is hawking the same kind of stuff?

Brian O'Reilly: Preparing for the SAT is not unimportant, but it does have minimal effect -- on average. Are there kids whose scores increase by 100 points? Sure. Are there kids whose scores decrease by 100 points? Yes. (You just don't hear from the coaching companies about them.) And the College Board is not "hawking" test prep. We have an obligation, which we take seriously, the prepare students for the SAT. We give every student who registers to take the test a copy of "Taking the SAT I." It contains useful tips for taking the test, as well as a full-length practice test. We sell "10 Real SAT's" in bookstores, where students can get additional practice at a reasonable cost. (I believe it's $17.95.) We have additional free test prep on our web site, www.collegeboard.com. We encourage schools to participate in the PSAT/NMSQT, where students can take a low-stakes version of the test which also serves as practice for the SAT. Practice is important -- but students and parents need to keep things in perspective.

washingtonpost.com: Unfortunately, that is all the time we have for today. We would like to thank our guest, Brian O'Reilly for joining us.

© Copyright 2001 The Washington Post Company


Home   |   Register               Web Search: by Google
channel navigation