Today is the day when Virginia students will start running what teacher Mary Tedrow calls the state’s ” testing gauntlet” which seems to test endurance at least as much as knowledge. In this post, Tedrow looks at the sample questions of Standards of Learning tests and explains what students need to know about these exams to pass — and graduate on time.
Tedrow teaches high school English, co-directs the Northern Virginia Writing Project and is a National Board Certified Teacher as well as a member and past fellow of the Teacher Leaders Network. She blogs at Walking to School.
By Mary Tedrow
At last, it’s May. For some that means Mother’s Day, Graduations, and the official Memorial Day start of summer.
For high school students in Virginia it means running the testing gauntlet.
This year the tests start in earnest on May 5th with two weeks of Advanced Placement tests. On the heels of that are the End-of-Course (EOC) Standards of Learning tests (SOLs). Throughout the month and well into June the library, computer labs, and every available space will be given over to testing.
For many students, the SOLs truly carry the alternative street-wise definition of the acronym. (When talking to out-of-state educators about SOLs there is always an eyebrow-raising moment when the person looks closely at you, and then begins to mutter, “Where I come from SOL means…) Our Virginia students must pass a certain number of End-of-Course tests in order to receive a valid diploma.
For the student, these are crucial scores.
English teachers are well aware of the importance of a passing score. In the eleventh grade, students sit for the EOC in both reading and writing. Students must pass them both in order to graduate. Everyone must pass them — special education students, recent immigrants (in the country for one year), transfer students, migrants — everyone. Many students — I’m sure you can guess which ones — take these tests multiple times before passing and earning their diploma.
When a test is this important to the student, one would hope it is a solid, informative test that represents the students’ real achievements.
Last year, Virginia introduced new “technology-enhanced” tests promising increased rigor. The tests are harder, of that there is no doubt, judging by the results — lower scores across the Commonwealth–and by student observation. I’m not sure that “harder” equates with rigor, however, unless running farther equates with running faster.
I cannot comment on the actual test since merely reading the test can result in the revocation of my license. I can, however, refer to the practice items the state has released to introduce the “technology enhanced” items to the students.
The practice reading test has 30 items on it. I have taken the test. It takes quite a while. If you had the proper software (Test Nav), you could take this test yourself.
The left-hand side of the computer screen has the text the students must read; the right-hand side presents one test question at a time. In the practice test, the students first answers 10 questions related to a fictional story. In order to read the story, students must click through four pages of text.
The second set of 10 questions relates to a non-fiction piece that is another four pages long.
Then students read a paired passage, ten questions about two related texts. For the practice test the paired passages are a fictional and a non-fiction piece on a similar subject. To negotiate between the two passages, students must click tabs at the top of the left-hand side of the screen. In addition, the student must page through the passages by clicking arrows on the bottom of each passage. The first passage is six pages long; the other is three pages long. Nine pages of computerized text must be read to answer ten questions.
For those keeping track, in order to answer 30 questions we have had to page through 17 pages of text.
For many of the questions, technology enhancement just means a splashy multiple-choice question. Students select a radio-button next to their choice or drag one of four or five choices into a box.
Others reflect more unusual “enhancements.” In one question, students must drag six items into a Venn diagram to compare and contrast items from the text. The student must properly place ALL items or receive no credit for this single test item.
In others, students must select items from a list. Sometimes the directions indicate how many choices the student must make (i.e. ‘choose two’). In others there is no indication of how many choices would warrant a correct answer. Again, the student must select ALL of the correct choices or the entire item is wrong.
The question below caused me considerable anxiety, primarily because I did not know how many choices was the “right” answer. I waffled, hemmed, and hawed before choosing three.
Which words are synonyms for tenacious in paragraph 7?
Please note that the question only says which words are synonyms (plural). Not every student is going notice the s’s and may choose the first synonym they come to and move on. Also, it does not indicate how many would be right. If the student doesn’t choose the three correct answers (unrelenting, persistent, determined) the question is wrong. If the student overshoots and chooses a fourth (Would a tenacious person be considered unfailing?), the question is also wrong. If the student wants to see the word in context, he or she must page through the text to find paragraph seven. (The words are underlined in the text.)
Here is another question that caused similar anxiety: “Select the statements that are supported by details.” OK. I see I need to choose more than one, but how many more than one?
From the testing blueprint, I know that the actual test has 55 questions (10 word analysis, 18 fictional text and 27 non-fiction). Some answers are not figured into the student score because they are being field-tested. The school analyzes the “data” from the test we cannot see; sometimes trying to adjust instruction for a question type no longer used while guessing what students struggled with.
For many of our students the very setup of the test can be an issue. Did the student see the tabs for the paired passages? Did the student select TWO synonyms but not the third? For an English Language Learner (ELL) this would be a great achievement. Vocabulary acquisition is a long process. For example, an ELL student once told me, “I knew the poem was about fall, but I could not find the answer, so I guessed.” She did not know the word autumn, one of the choices. As a test of reading comprehension, the test failed her.
Last year, my entire class spent four hours on the untimed, enhanced-reading test. Other students, typically English Language Learners, stayed longer. Two worked the entire school day. The reading test is only one of up to as many as four tests a semester. These students are likely returning the following day to take another four-hour test on a computer. Our students are on semester block and could potentially have also taken four tests in January. Students on a year round calendar frequently face more than four of these marathons.
Is this a test of reading skill or endurance? Are we trying to trick students or find out if they are competent readers? Are we being sold some fancy tools that have no validity? What, exactly, are these tests measuring? As for their reading levels, the school division and classroom teachers have known that long before the students sit for tests.
Taken earlier in the year, the writing test is now also online and “enhanced” with technology. One practice question provides a passage with the punctuation left out. In its place are designated “hot spots.” Students are given a variety of punctuation marks (commas, semi-colon, periods, etc.) and must click and drag them into the hot spots. Not every hot spot is used. Students may have to correctly place as many as four different types of punctuation. This item is worth a single point.
The state test does actually require a written piece by a student–which I find to be a positive. But on the computer, the program only allows for 52 lines of type. Should a student exceed the 52-line limit, he or she must go back and adjust the writing to fit the space. If that is not done, the student risks turning in writing without a discernable conclusion.
We know this scenario has occurred. Remediated students often explain their experience when looking at their test writing—which is returned to the school. Some have indicated that they either spent time cutting text or they became flummoxed when the computer wouldn’t let them write any more. Students are told in advance but not all hear or are present or…. Proctors in the test are not allowed to say anything other than “Read your directions and do the best you can.”
The writing test, by the way, is the only part of the test that must actually be scored by a human being. Pearson hires temporary workers for this and they don’t even have to be educators.
Understanding the tools and the testing enhancements appears to be as crucial as skills in reading and writing if you are to graduate on time.