POLLSTER Lou Harris asked 1,254 adult Americans last January, whether they agreed or disagreed with the following statement: "When (Bernhard) Goetz said in his confession that he used dum-dum bullets, that he was sorry he didn't gouge out the eyes of the four he shot, and that if he could have reloaded his gun fast enough," -- you can pause for breath here -- "he would have taken out after them, he looks more like a 'Death Wish' gunman out stalking to kill criminals, not an innocent victim just trying to defend himself."

The wonder is not that a majority agreed with the statement, but that 38 percent of the respondents had the gumption to disagree. Or perhaps they just didn't catch the reference to a years-old Charles Bronson movie.

This is an example of a practice increasingly common in polling -- and in particular in Harris polls -- of telling respondents things they probably don't know in order to see what they think when they do. Sometimes that's useful. More often it's dangerous, because the results can be easily distorted by the order in which questions are asked or by wording questions in a way that is, inadvertently or not, biased.

As an example of the latter, consider the Harris Survey on Nicaragua last month. After an 89-word introduction, the first question asked was: "How concerned are you that the U.S. will end up sending American troops to fight Nicaragua -- highly concerned, somewhat concerned, not very concerned, or not concerned at all?" Mr. Harris reports that 81 percent are concerned, though that includes 31 percent who are "somewhat" concerned. Rep. Michael Barnes (D-Md.) tells colleagues "our constituents clearly recognize where the administration's policies are leading: 81 percent of them say they are concerned that the U.S. will end up sending American troops to fight Nicaragua."

Well, not exactly. More important, Republicans may well be right when they argue that raising the specter of American military involvement early and not mentioning the Sandinistas' communist ties biases all the responses to later, apparently neutrally worded questions.

How can people and politicians protect themselves against biased questions and misleading interpretations? First, they should insist on reading the actual question before they take a poll's word for anything. It's not hard to detect a leading question.

Second, they should look askance at poll results that seem to prove too much. Most ordinary citizens don't have a view on the motion to recommit subsection 6(b)(4). And, even if you tell them what subsection 6(b)(4) says, their response is, literally, off the top of their heads. It may not be a good indicator of what they would think once they've really thought about the issue.

Finally, they shouldn't expect the public through polls to resolve contradictory views. Voters want a balanced budget and no cuts in Social Security, a high level of defense spending and no cuts in aid to the truly needy. They hire politicians to figure out how they can get all these good things -- or acceptable levels of each of them -- at the same time. Polls provide useful information for politicians and others, but they don't give easy answers.