Several years ago, I had a chance to talk with a friend who had been an intelligence analyst during the Cold War. (He would rather I did not use his name in the paper; many old-timers are reluctant to go public, even in retirement.) We talked about intelligence analyses he had worked on. Some of the assessments were successes. Others, like the forecasts of Soviet strategic nuclear forces published in the 1950s and 1960s, were famously off the mark.

I asked my friend whether he thought the muffed estimates had caused real harm. Sure, he conceded, but one needs to keep the role of intelligence in perspective. After all, he reminded me, "No one ever stopped a policy meeting by waving an intelligence report in the air and shouting, 'But this is what the estimate says!' "

My friend's point was that intelligence is just one ingredient that shapes policy. It is supposed to inform decisions, not dictate them. This is only prudent. Intelligence assessments are rarely clear-cut. Usually there is a judgment call to be made. The responsibility for filling in those gray areas properly belongs to elected officials and their appointees.

I was reminded of that conversation by some of the recent controversies over U.S. intelligence. There have been several intelligence miscues lately--the surprise Indian nuclear test in May 1998, the surprise North Korean missile test last December and the targeting of the Chinese Embassy in Belgrade all come to mind. But the flap over the intelligence used to justify the bombing of the El Shifa Pharmaceutical Industries Co. in Khartoum, Sudan, just over a year ago, stands alone, because the problem seems to have been not just faulty intelligence but in the judgment of officials using it. The importance of skillfully filling in the gray areas became clear, first, when U.S. officials decided to strike the plant and, second, when they tried to explain their decision to the public.

The basics of the El Shifa episode are by now well known: On Aug. 7, 1998, terrorists bombed U.S. embassies in Kenya and Tanzania, killing more than 200 people. Information from a suspect apprehended in Dar es Salaam, Tanzania, and other intelligence gathered by investigators implicated the terrorist network led by Saudi expatriate Osama bin Laden. U.S. officials believed that bin Laden was using the El Shifa plant to produce VX, a nerve gas agent. Less than two weeks after the embassy bombings, 13 Tomahawk cruise missiles were fired at the plant. According to news reports, U.S. officials later told congressional committees that intercepted communications and financial transactions linked bin Laden to the plant, and that a soil sample secretly collected outside the plant by U.S. intelligence contained traces of EMPTA (a precursor chemical for VX).

The strike on El Shifa was unusual. The United States has carried out half-a-dozen similar airstrikes since the Clinton administration took office (not counting the past year's ongoing operations over Iraq). In each of the other cases, though, the operation was part of a larger war, or was direct retaliation against someone who had attacked the United States or its citizens. This time, U.S. forces struck a country simply because a facility in its territory was linked to a terrorist group. The attack could set a precedent for how we deal with terrorist organizations in the future--but it is a dangerous precedent.

The trouble, as we have since discovered, is that the link between bin Laden and El Shifa was not as close as officials first suggested. Critics (including the owner of the plant, Saudi businessman Saleh Idris, who is suing the U.S. government for damages) have presented a substantial case of their own. It turns out that many people--including several U.S. citizens from religious organizations and the business community--were familiar with the plant. Some had even been inside and found nothing other than the activities one would normally expect at a medicine factory. Also, the financial links between bin Laden and the owners of El Shifa, it turns out, were open to dispute at best. In other words, there were many gray areas that required judgment, and, in this case, judgment seems to have failed.

Such judgments were not so crucial during the Cold War. Military planners might keep a factory in Czechoslovakia on their strategic target list because it was believed to make parts for Soviet tanks, even though no one could actually prove what was inside. But the planners were not about to launch a missile at the factory unless there was a nuclear war. Today, as the El Shifa case demonstrated, our suspicions--mistaken or not--might actually be what triggers an airstrike. As a result, prudent judgment is more important than ever.

Once U.S. officials tried to justify their judgment in approving the raid, more problems emerged. Officials have often used intelligence to account for their actions after the fact. In 1962, U.S. ambassador to the United Nations Adlai Stevenson brought U.S. aerial reconnaissance photographs to the Security Council to confront his Soviet counterpart and justify the U.S. blockade during the Cuban missile crisis. Twenty-one years later, Jeane Kirkpatrick brought to the Security Council intercepted radio transmissions--"The target is destroyed"--as proof that the Soviets had downed an errant Korean airliner near Sakhalin Island. The Reagan administration also released reports of intercepted cables between Tripoli and the Libyan embassy in Berlin to prove Libya's role in the 1986 bombing of a nightclub frequented by American soldiers, which resulted in a U.S. air raid against Libya.

The problem is that, after the strike on El Shifa, U.S. officials tried to use intelligence as though it were evidence in a court case, and intelligence is usually poorly suited for that task. In a court, standards for proof are stringent--one must show a "preponderance of the evidence" in civil actions, or "evidence beyond a reasonable doubt" for criminal cases. The real world standard for intelligence more often is "give me what you've got; I need it now." We expect officials to fill in the gaps with surmises and hunches if necessary--something we would never accept in a court proceeding.

What's more, it is particularly hard to use intelligence as evidence supporting a policy decision after the fact without compromising intelligence sources. It is telling that lawyers talk about "full and complete disclosure" and "the process of discovery by the opposing legal team," while intelligence officers talk about "limiting access based on the need to know." Trying to settle public controversies by citing intelligence data will inevitably make collecting intelligence more difficult in the future.

With each round of disclosure in the Sudan incident, more U.S. intelligence sources have been exposed. Now, for example, our adversaries have a greater awareness that U.S. intelligence tracks electronic financial transactions to identify supporters of chemical and biological weapons proliferation, and chemically analyzes soil samples covertly collected near suspected facilities. One can be sure that future, would-be proliferators will protect their banking records better and be more careful not to spill stuff.

Officials need to use intelligence, make their best judgments--and then accept the public consequences. This really isn't a new issue. For example, it was U.S. intelligence that originally fingered Julius Rosenberg in 1950 as a Soviet spy. But U.S. Justice Department officials never revealed that the real source of their information was intercepted and deciphered cable traffic--a fact that the intelligence community confirmed just a few years ago. Instead, they took the time to build a legal case, independent of the intelligence--and took the heat when critics pointed to the gaps in the evidence.

Even when intelligence has been pivotal to major policy decisions, in the past U.S. officials understood that they should reveal as little as possible--even in the face of public pressure. Take for example, the event that triggered the U.S. entry into World War I, the Zimmermann Telegram, an intercepted message that exposed a German plot against the United States. (The Germans promised Mexico the land it lost in the mid-1800s in exchange for Mexico's assistance if the United States stepped up aid to the Allies.) U.S. officials released the telegram to the press without revealing that it had been intercepted by British intelligence and secretly supplied to the Americans. Even so, the resulting public reaction, combined with political leadership, was enough for a declaration of war.

So, here are a few rules of thumb for officials using intelligence to make national security decisions:

First, do not try to hide behind an intelligence estimate when you are really making a judgment about policy. Intelligence analysts are soft-skinned and make poor human shields. Eventually you will need to deal with the flak yourself.

Second, any time you use intelligence as evidence, you can be sure that it will be subjected to unrelenting scrutiny by the public and the media. They will be able to investigate at their leisure and, operating after the fact, may have better information than the intelligence community did when it made its assessment. Remember that our adversaries will be able to mount their own case to refute our evidence--sometimes using valid data, sometimes bogus. Be prepared to be second-guessed, and be frank in explaining when and why you made your best judgment.

Third, never cite a specific source of intelligence data to support your decision unless you are willing to compromise and lose the source. Bear in mind that unconvincing explanations are bound to be leaked. The intelligence sources used to justify a strike against terrorists are usually the same that are used to detect terrorist strikes, so such practices could make us more vulnerable in the future. Ask yourself: Is the policy objective worth the loss of the source? Will you ever need that source again? Is there a way of making the case almost as well by citing information that is public? Can you use information that is not quite as good to make your case by putting your own credibility on the line?

Finally, when you use intelligence as part of any policy decision, remember that increasing levels of certainty are required, depending on the severity of the consequences. Bombing to preempt a terrorist attack should usually require evidence of a clear and present danger. Decide in advance the level of certainty required to take a particular kind of action against an adversary. You will be on more solid ground if you do need to explain your actions later--or defend them in court.

Intelligence will always have gray cases where officials must exercise judgment. El Shifa may or may not have been an intelligence failure. But the record certainly suggests a failure by policy makers.

Bruce Berkowitz is co-author of the forthcoming book "Best Truth: Intelligence in the Information Age" (Yale). He was formerly an analyst at the CIA and a staff member for the Senate Intelligence Committee.