President Reagan's dream of an effective "Star Wars" system of antimissile defenses is almost certainly doomed to failure, according to a growing number of top computer programming experts who say there is no conceivable way to write and test the software that would be needed to operate it with adequate reliability.

Officials of the Strategic Defense Initiative, as the program is properly called, have long cited computer software -- the programs that instruct computers how to operate -- as their biggest single technical obstacle, but they insist that with enough time and money, it can be overcome.

Some argue that a high degree of reliability is not necessary -- that less-than-perfect reliability will be sufficient, since the Soviets would never have enough confidence to launch an attack against an America protected by Star Wars defenses, even if they knew there might be some bugs in the defense's computers.

Somewhat more tempered optimism is expressed by a panel of computer experts recruited to advise the Pentagon's SDI Organization (SDIO). Some members concede that while it is impossible to eliminate software bugs that could make the hardware malfunction, it may be possible to design systems that quickly isolate malfunctioning components, limiting the damage they can do. Others hold simply that SDI is a research program in its early stages, and that it is too soon to say it can't be done.

Outside SDIO, on the other hand, leading software engineers are mostly pessimistic. They say SDI officials underestimate the difficulty of the software problem and overestimate the capabilities of software engineering. Many say flatly that SDI's goals are impossible to achieve given the current state of the software writing art and that no foreseeable advance within this century will change that.

While lasers and other beam weapons have dominated much of the public perception of the technical side of SDI, relatively little popular attention has focused on the fact that the entire system would have to operate completely automatically, under the control of a network of computer programs that would, all sides agree, comprise the longest, most complex piece of software ever created.

Because the Star Wars system would have to respond so fast and be so highly effective, there would be no time for human intervention, no time even to "wake the president," as one SDI official put it, before committing the United States to war.

The first engagement of a nuclear war -- and perhaps the last -- would have to be entirely under the control of a computer programmed in advance on the basis of assumptions about how the Soviets would attack and how the United States should respond.

Computers linked to orbiting sensors would have to be the first to detect an attack. Computers would have to discriminate between thousands of real weapons and tens of thousands of decoys meant to waste U.S. firepower. Computers would have to calculate the trajectories of all objects in the "threat cloud." Computers would have to determine the nature of the attack and select an appropriate strategy for responding, selecting the highest priority targets and assigning them to orbiting battle stations armed with lasers or other weapons. Computers would have to aim the weapons. Computers would have to verify that Soviet missiles and warheads had been destroyed. And the computers could not "go down" if the Soviets happened to blow up a hydrogen bomb in their vicinity.

In a matter of seconds or, at most, a few minutes, an antimissile system of the sort envisioned by President Reagan and SDI officials would have to make all the decisions that in a conventional war would be made by legions of reconnaisance experts, field commanders, generals, the Joint Chiefs of Staff and the commander-in-chief over a period of days, weeks and months.

"People just don't seem to understand that software isn't like most other engineering problems. There are some fundamental reasons why it can never be made reliable enough that you could have confidence Star Wars would really work," said David L. Parnas, one of the computer world's most respected authorities on large-scale programming. Parnas, a U.S. citizen, is a professor at the University of Victoria in British Columbia. "I'm not saying it's impossible. I'm saying you'll never know how reliable it is."

One reason for this uncertainty, SDI opponents say, is that it will not be possible to test the entire system under realistic conditions. Computer programs invariably contain errors, or bugs, that can be found only by debugging -- running the program, trying to make it perform as intended, seeing where it goes wrong, rewriting the erroneous code and rerunning the program.

SDI advocates say it would not be necessary to test the software of a defensive system under realistic conditions, because programs can be debugged by running them on simulators.

Last June, Parnas was appointed by SDIO to its advisory committee on "battle management software." Parnas, who says he supports Reagan's goal of eliminating the threat of nuclear weapons and who has worked on military aircraft computing problems for many years, attended the panel's first session.

After meeting the other members and hearing SDIO's expectations, he quit in frustration.

"In March 1983," Parnas wrote in his letter of resignation, "the president asked us, as members of the scientific community, to provide the means of rendering nuclear weapons impotent and obsolete. I believe that it is our duty as scientists and engineers to reply that we have no technological magic that will accomplish that."

Parnas' resignation, accompanied by eight technical papers that he said explained why the software could not work as desired, galvanized the software engineering community and set the terms of a debate that continues to rage on campuses, where more and more software specialists are declaring their skepticism. The debate has become a prime topic on certain electronic "bulletin boards," by which many computer professionals communicate.

"I do believe, with Parnas and many others, that the software required simply cannot be produced to the degree of confidence without which it would be a meaningless exercise," Joseph Weizenbaum, a computer expert at the Massachusetts Institute of Technology, told his colleagues via the bulletin board.

"If the physics of the problem permits a good antimissile defense," countered John McCarthy of Stanford University's artificial-intelligence program, "the programs can be written and verified. However, it will be quite difficult and require dedicated work."

Larry Smarr, head of a new federally funded National Center for Supercomputing Applications at the University of Illinois, is among hundreds of physicists and growing numbers of other scientists-including software engineers -- who have signed a petition refusing to work on SDI research because of its technical dubiousness.

"In my experience as a physicist who has written some pretty large computer codes," Smarr said, "there is no way you could produce a code large enough to handle the job and do it perfectly the first time, which is what you would need. I can't imagine any developments in computer technology that would make it possible in the foreseeable future."

It is generally agreed that the software required for the Star Wars system would consist of at least 10 million lines of code, though some say it would be nearer 100 million. A line of code is an instruction, written in a programming language, telling the computer to carry out one in a series of data processing steps.

SDI advocates note that the space shuttle uses about 3 million lines of code, including the computers on the ground that control the launch and that control the flight from Houston. In the shuttle itself are about 100,000 lines.

"This is software that's evolved over many years of the space program. It's been tested on the ground many, many times. It's flown the shuttle successfully many times and yet we still have shuttle launches aborted because of software failures," Parnas said. "What happened is that in all the tests they never encountered the exact set of circumstances that revealed a bug that was in there all along.

"The SDI people say they will test all their software before deploying it, but what if they don't anticipate the exact set of circumstances that the software will encounter somewhere down the road when the Soviets decide to attack? You can't go back and fix the bug and start the nuclear war all over again."

Computer specialists know that all programs, even ones sold for commercial use, contain bugs -- many of which are not discovered until years later.

Parnas said it is not unusual for debugging to continue long after new computerized weapons are deployed in the field. "Programmers are transported by helicopter to Navy ships. Debugging notes can be found on the walls of trucks carrying computers that were used in Vietnam," Parnas said. "It is only through such modification that software becomes reliable. Such opportunities will not be available in the 30-minute war to be fought by a strategic defense battle management system.

"The largest program I ever saw that was correct the first time it was run was five lines," Parnas said.

The reason computer programming is so hard is that a human mind must think through every function the computer must perform and break the task down into a complete and flawlessly logical set of small steps. At each step where alternative outcomes are possible, the programmer must anticipate each one and add to the program a full and flawlessly logical set of instructions on how to deal with each of these outcomes.

SDI's programs would be stored in digital form in several places, some in ground-based computers and some in computers aboard orbiting platforms carrying sensors or beam weapons. The components would communicate by radio -- sending, for example, information on an enemy warhead's position from a sensor to a laser platform.

For every stage, or layer in the sequence of steps, at which alternative outcomes can occur, the program's complexity multiplies. Like a tree whose trunk divides repeatedly into tens of thousands of branch tips, the sequence of steps a program executes can lead to any of several thousand alternative outcomes. Unlike a tree, computer programs also contain many "branches" that emerge from one "limb" only to arch sideways, reentering some other branch.

Programmers say it is impossible to keep all the pathways clearly in mind so that the rules of programming logic are not violated and that every branch is always prepared to deal properly with the data that may enter it from all other connected branches.

As programmers like to say, their software usually does exactly what they tell it to do, not what they want it to do.

Typical programs for word processing or spreadsheet analysis, usually no more than a few hundred lines long, contain scores or even hundreds of bugs when first written. Only repeated use, trying out every conceivable combination of maneuvers, can reveal the bugs. Bugs remain even after most software is put on the market -- a situation that causes most manufacturers not only to deny their customers a warranty but to print a specific disclaimer of warranty.

For example, IBM's disk operating system software, the program without which no other program will run on an IBM-PC, carries the following disclaimer: "The program is provided 'as is' without warranty of any kind, either expressed or implied . . . . The entire risk as to the quality and performance of the program is with you. Should the program prove defective, you (and not IBM or an authorized personal computer dealer) assume the entire cost of all necessary servicing, repair or correction."

Top software engineers say bugs are not an indication of careless programming but a fact of life that even the best programmers must cope with. Moreover, they note, as programs grow larger, the incidence of bugs increases not in proportion, but much faster.

"You talk to people who write these big programs," Parnas said, "and you think you're talking to sociologists. They'll tell you that when they run their program it does 'funny things' that they can't predict. 'Sometimes it does this; sometimes it does that.' It's like they're trying to predict public opinion. You ask them what their program will do in such and such a situation and they say, 'I don't know. Let's try it.' "

SDIO's panel on computing, while conceding some of Parnas' points, insists that these concerns are not fatal to the long-range goal.

"Perfection is a bit overrated," said panel chairman Danny Cohen of the University of Southern California. "There will always be bugs and malfunctions. But that doesn't mean the thing won't work. You can design the software so that when bugs turn up, they are isolated in the system. You design the system to cope with malfunctions.

"Parnas keeps talking as if there is some fundamental law of nature that says it's impossible. But there isn't. This is not like perpetual motion, where you can show mathematically that it's impossible. It will be very hard to produce this software but as long as it isn't like perpetual motion, it isn't impossible."

A different problem for SDI software, according to the critics, lies in the fact that the program must embody assumptions about the characteristics of Soviet weapons.

For example, when a swarm of warheads is hurtling over the Arctic toward the United States, they are likely to be surrounded by perhaps 10 times as many decoys -- objects designed to look like real warheads to the sensor. If the sensors cannot tell them apart, the beam weapons will have to spend precious time and energy destroying every object. If there are enough decoys, the beam weapons will not have time to destroy all the threatening objects and some warheads will slip through.

If the sensors and their computers are to distinguish the decoys, they must first be programmed to do so. "Unless the Soviets cooperate and tell us what characteristics to look for, the recognition algorithms written into the software could be wrong," Parnas said.

"If the Soviets come up with just one special trick to spoof the system and our people didn't happen to design the system to cope with that, it won't work," Smarr says. "It's going to be a Maginot Line in space."

"Parnas is putting his finger on some real technical problems," said Charles Seitz, a computer panel member from California Institute of Technology, "but these are things that SDI is researching. While Parnas is going around debating, we're studying the problems. The honest answer right now is that there is nothing today that assures us it can be done or that it can't be done. Existing software engineering practice has never encountered a problem quite like this before."

For all its optimism, the computer panel has concluded that there are limits to what software can do. The way out, as Seitz and Cohen described the panel's findings, is to rethink the nature of the hardware being considered for an antimissile system, limiting it to something that software could handle. In a formal report being prepared for SDIO, Seitz said, the panel will be "quite critical of the system architecture," the general configuration of the system's hardware that SDIO has been considering.

One change under consideration in the "system architecture" is decentralizing the battle management functions so that each orbiting battle station operates with some autonomy, said Air Force Maj. David Audley of SDIO's computer section.

"Instead of having the whole system operated by one monolithic computer," Audley said, "we're thinking now about a loose federation of battle stations."

Audley said software limitations may force acceptance of a less efficient battle management. As a result, for example, a Star Wars system might end up with two or more battle stations shooting at the same target.

Critics say that while decentralization can overcome some barriers, each semiautonomous software module would still suffer from a lack of debugging under realistic conditions. Also, any bugs or wrong assumptions programmed into one module would be present in all.

For all their pessimism, most critics concede that if the government keeps spending money on SDI, someday there will be a huge computer program that SDIO calls battle management software. "But this software will not have the reliability that you or I would consider to be essential for such a system," said James J. Horning of Digital Systems Research Center in Palo Alto, Calif. "Nor will it be possible to retrofit reliability into it. The country will be faced with a cruel dilemma: deploy a system that cannot be trusted, or scrap it."