Should law enforcement agencies be allowed to use computers to help them determine whether a person ought to be jailed or allowed out on bond? Should the military let computers decide when and on whom nuclear weapons should be used?
Questions like these have displaced concerns about hackers and computer security as the latest issue troubling computer professionals. While theft and computer viruses have not gone away as industry problems, a group of 30 computer engineers and ethicists gathered here yesterday agreed that questions about the proper use of computers is taking center stage. At issue is to what degree computers should be allowed to make significant decisions that human beings normally make.
Already, judges are consulting computers, which have been programmed to predict how certain personality types will behave, and basing their decisions more on what the computer tells them than on the arrested person's actual history.
Computers are helping doctors decide treatments for patients, they played a major role in the July 1988 shooting of the Iranian jetliner by the USS Vincennes, and they are the backbone of this country's Strategic Defense Initiative, or "Star Wars."
"Decision-making systems are taking over and we don't have a solid idea of what is going on," said Jane Robinett, a professor at Polytechnic University in Brooklyn and one of several computer experts gathered yesterday for a conference on computer ethics.
Robinett, along with representatives from other universities, International Business Machines Corp., the Brookings Institution and several Washington theological seminaries, spent the better part of the day at Brookings, a liberal think tank, talking about what they could do to build a conscience in the computer field.
The computer industry has been marked by "creativity and drive for improvement and advancement," not by ethical concerns, said Robert J. Melford, chairman of the computing ethics subcommittee of the Institute of Electrical and Electronics Engineers.
Computer professionals, Melford said, often spend much of their time in solitude, separated from the people affected by their programs who could provide valuable feedback.
Unlike hospitals, computer companies and most organized computer users have no staff ethicists or ethics committees to ponder the consequences of what they do. And few businesses have written policies about the proper way to govern computers. But there is evidence that technical schools, at least, are beginning to work an ethical component into their curricula.
The course Robinett teaches has been required for all computer engineering majors at Polytechnic for the past two years. And the Massachusetts Institute of Technology is considering mandating five years of study instead of the current four to include work in ethics.
Conference participants yesterday said they would like to see instruction in computer ethics start with children who are just learning to use computers.
In part, worries about children stem from data that shows computer "crackers," those individuals who break into databases, are usually between the ages of 12 and 20. The number of such cases, though growing, is only about 3 percent of all computer security problems, according to statistics presented by Melford. Dishonest or disgruntled employees account for 19 percent, mistakes for 65 percent. (Water damage and infrastructure problems account for the rest, according to Melford's data.)
It is precisely because computers produce erroneous information so often, based on human error, that users should proceed slowly in assigning them great power, several people said. Ramon Barquin, who with the Washington Consulting Group does computer training for government agencies, predicted that industry excesses may lead to increased licensing, regulation and even a kind of "driver's education" for every computer user.