When FBI agents turn to "Big Floyd" for help on a labor racketeering case this summer, they won't be counting on a squealer to help them crack the case -- they'll be looking at a computer program for advice on what to do next.

A software special agent, Big Floyd will be the first "artificial intelligence" program to assist a criminal investigation, said William A. Bayse, the FBI's assistant director of technical services, who calls it "a terrific opportunity" to enhance the bureau's ability to fight crime.

The FBI's efforts are part of an emerging trend that's capturing both the imagination and the budgets of federal agencies: the use of artificial intelligence software to identify potential lawbreakers.

Floyd is what computer scientists call an "expert system" -- a program composed of rules devised by human experts that's designed to replicate their expertise in a computer. Nonexperts can use these computer systems much as they use human experts for advice and counsel.

The FBI, for one, is so enamored of Big Floyd and the potential of expert systems that it is seeking $12 million to create two more Floyds, one to help catch drug smugglers and the other to target potential terrorists.

These expert systems represent a radical departure from the uses to which government computers have traditionally been put. Used in the past primarily to store, retrieve and manipulate data, they are now being asked to make judgments and determine what people should do next.

"This is the next logical step in how computers will be used," said John McDermott, a Carnegie-Mellon computer scientist who is developing expert systems for the Internal Revenue Service.

While expert systems will no doubt play an important role in aiding the internal operations of federal agencies, their application as a profiler of potential troublemakers raises disturbing questions about privacy and individual rights.

"When Newsweek makes a mistake doing a computer profile of its subscribers, the worst that can happen is that you get junk mail," said Jerry Berman of the American Civil Liberties Union's project on privacy and technology. "When the government does it, you may be subjected to an intrusive investigation."

A recent report from the Office of Technology Assessment pointed out that "important privacy and constitutional implications are raised by computer profiling because people may be treated differently before they have done anything to warrant such treatment."

Indeed, the OTA report emphasized that there are no formal policies governing the creation or use of these expert system profiles, although agencies may have their own internal guidelines.

Nevertheless, the technology is being developed by a broad spectrum of regulatory and law enforcement agencies. The Environmental Protection Agency is examining how expert systems might identify likely polluters; the Treasury Department hopes expert systems can target money-laundering banks; the Energy Department's inspector general wants to use an expert system to check if its contractors are cutting corners; Customs is creating one to target drug smugglers; and the IRS is developing an expert system to spot tax cheats.

A major force behind the expert systems push is cost: Computer time is cheaper than human labor. Faced with shrinking budgets, many federal agencies see expert systems as a cost-effective way to help determine how to allocate their enforcement dollars.

"With our limited surveillance and enforcement budgets, we're always looking for ways to make our enforcement more efficient," said Richard H. Mayes, the EPA's acting assistant administrator for enforcement and compliance monitoring.

Consequently, expert systems developers see a growing federal market for their services.

"The use of expert systems in the federal government will go up by hundreds of percent within the next five years," predicts Daniel A. DeSalvo, a program manager with American Management Systems, the prime contractor for the DOE expert systems contract. " Expert systems profiling will be a real strong area."

The heart of the software is the rules culled from human experts in the relevant topic area.

The FBI interviewed "15 to 20 of our top labor racketeering people over a year and a half," and came up with nearly a thousand investigative tips and rules on how to run a case, said Bayse.

With the help of Defense Department artificial intelligence experts, those rules were fashioned into a computer program that can be updated and modified. The computer screen displays a number of "windows" representing paths of possible inquiries investigators can pursue and Big Floyd engages his human interlocutors in a "conversation."

"You start with a suspect," said Bayse, and Floyd makes suggestions such as, "Would you like for me to look in greater detail?"

"For example, if data indicated that money changed hands between two suspects, Floyd might suggest the transaction was a bribe," said Bayse. "Floyd recommends and suggests -- sometimes more adamantly than others. Floyd can recommend a wiretap or suggest the existence of a felony violation or possibly recommend that someone else in the investigation should become a suspect."

Floyd's advice is designed to stay within the boundaries of law; Bayse emphasized that a human agent can examine every step of logic Floyd uses in making its recommendations to see if they are valid.

The FBI's National Center for the Analysis of Violent Crimes is also developing expert systems to help identify potential serial killers, arsonists and rapists.

By contrast, the Energy Department inspector general's expert system, AMS's DeSalvo said, is "designed for identifying patterns of contractor behavior that may be something they want to look at."

Programmed with the appropriate rules, an expert system can pick up subtleties and obscure points that human experts often lose amidst a mass of data.

For example, the DOE expert system might examine contractor data and raise a red flag if its rules indicate that a contractor is requesting an unusual amount of overtime or has an unusual way of compensating a subcontractor.

An expert system could evaluate thousands of such contracts and identify the likeliest violators, according to the system's rules, in a fraction of the time it would take human inspectors to do the work.

"The payoff comes from matching that profile against the huge amounts of data in the database," said Carnegie Mellon's McDermott. "That's something that people can't do well -- the tediousness of that is overwhelming."

"Initially, I was pooh-poohing expert systems ," said Roger Cooper, deputy assistant secretary for information systems at the Treasury Department. "But there are a lot of things you can use them for -- it does make sense. . . . I'm going through our 1986 budget internally and we're looking for artificial intelligence funding."

Cooper said "the enforcement side of the house" is actively involved in expert system development: The Secret Service is developing a classified expert system to help advise it on potential presidential assassins. Expert systems profile programs also are being developed to target potential counterfeiters and bombers.

Their catch, however, is that "if you stick a bum rule in there, you're going to get a bum judgment," AMS's DeSalvo conceded. It may prove difficult to determine which rules in an expert system are flawed.

There are also concerns about which rules should be built into expert systems. For example, should expert systems use criteria involving a person's gender or skin color? Who determines whether computer rules are potentially discriminatory?

To compound the concerns, there are inherent statistical problems with many expert systems because they often rely on combining probabilities that render their judgments ambiguous.

The reliability of such expert systems-generated profiles to target potential lawbreakers has yet to be demonstrated, although the technology has shown great promise in a number of fields.

Even if such systems do prove reliable, there remain unresolved legal concerns. For example, could the recommendation of an expert system create enough of a "probable cause" to justify a search warrant? The law is unclear.

The FBI's Bayse insists that "Privacy is a No. 1" consideration in the design of Big Floyd and that "even the dialogues between man and machine are reviewed by the lawyers" to ensure that Floyd's advice stays within the law.

However, Bayse declined to say whether a defense attorney would be allowed to see the computer-programmed reasons used by Big Floyd to finger his client.

Ultimately, the legal status afforded to expert systems-generated recommendations may be determined in court.