Four years ago, the Consumer Product Safety Commission decided that a simple, noncontroversial way to identify consumer hazards was to monitor the kinds of injuries that showed up in hospital emergency rooms.

The network it created -- the National Electronic Injury Surveillance System (NEISS) -- quickly became the backbone of the agency's injury reporting system. It also proved to be anything but simple and noncontroversial.

The most recent complaints about the system have come from manufacturers of chain saws and the CPSC itself. They have raised new concerns about the validity of the statistics, which the CPSC relies on to help decide when to recall a product or impose a safety standard.

Various government agencies had collected information from hospitals for decades, but until NEISS was created, none of the data was in a standard form.

Under the system, hospitals classify injuries according to more than 1,000 "product codes." For example, an injury caused by a saw could be listed under 10 different codes depending on what kind of saw was involved.

The hospitals' information was processed by the CPSC and extrapolated into national statistics. The agency used one multiplier to estimate how many injuries were reported at all hospital emergency rooms. It multiplied them again to estimate how many injuries occurred nationwide, because the agency figured that only 38 percent of injuries were treated in emergency rooms.

Originally, 130 hospitals were chosen by size and location to be statistically representative of emergency rooms nationwide. But in 1979, the CPSC trimmed the number of hospitals surveyed to 74 as a way of reducing the $3.25 million annual cost of NEISS. On the average, a hospital receives $11,000 for participating in the program; large hospitals receive more because they usually file longer reports.

The CPSC estimates that it spent $1.39 million on NEISS in fiscal 1982; its true cost, however, is expected to be about $864,000 because the CPSC sells NEISS data to other federal agencies.

In July, the Chain Saw Manufacturers Association, which wants to avoid having the agency impose mandatory safety standards on its members, released a study it paid a Washington consulting firm to do that said the NEISS data base was so small, its estimates were "virtually meaningless."

The study's authors, Heiden, Pittaway Associates Inc., said NEISS' error rate jumped from 26 percent to 56 percent when the CPSC cut its sample size.

If CPSC used figures from 74 hospitals to estimate that chain saws caused 50,000 injuries, the study said, the actual number of injuries could range between 22,000 and 78,000. The study also noted that all 74 hospitals did not report every month; for example, in one month, it said, only 59 hospitals reported, further weakening the system.

The study analyzed the same data the CPSC had used when it estimated that chain saw-related injuries had increased from 35,000 to 63,000 over the last five years. Yet, by applying the larger error margin, the manufacturers' study concluded that the number of chain saw injuries actually had dropped significantly during that period.

"The chain saw study shows an unfamiliarity with our system," said Michael Stahl, a special assistant to the CPSC's executive director. The CPSC figures that the system has an error rate of about 14 percent, a level acceptable to the National Center for Health Statistics, Stahl said. Thus, if the agency estimated that chain saws caused 50,000 injuries, the actual number could range from 43,028 to 56,972.

But the CPSC and the chain saw industry have been arguing about the need for a mandatory safety standard since 1976, and the staff sees the manufacturers' study as just another attempt to delay a standard. "We know of 86 deaths related to chain saws for the five-year period from 1976 through 1980," CPSC Chairman Nancy Harvey Steorts said recently. Those figures -- taken from death certificates, not statistical samples -- prove there is a need for a standard, she said.

The chain saw study followed an internal CPSC report earlier this year that also criticized NEISS, but for a different reason. It said the system wasted money by collecting too much data. It recommended dropping six large hospitals from the survey to save $130,000 a year, collecting data only once or twice a week, instead of daily, and cutting the number of NEISS product codes.

The commission voted, 3 to 2, against dropping the six hospitals, but only after much debate.

Commissioner Stuart M. Statler, in a 19-page dissent, called NEISS "an albatross of no small dimensions."

"Does this agency . . . really need to know that a tombstone may topple over and injure a bystander's leg? That a few people each year trip over telephone cords and bump into walls?" he asked. "It is an embarrassment that this commission collects information about frivolous incidents which are only faintly associated with consumer products." He said the CPSC should rely on a common-sense approach to identify problems and should cut the 1,000-product-code list to 100 priority items.

Commissioner R. David Pittle responded with a 29-page defense, calling NEISS a "marvelous tool."

While keeping track of tombstones is silly, Pittle said, a thorough code is needed because "common sense too often misses the mark."

"Common sense would probably lead to a quick dismissal of . . . baby cribs or pacifiers . . . yet, because we keep data on these products, we know . . . cribs can choke infants between too-wide slats and pacifiers can be inhaled," he said.

CPSC recently agreed to drop 53 product codes, Stahl said, including injuries caused by clothespins, doorstops, corkscrews, books and combs, but it is still reviewing the task force's other recommendations.