Pinpointing the flaws that led to the Challenger's destruction could take anywhere from six months to a year unless there is "breakthrough" information lurking in the reams of data NASA has amassed, according to a spokesman at the Kennedy Space Center in Florida.
But it is also possible, the spokesman warned, that the space shuttle's telemetry system -- the elaborate network of sensors and computer technology that beams a millisecond-by-millisecond status report to ground control -- might not unlock the mystery of what caused Tuesday morning's explosion that killed the seven-member crew and destroyed the space shuttle.
Jim Mizell, a telemetry expert and spokesman for the National Aeronautics and Space Administration, said the problem that touched off the explosion could have occurred at a spot where there was no instrument sending back information.
He said the problem also might have been missed because of the frequency at which the sensors monitored developments on the spacecraft.
"There are always scenarios that can defeat any known sensing ," said John Clark, director of space applications and space technologies for RCA Corp. who sat on the review board for the nearly disastrous Apollo 13 mission in 1970. "It's conceivable that everything was normal on the telemetry until the explosion."
NASA officials said yesterday that an initial review of Challenger's telemetry revealed nothing abnormal. But a close analysis of those readings could be the key to determining what happened to the shuttle.
Clark pointed out that telemetry readings were the "indispensable" part of the Apollo 13 investigation and that, with advances in technology, the shuttle's "nervous system" of sensors should ultimately prove revealing.
Up until the instant the shuttle exploded, the Challenger's telemetry system had sent millions of bits of computer data to the Johnson Space Center in Houston and Kennedy Space Center at Cape Canaveral.
"From the time of liftoff to the termination of flight, we track the temperatures, pressures, vibrations and steering commands of most critical systems of the space shuttle," Mizell said.
The sensors are supposed to determine if something is wrong and trigger the appropriate response from the shuttle's main computer systems to correct it. Just where and how many sensors the shuttle should have has long been a design question NASA and Rockwell International engineers have pondered.
Historically, the space agency has preferred to have more rather than fewer sensors. Indeed, many shuttle missions have been delayed because of sensor readings that were proven incorrect.
There are several hundred pressure and temperature sensors along the giant external fuel tank, hundreds along the solid rocket boosters, and thousands of sensors monitoring the space shuttle's main engines.
All of them feed millisecond-by-millisecond updates into special "control units," or computers, that quickly analyze and relay the information to computers inside the shuttle. Those computers in turn relay the data to ground stations across the country.
"Those machines put out more data in an instant than a person can read in a day," Mizell said. "By the time we get it all printed out, it will more than fill a room."
Part of the problem NASA now faces is bringing all the data together to assure that all the information recorded at different NASA sites is appropriately synchronized for playback on NASA computers, much like editing a movie with hundreds of sound tracks.
Even if there were no sensors near the spot where the accident began, a leak or change in pressure or temperature subsequently could show up on other sensor systems.
For example, a change of temperature in a nonsensored section of the external fuel tank could lead to a pressure change somewhere else.
"We're not going to neglect a single" bit of information, Mizell said.
NASA systems and computer specialists face the daunting task of poring over each millisecond of data recorded during the 74-second flight. The teams of specialists will use computers to enhance or magnify the data and to correlate what the various sensors were finding at any particular time.
Just getting the tapes ready for certain types of computer enhancement could take weeks.
One topic of concern is that initial telemetry readings offered no clue to the cause of the explosion.
That is disconcerting to some scientists because it does not compare with any simulated accident computer sequence. That will make it more difficult initially to focus the investigation, although the extensive visual record, including that of infrared cameras, may offer clues.
One other possibility is that sensors detected something wrong, but hardware or software problems in the computers that link the sensors may have obscured or misinterpreted the data.