Although preliminary evidence collected by the National Transportation Safety Board is somewhat conflicting, the most plausible explanation for the crash of Northwest Flight 255 in Detroit last Sunday puts the blame for the deaths of at least 156 people squarely in the cockpit.

Specialists who have devoted their careers to studying human performance in aviation said it is too simple just to blame the individual pilots. They suggest that the crash raises the issue of whether flight crews today have lost their discipline and reflexes to complacency nurtured by automation and seniority.

The cockpit voice recording from Flight 255 tells investigators that two experienced pilots were casual at best in running through their preflight checklist. The flight data recorder -- contrary to a Northwest pilot witness -- tells them that the plane's wing flaps were not extended, a procedure vital for successful takeoff. A warning device that should have sounded did not.

Information provided to The Washington Post by the National Aeronautics and Space Administration shows that other pilots have made similar mistakes, but have lived to talk about them.

For the airline industry and federal regulators, the question is what can be done to improve training and equipment to prevent what the specialists call a "human factors" accident.

"The real problem is for us to be smart enough to design systems -- whether they are for airplanes or nuclear power plant control rooms -- that will recognize that people will make blunders and that will help them recognize the blunders before they create an accident," said John K. Lauber, who headed a human factors research group at NASA before he became a member of the safety board.

Safety board statistics show that for the five years ending in 1984, pilot error was at least a factor in 40 percent of airline accidents, whether fatal or not. Errors by controllers, mechanics and others not in the cockpit were a factor in exactly half the accidents.

Marvelous new airplanes -- such as the McDonnell Douglas MD80 that crashed in Detroit and the new Boeing offerings, the 737-300, 757 and 767 -- are so automated that about the only thing left for the pilots to do is to monitor systems.

"The pilots are feeling that they are out of the loop," said Earl L. Weiner, of the University of Miami's department of management science. Weiner has written extensively about flight crews and automation. "What is occurring is that the equipment is so reliable and works so well that often {crew members} are not able to maintain an alertness and awareness of what's going on. It doesn't mean they aren't good pilots and good men," he said.

Henry Duffy, president of the Air Line Pilots Association, said, "You take 18,000 flights a day {nationwide} and get a crew flying six legs and doing the same things time after time after time, it takes a special discipline to make them continue to do that right . . . . It's the kind of thing you never let up on."

History established long before Detroit shows that experienced, senior captains, just like other people, make incredible mistakes:The world's worst aviation disaster, on March 27, 1977, came when KLM Captain Jacob Louis Veldhuyzen van Zanten, head of the airline's respected flight training department, started to take off without clearance from the air traffic control tower at Los Rodeos Airport in Tenerife, Canary Islands. Halfway down the fog-enshrouded runway he saw -- and tried to avoid -- a taxiing Pan American World Airways plane. The two Boeing 747s collided and 582 people were killed. Korean Air Lines Flight 007, piloted by a senior crew, was unaccountably more than 300 miles off course over Soviet territory when it was shot down by the Soviets on Sept. 1, 1983, killing all 269 on board. Although the reason it was off-course will always remain in dispute, fed by conspiracy theorists, the best guess is that the plane's computerized navigation system was misprogrammed by the crew. On Dec. 28, 1978, 10 people were killed when a United Airlines DC8 ran out of fuel and crashed while trying to glide into the Portland, Ore., airport. The plane had intitially circled the airport while the crew studied a landing gear problem. The flight engineer mentioned, almost casually, that the plane was running out of fuel, but the captain did not hear or react to the message until it was too late. There was no post-crash fire, probably because there was no fuel. On May 8, 1978, in a heavy fog, a National Airlines flight to Pensacola, Fla., landed in Escambia Bay about three miles short of the runway. An automatic warning device told crew members that they were descending too rapidly, but they disarmed the device and continued the descent. The water was very shallow, and a passing barge saved all but two people on board.

Clay Foushee, a NASA psychologist specializing in human factors, said, "It wouldn't be terribly surprising to see a relationship between seniority and complacency. As you get to the same thing day in and day out, it becomes automatic behavior. When humans are reacting according to their reflexes, they are not necessarily using their higher-level thought processes."

In all of those cases, a failure of the entire flight crew, not just the captain, can be inferred. "One of the most common patterns you see is a situation where subordinate crew members are most hesitant to question a senior crewman," Foushee said. "It's not too much different with what you see in any management situation . . . . Unfortunately, in an air carrier cockpit, where there is not much time to make decisions, mistakes can lead to tragedy."

T. Allan McArtor, the new Federal Aviation Administration chief, thinks that complacency is the wrong word. "I'd prefer to use a term like routine professionalism, where you become so good, so familiar with the task, that there is a lack of vigilance," he said. "Our pilots are professional; they're concerned. What we need is . . . a top to bottom assessment of how are we training our pilots to manage the office."

McArtor has called a meeting Thursday for the chief pilots of the nation's airlines. "We need to make sure the crews in our cockpits work together as crews and are on top of their jobs every day," he said.

Flap settings are computed prior to takeoff based on a number of factors, including the weight of the plane, the length of the runway and the direction and speed of the wind. Then the plane's speed for liftoff is computed assuming certain flap settings. If the flaps are improperly set, the plane will not perform as expected.

Modern aircraft are equipped with warning devices that will tell the pilot at the start of takeoff if the flaps are not set properly.

If improper flap settings and a dysfunctional or disconnected warning system turn out to be major contributors to the Detroit crash, there are predecessor incidents. Sadly in aviation, there always are.

NASA's Aviation Safety Reporting System, based in Mountain View, Calif., has received voluntary reports of more than 31,000 safety-related incidents from pilots, controllers and others since 1981. Of those, a NASA computer run requested by The Post shows, seven involved mis-set flaps and one involved a malfunctioning and intentionally disarmed flap-warning system.

In all but one case where the flaps were not set, the takeoff warning system sounded and the crew aborted takeoff. In that one case, three of four flap segments were properly set; the fourth was not and the warning did not sound because the sensor was wired to the other flap on the same wing.

In July 1986, a commercial airliner was taking off from Albuquerque, according to the NASA records, when about halfway down the runway, the copilot flying the plane realized that the flaps had been set incorrectly. He was past the point where he could abort so he extended the flaps and modified the takeoff speed to compensate. The takeoff was successful.

"The biggest cause of this incident was complacency and routine," the copilot told NASA. He said that he and the captain had been flying the aircraft for six months and had made only one takeoff that demanded the flap setting needed in this instance to compensate for a strong crosswind. "The routine which had developed turned into a very tough habit to break," he said.

Another crew described a trouble-plagued flight from Dallas-Fort Worth to Memphis in December 1985. After the plane had lifted off and flaps started retracting, as is normal, the flap warning sounded. It could not be supressed, so the crew members pulled a circuit breaker to turn it off.

What they didn't know was that the same circuit breaker also controlled the alert system telling them when pressure in the passenger cabin dropped unexpectedly. Next thing they knew, the oxygen masks had descended for the use of passengers because the pressurization system was not working properly.

"Should I ever have to pull takeoff warning horn circuit breaker again, you can bet I will watch the cabin altitude like a hawk," the reporting pilot told NASA.

The most distressing report -- one that shows a renegade pilot with no respect for the rules or his own neck -- came from a copilot who said that a captain he regularly flew with "insists on occasionally exploring different performance characteristics of the aircraft on revenue flight . . . in this case zero flap takeoffs."

Reports to NASA are submitted anonymously and sanitized so airlines and crew members cannot be identified. The information may not be used by the FAA or an airline for disciplinary proceedings.

"It was believed that automation was going to remove human error at its source," Weiner noted. "That has turned out to be wrong. One of the effects I'm afraid of is that it has detrivialized human error and has relocated it. {Automation} tunes out small errors and potentiates large ones."