Investigations of catastrophes often conclude by assigning blame to "human error," and psychologists and psychiatrists have evolved elaborate theories involving death wishes to explain why, for instance, pilots land with wheels up. But psychologist Donald A. Norman thinks the explanation is a lot simpler. Everyone makes "slips" -- doing one thing when intending to do something else -- which are nothing more than minor malfunctions of our internal information processing machinery. Few technological systems allow for this. "Modest systems of all sorts are inconsiderate of human beings," he writes in the April issue of Psychology Today. ". . . In the design of machines, the machine comes first. People are forced to be the servant of the machine. System designs do not pay enough attention to human functioning." Norman has made a study of slips, which range from trying to get out of the car without undoing the seat belt to retracting the wings while trying to land an F-111. He notes the importance of two things: "side effects" and "forcing functions." Forcing functions are characteristics of systems that tend to correct slips. Try to drive a car with the emergency brake on and the system lets you know right away something is wrong. But get out and leave your lights on and there is not such correction. What you get is a side effect: a dead battery. Norman notes that these simple concepts apply equally to more complex systems. "The accident at Three Mile Island was intensified by just such a side effect slip. Two valves had been closed in the auxiliary feedwater system to allow servicing. They were not reopened, and it took the operators eight minutes to discover the fact, a critical eight minutes at the very start of the incident." What is needed, he suggests, is more attention to forcing functions in technical systems. "This strategy," he admits, "is much easier to propose than to implement, but, clearly, forcing functions are worth far more effort than we have put in so far."