If you think about it for a moment, the word "ansurge" makes perfectly good sense. Temple G. Porter (as quoted in Washington author Paul Dickson's book) tells us that it means the "irresistible urge to answer a ringing telephone, no matter how inconvenient the hour or the circumstances," and Porter should know because he invented the word.

Inventing words is a game anyone can play, though few of us are as adept as Porter or as Lewis Carroll, who has brightened the lives of devotees with "brillig" and "slithy" and actually managed to land "chortle" (a combination of "chuckle" and "snort") in the standard dictionaries. In Washington, the game is likely to produce such monsters as "deinstitutionalization" and "reprioritization," both of which are held up to scorn in "Words" -- a handy and delightful compendium of words, arranged in all sorts of topical groupings, that lives up precisely to the description in its subtitle.

When we start to meditate on the strange words collected by Dickson (including 2,231 synonyms for "drunk") we may perhaps wonder why some words work ("tintinnabulation," for example) while others (such as "obconic") obviously do not. Here we begin to get into Jeremy Campbell's territory: information theory. While Dickson has been busily collecting words to send them marching past the reader in an odd sort of parade, Campbell, also a Washingtonian, has been looking into some of the rules (unwritten rules -- or written by unknown forces in media much stranger than paper) that underlie our use of words and other tools of communication.

We follow such rules and have since life began, but our awareness has become fully scientific only with the development (in the last generation) of information theory. Messages (including rules, directions, structural principles) seem to be planted everywhere. A sea creature evolving into an amphibian; a concertgoer enjoying a Beethoven piano sonata essentially as Beethoven intended despite wrong notes, coughs, the noise of an air conditioner and the rustling of programs; a child instinctively learning the grammatical structures of a language; a radar apparatus tracking an airplane through the random electronic "noise" in the atmosphere; a scientist developing a hypothesis from fragmentary or confused evidence: all are "reading" such messages and are therefore of interest to this new science. All are going beyond elements of confusion in an informative process to decipher what lies below the noise.

Information theory, which deals with ways of detecting and interpreting meaningful structures in phenomena that can be thought of as messages or packages of information, is affecting the philosophy and practice of many scientific disciplines. Its principles extend to such areas as the deciphering of messages encoded in DNA molecules, Noam Chomsky's theories on linguistics, the development of computers and the study of the human brain.

It is developing new answers to many old questions, including hints of a solution to a problem that has puzzled science for generations:

"Why is the world full of the most improbable kinds of order, when the most probable state for the universe is one of pure chaos?"

Scientific gloom about the probability of chaos is based on its knowledge of the second law of thermodynamics, discovered in the last century by one Rudolf Clausius, which deals with the law or process of entropy -- the constant, inexorable tendency of matter to lose its dynamic potentials. A gain of entropy occurs, as Campbell explains, "every time heat flows from a higher to a lower temperature, and since nothing interesting or useful happens unless heat does make this descent, all interesting and useful things are accompanied by an irreversible increase in entropy."

Chaos, at least in scientific terms, is a state of maximum homogeneity; contrasts are what make things happen in nature, including the processes of life itself, and they are gradually being worn down in nature's countless thermodynamic interactions.

Entropy began to impinge on information theory from its origins -- which grew out of research on radar guidance and communication systems during World War II. Norbert Wiener, whose contribution to information theory came chiefly through his work on radar guidance for antiaircraft weapons, "would walk into a doctoral candidate's room, puffing at a cigar, and say: 'Information is entropy,' " Campbell reports. "Then he would turn around and walk out again without another word." That was in 1947. A year later, Claude Shannon, who was then an engineer at the Bell Laboratories, published a much more coherent and comprehensive treatment of the subject, a paper that is the cornerstone of the whole science of information theory and one in which the formulas seemed to have strange parallels with the formulas related to entropy.

These are complex and abstruse fields, but Campbell's survey of them is a model of clarity -- appropriately, considering his subject matter. He traces his subject patiently from a historical point of view, telling how the underlying theories were developed, and he makes frequent and graceful excursions into the bewildering variety of subjects that relate to his main theme. "Grammatical Man" will introduce the nonspecialized reader easily and enjoyably to a strange and fascinating new world of ideas that is certain to have a major impact on the future of mankind.