|Subject||Information Theory, Systems Theory, Cybernetics, Linguistics|
|Publisher||Simon & Schuster|
Grammatical Man: Information, Entropy, Language, and Life is a 1982 book written by the Evening Standard's Washington correspondent, Jeremy Campbell.  The book touches on topics of probability, Information Theory, cybernetics, genetics and linguistics. The book frames and examines existence, from the Big Bang to DNA to human communication to artificial intelligence, in terms of information processes. The text consists of a foreword, twenty-one chapters, and an afterword. It is divided into four parts: Establishing the Theory of Information; Nature as an Information Process; Coding Language, Coding Life; How the Brain Puts It All Together.
Part 1: Establishing the Theory of Information
- The book's first chapter, The Second Law and the Yellow Peril, introduces the concept of entropy and gives brief outlines of the histories of Information Theory, and cybernetics, examining World War II figures such as Claude Shannon and Norbert Wiener.
- The Noise of Heat gives an outline of the history of thermodynamics, focusing on Rudolf Clausius's 2nd Law and its relation to order and information.
- In The Demon Possessed Campbell examines the concept of entropy and presents entropy as missing information.
- Chapter Four, A Nest of Subtleties and Traps, takes its name from a critique of one of the earliest theorems in probability theory, Law of large numbers (Bernoulli, 1713). The chapter outlines the history of probability, touching on characters such as Gerolamo Cardano, Antoine Gombaud, Bernoulli, Richard von Mises, and John Maynard Keynes. Campbell examines information and entropy as a probability distribution of possible messages and says that subjective versus objective interpretations of probability are made largely obsolete by an understanding of the relationship between probability and information.
- Not Too Dull, Not Too Exciting addresses the problem of clarifying order from disorder within communication by highlighting the role that redundancy plays in information theory.
- In the last chapter of Part 1, The Struggle Against Randomness, Campbell addresses the concepts published by Shannon in 1948—that a message can be sent from one place to another, even under noisy conditions, and be as free from error as the sender cares to make it, as long as it is coded in the proper form.
Part 2: Nature as an Information Process
- Campbell uses Arrows in All Directions discusses the potential inverse relation between entropy and novelty, invoking such concepts as Laplace's Superman. Campbell quotes David Layzer:
For Laplace's "intelligence," as for the God of Plato, Galileo and Einstein, the past and future coexist on equal terms, like the two rays into which an arbitrarily chosen point divides a straight line. If the theories I have presented are correct, however, not even the ultimate computer --the universe itself-- ever contains enough information to specify completely its own future states. The present moment always contains an element of genuine novelty and the future is never wholly predictable. Because biological processes also generate information and because consciousness enables us to experience those processes directly, the intuitive perception of the world as unfolding in time captures one of the most deepseated properties of the universe.
- Chapter 8, Chemical Word and Chemical Deed, examines the processes of DNA as information processes. Campbell makes the distinction between first order DNA messages and second order, or structural, DNA messages (e.g., "how to bake a cake" versus "how to read a recipe"). This distinction he relates to the linguistic principles of Noam Chomsky's Universal Grammar.
- In Jumping the Complexity Barrier, Campbell discusses the concept of emergence and notes that Information Theory, thermodynamics, linguistics, and the theory of evolution make significant use of terms and phrases such as "complexity," "novelty," and "constraints on possibilities." Campbell writes:
To understand complex systems, such as a large computer or a living organism, we cannot use ordinary, formal logic, which deals with events that definitely will happen or definitely will not happen. A probabilistic logic is needed, one that makes statements about how likely or unlikely it is that various events will happen.
- Something Rather Subtle
Part 3: Coding Language, Coding Life
- Algorithms and Evolution
- Partly Green Till the Day We Die
- No Need for Ancient Astronauts
- The Clear and the Noisy Messages of Language
- A Mirror of the Mind
Part 4: How the Brain Puts It All Together
- The Brain as Cat on a Hot Tin Roof and Other Fallacies
- The Strategies of Seeing
- The Bottom and Top of Memory
- The Information of Dreams
- The Left and Right of Knowing
- The Second-Theorem Society
Afterword: Aristotle and DNA
This section is empty. You can help by adding to it. (April 2012)
- Information Theory
- Claude Shannon
- Norbert Wiener
- Systems Theory
- Noam Chomsky
- Universal Grammar
- "GRAMMATICAL MAN: Information, Entropy, Language, and Life". Kirkus Reviews. Retrieved 2014-12-02.