If information about information is metadata, that's what we have here. As most of us are trying to drink from a firehose of information that clogs our eyes, ears, and every mailbox, James Glieck (author of "Chaos") helps us to step back to view the larger picture, with a longer perspective. His history starts with the surprisingly effective transmission of information over distance by African drums, and continues through the Chappe brothers' "telegraph", a method of sending coded messages by a system of flags mounted on movable beams (sort of a robotic semaphore). The scientific study of information seems to have truly blossomed with the publication in 1948 of "A Mathematical Theory of Communication" by Claude Shannon in a Bell Laboratory technical journal. A connection with statistical mechanics occurs in the identification of information as a kind of negative entropy. It is ironic that Gleick's book about information, that gave us another 544 pages of it, stopped just short of the first political revolution to be organized via social media.