Technology

Book Review

Information Theory and the War on Noise

August 23, 2019

Richard Anuszkiewicz, Temple Series II, Loretta Howard Gallery

With a 1948 research paper, "A Mathematical Theory of Communication," Claude Shannon pretty much single handedly invented the idea of information. Before then, engineers would design logical circuits, or work to reduce noise in telephone transmission lines, on a case-by-case basis, not realizing their separate efforts could be unified under a single theory, around which an entire body of scientific work would be built.

Shannon was the first to realize all the various systems for the transmitting intelligence — telegraphy, radio, television — had the same fundamental characteristics, no matter the sender, receiver, or content. And this is true not merely in the electronic realm: Shannon was among the first to realize that our genes were carriers of information.

"Building circuits in those days was an art, with all the mess and false starting, and indefinable intuition that 'art' implies," wrote Jimmy Soni and Rob Goodman, in their 2017 biography "A Mind at Play: How Claude Shannon Invented the Information Age." After Shannon, "Designing circuits was no longer an exercise in intuition, it was a science of equations and shortcut rules."

And fortuitously, his work was instrumental at the time to the then-emerging science of building computational machines.


Density of Intelligence

At 13, Claude Shannon won first place in a Boy Scout contest for Morse Code signaling using only the body. He "had something of the machine" in him, the biographers wrote. His time at the Massachusetts Institute of Technology as a student seemed to cement this tendency. There, "math and engineering were an extension of the metal shop, and the wood shop," the biographers wrote. “A man learns to use the calculus as he learns to use the chisel or the file.”

Given a pipe wrench, an engineer should be able to write the words that would recreate that wrench exactly; And given those words, an engineer should be able to build that pipe wrench correctly: That was the introduction to engineering offered by computer science pioneer Vannevar Bush, under whom Shannon did his graduate work.

In Bush's time, early computing machines were all single purpose machines designed to answer one specific question. They were "barebones miniatures of the processes they described." Vannevar Bush built computers that embodied calculus, differential analyzers that could tackle problems too complex for humans, such as modeling the earth's magnetic fields.What Vannevar Bush created "was not so much a single machine as a huge array of machines in outline, to be rebuilt for every problem, and broken down at every solution," the authors noted. They were not quite computers, but they were heading in that direction.

Shannon's theories of communication came out of his defense work during World War II around cryptography, when he had resettled at Bell Labs. He concerned himself with the challenge with electronic communications, and its constant battle against noise. A telephone call could be hampered by failing insulation. Radio transmissions could be plagued by static.

Plus, the telephone company had a vested interest in getting more out of its lines. Telephone frequencies ran between 200 to 3,200 hz, giving them a bandwidth of about 3,000 hz. What could be done in this range?

The challenge with communication is reproducing a message from one point that was sent from somewhere else, in as much fidelity as possible. The more information that is conveyed, the less uncertainty there we have about the subject itself.

Matias Faldbakken

We can calculate the "density of intelligence" of a signal by caluculating the largest number of discrete values possible in the form. Meaning can only be shared when we agree on the signals we use beforehand. A symbol gains value as its alternatives are killed off.

This is where the idea of the 'bit' came into being, as the definition of a piece of information that is chosen between two alternatives. The more equal the possibility of either alternative, the more surprising the actual choice, i.e. "information," really is. If we can predict what someone will say, for instance a speech at the graduation ceremony, the less information it carries.

In effect, Shannon taught engineers to think about information in terms of probability. Like a drunk person walking down the street, information is a stochastic beast, neither fully predictable, nor fully random. What information really measures, Soni and Goodman write, is the uncertainity we overcome interpreting the message.

And all communication works like this, "from waves sent over electrical wires, to the letters agreed upon to symbolize words, to the words agreed upon to symbolize things," the authors write. "The real measure of information is not in the symbols we send. It is in the symbols we could have sent but did not."

Back