The Information: A History, A Theory, A Flood

How information became the fundamental currency of the universe

Before there was data, before there was bandwidth, before there was the internet — before any of that, there was information. James Gleick's masterwork traces how we discovered that information isn't just a thing we exchange. It's fundamental to how the universe works.

The Central Insight

Information has a very specific meaning in science, and it’s not what you might think.

It’s not about meaning or truth. It’s about surprise — the reduction of uncertainty. A message contains more information if it’s less predictable.

"Information is the resolution of uncertainty." — Claude Shannon

Shannon’s insight in 1948: you can measure information mathematically. It has units (bits). It obeys laws. It can be quantified, compressed, transmitted, and stored.A bit — binary digit — is the answer to a yes/no question. Everything else is built from bits.

This changed everything.


The Historical Journey

Gleick takes you through the entire arc of information’s story:

The African Drums

Long-distance communication in Africa used talking drums — not codes, but mimicking the tonal patterns of speech itself. Europeans heard noise; locals heard language.

The drums had to be redundant to overcome the channel’s limitations. This is information theory avant la lettre.

The Telegraph

Suddenly, information could travel faster than physical objects. Messages could leap across continents in seconds.

This required new thinking: How do you encode language efficiently? How do you deal with noise and errors? These aren’t just engineering problems — they’re fundamental questions about communication itself.

Claude Shannon’s Breakthrough

In 1948, Shannon published “A Mathematical Theory of Communication” — one of the most important papers of the 20th century.

He showed:

  • Information can be quantified (entropy = surprise)
  • There’s a maximum rate for error-free transmission (channel capacity)
  • Redundancy helps overcome noise
  • Meaning is irrelevant to the engineering problem

Shannon's theory works whether you're transmitting Shakespeare or random gibberish. Information theory cares about the structure of messages, not their content.

The Digital Revolution

Once information became mathematical, it became universal. Everything that can be encoded can be transmitted, stored, copied, compressed.

Music becomes numbers. Images become numbers. DNA is a code. The brain processes information.

We live in the information age not just because we have computers, but because we understand that information is substrate-independent. It doesn’t matter if it’s written in ink, stored in silicon, or encoded in RNA.


Why This Is Deep

The book connects information theory to:

Thermodynamics: Entropy in physics is deeply related to information. Maxwell’s demon, the heat death of the universe, black holes — they’re all about information.

Biology: DNA is a 3-billion-letter code. Evolution is information transfer across generations. Life is information that wants to persist.Schrödinger’s “What Is Life?” predicted this connection decades before we understood DNA!

Computation: Turing showed that computation is about manipulating symbols. Information theory showed that information has physical limits. Together, they birthed the computer age.

Language: Every human language is a compression algorithm — mapping infinite possible thoughts into finite sounds and symbols.

The Information Flood

We went from information scarcity to information overload in the blink of a historical eye.

Gleick explores the paradox: we have more information than ever, but finding signal in the noise is harder than ever.

Google, Wikipedia, social media — we’re drowning in data while starving for wisdom.

The book was published in 2011, and the flood has only accelerated. Gleick’s analysis feels prescient.

What Makes This Book Special

The scope: From African drums to quantum computing. From Babbage to Turing to Shannon. Gleick connects everything.

The depth: This isn’t pop science fluff. Gleick takes you deep into information theory, entropy, coding theory — but always with clarity.

The writing: Gleick is a master. Every chapter is meticulously researched, elegantly structured, and genuinely engaging.

The ambition: Most books tell you about a topic. This book changes how you see the world.


The Hard Parts

I won’t lie: this book is dense. 500+ pages covering mathematics, history, biology, computer science, linguistics, and philosophy.

Some chapters require concentration. The sections on entropy and Maxwell’s demon took me multiple reads.

But Gleick never makes things harder than they need to be. He respects your intelligence while explaining clearly.

What Stuck With Me

Information wants to be free: Not as a political slogan, but as a physical fact. Information can be copied perfectly and infinitely. This broke economics, law, and culture.

Life is information: We’re not just made of atoms. We’re made of information encoded in atoms. DNA is software; proteins are machines built from that software.

The universe computes: Every physical process is information processing. Reality itself might be fundamentally computational.

Meaning is separate from information: Shannon deliberately excluded meaning from information theory. The engineering problem is transmission, not interpretation. But meaning is what we care about.

Who Should Read This

Essential for:

  • Anyone interested in information theory, computer science, or AI
  • People curious about how ideas change the world
  • History buffs who want deep technical history
  • Anyone who wants to understand the digital age at a fundamental level

Skip it if you want:

  • Quick, easy reading
  • Practical applications
  • Simple answers
  • Surface-level explanations

The Big Picture

Before Shannon, information was vague — data, facts, knowledge, news. After Shannon, information is precise: measurable, mathematical, fundamental.

We now understand that:

  • Genes carry information
  • Brains process information
  • Computers manipulate information
  • The universe itself might be made of information

"Information is information, not matter or energy. No materialism which does not admit this can survive at the present day." — Norbert Wiener

Gleick shows how this insight emerged, evolved, and reshaped our world. It’s one of the great intellectual adventures of the modern era.

My Takeaway

We’re living in the consequences of Shannon’s breakthrough. Every time you stream music, send a text, search Google, or use GPS — you’re using information theory.

But more than that: understanding information theory changes how you think about language, biology, physics, and meaning itself.

Information isn’t just what computers process. It’s what you’re reading right now. It’s what DNA encodes. It’s what makes the difference between order and chaos.

We are, fundamentally, information processing systems trying to make sense of an information-rich universe.

That’s beautiful. And that’s what Gleick helps you see.


Also recommended: “A Mind at Play” by Soni & Goodman (Shannon biography), and Shannon’s original 1948 paper (surprisingly readable!).


top