Posts

Showing posts with the label information theory

Information Theory - Part 5: Everything or Overhyped?

The most famous thought experiment in quantum theory, Schrodinger’s cat, raised a highly problematic question: At what size does the weirdness of quantum phenomenon give way to the “normal” behavior we observe all the time? Nothing in the maths of quantum theory put a size limit. Nor does the maths explain what constitutes an “observation” of a quantum entity. Could only living entities could make an observation? Or did instruments count too? The accepted answer today is something called decoherence, writes Charles Seife in Decoding the Universe . “Decoherence” refers to any interaction between two items in the universe (light, matter, anything else). Every such interaction constitutes a measurement made by nature. An extraction of (there’s that word again) information. The tinier or colder or more isolated something is, the longer it can stay without interacting with any other piece of nature, i.e., the longer it takes before another part of nature can “measure” it. But ev...

Information Theory - Part 4: Relativity

Moving on, Charles Seife next looks at the theory of relativity in Decoding the Universe . One of the most famous dictums of that theory is that “nothing can go faster than the speed of light”. Except that’s not what the theory says. That statement is an oversimplification: “Some things can go faster than the speed of light. Even light itself can break light speed, in a sense.” Huh? Both those statements have been proven in multiple experiments, and no, they don’t necessarily involve quantum mechanics! Even good old non-quantum experiments have shown those two statements to be true. The exact details of those experiments aren’t relevant to this blog, so I won’t get into them. Regardless, don’t such experiments prove that the speed limit imposed by relativity is being violated? It gets a bit murky, but this is what most scientists say relativity really says: “The true rule is that information can’t travel faster than the speed of light. You cannot take a bit of inform...

Information Theory - Part 3: Life

What other topics does information theory answer, Charles Seife asks and answers in Decoding the Universe . How about that question from philosophy: what is the purpose of life? And the answer increasingly seems to be: “Duplicate your (genetic) information. Sure, the programs go about the task in different ways, but the goal is always the same. Reproduction. All else is decoration – decoration that helps the program to achieve its ultimate goal.” Wonder if that’s true? Isn’t the individual trying to reproduce the entire organism? How can one say that information within is trying to reproduce (only) itself? Are we stuck with a chicken and egg question here, each left to his own preference with no way to prove/know? Aha, but organisms in nature increasingly align with one view: “That the organism reproduces is just a by-product of the information duplicating itself… sometimes .” This view explains why an ant colony has only fertile organism – the queen. By the co...

Information Theory - Part 2: Maxwell's Demon

Image
How can a theory about information and noise be considered as profound as relativity and quantum theory? Sounds ridiculous, right? So let’s check out the first big science-y problem that information theory solved. The second law of thermodynamics forbids the existence of a perpetual motion engine (That’s a machine that can run forever without any intervention that involves more work than the work you get from the machine itself). Why not, asked James Clark Maxwell, with his famous thought experiment known as Maxwell’s demon, devised in 1871. But first, a recap of some basics. Heat flows from a hotter to a colder place. As that heat flows, you can use (part of) that heat flow to do work. By definition, this means that once the two sides are at the same temperature (same heat), heat stops flowing (neither end is hotter anymore). And without heat flow, no work can be done. We also know that heat is just a measure of the average speed of molecules. And since those molecules...

Information Theory - Part 1: What is it?

In their biography of Claude Shannon (the founder of Information Theory) titled A Mind at Play , Jimmy Soni and Rob Goodman explain the theory in a layman friendly way. Of course, it needed a genius to come up with the theory: “Turning art into science would be the hallmark of Shannon’s career.” And what could be more art-sy and non-science-sy than an abstract concept like information? Shannon, building on the works of Harry Nyquist and Ralph Hartley, considered an example. If a biased coin always landed heads, you have no uncertainty about how it lands, and so the information conveyed by announcing its state is zero. But if it is a perfectly unbiased coin, then the information conveyed on how it lands is the maximum. Ergo, Shannon decided that “information” is a measure of the reduction in uncertainty on the subject . Now Shannon needed a unit for this thing called “information”. He realized the smallest piece of information is a Yes/No answer. Further, all information ...