This month's Frame: Information theory
A framework that helps us see the value of just enough information.
That we live in an information rich world is a truism. It’s also true. We inhabit increasingly high fidelity communications systems which encourage us to both consume and produce rich information. Whether it’s through a combination of images, video, avatars, emojis and audio, our one-to-one, or one-to-many communications are getting increasingly rich in content. Is it time for us to think about how we can be more judicious with the choices we make about information richness?
Richness in communication can be traced to the tendency for technology to support ever greater degrees of verisimilitude. Consider the trajectory from telegraph to telephone to video phone towards VR. Simple text communication has evolved over the span of 150 years into rich, immersive communication worlds. And while it’s the case that non-rich communication tools occupy an important place in our everyday lives—texting being a familiar example—the overall direction of travel is towards ever richer content.
This evolution is in many ways the result of “technology push”. Increases in bandwidth enable more information to be transmitted, leading to more information density: what else to do with fatter pipes but fill them with more information? And while there’s some “society pull” here too—it’s nice to be able to video call your grandma—it’s hard not to conclude that we are surrounded by noisy, over-rich information environments which suck our attention, or leave us Zoom fatigued. Is that why a Huddle on Slack or a plain old telephone conversation often feels like a relief from oversaturated video calls, and more productive to boot?
Is there a way for us to better understand what we mean by richness? And if there is, does it help us consider more carefully where richness is not always a virtue (or even a necessity)? This is important not just because redundant information has the potential to overload us. Reducing information in communication can open up interpretive space and opportunities for meaning making.
Claude Shannon, the father of the Information Age, provided a means to think about communication and information in a new light. He helps us ask about the quantity and quality of information in communication.
The framework
Claude Shannon’s 1948 paper, “A Mathematical Theory of Communication” began by positing a model of communication. You speak into a phone (source), the phone encodes sound waves into an electric signal (transmitter). The signal passes into wire (channel), where the (noise) of other signals interferes with it, but the signal is decoded back into sound (receiver) and the sound reaches your ear (destination). This model is universalistic and applies to all communication, of any sort, across any medium: from smoke signals, text messaging to the Metaverse. This was one step of genius. His second was to recognise the probabilistic nature of information.
Toss a coin and it can give you two pieces of information. A head or a tail, a yes or a no, a zero or one. A bit, as Shannon termed it, is the amount of information that results from two equally likely options. 0 or 1?
But not all coins are fair—flip a coin with two heads and it tells you nothing you don’t already know. The information it provides resolves no uncertainty. Take a book. How much it can tell us about a topic reflects the degree to which it reduces uncertainty about that subject.
So for Shannon the measure for information is the degree to which it reduces uncertainty, or its ability to provide new information. The messages that resolve the greatest amount of uncertainty are the richest in information. He took this insight and demonstrated how it is possible to remove information without making the meaning of something uncertain.
Thinking about language (and its content—symbols) Shannon observed its probabilistic, or, more accurately, stochastic nature. It is neither entirely unpredictable nor fully determined. Use one letter and the one following it can be guessed at. A “t” followed by an “h” is relatively common. A “q” is rarely seen without a “u”. The same is true for combinations of words.
For an information theorist language is predictable. And in language a consequence of predictability is redundancy. Of letters and words. As Shannon observed: “mst ppl hv n dffclty rdng ths sntnce”.
So what Shannon was able to conclude was that we don’t need all information in every message to reduce uncertainty. Indeed, to the contrary, we can strip information out and still convey what needs to be communicated. This insight has enabled the (digital) information age. Compression removes redundant information. An .mp3 file has less fidelity than the uncompressed original but the meaning of the music is still intact. We can get away with redundancy.
Using the framework
If we can cope with some informational uncertainty then it follows that we can communicate without providing all the information we think we need.
Shannon’s theory helps us see why that redundancy might be a positive virtue since it opens interpretive space between people communicating. It creates the possibility that the uncertainty in communication acts, often intentional, sometimes implicit, is a space for interpretation and meaning making.
Humans are good at dealing with information redundancy. You could go further and say that acts of human communication and cultural production are about either filling the gaps in information presented to us, or making interpretive leaps in the absence of all the information. In other words we are highly capable of, and indeed thrive on, bringing useful noise to the communication party.
What this means is that not all that we need to communicate needs to be specified. And yet the irony is that Shannon’s work, which enables vast amounts of information to be stored and transmitted in zeros and ones, has also enabled rich—or, in his view, highly redundant—information exchange between people.
How might we use a framework that allows us to see that redundant communication has the potential to overload?
One answer is for us to throttle (or at least question) our desire to build tools to produce, and share, ever-richer information. Instead, we might ask, what gaps can we leave in our tools to allow for more acts of meaning making? The mathematically demonstrated possibility of informational redundancy should lead us to question informational richness. And that insight is an open invitation for communication platform designers to make more considered choices about how their communication platforms can avoid unnecessary information overload.
News
We're celebrating a year in Frames!
Today's Frame is our 13th monthly issue. Thank you for subscribing, sharing and giving us your feedback. Have you got a favourite framework you'd like us to cover? Let us know in the comments.