In the middle of the twentieth century something happened to the meaning of “meaning.” Until then, in a tradition reaching back at least to Aristotle, meaning had been associated with concepts, definitions, and language—and so associated strongly with the human animals who hold concepts, define things, and speak. But now it came to be connected to a term, information, that was sponsoring revolutions in areas from computation to biology. Meaning was ushered into the Information Age.
There were two consequences of this new connection. First, it pointed to the limitation of the old, human-centered tradition of thinking about meaning. It asked: what if meaning is broader, even vastly broader, than concepts, definitions, and words? This was an exciting development that opened meaning out on wider-than-human horizons.
At the same time, however, the association raised a confusion that has lingered ever since, suggesting to many that information and meaning are the same. But in fact they differ; meaningful information is one kind of information, but there is another kind that conveys no meaning at all.
The idea of information without meaning is challenging and counterintuitive, but it’s mooted clearly enough in our hard-drive dictionaries:
1. facts provided or learned about something or someone
2. what is conveyed or represented by a particular arrangement
or sequence of things
The first of these definitions describes our common-usage notion of information, filled with meaning. But the second alludes to information measured as the correspondence between a sequence of things at one time or place and the sequence transmitted to and reproduced at another. This is the information of the age of computation, the information we can quantify as bits and bytes on our laptops, and in the same way we can also quantify the information processed by living things as they transmit genetic sequences to drive their life processes and reproduce their kinds. This sort of information bears no meaning but is instead a measure of quantity of correspondence, correspondence that then drives processes such as the algorithms on our computers or the cellular production of protein molecules.
Missing from this second kind of information is an aspect basic to the first: reference or “aboutness.” Meaning, even in its beyond-human applications, connotes what is intended or referred to by an action, situation, or word. To convey meaning, those actions, gestures, and so forth must connect themselves to something else, referring to it and coming to be about it. This aboutness has always been basic to meaning, but it occurs only in the first of our two kinds of information.
The fundamental unit of aboutness is not a quantity of correspondence, as in bytes and bits, but a quality of relation. It occurs in signs, since signs are always signs of something other, creating a relation of reference or aboutness to that other. The role of a sequence of 0s and 1s in an algorithm enables a process but refers to nothing; a deer crossing sign on a highway refers to a leaping deer entirely separate from the sign itself. Both are informational, but only the second is meaningful.
The distinction of meaningful and meaningless types of information raises fundamental questions. How does referential information, not mere correspondence information, arise? What kinds of organisms produce it, and through what processes? How did these processes evolve?
Among the processes needed to create meaning, memory is crucial, but not merely the simplest forms of storage and retrieval, arguably found in all organisms. Instead something much rarer is necessary: an episodic memory, a process in which memories occur as parts of whole situations and enable some kind of reliving of the situations. The best evidence suggests that a good number of animals aside from humans form episodic memories, especially certain birds and mammals; but the vast majority of animals, and all other organisms, do not.
Learning is also central, but learning of a sort arising from the situations of episodic memory. The situational learning of a song sparrow, able to gauge the varying challenges from the differing songs of several neighbors, is an exemplary case of nonhuman meaning making and depends on the retrieval and processing of earlier episodes from the bird’s life. Simpler kinds of learning occur in many other organisms as non-meaningful correspondence, a function of neural networks involving excitation and inhibition connected with innate reward signals. A well-studied example is the sucrose sensitivity of honeybees, a process forming associations concerning food sources and much else.
Attention, finally, is related to the salience of a stimulus—the way an organism singles out a particular stimulus from all those coming to it. The sensitivity of honeybees to sucrose again provides a good example. But attention, like learning and memory, comes in degrees of complexity and differing kinds, and for meaning making a complex attentional process is required that can analyze incoming stimuli, taking them apart into their several aspects and focusing on them at both comprehensive and partial levels. This part-to-whole complexity reveals the similarity of complex attention to episodic memory and situational learning, also made up of parts and wholes.
Episodic memory, advanced learning, and analytic attention: these foundations for meaning making are not discrete properties of certain animals but instead developing processes in their engagements with their environments. They are not abilities presented to the world but processes shaped in the lifeways of certain animals in interaction with the world. They form important aspects of these animals’ niche construction—the web of processes by which all organisms exploit certain aspects of their surroundings but not others to enable their thriving. So we can distinguish between niche construction involving correspondence information and niche construction involving both correspondence and meaningful information. And, at the far horizon of the second kind, we can glimpse the possibility of animal cultures—cultural niche construction replete with meanings.
In the broadest vantage, then, distinct types of informational complexity drive the lifeways all around us. On the one side, the correspondence kind of information underlies all life processes. All life forms, even the simplest, are information processors of awesome complexity. On the other side lives a far smaller group of organisms that, while relying on the foundation of correspondence information, erects on it a superstructure of information managed so as to make perceived things refer to other things and thus to create aboutness.
Today evolutionary biologists are drawn to listing “major transitions,” large changes in the nature of living things that form leaps in the history of life, such as the advent of multicellular organisms in a single-celled world. It is clear that room needs to be made in their lists for the invention of meaning, a novel way of managing information and an imposing solution to the hard problem of life.
Gary Tomlinson is Sterling Professor of Music and Humanities at Yale University and the author of books on music, culture, and evolution, including A Million Years of Music: The Emergence of Human Modernity.