The Nature of Knowledge
This is an excerpt from the book: The Knowing Universe.
Though not often explicitly stated, it is evident and commonly accepted that knowledge is crucial for most forms of existence. This little-noted relationship is best illustrated with specific examples, beginning with one from biology, where there is broad agreement that genetic knowledge brings each generation of organisms into existence. As another example, animals’ neural-based knowledge brings a complex network of ecological interactions into existence. Cultural knowledge, accumulated over many generations, is responsible for existing cultural characteristics, such as rituals and tools. And as a final example, we all know that scientific knowledge results in existing technologies such as computers, cell phones and space exploration. We will argue that in all cases, knowledge is essential to existence.
Despite the many familiar examples, across all domains of existence, the scientific consensus, as expressed by Wikipedia, does not endorse the universal role of knowledge but instead considers knowledge as a purely human attribute (1):
Knowledge is the primary subject of the field of epistemology, which studies what we know, how we come to know it, and what it means to know something.
This introductory chapter examines the historical basis for a more general view extending knowledge to all things, one in which all forms of existence are due to the playing out of knowledge or, more succinctly, that existing is knowing.
Although we easily accept specific examples demonstrating existence’s reliance on knowledge, this connection is seldom scientifically examined. While science considers information as a universal concept, it neglects the related concept of knowledge outside of a human context.
We are amid an information revolution enhancing many of our cultural processes through computation and providing perhaps the predominant metaphor for our age. This information revolution has progressively reinterpreted scientific understanding during the past seventy years and has expanded to include related concepts such as knowledge and inferential systems. However, the view we develop here, that knowledge is, in general, an essential component of existing entities, is still a relatively novel scientific concept.
Since Shannon introduced the subject (2), information has taken centre stage, but we suggest that information by itself is only a kind of potential for knowledge. Sometimes this is stressed by noting that information is distinct from meaning. Perhaps the first to seriously grapple with this distinction was Charles Sanders Peirce, who understood that information or what he called signs are meaningless in themselves and require an interpretant to assign meaning to them. Shannon himself also insisted that information is separate from meaning.
Meaning arises when information or evidence updates an inferential model, sometimes called a generative model. A generative model describes the sources of the information or evidence it receives and uses this evidence to update the model to greater accuracy. It is in this process that information achieves context and meaning. Technically, information is purely a quantitative measure of the probability assigned to a hypothesis and contains no meaning. It is the hypotheses themselves that contain meaning. Information or evidence contributes to meaning only through the degree of support they confer on each of the competing hypotheses.
It is this support of hypotheses through information or evidence that bestows meaning. Hypotheses spell out meaning, and the role of information is to select the more probable hypotheses and weigh the different possible meanings appropriately. In this sense, evidence or information moves the model towards certainty, and in a general sense, a model accumulates knowledge and specifies meaning to the extent that the evidence moves it towards certainty.
In this manner, model hypotheses and evidence achieve synergy. In the absence of evidence, all model hypotheses are equally likely, and no meaning is discernible. Likewise, there is neither information nor evidence in the absence of hypotheses, and no meaning is discernible. When brought together, evidence assigns a relative weight to hypotheses, and meaning emerges. We may consider the relative weighting of the model’s probability distribution away from the uniform distribution as the emergence of both meaning and knowledge.
Restrictive and somewhat anthropomorphic definitions have hampered a proper appreciation of knowledge and its pivotal role in existence. One might have expected that academia, whose raison d'être is the accumulation of knowledge, would continually have updated a cutting-edge definition over its long history. Surprisingly, Plato (424 – 348 BC) provided what is still the consensus philosophic and scientific definition of knowledge:
knowledge is justified true beliefs.
Knowledge was also a central concept for Plato's teacher, Socrates (469 - 399 BC), who went so far as to identify knowledge as the only good (3):
There is only one good, knowledge and one evil, ignorance.
Socrates and Plato taught that we could distinguish knowledgeable beliefs, hypotheses or forms from ignorant ones using dialectics or a type of rational questioning known as the Socratic method. This form of cooperative argumentative dialogue based on asking and answering questions compares competing ideas in the hope of identifying the best ones. As Wikipedia describes dialectic (4):
a series of questions clarifies a more precise statement of a vague belief, logical consequences of that statement are explored, and a contradiction is discovered. The method is largely destructive, in that false belief is exposed and only constructive in that this exposure may lead to further search for truth.
Complex questions, such as those asked in a Socratic dialogue, may often be broken down into more basic questions having yes/no answers – think of the game of twenty questions. Such questions require one bit of information to answer[1]. Thus, dialectic questioning is amenable to logical calculation, a fact exploited by Plato's student Aristotle to introduce the field of mathematical logic as an aid to the dialectic process, providing a more formal examination of a given belief's logical consequences (5).
All variables in Aristotle's logic are two-valued: true or false. More recently, this system was understood as a limiting form of probability theory, specifically the theorems of Bayesian inference (6) in which the variables, called random variables, may be assigned any of the continuous values between 0 and 1. Bayesian inference subsumes Aristotelian logic as Aristotelian logic is the particular case of Bayesian inference where probabilities are constrained to be either 0 or 1, true or false. By processing degrees of probability ranging between 0 and 1, inference extends beyond arguments of true or false enabling subtle shades of reasoning and providing a continuous approach to certainty, a framework in which knowledge may gradually evolve.
For example, the effect of morning clouds on rain occurring during the day is uncertain. Although observed morning clouds may be assigned probability 1, the cloud’s implication for rain later in the day is uncertain, neither true nor false, and cannot be described with binary Aristotelian logic but is describable with Bayesian probabilities. The Bayesian expression P(rain | clouds), or the probability of rain given clouds, describes this relationship. This conditional probability is now routinely calculated to great accuracy using sophisticated weather forecasting models.
In general, Bayesian inference provides a more nuanced, general and robust framework for evolving knowledgeable beliefs, a process for assigning probabilities that accurately describe a belief's evidence-based plausibility. We find some support, even in its ancient Greek origins, for the notion that the accumulation of knowledge takes place through the process of Bayesian inference, and we develop this line of thought more thoroughly in Part II.
Given its great antiquity, Plato's definition is surprisingly robust, and we offer only a couple of quibbles but ones that may reveal knowledge in a new light. First, we should insist that 'justified true beliefs' means justified by the evidence and that the extent to which beliefs are valid is the extent to which they are justified by the evidence. This slight shift connects knowledge to Bayesian inference, as the Bayesian update relies on pertinent information or evidence. It also connects knowledge with information theory, as information always acts as a form of evidence.
Second, we emphasize the non-anthropomorphic nature of beliefs and suggest that existing entities throughout nature can be said to hold Bayesian beliefs or expectations. For example, organisms’ genomes contain beliefs, make predictions, or have expectations concerning their environments and available actions. Whether we call them beliefs, forms, expectations, predictions or hypotheses (we will most often use hypotheses), these only accumulate knowledge when informed by outcomes. Incorporating these two quibbles, we can suggest a clarification to Plato's definition of knowledge:
Hypotheses contain knowledge to the extent that they are justified by the evidence.
When modern western philosophy finally emerged from the theocratic thought tyranny of the dark ages, dialectics resumed its role as a central philosophical concept. Hegel, Kant, and Marx each offered a reinterpreted dialectics. Still, its more formal description, mathematical logic, did not evolve much beyond its Aristotelian roots until the middle of the nineteenth century when it rapidly evolved in the work of George Boole (1815 – 1864), Augustus de Morgan (1806 – 1871), Charles Sanders Peirce and Gottlob Frege (1848 -1925). In their hands, mathematical logic became more abstract and formal. Although building on the Socratic method, these developments tended to ignore the concept of knowledge, even though Socrates' initial goal for dialectics was to accumulate knowledge.
Western culture began developing science during the seventeenth century, a new and efficient Bayesian inferential process for accumulating knowledge (6). Still, even here, the concept of knowledge itself was rarely explicitly examined. Only in the twentieth century did researchers begin to pay some attention to this crucial concept. In 1935 the great biologist and statistician R.A. Fisher (1890 – 1962) identified inference as the source of scientific knowledge (7):
Inductive inference is the only process known to us by which essentially new knowledge comes into the world.
Although his observation identifies the crucial role of inference in accumulating knowledge and the scientific method as an inferential system, Fisher still regarded knowledge as a human-only domain. He limits inferential knowledge accumulation to the scientific method.
Since Fisher's time, some researchers have taken further tentative steps to understand knowledge as one of nature's universal characteristics. Donald T. Campbell (1916 – 1996) introduced the subject of evolutionary epistemology, describing scientific knowledge as evolving through a Darwinian process of (8):
blind variation and selective retention
As Wikipedia explains, he acknowledged the application of this evolutionary paradigm even to forms beyond human knowledge (9):
Campbell added that the same logic of blind variation and selective elimination/retention (BVSR) underlies all knowledge processes, not only scientific ones. Thus, the BVSR mechanism explains creativity, but also the evolution of instinctive knowledge, and of our cognitive abilities in general.
We might note here that as Darwinian processes follow the mathematics of Bayesian inference (10), Campbell's views on knowledge are entirely consistent with the developing definition offered above.
Karl Popper (1902 – 1994), generally regarded as one of the 20th century's most influential philosophers of science (11), also made a compelling argument that knowledge is not restricted to humans but must be a more general natural characteristic because the biological realm also accumulates knowledge. As Wikipedia tells us (11):
For Popper, it is in the interplay between the tentative theories (conjectures) and error elimination (refutation) that scientific knowledge advances toward greater and greater problems; in a process very much akin to the interplay between genetic variation and natural selection.
However, beyond small concessions to biology and our biological roots, few researchers have identified knowledge as a concept applicable beyond an anthropomorphic context. Perhaps the most thorough discussion of knowledge within the recent scientific literature is offered by the physicist David Deutsch (born 1953) (12), a founder of the theory of quantum computation (13) and a philosophical disciple of Karl Popper (14). Indeed, Deutsch has developed a principled interpretation of scientific laws he calls constructor theory, which places knowledge at its core. Deutsch argues that constructor theory can (15):
incorporate into fundamental physics the fact that the most significant quantity affecting whether physical transformations happen or not is knowledge.
He summarizes the importance of knowledge with this remarkable claim (12):
Everything that is not forbidden by the laws of nature is achievable with the right knowledge.
Deutsch puts this compelling and universal claim very nicely. However, like Fisher, he is not referring to knowledge in general but, more specifically, to what is achievable with scientific knowledge. Unfortunately, he constrains his claim to human knowledge, but we will now attempt to demonstrate that his claim is true in its full generality.
Deutsch raises some arguments for restricting his claim to scientific knowledge. His primary objection to a more general concept is that the evolution of scientific knowledge is unbounded and lays the foundations for creating unbounded technologies, unlike other forms of knowledge found in nature. Following Popper, Deutsch does recognize that knowledge may come in other forms and acknowledges the existence of biological knowledge (12):
Using knowledge to cause automated physical transformations is, in itself, not unique to humans. It is the basic method by which all organisms keep themselves alive: every cell is a chemical factory.
He considers both human and biological knowledge as having cosmic significance as both these forms of knowledge are central to their existing forms, and both accumulate in a similar manner (12):
The two types of information that they respectively evolved to store have a property of cosmic significance in common: once they are physically embodied in a suitable environment, they tend to cause themselves to remain so. Such information - which I call knowledge – is very unlikely to come into existence other than through the error-correcting processes of evolution or thought.
However, Deutsch considers human knowledge to be qualitatively different from other forms of knowledge and unique in its universal status as unbounded and at the beginning of infinity (12):
The difference between humans and other species is in what kind of knowledge they can use (explanatory instead of rule-of-thumb) and in how they create it (conjecture and criticism of ideas, rather than the variation and selection of genes). It is precisely those two differences that explain why every other organism can function only in a certain range of environments that are hospitable to it, while humans transform inhospitable environments like the biosphere into support systems for themselves.
We propose that all knowledge has a common origin in inferential systems operating across the various domains of reality: physical, biological, neurological and cultural. Contrary to Deutsch, this means that virtually every natural system employs this same kind of inferential knowledge. Even though knowledge accumulates through the same underlying mechanism in each domain, its implementation has evolved and has become more streamlined over evolutionary time.
For example, although biological and cultural evolution employ inferential systems as their primary form of knowledge creation (10; 16), cultural knowledge accumulates much more rapidly due to recent evolutionary adaptations such as consciousness and science. As the chair of the Nobel chemistry selection committee said in awarding the 2018 Nobel prize to scientists who developed methods for the directed evolution of proteins (17):
In doing so, they have been able to make evolution many thousand times faster, and they have also been able to direct evolution to create proteins with new and useful properties.
In this view, distinctions between knowledge accumulation in the various natural domains are more quantitative than qualitative. For example, Deutsch attempts to distinguish human knowledge from other forms by claiming that only human knowledge can (12):
transform inhospitable environments like the biosphere into support systems for themselves
But this claim is mistaken, as demonstrated by the theory of biological niche construction (18), which describes a common adaptive strategy of animals to transform inhospitable environments into hospitable ones. There are many familiar examples of this, such as beavers and their dams or spiders and their webs. In general, niche construction refers to processes in which animals adapted to a particular niche transform other niches to include those features to which they are already adapted.
Indeed, it is plausible that even the cellular structure employed by all life is a form of niche construction. In this view, early life transformed inhospitable environments outside of their native rocky pores, where they may have first evolved, by constructing more familiar and hospitable cellular structures to which they were already adapted. While humans undoubtedly have taken this kind of environmental engineering to new limits, it is still the same general process, and the differences are again more quantitative than qualitative.
Deutsch is also mistaken in his claim that cultural and biological knowledge accumulates through qualitatively different processes (12):
conjecture and criticism of ideas, rather than the variation and selection of genes.
Numerous researchers have noted that cultural (19; 8; 20; 21) and biological (22; 23) knowledge accumulate through a common Darwinian process. Further, the mathematics describing Darwinian processes is the mathematics of Bayesian inference (10; 24). Thus, we may conclude that biological and cultural knowledge (and all other forms) accumulate through essentially the same mechanism of inferential systems.
Deutsch's constructor theory has its roots in the work of John Von Neumann (1903 – 1957), often considered the greatest mathematician of his era (25), who claimed that existence requires practical rather than abstract knowledge, a knowledge that is self-creating and self-sustaining (autopoietic). Von Neumann began an investigation into the forms of knowledge which could cause entities to exist, a form of knowledge that he called universal constructors. His research produced some notable results: the knowledge used by universal constructors to produce existing entities must be a two-step algorithmic knowledge. The first is to make copies of the algorithmic knowledge, and the second is to execute the algorithm and construct an additional copy of the existing entity (15).
Deutsch has developed Von Neumann's program by observing the additional fact that this knowledge not only creates entities but also maintains these entities' existence (12):
once they are physically embodied in a suitable environment, they tend to cause themselves to remain so.
He hints at mechanisms used to create this algorithmic and autopoietic (self-creating and self-maintaining) knowledge. He claims this knowledge:
is very unlikely to come into existence other than through the error-correcting processes of evolution or thought.
In Part II, our examination of the nature of knowledge fundamental to existence suggests that inferential systems are the single process capable of producing knowledge and that both examples provided by Deutsch (evolution and thought) operate in this manner.
Given the arguments above, we may take Deutsch's claim literally as a universal statement of the relationship between knowledge and existence:
Everything that is not forbidden by the laws of nature is achievable with the right knowledge.
In other words, the laws of nature form a bound beyond which existence may not go – nothing can exist contrary to the laws of nature. But everything that nature allows does not yet exist – existence depends on knowledge, and knowledge evolves and accumulates in lengthy evidence-based processes. In the beginning, at the time of the big bang, it is thought that very little existed other than a universal and homogeneous high energy state. At that time, there was little knowledge for existence, other than the laws of nature themselves, but the required knowledge did begin to evolve. Now we are at an intermediate stage where the realms of cosmology, quantum physics, biology, neural-based behaviour and culture have evolved knowledge for a wide range of existing forms but are still a long way from exhausting the knowledge required to produce all possible existing forms allowed by the laws of nature.
Having established its universal applicability, we now probe and clarify Deutsch’s claim and attempt to demonstrate that it is entirely consistent with the definition of existence presented at the beginning of this introduction.
First, we note that a phrase in Deutsch's claim, 'not forbidden by the laws of nature', is redundant as we accept that things not allowed by the laws of nature do not occur. With this assumption, we may delete this phrase from the claim without loss of meaning. Second, the word achievable is ambiguous and perhaps anthropomorphic; we clarify its meaning by substituting instead 'may be brought into existence'. Lastly, we accept Deutsch's claim as true of existence in general; things do not come into existence other than by using the right knowledge. For example, scientific knowledge brings technologies into existence. As another example, no life form comes into existence other than through genetic knowledge. Later we discuss in detail why this general statement also applies to domains of reality in addition to biology and culture.
Starting with Deutsch's claim and applying these three plausible clarifications, we arrive at an extension:
The right knowledge brings everything that exists into existence.
A subsequent clarification is to specify what is meant by ‘right knowledge’; what kind of knowledge do we mean? In the tradition of Von Neumann and Deutsch, we conclude that the knowledge capable of achieving existence is algorithmic and autopoietic and use this to again revise the claim:
Algorithmic and autopoietic knowledge brings everything that exists into existence.
At this point, we might ask, what is the source of this algorithmic autopoietic knowledge? The short answer is that this knowledge for existence is the product of inferential systems continually updating their knowledge with the evidence generated in their attempts to exist. For example, in the domain of life, genetic knowledge accumulates through the inferential system called natural selection that updates the genome with the evidence generated by organisms' experience in their attempt to exist. We now add this observation to our claim:
Algorithmic and autopoietic knowledge, generated by inferential systems, brings everything that exists into existence.
Our next move in developing this claim is to shift its subject from knowledge to existence:
Existence is achieved through the algorithmic and autopoietic knowledge generated by inferential systems.
A little further clarification is required to explain more clearly the process by which knowledge achieves existence. In Part II, we develop understandings of inferential systems that include updating and accumulating knowledge and turning it into existing forms. Existence is a dynamic process of accumulating knowledge for existence and testing it pragmatically and experimentally, creating existing forms. Inferential systems are two-step cyclical processes where knowledge creates existence, and existence creates knowledge.
Another crucial point concerning the relationship between knowledge and existence is that knowledge controls the system's actions to cause existence. Given the appropriate circumstances, knowledge initiates a causal cascade creating the existing entity from more fundamental forms. With these final clarifications, we arrive, once again, at our previous definition of existence:
Existence is an inferential system in which a generative model containing algorithmic autopoietic knowledge, accumulated by that inferential system, exercises direct causal control over more fundamental forms of matter and energy to create and regulate existing forms.
This brief history of knowledge attempts to connect with the radical view that all existence is due to the expression of knowledge within inferential systems. In this view, existence takes the form of a two-step inferential system. In the first step, algorithmic and autopoietic knowledge or models hypothesize and construct generalized phenotypes. In the second step, evidence of the phenotype’s existence updates the model’s knowledge for existence.
Our introduction ends with a reiteration of the definition of existence offered at its beginning. The remainder of this book consists of clarifications, examples and arguments in its defence.
References
1. Wikipedia. Knowledge. Wikipedia. [Online] [Cited: August 20, 2020.] https://en.wikipedia.org/wiki/Knowledge.
2. A mathematical theory of communications. Shannon, Claude. 1948, Bell System Technical Journal.
3. Laertius, Diogenes. Lives of eminent philosophers. [trans.] Robert Drew Hicks. London : W. Heinemann, 1925.
4. Wikipedia. Dialectic. Wikipedia. [Online] [Cited: September 22, 2018.] https://en.wikipedia.org/wiki/Dialectic.
5. Smith, Robin. Aristotle's Logic. Stanford Encyclopedia of Philosophy. [Online] First published Sat Mar 18, 2000; substantive revision Fri Feb 17, 2017. [Cited: September 22, 2018.] https://plato.stanford.edu/entries/aristotle-logic/.
6. Jaynes, Edwin T. Probability Theory: The Logic of Science. s.l. : University of Cambridge Press, 2003.
7. Fisher, R. A. The Design of Experiments. 9. New York : Macmillian, 1937 (1971). ISBN: 0-02-844690-9.
8. Campbell, Donald T. Evolutionary Epistemology. [book auth.] P. A. Schilpp. In The philosophy of Karl R. Popper. s.l. : Open Court., 1974, pp. 412-463.
9. Wikipedia. Donald T. Campbell. Wikipedia. [Online] [Cited: September 8, 2018.] https://en.wikipedia.org/wiki/Donald_T._Campbell.
10. Universal Darwinism as a process of Bayesian inference. Campbell, John O. s.l. : Front. Syst. Neurosci., 2016, System Neuroscience. doi: 10.3389/fnsys.2016.00049.
11. Wikipedia. Karl Popper. Wikipedia. [Online] [Cited: August 27, 2018.] https://en.wikipedia.org/wiki/Karl_Popper.
12. Deustch, David. The beginning of infinity. s.l. : Allen Lane, (UK); Viking Press, (US), 2011.
13. Quantum theory, the Church-Turing principle and the universal quantum computer. Deutsch, David. s.l. : Proceedings of the Royal Society of London, 1985, Vols. A 400 pg. 97-117.
14. Deutsch, David. The Fabric of Reality . London : Penguin Books., 1997.
15. Constructor Theory. Deutsch, David. s.l. : Synthese, 2013, 2013, Vols. Volume 190, Number 18, Page 4331.
16. How Darwinian is cultural evolution? Claidière, N., Scott-Phillips, T. C., Sperber, D. s.l. : Philos Trans R Soc Lond B Biol Sci. 2014, 2014, Vol. 369(1642):20130368. doi:10.1098/rstb.2013.0368.
17. Rennie, John. Three Biochemists Win Chemistry Nobel for Directing Evolution. Quanta Magazine. [Online] October 3, 2018. https://www.quantamagazine.org/frances-arnold-george-smith-and-gregory-winter-win-chemistry-nobel-for-directing-evolution-20181003/.
18. A variational approach to niche construction. Constant, Axel, et al. s.l. : Journal of the Royal Society Interface, 2018.
19. Blind variation and selective retension in creative thought as in other knowledge processes. Campbell, Donald T. 1960, Psychological Review, Vol. 67, pp. 380 - 400.
20. Popper, Karl. Objective Knowledge. s.l. : Claredon Press, 1972.
21. A framework for the unification of the behavioral sciences. Gintis, Herbert. 2007, BEHAVIORAL AND BRAIN SCIENCES.
22. Darwin, Charles. The Origin of Species. sixth edition. New York : The New American Library - 1958, 1872. pp. 391 -392.
23. Dawkins, Richard. The Selfish Gene. s.l. : Oxford University Press, 1976.
24. Simple unity among the fundamental equations of science. Frank, Steven A. s.l. : arXiv preprint, 2019.
25. Wikipedia. John Von Neumann. Wikipedia. [Online] [Cited: August 27, 2018.] https://en.wikipedia.org/wiki/John_von_Neumann.
[1] A bit may be defined as the amount of information contained in the answer to a yes/no question