Шрифт:
Cultural evolution has increased the complexity of entire cultures-societies as well as individual meanings. As mentioned earlier, the complexity of a meaning is determined by the minimum number of figurae required to reproduce it. Suppose we have a meaning s that can be represented as a string with a certain number of figurae. The length of this string is L(s). In this case, the complexity of a meaning s is defined by the length of the shortest program s* that can describe this meaning. The length of the program s* is called the algorithmic entropy of s or Kolmogorov complexity K(s):
“The key concept of algorithmic information theory is that of the entropy of an individual object, also called the (Kolmogorov) complexity of the object. The intuitive meaning of this concept is the minimum amount of information needed to reconstruct a given object” (Vinogradov et al. 1977-1985, vol. 1, p. 220).
We call the program s* the least or minimal action necessary to reproduce the meaning s. The complexity of the meaning s depends on the length (size) of the minimal action required to reproduce it. For example, the string asdfghjkl can be described only by itself. Its length is 9 non-repeating figurae. However, if a string s has a pattern—even a non-obvious one—it can be described by a minimal action s* that is much shorter than s itself. For example, the string afjkafjkafjkafjk can be described by the much shorter string afjk repeated as many times as necessary.
“The distinction between simplicity and complexity raises considerable philosophical difficulties when applied to statements. But there seems to exist a fairly easy and adequate way to measure the degree of complexity of different kinds of abstract patterns. The minimum number of elements of which an instance of the pattern must consist in order to exhibit all the characteristic attributes of the class of patterns in question appears to provide an unambiguous criterion” (Hayek 1988-2022, vol. 15, p. 260).
The complexity of a given meaning is determined by the size of the minimal action necessary to reproduce that meaning. As a product of culture, man himself is also a meaning. In order to be able to transmit more and more cultural experiences, he must become more complex. With each generation, the minimal action required to reproduce man as a cultural being grows, and with it the complexity of learning.
The complexity of a minimal action converges to the entropy of its source, that is, the minimal subject. As we saw above, the complexity of a culture-society is determined by the number of alternative meanings (counterfacts) it can generate. At the same time, the complexity of a culture-society is defined by the size of the minimal action necessary for its reproduction. Thus, the entropy of the culture-society considered as a source of messages (counterfacts) is approximately equal to the average complexity of all possible messages from this source:
“Shannon’s entropy does not make sense for a particular string of bits. Entropy is a property of an information source. There are many possible messages, each with its own probability. Entropy measures the size of that universe of possibilities. In contrast, the algorithmic entropy makes sense for any particular string of bits. The strings themselves can have a higher or lower information content, according to whether they require longer or shorter descriptions. The two entropies are related to each other. For a source that produces binary sequences, the Shannon entropy is approximately the average of the algorithmic entropy, taking an average over all the possible sequences that the source might produce: H ? ave(K). Shannon entropy is a way to estimate the algorithmic entropy, on average” (Schumacher 2015, p. 231).
In other words, the complexity of meaning, when measured in cultural bits, converges on average to the entropy of the subject, be it culture-society as a whole or an individual taken as a source of (counter)facts. As Protagoras said, “man is the measure of all things: of the things which are, that they are, and of the things which are not, that they are not” (Plato 1997, p. 169). The minimal subject is a measure of the complexity of man, that is, of the unpredictability, uncertainty, randomness and surprise of his actions performed and not performed. The minimal subject is both the source and the product of the minimal action, and together they constitute the minimal meaning.
The historical increase in the complexity of a culture-society is reflected in the increase in both the number of (counter)facts it produces and the average size of the minimal action required to reproduce an individual meaning. The transition from cultural selection based on the alternation of human generations to traditional choice based on the alternation of generations of meanings raised both the entropy of the source of (counter)facts and the complexity of the (counter)facts themselves.
When we apply the achievements of information theory to culture, we must note the difference between the terms “information” and “meaning.” Information is determinateness in general or certainty, its measure is the reduction of uncertainty. The unit of information is a bit, “1” or “0.” In contrast to information, meaning is directed certainty, an act of change in a certain direction. Examples of directed certainty are the evolution of living beings and the evolution of meanings. Humans process information (certainty) into meaning (mediated, that is, directed certainty) by matching information with needs. The unit of meaning is the cultural bit—not just “1” or “0,” but also “+” or “–.” Meaning is information in human action that reproduces the patterns of the world.