Use and Abuse of General Criterion for Discrimination of Causal Correlations from Individual Peculiarities and Provisional Correlations

Abstract

A fundamentally novel approach to the issue about existence of a general criterion for autonomous discrimination of causal correlations from individual peculiarities and provisional correlations at stable complex systems is put forward. It is grounded on a recently proven by the author decomposition theorem whose subject has no cross section with the subject of the Central Limit Theorem. The fundamental advantage of that criterion lies in insensitivity to the details of the underlying dynamics and to the details of the hierarchical structure regardless of the nature of the corresponding system. It holds in an unspecified ever-changing environment. It also holds when information is incomplete and/or uncertain. Another advantage of the criterion is the ability to forecast a change in a system. The limitation of the criterion is substantiated as a ban over predictability whether that change would develop in an adaptation or in destruction. It is worth noting that while the criterion itself holds in the frame of the recently proposed theory of boundedness, the ban over prediction of the nature of a change is model-free.

Share and Cite:

Koleva, M. (2020) Use and Abuse of General Criterion for Discrimination of Causal Correlations from Individual Peculiarities and Provisional Correlations. Journal of Modern Physics, 11, 767-778. doi: 10.4236/jmp.2020.116049.

1. Introduction

Complex systems viewed as a single matter is relatively new subject of interdisciplinary science. It encompasses a huge variety of systems such as social, biological, climate, ecological systems etc. They all share the answer to the question: how parts of a system give rise to collective behavior and how the system interacts with its environment. Thus, social systems are constituted by people; brains are constituted by neurons, and weather consists of flows of atoms and molecules. The scope of this study encompasses all fields of traditional science as well as engineering, medicine and management.

Such huge variety of systems and behaviors prompts to suggest that the common properties of the behavior of those systems must appear rather as general protocol than as a law and/or rule. To remind, a law in its traditional meaning implies specific relation among variables characterizing a given phenomenon which re-occurs on repetition. What is tacitly presupposed, however, is that the conditions must re-occur the same as well. The latter implies pre-determination of the environment and its permanent constancy. Thus, if exists, such general protocol must hold in an ever-changing environment and must be insensitive to the details of the underlying dynamics and structure of any system. Further, other properties of any such protocol must be its ability to distinguish between common and individual properties of the species. For example, such criterion, applied to DNA, must distinguish a common part, that part shared by all humans, from the part related to the peculiarities of any individual. A crucial requirement for successful distinguishing between common and individual properties is whether such general protocol is able to eliminate provisional correlations. A very illustrative example comes from the weather: when a short-term weather pattern implies climate change and when it is just a provisional fluctuation.

The major goal of the present paper is to demonstrate that such protocol does exist and it is a direct consequence of the basic theorem, called decomposition theorem, derived in the frame of recently put forward by the author theory of boundedness [1]. Its fundamental difference with the Central Limit Theorem is discussed below. Decomposition theorem is derived on account of another common property shared by all complex systems, that is, their stability. Indeed, we humans live in an ever-changing environment where changes vary from pico-seconds to days and nights, up to several decades for individuals and more than 100,000 years as a kind. Thus, the decomposition theorem proves that there is presentation basis (Fourier transform of the autocorrelation function) where the power spectrum of any bounded irregular sequence (BIS) is decomposed into 3 parts: specific discrete pattern, continuous band of universal shape 1 / f α ( f ) where f is the current component and α ( f ) = 1 for the first component and monotonically increases on the increase of the number of the component. The 3rd component is an irrational component which is always in the infra-red (to the most left in the power spectrum) and which originates from the highly non-trivial interplay between the discrete and the continuous band. A decisive property of that decomposition is that it is robust to the details of the variations in the corresponding BIS and it maintains constant accuracy of separation between discrete and continuous band. This property prompts me to call the discrete band homeostatic pattern. A highly non-trivial consequence of the additivity of the decomposition implies that the specific properties of each and every homeostatic pattern are robust to the individual responses and/or the current environmental impact. Another decisive property of that decomposition is that the provisional correlations drop out automatically since they are eliminated by definition from the autocorrelation function.

To remind, the autocorrelation function is a measure of the average correlation between any two points separated by an interval which is scanned to vary in an interval from zero to up to the length of the corresponding time series. An exclusive property of boundedness is that it provides correlations of radius equal to the length of time series because every “deviation” from the average inevitably “turns back” in a specific yet bounded interval. Further, major move forward is the assumption that all time scales contribute uniformly which yields lack of any special frequency to be signaled out. Put in other words, the power spectrum of such contribution would appear as a continuous band whose envelope smoothly approaches universal shape famous as 1 / f noise. It is worth noting, that the uniform contribution of all time scales is very different from random contribution which is characterized by lack of any systematic correlations. Then, the corresponding power spectrum consists of random lines, famous as “white noise”.

At this point a question arises: how and why a specific discrete band (homeostasis) appears and why it is superimposed over a band of shape 1 / f noise. Discrete band appears when the functionality in a system is synchronized so that to produce a stable pattern whose adaptability to an ever-changing environment is consistent with the boundedness of rates and amplitudes. Then, as proven in [1], the general condition for avoiding resonances (which is condition for avoiding “shaking up” a system beyond its thresholds of stability) yields additive decomposition of the power spectrum to specific discrete band and a continuous band of shape 1 / f α ( f ) . An exclusive for stable systems property of that decomposition is that both discrete band and the shape of the continuous band are insensitive to the statistics of environmental variations. This is easy to track out by means of taking time series of different length and partitioning in sub-time series of appropriate length. If no new line(s) appear, the corresponding system is stable.

In my previous paper [2] I have demonstrated that correlations presented in a homeostatic pattern are namely the causal relations in the corresponding system while the presence of a smooth continuous band implies that the corresponding individual peculiarities are consistent with notion of the corresponding kind. Needless to mention: provisional correlations are eliminated. An illustration why namely the homeostatic pattern represents the corresponding causal relation is presented in the next section.

In order to make the above statement clear let me present an example of three raw time series representing temperature variations in the course of time in an experiment on HCOOH oxidation over Pt catalyst (Figure 1(a), Figure 2(a), Figure 3(a)) and their power spectra (Figure 1(b), Figure 2(b) Figure 3(b) correspondingly). Details can be found in [3]. It is worth noting that the Fourier transform of the autocorrelation function is applied to the raw time series themselves.

(a) (b)

Figure 1. (a) Temperature variations (in relative units) in the time course (in relative units) of catalytic oxidation of HCOOH over Pt catalyst example 1; (b) Power spectrum of the temperature variations presented in (a).

(a) (b)

Figure 2. (a) Temperature variations (in relative units) in the time course (in relative units) of catalytic oxidation of HCOOH over Pt catalyst example 2; (b) Power spectrum of the temperature variations presented in (a).

(a) (b)

Figure 3. (a) Temperature variations (in relative units) in the time course (in relative units) of catalytic oxidation of HCOOH over Pt catalyst example 3; (b) Power spectrum of the temperature variations presented in (a).

The power spectra are represented in relative units: each component in a power spectrum is divided to the first component of that power spectrum. The log-log scale in Figure 1(b), Figure 2(b) Figure 3(b) makes apparent that the shape of the continuous band is 1 / f α ( f ) where α ( f ) = 1 for the first component and α ( f ) gradually increases with the frequency.

The comparison between the above raw time series and their power spectra reveal how inconclusive and deceptive is the judgement of the time series alone and what an unexpected behavior reveal the power spectra.

Yet, this is not surprising when taking into account the fundamental difference between the Central Limit Theorem (CLT) and the decomposition theorem. The fundamental difference lies in their subjects: while the subject of the Central Limit Theorem is independent random variables (yet unbounded), the subject of the decomposition theorem is bounded irregular variables (yet, not independent). Thus, they have no cross-section. The Central Limit theorem serves as grounds for the probabilistic theory which so far is the dominant concept for modeling of any behavior exhibiting any form of variability. However, this scenario suffers a common setback: it lacks generality, that is, it requires specific modeling in each and every case which turns very sensitive to the details of that modelling. On the contrary, the most powerful advantage of the decomposition theorem is insensitivity to the particularities of the dynamics let alone the system is bounded so that the amplitude of variations is confined to vary within specific margins and the rate of exchange of matter/energy/information with the environment is kept also permanently bounded. The most illustrative example of that power comes from application to social networks where the input information is most probably uncertain. The latter comes from the fact that human behavior is rather irrational and because of the variety of reasons people are reluctant to say the truth. However, if a network as a whole is stable, it would reveal a steady pattern which could be substantiated by means of applying the proposed general criterion. Moreover, through making inverse Fourier transform of the homeostatic pattern itself, that is, by means of removal of the continuous band from the corresponding power spectrum, one can find the pattern itself. In the next section it is presented the methodology of how to find out hierarchy in any such pattern.

The criterion could be applied to a large variety of systems where the current knowledge about it is incomplete, uncertain and/or clumsy, or hazardous to be obtained. Thus, the criterion is available for archeological data, meteorological data, medicine (EEG and EKG), urban engineering, financial engineering, DNA sequences, etc. To compare, the method of discrimination between competing models, based on the probability theory, does not provide qualitative difference between competing models and says nothing about the stability of any of them. On the contrary, the proposed by the author criterion provides not only a qualitative difference between different types of correlations but reveals their stability as well. One way is to monitor the power spectrum and to see whether the pattern stays intact or some extra line(s) appear. If stays intact, the corresponding system is stable even in an ever-changing environment, if extra line(s) appear it implies changes. At this point the major question arises whether these changes yield adaptation or they yield destruction. In Section 3 I will demonstrate that this is undecidable problem by any means of computation, neither by traditional algorithmic computing nor by proposed by me in [1] semantic computing.

Thus, the major goal of the present paper is to outline the general theoretical backgrounds for existence of that criterion and its limitations.

2. Why Homeostatic Patterns Represent Causal Relations? How the Hierarchy Is Revealed in Homeostatic Patterns?

The rigorous considerations why homeostatic patterns represent steady causal relations are represented in [2]. Here I will present them by means of a comparison to the network theory.

Up-to-now there is not established view on the difference between causal and provisional correlations but still pre-dominant view is to associate causality with the covariance in all variety of its forms. The weak point of that idea is the lack of general criterion allowing autonomous demarcation between causal relation, steady correlation and provisional ones. Indeed, the same physical mechanisms provide both causality and provisional correlations. Thus, up-to-now, complex networks, which serve as major implement for modelling complex systems, share common features the major of which consists in assigning probabilities and weights to both inputs and outputs. The role of dynamics and the environment is to rearrange those probabilities and/or weights. Yet, crucial flaw of this scenario is that it does not allow distinguishing causal correlations from provisional ones. Indeed, the probabilistic approach considers causality as a system of binary asymmetric in succession relations. However, this viewpoint does not allow any discrimination between causal and provisional correlations since the latter is also supposed binary and there is no general rule about whether a provisional or a causal relation comes next. So, a sequence of provisional and causal correlations exactly matches a random sequence of “0” and “1” thus justifying the application of CLT.

However, the formal logic grounded on the idea that there exists steady state whose characteristics are defined as specific steady quantal relations among certain variables for each and every specific set of steady environment, is not able to decide whether any given sequence of corresponding “0” and “1” assigned to each and every sequence of “yes-no” answers is indistinguishable from a random sequence of “0” and “1” because there is no general criterion which provides that the answers are always true; moreover there is no general protocol which provides that questions are posed always correctly.

The core of the problem is to be traced up to the Central Limit Theorem where the variables are supposed random independent ones. Thus, the very idea of long range ones is contradictive to the Central Limit Theorem. To compare, boundedness introduces long-range correlations of practically infinite radius while the correlations which are consistent with CLT are only those of bounded radius.

It is worth noting that the above setbacks of the CLT do not allow any robust to the environmental statistics additive separation of a power spectrum to a specific discrete band and universal continuous one. Indeed, the unboundedness of the random variables implies strong sensitivity of the shape of the corresponding power spectrum to the statistics of those variables. Thus, although the provisional correlations are eliminated, no discrimination between causal correlations and the individual peculiarities is available. In turn, the notions of an individual and the notion of a kind turn out blurred. Consequently, the latter renders pattern recognition (e.g. discrimination between cats and dogs) uncertain especially around the demarcation line between qualitatively different subjects (cat or dog).

Outlining, the additive separation of a power spectrum to a specific discrete band and a universal continuous one so that both bands are robust to the statistics of variations, is an exclusive property of the proposed by the author [1] concept of boundedness.

At this point the power of the proposed in the present paper discrimination criterion becomes evident: it is model independent in the sense that there is no need to know, for example, how exactly our bodies work in full details to be positive and certain whether a body is stable or not. Thus, the notion of homeostasis turns insensitive to all kinds of variations, daily variations of temperature for example. Alongside, it turns out that bounded within specific margins deviations from that homeostasis constitutes individual peculiarities in a way consistent with the notion of a kind. Moreover, the insensitivity of the shape of the continuous band to the environmental variations (which are associated with the individual peculiarities) implies that all individuals sharing the same homeostatic pattern share the same evolutionary value as well.

A very important matter in the study of complex systems such as societies, ecological systems and climate is the matter of their hierarchy and its stability. The question is what type of hierarchy is stable and how it is represented in the power spectra. So far, the dominant concept is that about one-directional hierarchy. The latter serves as grounds for the reductionist approach which sits in the very core of the science: thus it implies that complexity goes from elementary particles to cosmological objects via self-organization. However, reductionist approach is not able to provide any qualitative difference among the subjects of self-organization: in the above example it represents emergent patterns in the same terms (in the example of reaction networks those are the concentrations of different sorts of output products compared to the concentrations of the input reagents).

I explicate the qualitative difference between successive hierarchical levels by the use of general mechanism for leveraging the stability of complex systems. Its protocol is grounded on the idea that the environmental impact is distributed onto different levels so that to diminish the load on each level thus increasing the overall stability. Then, since the levels are interconnected by feedbacks (note that levels are parts of the same system), the hierarchy turns bidirectional: it goes both bottom up and top down. This is in sharp contrast with the reductionist approach where the hierarchy is unidirectional: it goes only bottom up. Yet, the question remains: how different patterns occur on different levels and what is the condition for their stable consistency.

The matter of self-organization is one of the most challenging problems in modern interdisciplinary science and is still far from resolution. My contribution to the problem consists of the fact that the structure of power spectra could reveal the presence of hierarchy and conditions for their stable consistency although different patterns occur on different levels.

In my previous paper [4] I have defined general condition for stable coexistence of patterns that come from different hierarchical levels. In general, this is again the condition for avoiding resonances. To remind, the notion of a resonance implies that a system is “shaken” up to very large amplitudes by means of relatively small driving force (not necessarily periodic). So there are two types of resonances: 1) those which keep the system within the current margins of stability (that is adaptation); 2) those resonances which bring a system beyond current thresholds of stability (that is destruction). In order to avoid destruction, the general protocol applies and it consists of the requirement that the homeostatic patterns of a lower level appear as “satellite” lines superimposed onto each and every line of the higher level pattern.

Then, stable systems reveal a specific “behavioral” pattern on applying operation of coarse-graining (that is partitioning of the corresponding spatio-temporal BIS into cells of almost equal size): it implies that the details of lower level structure are “smoothed out” and appear rather as broadening of the higher level lines so that the higher level patterns remain intact when the system is stable.

Outlining, stable systems are characterized by power spectra which consist of specific steady homeostatic pattern and noise component of universal shape. The homeostatic patterns and the shape of the noise component remain intact on monitoring and/or applying coarse-graining. The latter operation allows revealing of stable self-consistent bidirectional hierarchy.

Another advantage of the proposed general criterion is that causal relations (homeostatic patterns) are distinguishable for steady correlations ( 1 / f α ( f ) noise) regardless of the decoding algorithm. This is in fundamental opposition to the traditional algorithmic theory where the separation of causal, steady and provisional correlations is impossible to be accomplished by any spontaneous natural mechanism in a non-ambiguous way; instead, it is subject to supervision of our human mind and thus it is highly subjective to current beliefs and understandings of the decoding mind. Accordingly, the latter makes the classification highly sensitive to the ingenuity of the decoder. To compare, the boundedness provides self-sustaining boundedness of logical and quantal errors [5] in a long run which is expressed in a non-ambiguous separation of causal, steady and provisional correlations for each and every BIS substantiated by means of the general discrimination criterion proposed in the present paper.

Summarizing, a general criterion for discrimination between causal and steady correlations at stable complex systems is proposed. It consists of carrying out the power spectra of any time series which monitor the behavior of a complex system. The use of power spectra implies as well an automatic elimination of the provisional correlations presented in the corresponding time series. An exclusive property is of that criterion is the robustness to the details of the statistics of variations which appear as a response to an ever-changing environment. Another exclusive property is provided by the constant in the time accuracy of separation of both bands in the power spectra. In turn, this suggests a new role of a homeostatic pattern, namely: it serves as bearer of identity for the functionality of the corresponding complex system. This yields a question how the appearance of a new line(s) affects the identity of a complex system in the sense whether it implies adaptation or destruction?

3. Ban over Computation of Predictability of Whether a New Line(s) Is Adaptation or Destruction: Role of Human Mind

In the Introduction it was mentioned that decomposition theorem gives rise to 3 types of components in the power spectrum of a raw time series representing the behavior of a stable complex system. The first two components are a specific discrete pattern and a continuous component whose shape is universal. An exclusive property of the decomposition theorem is that both the pattern and the shape of the continuous band are robust to both the length of the time series and/or the statistics of the environmental variations. This renders an enormous power of the discrimination criterion for discrimination of causal relations, encapsulated in the corresponding discrete pattern, and the steady correlations, encapsulated in the continuous band, which commence from the unique individual response consistent with the “survival of the kind”. The discussion about the role of the 3rd component has been postponed to the present section since its computability plays a decisive role for the entire issue about computability of any prediction whether a new line(s) yields destruction or adaptation. The 3rd component comes as a result of highly non-trivial interplay between both other bands. It is a result of confinement of the variation not to exceed specific margins called thresholds of stability. Thus, the 3rd component appears as a result of that confinement and it is displayed at bounded distance to the left of the entire other parts of a power spectrum. And here comes a conundrum: from the point of view of stability, the condition for avoiding resonances renders the line to be irrational. However, another general aspect of the theory of boundedness claims that exchanged matter/energy/information with the environment, exerted by natural processes, is specific yet bounded. An immediate consequence of this claim is that the only computable numbers are those which are represented as bounded sequences of digits so that the precision is dictated by the boundedness. Another group of computable numbers is the zeroes of the unity, e.g. 3/7. The zeroes of unity are of use for the representation of Ramanujan sums which are in the grounds of Ramanujan-Fourier transform. The latter implements a spontaneous mechanism of processing Fourier transform by means of the feedback which provides energy dissipation put forward in Chapter 3 of [1]. Their computability is explicitly provided by the Euclideanity of the functional metrics. These types of numbers are well-known under the name incommensurate numbers. However, if the frequency of the 3rd component is incommensurate, it would yield an inevitable resonance which in turn yields either destruction or adaptation. The same problem arises when a new line appears: it is also incommensurate and thus one faces the same conundrum.

It is worth noting that my previous result about self-sustained boundedness of logical and quantal error [5] is of no help here since it is derived for stable in an arbitrarily long run systems. Alongside, it is grounded on the fact the metrics of the functionality of a stable complex system are self-sustained to be Euclidean. Note that the latter property provides the means of computability since intervals obtained at different spatio-temporal locations are the same. In turn the latter provides grounds for comparing whether different numbers are equal or not. However, the appearance of a new line implies violation of the Euclideanity because of the reasons provided in [2].

Another strategy is the use of traditional algorithmic computing. However, as it will be demonstrated now, it also does not assist resolving the conundrum. Indeed, a great advantage of modern computers is that, by means of hardware engineering, the precision could be made arbitrarily large. This comes at the price of enlargement of computation time, but by means of clever engineering of the software, an appropriate balance could be achieved. Yet, however, this does not solve the major problem posed above since the ingenuity in mastering the match between software and the hardware holds only for stable solutions. For unstable solutions, a question arises whether truncation error at Taylor series is insensitive to round-off error? However, the latter holds for stable solutions only: around unstable solutions each and every term counts in a long run and on repetition. In turn, it implies that the consistency between logical and quantal error is violated and in result the logical error becomes ill-defined (it could vary from minus infinity to plus infinity) for each and every quantal error, no matter how small it is!

A notorious example for such behavior is the computing of periodic solutions which are neutral with respect to stability. Indeed, the computation of limit cycles as solutions of differential equations, is inevitably bound to degenerate into a motion on a spiral (ingoing or outgoing depending on any current realization of computing) which produces qualitatively different result in a long run: instead of bounded cycling motion, it approaches either steady point or infinity. The inevitability of this behavior lies in the fact that each and every term in the corresponding Taylor series has equal contribution in a long run despite of its current value. The latter, however, just confirms the conundrum, since it turns out that the number of significant terms is infinite while the precision comprises always only bounded number of digits.

Outlining, the conclusion is that modern day computing is unable to decide with certainty whether a solution subject to resonance yields adaptation or destruction.

It is worth noting that the self-sustaining of Euclideanity of the functional metrics renders negligible error between the semantic computing (grounded of the use of Ramanujan-Fourier transform) and the traditional Fourier transform (grounded on the expansion in truncated Taylor series).

Yet, in practice a lot could be done: by means of monitoring and/or by means of appropriate intervention after skillful complementary investigations, the resonance could be postponed or even avoided. Yet, our intervention could be a double-blade razor: on the hand, we could postpone a resonance at a given hierarchical level but this could invoke sooner a resonance on other hierarchical levels. Thus everything is in the hands and abilities of our human mind to decide about our own future.

An immediate consequence of the fact that our human imagination is able to distinguish between incommensurate and irrational numbers prompts to suggest that the matter about what is intelligence is still far from resolution.

4. Conclusions

A general criterion for autonomous discrimination of causal and provisional correlations is established. It is grounded on a completely novel general theorem, proven by the author and called by her decomposition theorem, which is fundamentally different from the Central Limit Theorem. It also allows autonomous discrimination of the correlations commencing from the individual peculiarities of current individual; let alone the latter belongs to the same kind. The criterion holds in an unspecified environment and is conclusive about stable complex systems of all variety of their origin: meteorological, ecological, social networks, archaeological etc. Its power is spread over systems where the information about the targeted behavior is uncertain, missing and/or hazardous to be obtained.

Yet, the power of the criterion is limited to stable systems only because of the general impossibility for computing with certainty whether a given new line(s) yield destruction or adaptation. The flaw confirms once again after turning the fundamental difference between our human intelligence and all attempt to construct an artificial one. Yet, the considerations presented in the present paper once again confirm the decisive role of the human mind and its ingenuity for solving problems ranging from diseases, to climate changes and ultimately to our knowledge about the Universe. And all this comes out from the simple question what is a stable system and how it fits an unspecified environment.

Conflicts of Interest

The author declares no conflicts of interest regarding the publication of this paper.

References

[1] Koleva, M.K. (2012) Boundedeness and Self-Organized Semantics: Theory and Applications. IGI-Global, Hershey, PA.
https://doi.org/10.4018/978-1-4666-2202-9
[2] Koleva, M.K. (2019) Journal of Modern Physics, 10, 43-58.
https://doi.org/10.4236/jmp.2019.101005
[3] Koleva, M.K., Elyias, A.E. and Petrov, L.A. (2000) Fractal Power Spectrum at Catalytic Oxidation of HCOOH over Supported Pd Catalyst. In: Russo, N. and Salahub, D.R., Eds., Metal-Ligand Interactions in Chemistry, Physics and Biology, NATO ASI Series C Vol. 546, Kluwer Academic Publishers, Dordrecht, 353-369.
https://doi.org/10.1007/978-94-011-4245-8_15
[4] Koleva, M.K. (2018) Journal of Modern Physics, 9, 335-348.
https://doi.org/10.4236/jmp.2018.93024
[5] Koleva, M.K. (2020) Journal of Modern Physics, 11, 157-167.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.