Papers in Physics, vol. 7, art. 070006 (2015) Received: 20 November 2014, Accepted: 1 April 2015 Edited by: C. A. Condat, G. J. Sibona Licence: Creative Commons Attribution 3.0 DOI: http://dx.doi.org/10.4279/PIP.070006 www.papersinphysics.org ISSN 1852-4249 Noise versus chaos in a causal Fisher-Shannon plane Osvaldo A. Rosso,1, 2∗ Felipe Olivares,3 Angelo Plastino4 We revisit the Fisher-Shannon representation plane H×F, evaluated using the Bandt and Pompe recipe to assign a probability distribution to a time series. Several stochastic dynamical (noises with f−k, k ≥ 0, power spectrum) and chaotic processes (27 chaotic maps) are analyzed so as to illustrate the approach. Our main achievement is uncovering the informational properties of the planar location. I. Introduction Temporal sequences of measurements (or observa- tions), that is, time-series (TS), are the basic ele- ments for investigating natural phenomena. From TS, one should judiciously extract information on dynamical systems. Those TS arising from chaotic systems share with those generated by stochastic processes several properties that make them very similar: (1) a wide-band power spectrum (PS), (2) a delta-like autocorrelation function, (3) irregular behavior of the measured signals, etc. Now, irregu- lar and apparently unpredictable behavior is often observed in natural TS, which makes interesting the establishment of whether the underlying dynami- cal process is of either deterministic or stochastic character in order to i) model the associated phe- nomenon and ii) determine which are the relevant ∗Email: oarosso@gmail.com 1 Insitituto Tecnológico de Buenos Aires, Av. Eduardo Madero 399, C1106ACD Ciudad Autónoma de Buenos Aires, Argentina. 2 Instituto de F́ısica, Universidade Federal de Alagoas, Maceió, Alagoas, Brazil. 3 Departamento de F́ısica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, La Plata, Argentina. 4 Instituto de F́ısica, IFLP-CCT, Universidad Nacional de La Plata, La Plata, Argentina. quantifiers. Chaotic systems display “sensitivity to ini- tial conditions” and lead to non-periodic motion (chaotic time series). Long-term unpredictability arises despite the deterministic character of the tra- jectories (two neighboring points in the phase space move away exponentially rapidly). Let x1(t) and x2(t) be two such points, located within a ball of radius R at time t. Further, assume that these two points cannot be resolved within the ball due to poor instrumental resolution. At some later time t′, the distance between the points will typically grow to |x1(t′) −x2(t′)| ≈ |x1(t) −x2(t)| exp(λ |t′− t|), with λ > 0 for a chaotic dynamics, λ the largest Lyapunov exponent. When this distance at time t′ exceeds R, the points become experimentally dis- tinguishable. This implies that instability reveals some information about the phase space popula- tion that was not available at earlier times [1]. One can then think of chaos as an information source. The associated rate of generated information can be cast in precise fashion via the Kolmogorov-Sinai’s entropy [2, 3]. One question often emerges: is the system chaotic (low-dimensional deterministic) or stochas- tic? If one is able to show that the system is domi- nated by low-dimensional deterministic chaos, then only few (nonlinear and collective) modes are re- quired to describe the pertinent dynamics [4]. If 070006-1 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. not, then the complex behavior could be modeled by a system dominated by a very large number of excited modes which are in general better described by stochastic or statistical approaches. Several methodologies for evaluation of Lya- punov exponents and Kolmogorov-Sinai entropies for time-series’ analysis have been proposed (see Ref. [5]), but their applicability involves taking into account constraints (stationarity, time series length, parameters values election for the method- ology, etc.) which in general make the ensuing results non-conclusive. Thus, one wishes for new tools able to distinguish chaos (determinism) from noise (stochastic) and this leads to our present in- terest in the computation of quantifiers based on Information Theory, for instance, “entropy”, “sta- tistical complexity”, “Fisher information”, etc. These quantifiers can be used to detect deter- minism in time series [6–11]. Different Informa- tion Theory based measures (normalized Shannon entropy, statistical complexity, Fisher information) allow for a better distinction between deterministic chaotic and stochastic dynamics whenever “causal” information is incorporated via the Bandt and Pompe’s (BP) methodology [12]. For a review of BP’s methodology and its applications to physics, biomedical and econophysic signals, see [13]. Here we revisit, for the purposes previously de- tailed, the so-called causality Fisher–Shannon en- tropy plane, H×F [14], which allows to quantify the global versus local characteristic of the time series generated by the dynamical process under study. The two functionals H and F are evalu- ated using the Bandt and Pompe permutation ap- proach. Several stochastic dynamics (noises with f−k, k ≥ 0, power spectrum) and chaotic processes (27 chaotic maps) are analyzed so as to illustrate the methodology. We will encounter that signifi- cant information is provided by the planar location. II. Shannon entropy and Fisher in- formation measure Given a continuous probability distribution func- tion (PDF) f(x) with x ∈ ∆ ⊂ R and ∫ ∆ f(x) dx = 1, its associated Shannon Entropy S [15] is S[f] = − ∫ ∆ f ln(f) dx , (1) a measure of “global character” that is not too sensitive to strong changes in the distribution tak- ing place on a small-sized region. Such is not the case with Fisher’s Information Measure (FIM) F [16,17], which constitutes a measure of the gradient content of the distribution f(x), thus being quite sensitive even to tiny localized perturbations. It reads F[f] = ∫ ∆ 1 f(x) [ df(x) dx ]2 dx = 4 ∫ ∆ [ dψ(x) dx ]2 . (2) FIM can be variously interpreted as a measure of the ability to estimate a parameter, as the amount of information that can be extracted from a set of measurements, and also as a measure of the state of disorder of a system or phenomenon [17]. In the previous definition of FIM (Eq. (2)), the division by f(x) is not convenient if f(x) → 0 at certain x−values. We avoid this if we work with real prob- ability amplitudes f(x) = ψ2(x) [16, 17], which is a simpler form (no divisors) and shows that F simply measures the gradient content in ψ(x). The gradi- ent operator significantly influences the contribu- tion of minute local f−variations to FIM’s value. Accordingly, this quantifier is called a “local” one [17]. Let now P = {pi; i = 1, · · · ,N} be a discrete probability distribution, with N the number of pos- sible states of the system under study. The con- comitant problem of information-loss due to dis- cretization has been thoroughly studied and, in par- ticular, it entails the loss of FIM’s shift-invariance, which is of no importance for our present purposes [10, 11]. In the discrete case, we define a “normal- ized” Shannon entropy as H[P] = S[P ] Smax = 1 Smax { − N∑ i=1 pi ln(pi) } , (3) where the denominator Smax = S[Pe] = ln N is that attained by a uniform probability distribution Pe = {pi = 1/N, ∀i = 1, · · · ,N}. For the FIM, we take the expression in term of real probability amplitudes as starting point, then a discrete nor- malized FIM convenient for our present purposes is 070006-2 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. given by F[P] = F0 N−1∑ i=1 [(pi+1) 1/2 − (pi)1/2]2 . (4) It has been extensively discussed that this dis- cretization is the best behaved in a discrete envi- ronment [18]. Here, the normalization constant F0 reads F0 =   1 if pi∗ = 1 for i∗ = 1 or i∗ = N and pi = 0 ∀i 6= i∗ 1/2 otherwise. (5) If our system lies in a very ordered state, which occurs when almost all the pi – values are zeros, we have a normalized Shannon entropy H∼ 0 and a normalized Fisher’s Information Measure F ∼ 1. On the other hand, when the system under study is represented by a very disordered state, that is when all the pi – values oscillate around the same value, we obtain H ∼ 1 while F ∼ 0. One can state that the general FIM-behavior of the present discrete version (Eq. (4)), is opposite to that of the Shan- non entropy, except for periodic motions [10, 11]. The local sensitivity of FIM for discrete-PDFs is re- flected in the fact that the specific “i−ordering” of the discrete values pi must be seriously taken into account in evaluating the sum in Eq. (4). This point was extensively discussed by us in previous works [10, 11]. The summands can be regarded as a kind of “distance” between two contiguous prob- abilities. Thus, a different ordering of the pertinent summands would lead to a different FIM-value, hereby its local nature. In the present work, we follow the lexicographic order described by Lehmer [22] in the generation of Bandt-Pompe PDF. III. Description of our chaotic and stochastic systems Here we study both chaotic and stochastic systems, selected as illustrative examples of different classes of signals, namely, (a) 27 chaotic dynamic maps [9,19] and (b) truly stochastic processes, noises with f−k power spectrum [9]. i. Chaotic maps In the present work, we consider 27 chaotic maps described by J. C. Sprott in the appendix of his book [19]. These chaotic maps are grouped as a) Noninvertible maps: (1) Logistic map; (2) Sine map; (3) Tent map; (4) Linear congruential generator; (5) Cubic map; (6) Ricker’s popu- lation model; (7) Gauss map; (8) Cusp map; (9) Pinchers map; (10) Spence map; (11) Sine- circle map; b) Dissipative maps: (12) Hénon map; (13) Lozi map; (14) Delayed logistic map; (15) Tinker- bell map; (16) Burgers’ map; (17) Holmes cubic map; (18) Dissipative standard map; (19) Ikeda map; (20) Sinai map; (21) Discrete predator-prey map, c) Conservative maps: (22) Chirikov standard map; (23) Hénon area-preserving quadratic map; (24) Arnold’s cat map; (25) Gingerbread- man map; (26) Chaotic web map; (27) Lorenz three-dimensional chaotic map; Even when the present list of chaotic maps is not exhaustive, it could be taken as representative of common chaotic systems [19]. ii. Noises with f−k power spectrum The corresponding time series are generated as fol- lows [20]: 1) Using the Mersenne twister genera- tor [21] through the Matlab c© RAND function we generate pseudo random numbers y0i in the inter- val (−0.5, 0.5) with an (a) almost flat power spectra (PS), (b) uniform PDF, and (c) zero mean value. 2) Then, the Fast Fourier Transform (FFT) y1i is first obtained and then multiplied by f−k/2, yield- ing y2i ; 3) Now, y 2 i is symmetrized so as to obtain a real function. The pertinent inverse FFT is now at our disposal, after discarding the small imaginary components produced by the numerical approxima- tions. The resulting time series η(k) exhibits the desired power spectra and, by construction, is rep- resentative of non-Gaussian noises. IV. Results and discussion In all chaotic maps, we took (see section i.) the same initial conditions and the parameter-values detailed by Sprott. The corresponding initial val- ues are given in the basin of attraction or near the attractor for the dissipative systems, or in the chaotic sea for the conservative systems [19]. For each map’s TS, we discarded the first 105 iterations 070006-3 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. 0.80.60.40.2 1.00.0 0.0 0.2 0.4 0.6 0.8 1.0 D = 6 2726 25 24 23-y 23-x 22-y 22-x 21-y 21-x 20-y20-x 19-y 19-x 18-y 18-x 17 16-y 16-x 15-y 15-x 14 13 12 11 109 8 7 6 5 4 2 1,3 k =0 k =2 k =2.5 k =3 Nonivertible maps Dissipative maps Conservative maps k - noise F is h er I n fo rm at io n M ea su re Normalized Shannon Entropy k =3.5 Figure 1: Localization in the causality Fisher- Shannon plane of the 27 chaotic maps considered in the present work. The Bandt-Pompe PDF was evaluated following the lexicographic order [22] and considering D = 6 (pattern-length), τ = 1 (time lag) and time series length N = 107 data (initial conditions given by Sprott [19]). The inside num- bers represent the corresponding chaotic map enu- merated at the beginning of section i.. The let- ters “X” and “Y” represent the time series coordi- nates maps for which their planar representation is clearly distinguishable. The open circle-dash line represents the planar localization (average values over ten realizations with different seeds) for the stochastic process: noises with f−k power spec- trum. and, after that, N = 107 iterations-data were gen- erated. Stochastic dynamics represented by time series of noises with f−k power spectrum (0 ≤ k ≤ 3.5 and ∆k = 0.25) were considered. For each value of k, ten series with different seeds and total length N = 106 data were generated (see section ii.), and their corresponding average values were reported for uncorrelated (k = 0) and correlated (k > 0) noises. The BP-PDF was evaluated for each TS of N data, stochastic and chaotic, following the lexi- cographic pattern-order proposed by Lehmer [22], with pattern-lengths D = 6 and time lag τ = 1. Their corresponding localization in the causality Fisher-Shannon plane are shown in Fig. 1. One can use any of these TS for evaluating the dynami- cal system’s invariants (like correlation dimension, Lyapunov exponents, etc.), by appealing to a time lag reconstruction [19]. Here we analyzed TS gener- ated by each one of chaotic maps’ coordinates when the corresponding map is bi- or multi-dimensional. Due to the fact that the BP-PDF is not a dynam- ical invariant (neither are other quantifiers derived by Information Theory), some variation could be expected in the quantifiers’ values computed with this PDF, whenever one or other of the TS gener- ated by these multidimensional coordinate systems. From Fig. 1, we clearly see that the chaotic maps under study are localized mainly at entropic re- gion lying between 0.35 and 0.9, and reach FIM values from 0.4 to almost 1. A second group of chaotic maps, constituted by: the Gauss map (7), linear congruential generator (4), dissipative stan- dard map (18), Sinai map (20) and Arnold’s cat map, is localized near the right-lower corner of the H×F plane, that is in the range 0.95 ≤ H ≤ 1.0 and 0 ≤ F ≤ 0.3. Their localization could be understood if one takes into account that when a 2D-graphical representation of them (i.e., a graph Xn ×Xn+1 for one dimensional maps, or Xn ×Yn for two dimensional maps) it tends to fulfill the space, resembling the behavior of stochastic dy- namics. However, they are chaotic and present a clear structure when the dynamics are represented in higher dimensional plane. Noises with f−k power spectrum (with 0 ≤ k ≤ 5) exhibit a wide range of entropic values (0.1 ≤ H ≤ 1) and FIM values lying between 0 ≤ F ≤ 0.5. A smooth transition in the pla- nar location is observed in the passage from un- correlated noise (k = 0 with H ∼ 1 and F ∼ 0) to correlated one (k > 0). The correlation de- gree grows as the k value increases. From Fig. 1 we gather that, for stochastic time series with in- creasing correlation-degree, the associated entropic values H decrease, while Fisher’s values F increase. Taking into account that other stochastic processes, like fBm and fGn (not shown), present a quite close behavior to the k-noise analyzed here (see Ref. [11]), we can think that the open circle-dash line represents a division of the plane; above this line all the chaotic maps are localized. It is also inter- esting to note that, qualitatively, the same results are obtained when the evaluations where made with pattern length D = 4 and D = 5, as well as, differ- 070006-4 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. ent Fisher information measure discretization are used. Summing up, we have presented an extensive series of numerical simulations/computations and have contrasted the characterizations of determin- istic chaotic and noisy-stochastic dynamics, as rep- resented by time series of finite length. Surprisingly enough, one just has to look at the different planar locations of our two dynamical regimes. The pla- nar location is able to tell us whether we deal with chaotic or stochastic time series. Acknowledgements - O. A. Rosso and A. Plas- tino were supported by Consejo Nacional de Inves- tigaciones Cient́ıficas y Técnicas (CONICET), Ar- gentina. O. A. Rosso acknowledges support as a FAPEAL fellow, Brazil. F. Olivares is supported by Departamento de F́ısica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, Ar- gentina. [1] H D I Abarbanel, Analysis of observed chaotic data, Springer-Verlag, New York (1996). [2] A N Kolmogorov, A new metric invariant for transitive dynamical systems and automor- phisms in lebesgue sapces, Dokl. Akad. Nauk. (USSR) 119, 861 (1959). [3] Y G Sinai, On the concept of entropy for a dynamical system, Dokl. Akad. Nauk. (USSR) 124, 768 (1959). [4] A R Osborne, A Provenzale, Finite correlation dimension for stochastic systems with power- law spectra, Physica D 35, 357 (1989). [5] H Kantz, T Scheiber, Nonlinear time series analysis, Cambridge University Press, Cam- bridge, UK (2002). [6] O A Rosso, H A Larrondo, M T Mart́ın, A Plastino, M A Fuentes, Distinguishing noise from chaos, Phys. Rev. Lett. 99, 154102 (2007). [7] O A Rosso, L C Carpi, P M Saco, M Gómez Ravetti, A Plastino, H A Larrondo, Causality and the entropy-complexity plane: Robustness and missing ordinal patters, Physica A 391, 42 (2012). [8] O A Rosso, L C Carpi, P M Saco, M Gómez Ravetti, H A Larrondo, A Plastino, The Amigó paradigm of forbidden/missing patterns: A de- tailed analysis, Eur. Phys. J. B 85, 419 (2012). [9] O A Rosso, F Olivares, L Zunino, L De Micco, A L L Aquino, A Plastino, H A Larrondo, Characterization of chaotic maps using the permutation Bandt–Pompe probability distri- bution, Eur. Phys. J. B 86, 116 (2013). [10] F Olivares, A Plastino, O A Rosso, Ambi- guities in Bandt–Pompe’s methodology for lo- cal entropic quantifiers, Physica A, 391, 2518 (2012). [11] F Olivares, A Plastino, O A Rosso, Contrast- ing chaos with noise via local versus global in- formation quantifiers, Phys. Lett A 376, 1577 (2012). [12] C Bandt, B Pompe, Permutation entropy: A natural complexity measure for time series, Phys. Rev. Lett. 88, 174102 (2002). [13] M Zanin, L Zunino, O A Rosso, D Papo, Per- mutation entropy and its main biomedical and econophysics applications: A review, Entropy 14, 1553 (2012). [14] C Vignat, J F Bercher, Analysis of signals in the Fisher-Shannon information plane, Phys. Lett. A 312, 27 (2003). [15] C Shannon, W Weaver, The mathematical the- ory of communication, University of Illinois Press, Champaign, USA (1949). [16] R A Fisher, On the mathematical foundations of theoretical statistics, Philos. Trans. R. Soc. Lond. Ser. A 222, 309 (1922). [17] B R Frieden, Science from Fisher information: A Unification, Cambridge University Press, Cambridge, UK (2004). [18] P Sánchez-Moreno, R J Yánẽz, J S Dehesa, Discrete densities and Fisher information, In: Proceedings of the 14th International Con- ference on Difference Equations and Appli- cations, Eds. M. Bohner, et al., Pag. 291, 070006-5 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. UğurBahçeşehir University Publishing Com- pany, Istanbul, Turkey (2009). [19] J C Sprott, Chaos and time series analy- sis, Oxford University Press, New York, USA (2003). [20] H A Larrondo, Matab program: noisefk.m (http://www.mathworks.com/matlabcentral/ fileexchange/35381) (2012). [21] M Matsumoto, T Nishimura, Mersenne twister: A 623-dimensionally uniform pseudo- random number gererator, ACM T. Model. Comput. S. 8, 3 (1998). [22] http://www.keithschwarz.com/interesting/ code/factoradic-permutation/Factoradic Permutation.hh.html 070006-6