INTERNATIONAL JOURNAL OF COMPUTERS COMMUNICATIONS & CONTROL Online ISSN 1841-9844, ISSN-L 1841-9836, Volume: 15, Issue: 6, Month: December, Year: 2020 Article Number: 3983, https://doi.org/10.15837/ijccc.2020.6.3983 CCC Publications Information Volume of Mass Function Y. Deng Yong Deng* 1. Institute of Fundamental and Frontier Science University of Electronic Science and Technology of China, Chengdu, 610054, China 2. School of Eduction Shannxi Normal University, Xi’an, 710062, China 3. Department of Biostatics Medical Center, Vanderbilt University, Nashville, TN. 37235, USA *Corresponding author: dengentropy@uestc.edu.cn, prof.deng@hotmail.com Abstract Given a probability distribution, its corresponding information volume is Shannon entropy. However, how to determine the information volume of a given mass function is still an open issue. Based on Deng entropy, the information volume of mass function is presented in this paper. Given a mass function, the corresponding information volume is larger than its uncertainty measured by Deng entropy. In addition, when the cardinal of the frame of discernment is identical, both the total uncertainty case and the BPA distribution of the maximum Deng entropy have the same information volume. Some numerical examples are illustrated to show the efficiency of the proposed information volume of mass function. Keywords: information volume, mass function, Shannon entropy, Deng entropy. 1 Introduction In the past decades, plenty of theories have been developed for expressing and dealing with the uncertainty in the uncertain environment, for instance, the extended probability theory [26], proba- bilistic linguistic [11, 28], fuzzy linguistic [21, 22], fuzzy logic [10], fuzzy set theory [67], intuitionistic fuzzy sets [20, 36, 48], soft sets [1, 12], Dempster-Shafer evidence theory [6, 46], rough sets [15, 44], Z numbers [23, 25, 35, 59] and D numbers [7, 30, 33]. Because these theories can well handle uncertainty in many kinds of situations, they have been widely applied in various fields, including decision making [13, 19, 40, 69], reliability analysis [27, 66], medical diagnosis [2, 14], multi-source information fusion [29, 55], causal analysis [32], vehicle system [5], and fault detecting [45]. Entropy function is very important in uncertainty modelling[9]. Since firstly derived from thermo- dynamics, different kinds of entropy have been proposed, such as Shannon entropy [47], Tsallis entropy [50], nonadditive entropy [51], interval valued entropies [61], and fuzzy entropy [3, 4]. Recently, a new entropy, called Deng entropy [8], is presented for measuring the uncertainty in evidence theory. Deng entropy is the generalization of Shannon entropy. Compared with traditional methods, Deng entropy is more reasonable, and it takes both discord and non-specificity into account [49]. https://doi.org/10.15837/ijccc.2020.6.3983 2 Given a probability distribution, its corresponding information volume can be measured by Shan- non entropy. However, how to determine the information volume of mass function in evidence theory is still an open issue. In this paper, an information volume of mass function based on Deng entropy is presented. The information volume of mass function is constructed based on the BPA distribution of the maximum Deng entropy. If the mass function is degenerated into probability distribution, the proposed information volume is the same as Shannnon entropy. In addition, when the cardinal of the frame of discernment is constant, both the total uncertainty case and the Deng distribution have the same information volume. The rest of this paper is organized as follows. In section 2, some preliminaries are briefly reviewed. In section 3, based on Deng entropy, the information volume of mass function is proposed. In section 4, numerical examples are expounded to illustrated the proposed definition. In section 5, we have a brief conclusion. 2 Preliminaries Several preliminaries are briefly introduced in this section, including Dempster-Shafer evidence theory, mass function, Shannon entropy, Deng entropy and the maximum Deng entropy. 2.1 Dempster-Shafer evidence theory Uncertainty modelling is still an open issue [9, 52]. Dempster-Shafer evidence theory[6, 46] can be used to deal with uncertainty. Besides, evidence theory satisfies the weaker conditions than the probability theory, which provides it with the ability to express uncertain information directly [39]. Therefore, evidence theory has well studied, including evidence reasoning [38, 64, 65, 68], belief rule [57, 58], complex mass functions [53, 54], and generalized Dempster–Shafer structures [60, 62], and it was applied in many areas, such as risk analysis [42, 43], classification [37], data fusion [56], and heuristic representation learning [16]. Some basic conceptions of evidence theory are given as follows: Definition 2.1: Frame of discernment and its power set Let Θ, called the frame of discernment, denote an exhaustive nonempty set of hypotheses, where the elements are mutually exclusive. Let the set Θ have N elements, which can be expressed as: Θ = {θ1,θ2,θ3, · · · ,θN} (1) The power set of Θ, denoted as 2Θ, contains all possible subsets of Θ and has 2N elements, and 2Θ is represented by 2Θ = {A1,A2,A3, · · · ,A2N} = { ∅,{θ1},{θ2}, · · · ,{θN},{θ1,θ2}, {θ1,θ3}, · · · ,{θ1,θN}, · · · , Θ } (2) where the element Ak is called the focal element of Θ, if Ak is nonempty. Definition 2.2: Mass function A mass function is also called Basic probability assignment (BPA), which map m from 2Θ to [0, 1], and it is defined as follows: m : 2Θ → [0, 1] (3) which is constrained by the following conditions:∑ A∈2Θ m(A) = 1 (4) m(∅) = 0 (5) Compared with probability distribution in probability theory, mass function has more efficient manner to handle uncertainty [63]. In addition, some other extension of mass function in quantum information has been paid attention recently [17]. https://doi.org/10.15837/ijccc.2020.6.3983 3 2.2 Shannon entropy Entropy plays an important role in measure the uncertainty [34, 41]. In the field of classical probability theory, Shannon entropy [47] is often used to measure the uncertainty of a probability distribution. Consider a probability distribution P defined on the set Θ = {H1,H2,H3, · · · ,HN}. Definition 2.5: Shannon entropy Shannon entropy Hs(P) is defined as follows: Hs(P) = ∑ θ∈Θ P(θ) log( 1 P(θ) ). (6) where ∑ θ∈Θ P(θ) = 1 and P(θ) ∈ [0, 1]. Usually, the base of logarithm is 2, and entropy has the unit of bit. It’s not hard to find that Hs(P) is on the scale [0, log N]. 2.3 Deng entropy In information theory, entropy can be used to measure the uncertainty of a system. Recently, a novel entropy, named as Deng entropy [8], is proposed to measure the uncertainty in evidence theory. Definition 2.5: Deng entropy Deng entropy is defined as: HDE(m) = − ∑ A∈2Θ m(A) log( m(A) 2|A| − 1 ) (7) where |A| is the cardinal of a certain focal element A. Deng entropy is the generalization of Shannon entropy. When every focal element is singleton, Deng entropy degenerates into Shannon entropy [18]. Through a simple transformation, Eq.(7) can be rewritten as follows: HDE(m) = ∑ A∈2Θ log(2|A| − 1) − ∑ A∈2Θ m(A) log m(A) (8) where ∑ A∈2Θ log(2|A|−1) and − ∑ A∈2Θ m(A) log m(A) are measurements of nonspecificity and discord, respectively. As a result, Deng entropy is a composite measurement of nonspecificity and discord, which means that it is a tool for measuring total uncertainty [31]. 2.4 The maximum Deng entropy Assume A is the focal element of a certain frame of discernment Θ and m(A) is the BPA for A. According to [24], the analytic solution of the maximum Deng entropy is as follows: Theorem 2.1: The analytic solution of the maximum Deng Entropy If and only if m(A) = (2 |A|−1)∑ A∈2Θ (2 |A|−1) , Deng entropy reaches its maximum value. The analytic solution of the maximum Deng entropy is HMDE(m) = log ∑ A∈2Θ (2|A| − 1) (9) 3 Information volume of mass function Given a probability distribution, the associated information volume can be measured by Shannon entropy. However, how to measure the information volume of a given mass function is still an open issue. In this section, based on Deng entropy, the information volume of mass function is defined. Definition 3.1: Information volume of mass function https://doi.org/10.15837/ijccc.2020.6.3983 4 Let the frame of discernment be Θ = {θ1,θ2,θ3, · · · ,θN}. Use index i to denote the times of this loop, and use m(Ai) to denote different mass function of different loops. Based on Deng entropy, the information volume of mass function can be calculated by following steps: step 1: Input mass function m(A0). step 2: Continuously separate the mass function of the element whose cardinal is larger than 1 until convergence. Concretely, repeat the loop from step 2-1 to step 2-3 until Deng entropy is convergent. step 2-1: Focus on the element whose cardinal is larger than 1, namely, |Ai|> 1. And then, separate its mass function based on the proportion of the BPA of the maximum Deng entropy: m(Ai) = (2|Ai| − 1)∑ Ai∈2Θ (2 |Ai| − 1) (10) For example, given a focal element Ai−1 = {θx,θy} and its mass function m(Ai−1), the separating proportion is that 15 : 1 5 : 3 5 . The ith times of separation divides m(Ai−1) and yields following new mass function: m(Xi), m(Yi), m(Zi), where Xi = {θx}, Yi = {θy} and Zi = {θx,θy}. In addition, they satisfy these equations: m(Xi) + m(Yi) + m(Zi) =m(Ai−1) (11) m(Xi) : m(Yi) : m(Zi) = 1 5 : 1 5 : 3 5 (12) step 2-2: Based on Deng entropy, calculate the uncertainty of all the mass functions except for those who have been divided. The result is denoted as Hi(m). step 2-3: Calculate ∆i = Hi(m) −Hi−1(m). When ∆i satisfies following condition, jump out of this loop. ∆i = Hi(m) −Hi−1(m) < ε (13) where ε is the allowable error. step 3: Output HIV−mass(m) = Hi(m), which is the information volume of the mass function. 4 Numerical examples and discussions In this section, some examples are expounded to better understand the definition for the proposed information volume of mass function, and the discussion is followed after every example. In the following examples, the base of the logarithmic function is 2, and the allowable error is 0.001. Example 4.1: Consider the focal element be X = {θ1}, Y = {θ2} and Z = {θ3}. Let the mass function be m0(X) = m0(Y ) = m0(Z) = 13 . Because there is no focal element whose cardinal is larger than 1, the step 2-1 can be skipped for all the times of the loop. Then, in step 2-2, use Deng entropy to calculate the uncertainty of this mass function: Hi(m) = − 1 3 log2( 1 3 ) − 1 3 log2( 1 3 ) − 1 3 log2( 1 3 ) = 1.584963 (14) After going through the loop again, the new Hi(m) is also 1.585 since step 2-1 is always skipped. As a result, we escape from the loop and get the information volume of this mass function HIV−mass(m) = 1.584963. Actually, this form of mass function is the probability distribution P1 = P2 = P3 = 13 . Hence, when the mass function degenerates into the probability distribution, the value of HIV−mass(m) is identical to the Shannon entropy. Example 4.2: https://doi.org/10.15837/ijccc.2020.6.3983 5 Consider the frame of discernment be Θ = {θ1,θ2,θ3}. Let the mass function be m0({θ1}) = m0({θ2}) = m0({θ3}) = m0({θ1,θ2}) = m0({θ1,θ3}) = m0({θ2,θ3}) = m0({θ1,θ2,θ3}) = 17 . The information volume of this mass function can be calculated by Definition 3.1. The conver- gence procedure of Hi(m) is listed in Table 1. Table 1: The convergence procedure of Hi(m) i Hi(m) i Hi(m) 1 3.887675 9 5.178227 2 4.409314 10 5.187146 3 4.724509 11 5.192498 4 4.914440 12 5.195709 5 5.028700 13 5.197636 6 5.097366 14 5.198792 7 5.138606 15 5.199486 8 5.163366 According to Table 1, when i = 15, Hi(m)−Hi−1(m) < 0.001, which means that Hi(m) finally con- verges to 5.199486. Hence the information volume of this mass function is HIV−mass(m) = 5.199486. If we use Deng entropy to measure the uncertainty of this mass function, the result is as follows: HDE(m) = − 1 7 log2( 1 7 21 − 1 ) × 3 − 1 7 log2( 1 7 22 − 1 ) × 3 − 1 7 log2( 1 7 23 − 1 ) = 3.887675 (15) Compared HIV−mass(m) with HDE(m) in this example, HIV−mass(m) is larger than HDE(m), which shows that, given a mass function, the corresponding information volume is larger than its uncertainty measured by Deng entropy. Example 4.3: Consider the frame of discernment be U = {θ1,θ2}, X = {θ1} and Y = {θ2} be singletons. Let the mass function be m0(X) = m0(Y ) = 15 and m0(U) = 3 5 , which is the BPA distribution of the maximum Deng entropy when the cardinal of the frame of discernment is 2. The information volume of this mass function can be calculated by Definition 3.1, whose cal- culating procedure is illustrated in Figure 1. For the convenience of comprehension, the calculating procedure can be abstracted as a directed acyclic graphical model shown in Figure 2. Figure 1: The calculating procedure of Example 4.3 https://doi.org/10.15837/ijccc.2020.6.3983 6 Figure 2: The directed acyclic graphical model of Example 4.3 Table 2: The convergence procedure of Hi(m) i Hi(m) i Hi(m) 1 2.321928 8 3.396431 2 2.764107 9 3.408809 3 3.029415 10 3.416236 4 3.188600 11 3.420692 5 3.284110 12 3.423366 6 3.341417 13 3.424970 7 3.375801 14 3.425933 Then, the convergence procedure of Hi(m) is listed in Table 2. According to Table 2, when we continuously separate the BPA of the element whose cardinal is larger than 1, the ∆i of Deng entropy becomes smaller and smaller. When i = 14, Hi(m)−Hi−1(m) < 0.001, which means that Hi(m) finally converges to 3.425933. Hence, when the cardinal of the frame of discernment is 2, the information volume of the BPA distribution of the maximum Deng entropy is HIV−mass(m) = 3.425933. Example 4.4: Consider the frame of discernment be Θ = {θ1,θ2}. Let the mass function be m0(Θ) = m0({θ1,θ2}) = 1, which is called the total uncertainty case when the cardinal of the frame of discernment is 2. The information volume of this total uncertainty case can be calculated by Definition 3.1. The calculating procedure can be abstracted as a directed acyclic graphical model shown in Figure 3, and the convergence procedure of Hi(m) is listed in Table 3. According to Table 3, when i = 15, Hi(m) − Hi−1(m) < 0.001, which means that Hi(m) finally converges to 3.425933. Hence, when the cardinal of the frame of discernment is 2, the information volume of the total uncertainty case is HIV−mass(m) = 3.425933. Compared Example 4.3 with Example 4.4, we can find that, although the directed acyclic graphical models of each example is not the same, the HIV−mass(m) of them is identical. It can be concluded that, when the cardinal of the frame of discernment is identical, the BPA https://doi.org/10.15837/ijccc.2020.6.3983 7 Figure 3: The directed acyclic graphical model of Example 4.4 Table 3: The convergence procedure of Hi(m) i Hi(m) i Hi(m) 1 1.584963 9 3.396431 2 2.321928 10 3.408809 3 2.764107 11 3.416236 4 3.029415 12 3.420692 5 3.188600 13 3.423366 6 3.284110 14 3.424970 7 3.341417 15 3.425933 8 3.375801 distribution of the maximum Deng entropy and the total uncertainty case have identical information volume. The rest examples will further illustrate this conclusion. Example 4.5: Consider the frame of discernment be Θ = {θ1,θ2,θ3}. Let the mass function be m0({θ1}) = m0({θ2}) = m0({θ3}) = 119, m0({θ1,θ2}) = m0({θ1,θ3}) = m0({θ2,θ3}) = 3 19, m0({θ1,θ2,θ3}) = 7 19 , which is the BPA distribution of the maximum Deng entropy when the cardinal of the frame of discernment is 3. The information volume of this mass function can be calculated by Definition 3.1. The conver- gence procedure of Hi(m) is listed in Table 4. According to Table 4, when i = 16, Hi(m) − Hi−1(m) < 0.001, which means that Hi(m) finally converges to 6.469009. Hence, when the cardinal of the frame of discernment is 3, the information volume of the BPA distribution of the maximum Deng entropy is HIV−mass(m) = 6.469009. Example 4.6: Consider the frame of discernment be Θ = {θ1,θ2,θ3}. Let the mass function be m0(Θ) = m0({θ1,θ2,θ3}) = 1, which is called the total uncertainty case when the cardinal of the frame of discernment is 3. The information volume of the total uncertainty case can be calculated by Definition 3.1. The https://doi.org/10.15837/ijccc.2020.6.3983 8 Table 4: The convergence procedure of Hi(m) i Hi(m) i Hi(m) 1 4.247928 9 6.432107 2 5.127754 10 6.447290 3 5.661354 11 6.456402 4 5.983615 12 6.461869 5 6.177746 13 6.465150 6 6.294510 14 6.467119 7 6.364674 15 6.468300 8 6.406810 16 6.469009 convergence procedure of Hi(m) is listed in Table 5. Table 5: The convergence procedure of Hi(m) i Hi(m) i Hi(m) 1 2.807355 10 6.432107 2 4.247928 11 6.447290 3 5.127754 12 6.456402 4 5.661354 13 6.461869 5 5.983615 14 6.465150 6 6.177746 15 6.467119 7 6.294510 16 6.468300 8 6.364674 17 6.469009 9 6.406810 According to Table 5, when i = 17, Hi(m) − Hi−1(m) < 0.001, which means that Hi(m) finally converges to 6.469009. Hence the information volume of this total uncertainty case is HIV−mass(m) = 6.469009. Example 4.5 and Example 4.6 further illustrate that, when the cardinal of the frame of dis- cernment is identical, the information volume of the BPA distribution for the maximum Deng entropy is the same as that of the total uncertainty case. This point is consistent with the intuition. 5 Conclusion In this paper, we define the information volume of a given mass function based on Deng entropy. In addition, some examples are shown for better understanding of proposed information volume fo mass function. Some concluding remarks can be shown as follows. 1) If the mass function degenerates as probability distribution, the information volume is the same as Shannon entropy. 2) Given a mass function, the corresponding information volume is larger than its uncertainty mea- sured by Deng entropy. 3) One interesting point is that, when the cardinal of the frame of discernment is identical, the BPA distribution for the maximum Deng entropy and the total uncertainty case has the same information volume. This point is coincide with the intuition. https://doi.org/10.15837/ijccc.2020.6.3983 9 Acknowledgment The author greatly appreciates he China academician of the Academy of Engineering, Professor Shan Zhong and Professor You He, for their encouragement to do this research. The author greatly appreciates Professor Yugeng Xi in Shanghai Jiao Tong University to support this work. Research assistant, Jixiang Deng, and Ph.D student, Tao wen, discussed the idea and do a lot editorial works. The author greatly appreciates the continuous funding for the past nearly twenty years including National Natural Science Foundation of China, Grant Nos. 30400067, 60874105, 61174022, 61573290 and 61973332, Program for New Century Excellent Talents in University, Grant No. NCET-08-0345, Shanghai Rising-Star Program Grant No.09QA1402900, Chongqing Natural Science Foundation for distinguished scientist, Grant No. CSCT, 2010BA2003. References [1] Alcantud, J.C.; Feng, F.; Yager, R. (2020). An N-soft set approach to rough sets. IEEE Transac- tions on Fuzzy Systems, 28(11), 2996-3007, 2020. [2] Cao, Z.; Chuang, C.H.; King, J.K.; Lin, C.T.(2019). Multi-channel EEG recordings during a sustained-attention driving task. Scientific Data, 6, 19, 2019. [3] Cao, Z.; Ding, W.; Wang, Y.K.; Hussain, F.K.; Al-Jumaily, A.; Lin, C.T.(2019). Effects of Repet- itive SSVEPs on EEG Complexity using Multiscale Inherent Fuzzy Entropy. Neurocomputing, 389, 198-206, 2020. [4] Cao, Z.; Lin, C.T.; Lai, K.L.; Ko, L.W.; King, J.T.; Liao, K.K.; Fuh, J.L.; Wang, S.J.(2020). Extraction of SSVEPs-based Inherent fuzzy entropy using a wearable headband EEG in migraine patients. IEEE Transactions on Fuzzy Systems, 28(1), 14-27, 2020. [5] Cavaliere, D.; Morente-Molinera, J.A.; Loia, V.; Senatore, S.; Herrera-Viedma, E.(2020). Col- lective scenario understanding in a multi-vehicle system by consensus decision making. IEEE Transactions on Fuzzy Systems, 28(9), 1984-1995, 2020. [6] Dempster, A.P.(1967). Upper and lower probabilities induced by a multivalued mapping. The Annals of Mathematical Statistics, 38(2), 325–339, 1967. [7] Deng, X.; Jiang, W.(2019). A total uncertainty measure for D numbers based on belief intervals. International Journal of Intelligent Systems, 34(12), 3302–3316, 2019. [8] Deng, Y.(2016). Deng entropy. Chaos, Solitons & Fractals, 91, 549 – 553, 2016. [9] Deng, Y.(2020). Uncertainty measure in evidence theory. Science China Information Sciences, 63(11), 210201, 2020. [10] Dzitac, I.; Filip, F.G.; Manolescu, M.J.(2017). Fuzzy logic is not fuzzy: World-renowned computer scientist Lotfi A. Zadeh. International Journal of Computers Communications & Control, 12(6), 748–789, 2017. [11] Fang, R.; Liao, H.; Yang, J.B.; Xu, D.L.(2020). Generalised probabilistic linguistic evidential rea- soning approach for multi-criteria decision-making under uncertainty. Journal of the Operational Research Society, (2), 1–15, 2020. [12] Feng, F.; Xu, Z.; Fujita, H.; Liang, M. (2020). Enhancing PROMETHEE method with intuition- istic fuzzy soft sets. International Journal of Intelligent Systems, 35(7), 1071-1104, 2020. [13] Fu, C.; Chang, W.; Xue, M.; Yang, S. (2019). Multiple criteria group decision making with belief distributions and distributed preference relations. European Journal of Operational Research, 273(2), 623–633, 2019. https://doi.org/10.15837/ijccc.2020.6.3983 10 [14] Fu, C.; Liu, W.; Chang, W. (2020). Data-driven multiple criteria decision making for diagnosis of thyroid cancer. Annals of Operations Research, 293, 833–862, 2020. [15] Fujita, H.; Gaeta, A.; Loia, V.; Orciuoli, F. (2019). Hypotheses analysis and assessment in counter-terrorism activities: a method based on OWA and fuzzy probabilistic rough sets. IEEE Transactions on Fuzzy Systems, 28(5), 831-845, 2020. [16] Fujita, H.; Ko, Y.C. (2020). A heuristic representation learning based on evidential memberships: Case study of UCI-SPECTF. International Journal of Approximate Reasoning. 120, 125-137, 2020. [17] Gao, X.; Deng, Y. (2020). Quantum Model of Mass Function. International Journal of Intelligent Systems, 35(2), 267–282, 2020. [18] Gao, X.; Deng, Y. (2020). The pseudo-pascal triangle of maximum deng entropy. International Journal of Computers Communications & Control, 15(1), 1006, 2020. [19] Garg, H.; Chen, S.(2020). Multiattribute group decision making based on neutrality aggregation operators of q-rung orthopair fuzzy sets. Information Sciences, 517, 427–447, 2020. [20] Garg, H.; Kumar, K.(2019). Linguistic interval-valued atanassov intuitionistic fuzzy sets and their applications to group decision making problems. IEEE Transactions on Fuzzy Systems, 27(12), 2302–2311, 2019. [21] Gou, X.; Liao, H.; Xu, Z.; Min, R.; Herrera, F. (2019). Group decision making with double hierarchy hesitant fuzzy linguistic preference relations: consistency based measures, index and repairing algorithms and decision model. Information Sciences, 489, 93–112, 2019. [22] Gou, X.; Xu, Z.; Herrera, F.(2018). Consensus reaching process for large-scale group decision making with double hierarchy hesitant fuzzy linguistic preference relations. Knowledge-Based Systems, 157, 20–33, 2018. [23] Jiang, W., Cao, Y.; Deng, X. (2020). A Novel Z-network Model Based on Bayesian Network and Z-number. IEEE Transactions on Fuzzy Systems, 28(8), 1585-1599, 2020. [24] Kang, B.; Deng, Y.: The maximum Deng entropy. IEEE Access, 7, 120758-120765, 2019. [25] Kang, B.; Zhang, P.; Gao, Z.; Chhipi-Shrestha, G.; Hewage, K.; Sadiq, R.(2020). Environmental assessment under uncertainty using dempster–shafer theory and z-numbers. Journal of Ambient Intelligence and Humanized Computing, 11, 2041–2060, 2020. [26] Lee, P. (1980). Probability theory. Bulletin of the London Mathematical Society, 12(4), 318–319, 1980. [27] Li, H.; Yuan, R.; Fu, J. (2019). A reliability modeling for multi-component systems considering random shocks and multistate degradation. IEEE Access, 7, 168805–168814, 2019. [28] Liao, H.; Mi, X.; Xu, Z.(2020) A survey of decision-making methods with probabilistic linguis- tic information: Bibliometrics, preliminaries, methodologies, applications and future directions. Fuzzy Optimization and Decision Making, 19, 81–134, 2020. [29] Huang, L., Liu, Z., Pan, Q. et al. (2020). Evidential combination of augmented multi-source of information based on domain adaptation. Sci. China Inf. Sci., 63, 210203, 2020. [30] Liu, B.; Deng, Y. (2019). Risk evaluation in failure mode and effects analysis based on D numbers theory. International Journal of Computers Communications & Control, 14(5), 672-691, 2019. [31] Liu, F., Gao, X., Zhao, J., Deng, Y.(2019). Generalized belief entropy and its application in identifying conflict evidence. IEEE Access, 7(1), 126625–126633, 2019. https://doi.org/10.15837/ijccc.2020.6.3983 11 [32] Liu, H.; Dzitac, I.; Guo, S. (2018). Reduction of conditional factors in causal analysis. Interna- tional Journal of Computers Communications & Control, 13(3), 383–390, 2018. [33] Liu, P.; Zhang, X. (2020). A novel approach to multi-criteria group decision-making problems based on linguistic D numbers. Computational and Applied Mathematics, 39, 132, 2020. [34] Liu, P.; Zhang, X.; Wang, Z.(2020). An extended vikor method for multiple attribute decision making with linguistic D numbers based on fuzzy entropy. International Journal of Information Technology & Decision Making. 19(1), 143-167, 2020. [35] Liu, Q.; Tian, Y.; Kang, B.(2019). Derive knowledge of Z-number from the perspective of Dempster-Shafer evidence theory. Engineering Applications of Artificial Intelligence, 85, 754–764, 2019. [36] Liu, Y.; Jiang, W.(2020). A new distance measure of interval-valued intuitionistic fuzzy sets and its application in decision making. Soft Computing, 24, 6987–7003, 2020. [37] Liu, Z.; Zhang, X.; Niu, J.; Dezert, J.(2020). Combination of classifiers with different frames of discernment based on belief functions. IEEE Transactions on Fuzzy Systems, doi: 10.1109/TFUZZ.2020.2985332, 2020. (2020) [38] Liu, Z.G.; Pan, Q.; Dezert, J.; Martin, A.(2018). Combination of classifiers with optimal weight based on evidential reasoning. IEEE Transactions on Fuzzy Systems, 6(3), 1217–1230, 2018. [39] Luo, Z., Deng, Y.(2020) A vector and geometry interpretation of basic probability assignment in Dempster-Shafer theory. International Journal of Intelligent Systems, 35(6), 944–962, 2020. [40] Morente-Molinera, J.; Wu, X.; Morfeq, A.; Al-Hmouz, R.; Herrera-Viedma, E.(2020). A novel multi-criteria group decision-making method for heterogeneous and dynamic contexts using multi- granular fuzzy linguistic modelling and consensus measures. Information Fusion, 53, 240–250, 2020. [41] Pan, L., Deng, Y.(2020). Probability transform based on the ordered weighted averaging and entropy difference. International Journal of Computers Communications & Control, 15(4), 3743, 2020. [42] Pan, Y.,; Zhang, L.; Li, Z.; Ding, L. (2020). Improved fuzzy bayesian network-based risk analysis with interval-valued fuzzy sets and D-S evidence theory. IEEE Transactions on Fuzzy Systems, 28(9), 2063-2077, 2020. [43] Pan, Y.; Zhang, L.; Wu, X.; Skibniewski, M.J.(2020). Multi-classifier information fusion in risk analysis. Information Fusion, 60, 121–136, 2020. [44] Pawlak, Z.(1982). Rough sets. International journal of computer & information sciences, 11(5), 341–356, 1982. [45] Rong, H.; Ge, M.; Zhang, G.; Zhu, M.(2018). An approach for detecting fault lines in a small current grounding system using fuzzy reasoning spiking neural p systems. International Journal of Computers Communications & Control, 13(4), 521–536, 2018. [46] Shafer, G.(1976). A mathematical theory of evidence, vol. 1. Princeton university press, Princeton, 1976. [47] Shannon, C.E.(1948). A mathematical theory of communication. Bell System Technical Journal, 27(4), 379–423, 1948. [48] Song, Y.; Fu, Q.; Wang, Y.F.; Wang, X.(2019). Divergence-based cross entropy and uncertainty measures of Atanassov’s intuitionistic fuzzy sets with their application in decision making. Applied Soft Computing, 84, 105703, 2019. https://doi.org/10.15837/ijccc.2020.6.3983 12 [49] Song, Y.; Wang, X.; Wu, W.; Quan, W.; Huang, W.(2018). Evidence combination based on credibility and non-specificity. Pattern Analysis and Applications, 21(1), 167–180, 2018. [50] Tsallis, C.(1988). Possible generalization of boltzmann-gibbs statistics. Journal of statistical physics, 52(1-2), 479–487, 1988. [51] Tsallis, C.(2009). Nonadditive entropy: The concept and its use. The European Physical Journal A, 40(3), 257, 2009. [52] Wang, H.; Fang, Y.P.; Zio, E.(2019). Risk assessment of an electrical power system considering the influence of traffic congestion on a hypothetical scenario of electrified transportation system in new york stat. IEEE Transactions on Intelligent Transportation Systems, 2019. [53] Xiao, F.(2019). Generalization of Dempster–Shafer theory: A complex mass function. Applied Intelligence, DOI: 10.1007/s10489–019–01617–y, 2019. [54] Xiao, F.(2020). CED: A distance for complex mass functions. IEEE Transactions on Neural Networks and Learning Systems, DOI: 10.1109/TNNLS.2020.2984918, 2020. [55] Xiao, F.(2020). GIQ: A generalized intelligent quality-based approach for fusing multi-source information. IEEE Transactions on Fuzzy Systems, DOI: 10.1109/TFUZZ.2020.2991296, 2020. [56] Xiao, F.(2020). A new divergence measure for belief functions in D-S evidence theory for multi- sensor data fusion. Information Sciences, 514, 462–483, 2020. [57] Xu, X.; Xu, H.; Wen, C.; Li, J.; Hou, P.; Zhang, J.(2018). A belief rule-based evidence updating method for industrial alarm system design. Control Engineering Practice, 81, 73–84, 2018. [58] Xu, X.B.; Ma, X.; Wen, C.L.; Huang, D.R.; Li, J.N.(2018). Self-tuning method of PID parameters based on belief rule base inference. Information Technology and Control, 47(3), 551–563, 2018. [59] Yager, R.R.(2012). On z-valuations using Zadeh’s Z-numbers. International Journal of Intelligent Systems, 27(3), 259–278, 2012. [60] Yager, R.R. (2018). Fuzzy rule bases with generalized belief structure inputs. Engineering Appli- cations of Artificial Intelligence, 72, 93–98, 2018. [61] Yager, R.R.: Interval valued entropies for dempster–shafer structures. Knowledge-Based Systems 161, 390–397 (2018) [62] Yager, R.R.(2019). Generalized Dempster–Shafer structures. IEEE Transactions on Fuzzy Sys- tems, 27(3), 428–435, 2019. [63] Yan, H., Deng, Y.(2020). An improved belief entropy in evidence theory. IEEE Access, 8(1), 57505–57516, 2020. [64] Yang, J., Xu, D.(2002). On the evidential reasoning algorithm for multiple attribute decision analysis under uncertainty. IEEE Transactions on Systems, Man, and Cybernetics, Part A 32(3), 289–304, 2002. [65] Yang, J.; Xu, D.(2013). Evidential reasoning rule for evidence combination. Artificial Intelligence, 205, 1–29, 2013. [66] Yuan, R.; Tang, M.; Wang, H.; Li, H.(2019). A reliability analysis method of accelerated perfor- mance degradation based on bayesian strategy. IEEE Access, 7, 169047–169054, 2019. [67] Zadeh, L.A.(1965). Fuzzy sets. Information and control, 8(3), 338–353, 1965. [68] Zhou, M.; Liu, X.; Yang, J.(2017). Evidential reasoning approach for MADM based on incomplete interval value. Journal of Intelligent & Fuzzy Systems, 33(6), 3707–3721, 2017. https://doi.org/10.15837/ijccc.2020.6.3983 13 [69] Zhou, M.; Liu, X.B.; Chen, Y.W.; Yang, J.B.(2018). Evidential reasoning rule for MADM with both weights and reliabilities in group decision making. Knowledge-Based Systems, 143, 142–161, 2018. Copyright c©2020 by the authors. Licensee Agora University, Oradea, Romania. This is an open access article distributed under the terms and conditions of the Creative Commons Attribution-NonCommercial 4.0 International License. Journal’s webpage: http://univagora.ro/jour/index.php/ijccc/ This journal is a member of, and subscribes to the principles of, the Committee on Publication Ethics (COPE). https://publicationethics.org/members/international-journal-computers-communications-and-control Cite this paper as: Deng, Y. (2020). Information Volume of Mass Function, International Journal of Computers Communications & Control, 15(6), 3983, 2020. https://doi.org/10.15837/ijccc.2020.6.3983 Introduction Preliminaries Dempster-Shafer evidence theory Shannon entropy Deng entropy The maximum Deng entropy Information volume of mass function Numerical examples and discussions Conclusion