Open access peer-reviewed chapter

Fuzzy Information Measures with Multiple Parameters

Written By

Anjali Munde

Submitted: 12 April 2018 Reviewed: 18 May 2018 Published: 31 October 2018

DOI: 10.5772/intechopen.78803

From the Edited Volume

Fuzzy Logic Based in Optimization Methods and Control Systems and Its Applications

Edited by Ali Sadollah

Chapter metrics overview

1,184 Chapter Downloads

View Full Metrics

Abstract

Information theory deals with the study of problems concerning any system. This includes information processing, information storage, information retrieval and decision making. Information theory studies all theoretical problems connected with the transmission of information over communication channels. This includes the study of uncertainty (information) measures and various practical and economical methods of coding information for transmission. In this chapter, the introduction of a new generalised measure of fuzzy information involving two real parameters is given. The proposed measure satisfies all the necessary properties of being a measure. Some additional properties of the proposed measure have also been studied. Further, the monotonic nature of generalised fuzzy information measure with respect to the parameters is studied and validity of the same is checked by constructing the computed tables and plots on taking different fuzzy sets and different values of the parameters. Also, a new generalised fuzzy information measure involving three parameters has been introduced.

Keywords

  • fuzzy set theory
  • entropy
  • fuzzy information measures
  • monotonicity
  • symmetry

1. Background

Shannon [1] introduced the concept of entropy in communication theory and founded the subject of information theory. The stochastic system has an important property known as entropy which is widely used in various fields.

Further, the second law of thermodynamics that explains that there cannot be spontaneous decrease in the entropy of system described that over time the systems tend to be more disordered. Thus, information theory has found wide applications in statistics, information processing and computing instead of concerned with communication systems only.

If we consider entropy equivalent to uncertainty then an enormous deal of insight can be obtained. Zadeh [2] introduced and enlightened about a generalised theory of vagueness or ambiguity. In order to observe about the external world, uncertainty plays a very important role. For understanding composite phenomena, any discipline that contributes in order to understand measure, regulate, maximise or minimise and control is considered as a significant input.

Uncertainty plays a significant role in our perceptions about the external world. Any discipline that can assist us in understanding it, measuring it, regulating it, maximizing or minimizing it and ultimately controlling it to the extent possible, should certainly be considered an important contribution to our scientific understanding of complex phenomena.

Uncertainty is not a single monolithic concept. It can appear in several guises. It can arise in what we normally consider a probabilistic phenomenon. On the other hand, it can also appear in a deterministic phenomenon where we know that the outcome is not a chance event, but we are fuzzy about the possibility of the specific outcome. This type of uncertainty arising out of fuzziness is the subject of investigation of the relatively new discipline of fuzzy set theory.

We shall first take up the case of probabilistic uncertainty. Probabilistic uncertainty is related to the uncertainty connected with the probability of outcomes.

Consider a set of events E = (E1, E2,…, En) with a set of probability distribution P = (p1, p2, …,pn), pi ≥ 0, i = 1 n p i = 1 .

Then the Shannon [1] entropy associated with P is given by,

H P = i = 1 n p i log p i E1

The base of logarithm is taken as 2. Also it is assumed that

0 log0 = 0 .

Shannon [1] obtained (Eq. (1)) on the basis of following postulates:

  1. H(P) should be a continuous permutationally symmetric function of p1, p2,…, pn, that is, ambiguity changes by slight quantity if there is slight quantity changes in pi’s and ambiguity remain unchanged if pi’s exchange among themselves.

  2. H(p1, p2,…, pn, 0) = H(p1, p2,…, pn), that is, uncertainty should not change when an impossible outcome is added to the scheme.

  3. H(P) should be minimum when P is any one of the n degenerate distribution Δ 1 = 1 0 0 , Δ 2 = 0 1 0 , , Δ n = 0 0 1 and the minimum value should be zero because in all these cases, there is no uncertainty about the outcome.

  4. H(P) should be maximum when p1 = p2 = … = pn = 1/n because in this case the uncertainty is maximum.

  5. H(P*Q) = H(P) + H(Q), that is, the uncertainty of two independent probability distributions is the sum of the uncertainties of the two probability distributions.

  6. H(p1, p2,…, pn) = H(p1+ p2,p3,…, pn) + (p1+ p2)H(p1/p1 + p2, p2/p1 + p2)

The measures in (Eq. (1)) not only measures uncertainty, but it also measures equality of p1, p2,…, pn, since it has the maximum value when p1, p2,…,pn, are all equal and has the minimum value when pi’s are most unequal. In fact pi’s can be regarded as proportions rather than probabilities.

After Shannon’s [1] entropy, various other measures of entropy have been proposed.

Entropy of order α was described by Renyi [3] in the way as:

H α P = 1 1 α log ( i = 1 n p i α i = 1 n p i , α 1 , α 0 E2

Entropy of order α and type β was described by Kapur in the way as:

H α , β P = 1 1 α log i = 1 n p i α + β 1 i = 1 n p i β , α 1 , α 0 , β 0 , α + β 1 0 E3

Havrada and Charvat [4] gave the first nonadditive measure of entropy and it is used in the modified form as

H α P = 1 1 α i = 1 n p i α 1 , α 1 , α 0 . E4

Behara and Chawla [5] defined the nonadditive τ entropy as

H τ P = 1 i = 1 n p i 1 τ τ 1 2 τ 1 , τ 1 , τ 0 . E5

Kapur [6] gave the following nonadditive measures of entropy:

H a P = i = 1 n p i log p i + 1 a i = 1 n 1 + ap i log 1 + ap i ap i , a 0 E6
H b P = i = 1 n p i log p i + 1 b i = 1 n 1 + bp i log 1 + bp i 1 + b log 1 + b p i , b 0 E7
H c P = i = 1 n p i log p i + 1 c 2 i = 1 n 1 + cp i log 1 + cp i cp i , c 0 E8
H k P = i = 1 n p i log p i + 1 k 2 i = 1 n 1 + kp i log 1 + kp i 1 + k log 1 + k p i , k 0 E9
Advertisement

2. Technical aspects of fuzzy measures

Zadeh [2] introduced fuzzy set theory which is associated with vagueness arising in human cognitive methods. The alteration for an element connecting membership and nonmembership in the universe of classical sets is abrupt whereas in the universe of fuzzy sets the transition is gradual. Thus, the membership function describes the vagueness and ambiguity of an element and takes values in the interval [0, 1].

Kapur [7] explained the concept of fuzzy entropy by considering the following vector μ A x 1 μ A x 2 μ A x n .

If μ A x i = 0 then the ith element does not belong to set A and if μ A x i = 1 , then the ith element belongs to set A. If μ A x i = 0.5 then highest ambiguity arises as to ith element belongs to set A or not. Thus, μ A x 1 μ A x 2 μ A x n is termed as fuzzy vector and the set A is identified as the fuzzy set. Thus crisp set are those sets in which each element is 0 or 1 and hence uncertainty does not arise in these sets whereas those sets in which elements are 0 or 1 and others lie among 0 and 1 are entitled as fuzzy sets. A fuzzy set A is represented as A = x i μ A x i i = 1 2 n where μ A x i gives the degree of belongingness of the element x i to A. We explain the concept of membership function μ A :X 0 1 as follows:

μ A x i = 0 , if x A and there is no ambiguity 1 , if x A and there is no ambiguity 0.5 , if there is max imum ambiguity whether x A x A E10

Further if μ B x i = μ A x i either 1 − μ A x i or then fuzzy sets A and B are characterised as fuzzy equivalent sets. Also, without being fuzzy equivalent, two sets can have same entropy but it is obvious to have identical entropy for fuzzy equivalent sets. Now if all the membership values of class of fuzzy equivalent sets are less than or equal to 0.5 then that set is defined as standard fuzzy set.

For any fuzzy set A* to be a sharpened version of set A the subsequent requirements has to be fulfilled:

μ A x i μ A x i , if μ A x i 0.5 ; i E11

and

μ A x i μ A x i , if μ A x i 0.5 ; i E12

Thus, when x 1 , x 2 , , x n are components of universe of discourse then, μ A x 1 μ A x 2 μ A x n are positioned among 0 and 1 but since their sum is not unity therefore they are not considered as probabilities. However,

ϕ A x i = μ A x i i = 1 n μ A x i , i = 1 , 2 , , n E13

gives a probability distribution.

With the ith element, fuzzy uncertainty is defined as f ( μ A x i ) with the following properties:

  1. f(x) = 0 when x = 0 or 1.

  2. f(x) increases as x goes from 0 to 0.5.

  3. f(x) decreases as x goes from 0.5 to 1.

  4. f(x) = f(1 − x).

The total fuzzy uncertainty defined as fuzzy entropy for n independent elements is given by,

H A = i = 1 n f ( μ A x i ) E14

This is called fuzzy entropy.

When there is uncertainty due to fuzziness of information it is known as fuzzy entropy measures whereas when the uncertainty is due to information available in terms of probability distribution it is known as probabilistic entropy. Following similarities and dissimilarities are there between fuzzy entropy measures and probabilistic entropy measures:

  1. 0 ≤ p i ≤ 1 for each i. Also 0 ≤ ( μ A x i ≤ 1 for each i.

  2. i = 1 n p i = 1 for all probability distributions, but i = 1 n ( μ A x i ) need not be equal to unity and it need not even be the same for all fuzzy sets.

  3. The probabilistic uncertainty measure measures how close the probability distribution (p1, p2, …, pn) is to the uniform distribution (1/n, 1/n, …, 1/n) and how far away it is from degenerate distributions. Fuzzy uncertainty measures how close the fuzzy distribution is from the most fuzzy vector distribution (1/2, 1/2, …, 1/2) and how far it is from the distribution of crisp sets.

  4. ( μ A x i ) gives the same degree of fuzziness as 1 − ( μ A x i ) because both are equidistant from 1/2 and the crisp set values 0 and 1. However probabilities p and 1 − p make different contributions to probabilistic uncertainty. As such while most measures of fuzzy entropy are of the form i = 1 n f ( μ A x i + i = 1 n f 1 μ A x i , most measures of probabilities entropy are of the form i = 1 n f p i . However some measures of probabilistic entropy can also be of the form i = 1 n f p i + i = 1 n f p i .

  5. Similarly while many measures for fuzzy directed divergence are all of the form i = 1 n f ( μ A x i , μ B x i + i = 1 n f 1 μ A x i , 1 μ B x i ) , most of the probabilistic measures are of the form i = 1 n f p i q i . For each measure of probabilistic entropy or directed divergence, we have a corresponding measure of fuzzy entropy and fuzzy directed divergence and vice-versa.

  6. The common properties arise from the consideration that both types of measures are based on measures of distance from (1/n, 1/n, …, 1/n) in one case and from (1/2, 1/2, …, 1/2) in the other.

  7. The dissimilarity arises because while i = 1 n p i = 1 , i = 1 n ( μ A x i is not 1. The probabilities of n − 1 outcomes will determine the probability of the nth outcome. However fuzziness of n elements of the fuzzy set are quite independent and our knowledge of fuzziness of n − 1 elements gives us no information about the fuzziness of the nth element.

  8. Conceptually the two types of uncertainty are poles apart. One deals with probabilities or relative frequencies and repeated experiments, while the other deals with estimation of fuzzy values. The probabilities can be determined objectively and experimentally and should naturally be the same for everyone. Fuzziness is one’s perception of membership of an element of a set and can be subjective. However, after finding fuzzy value for every member of the set, everything else is objective. In probability theory also after assigning probabilities, everything is also objective.

  9. Fuzzy and probabilistic entropies are concave functions of μ A x 1 μ A x 2 μ A x n and p1, p2, …, pn respectively. If we start with any value of μ A x 1 μ A x 2 μ A x n and approach the vector 1/2, 1/2, …, 1/2, the fuzzy entropy will increase. Similarly, if we start with any probability vector p1, p2, …, pn and approach the vector 1/n, 1/n, …, 1/n, the probabilistic entropy will increase. Thus, Z = F μ A x 1 μ A x 2 μ A x n , where F is a fuzzy entropy is a concave surface with maximum value at 1/2, 1/2, …, 1/2. Similarly Z = G(p1, p2, …, pn) where G is probabilistic entropy is a concave surface with maximum value at 1/n, 1/n, …, 1/n.

While processing information, making decision and in our language we can find fuzziness. Many authentic world objectives and human thinking consider uncertainty and fuzziness as their fundamental nature. Uncertainty and fuzziness are removed by the utilization of information. The degree of information is the quantity of uncertainty eliminated whereas the degree of vagueness and ambiguity of uncertainties is the quantity of fuzziness.

The theory of fuzziness is related to various areas of research such as Statistics, Information theory, Clustering and Decision analysis, Medical and Socio-economic prediction, Image processing, etc. The preparation and analysis of information development method are the applications of the mathematical designs related to system research.

In order to deal with fuzziness there is a small area from the extremely large fields of theories and applications which have been developed from the concept of fuzziness.

We define problems in the form of decision, management and prediction and by analysis, understanding and utilization of information, we can find their solutions. Thus, a significant quantity of information together with significant quantity of uncertainty is considered as the ground of many problems.

As we become aware of how much we know and how much we do not know, as information and uncertainty themselves become the focus of our concern, we begin to see our problems as centring on the issue of complexity.

Thus, ambiguity due to fuzziness of information is calculated by fuzzy entropy whereas vagueness due to information which is accessible in context of probability distribution is computed by probabilistic entropy.

Entropy theory was developed to measure uncertainty of a probability distribution and therefore it was natural for researchers in fuzzy set theory to make use of entropy concepts in measuring fuzziness.

Entropy of a fuzzy set A having n support points was characterised by Kauffman [8] in the way as

H k A = 1 log n i = 1 n ϕ A x i log ϕ A x i E15

Deluca and Termini [9] suggested the measure

H D A = 1 n log 2 i = 1 n μ A x i log μ A x i + 1 μ A x i log 1 μ A x i E16

Bhandari and Pal [10] suggested the following measure:

H e A = 1 n e 1 log μ A x i e 1 μ A x i + 1 μ A x i e μ A x i 1 E17

Some other measures of fuzzy entropy are:

  1. Corresponding to Sharma and Taneja’s [11] measure of entropy of degree α, β

    H α β P = 1 β α i = 1 n p i α i = 1 n p i β , α β E18

    we get the measure

    H α β A = 1 β α i = 1 n μ A α x i + 1 μ A x i α μ A β x i + 1 μ A x i β E19

    where either α ≥ 1, β ≤ 1 or α ≤ 1, β ≥ 1 and α = β only if both are unity.

  2. Corresponding to Kapur’s measure of entropy of degree α, β

    H α β P = 1 α + β 2 i = 1 n p i α + i = 1 n p i β 2 E20

    we get the measure

    H α β A = 1 α + β 2 i = 1 n μ A α x i + 1 μ A x i α μ A β x i + 1 μ A x i β 2 E21

  3. Corresponding to Kapur’s [12] measure of entropy

    H α P = i = 1 n p i log p i + 1 a i = 1 n 1 + ap i log 1 + ap i 1 a 1 + a log 1 + a , a 0 E22

    we get the measure

    H α A = i = 1 n μ A x i log μ A x i + 1 μ A x i log 1 μ A x i + 1 a i = 1 n 1 + a μ A x i log 1 + a μ A x i + 1 a i = 1 n [ 1 + a a μ A x i log 1 + a a μ A x i 1 a 1 a log 1 a E23

  4. Corresponding to Kapur’s [4] measure of entropy of degree α and type β

    H α , β P = 1 β α log i = 1 n p i α i = 1 n p i β , α β E24

    we get the measure

    H α , β P = 1 β α log i = 1 n μ A α x i + 1 μ A α x i α i = 1 n μ A β x i + 1 μ A x i β , α 1 , β 1 or α 1 , β 1 . E25

Kosko [13] introduced fuzzy entropy and conditioning. Pal and Pal [14] gave object background segmentation using new definition of entropy. Parkash [15] proposed new measures of weighted fuzzy entropy and their applications for the study of maximum weighted fuzzy entropy principle. Parkash and Gandhi [16] suggested new generalised measures of fuzzy entropy and properties. Parkash and Singh [17] gave characterization of useful information theoretic measures. Taneja [18] introduced generalised information measures and their applications. Taneja and Tuteja [19] gave characterization of quantitative-qualitative measure of relative information. Tuteja [20] introduced characterization of nonadditive measures of relative information and accuracy. Tuteja and Hooda [21] proposed generalised useful information measure of type α and degree β. Tuteja and Jain [22, 23] gave characterization of relative useful information having utilities as monotone functions and an axiomatic characterization of relative useful information. Tahayori [24] presented a universal methodology for generating an interval type 2 fuzzy set membership function from a collection of type 1 fuzzy sets. Kumar and Bajaj [25] introduced NTV metric based entropies of interval-valued intuitionistic fuzzy sets and their applications in decision making.

Mishra [26] introduced two exponential fuzzy information measures and characterised axiomatically. To show the effectiveness of the proposed measure, it is compared with the existing measures. Further, two fuzzy discrimination and symmetric discrimination measures are defined and their validity are checked. Important properties of new measures are studied and their applications in pattern recognition and diagnosis problem of crop disease are discussed. Hooda and Mishra [27] gave two sine and cosine trigonometric measures of fuzzy information and obtained new measures of trigonometric fuzzy information.

Advertisement

3. A new parametric measure of fuzzy information measure involving two parameters α and β

A new generalised fuzzy information measure of order α and type β has been suggested and their necessary and required properties are examined. Thereafter, its validity is also verified. Also, the monotonic behaviour of fuzzy information measure of order α and type β has been conferred.

The generalised measure of fuzzy information of order α and type β is given by,

H α β A = 1 1 α β i = 1 n μ A αμ A x i + 1 μ A x i α 1 μ A x i β 2 β , α 0 , α 1 , β 0 . E26

3.1. Properties of H α β A

We have supposed that, 0 0 . α = 1 , we study the following properties:

Property 1: H α β A 0 i.e. H α β A is nonnegative.

Property 2: H α β A is minimum if A is a non-fuzzy set.

For μ A x i = 0 , it implies H α β A = 0 and μ A x i = 0 we have H α β A = 0.

Property 3: H α β A is maximum if A is most fuzzy set.

We have, H α β A μ A x i = α 1 α μ A x i αμ A x i + 1 μ A x i α 1 μ A x i β 1 μ A x i αμ A x i 1 + log μ A x i 1 μ A x i α 1 μ A x i ( 1 + log 1 μ A x i .

Taking, H α β A μ A x i = 0 which is possible μ A x i = 1 μ A x i that is if μ A x i = 1 2 .

Now, we have, 2 H α β A 2 μ A x i , Thus, at μ A x i = 1 2 .

2 H α β A 2 μ A x i = 1 1 1 α 2 1 α 2 β 1 2 2 α 2 + α .2 1 α 2 1 log 2 2 0

Hence, the maximum value exists at μ A x i = 1 2 .

Property 4: H α β A H α β A , where A be sharpened version of A.

When μ A x i = 1 2 ,

H α β A = n .2 β 1 α β 1 2 αβ 2 2 αβ 2

When μ A x i lies between 0 and 1/2 then H α β A is an increasing function whereas when μ A x i lies between 1/2 and 1 then H α β A is a decreasing function of μ A x i

Let A be sharpened version of A which means that

  1. If μ A x i < 0.5 then μ A x i μ A x i for all i = 1, 2, …, n

  2. If μ A x i > 0.5 then μ A x i μ A x i for all i = 1, 2, …, n

Since H α β A is an increasing function of μ A x i for 0 μ A x i 1 2 and decreasing function of μ A x i for 1 2 μ A x i 1, therefore

  1. μ A x i μ A x i this implies H α β A H α β A in [0, 0.5]

  2. μ A x i μ A x i this implies H α β A H α β A in [0.5, 1]

Hence, H α β A H α β A .

Property 5: H α β A = H α β A ¯ , where A ¯ is the compliment of A i.e. μ A x i = 1 - μ A x i .

Thus when μ A x i is varied to (1 − μ A x i ) then H α β A does not change.

Under the above conditions, the generalised measure proposed in (26) is a valid measure of fuzzy information measure.

3.2. Monotonic behaviour of fuzzy information measure

In this section we study the monotonic behaviour of the fuzzy information measure. For this, diverse values of H α β A by assigning various values to α and β has been calculated and further the generalised measure has been presented graphically.

Case I: For α > 1, β =1, we have compiled the values of H α β A in Table 1, (a) and presented the fuzzy entropy in Figure 1(a) which unambiguously illustrates that the fuzzy information measure is a concave function.

μ A x i H α β A μ A x i H α β A μ A x i H α β A
0.0 0.0 0.0 0.0 0.0 0.0
0.1 0.5419 0.1 0.5074 0.1 2.0337
0.2 0.7749 0.2 0.7763 0.2 2.7289
0.3 0.9075 0.3 0.9534 0.3 3.0732
0.4 0.9778 0.4 1.0517 0.4 3.2411
0.5 1.0 0.5 1.0844 0.5 3.2916
0.6 0.9778 0.6 1.0517 0.6 3.2411
0.7 0.9075 0.7 0.9534 0.7 3.0732
0.8 0.7749 0.8 0.7763 0.8 2.7289
0.9 0.5419 0.9 0.5074 0.9 2.0337
1.0 0.0 1.0 0.0 1.0 0.0
(a) (b) (c)

Table 1.

The values of fuzzy information measure for α = 2 and β = 1; α = 1.5 and β = 0.1; and α = 1.5 and β = 2.5.

Figure 1.

Representation of the monotonic behaviour of fuzzy information measure for (a) For, α = 2 and β = 1; (b) For, α = 1.5 and β = 0.1; and (C) For, α = 1.5 and β = 2.5.

For α = 2, β = 1, values of H α β A have been represented with the help of graph for α = 2 and β = 1 which implies that the proposed measure is a concave function. Similarly, for other values of α and β, we get different concave curves.

Case II: For α > 1, 0 < β < 1 we have compiled the values of H α β A in Table 1, (b) and presented the fuzzy entropy in the Figure 1(b) which unambiguously illustrates that the fuzzy entropy is a concave function.

For α = 1.5 and β = 0.1, values of H α β A have been represented with the help of graph for α = 1.5 and β = 0.1 which implies that the proposed measure is a concave function. Similarly, for other values of α and β, we get different concave curves.

Case III: For α > 1, β > 1 we have compiled the values of H α β A in Table 1, (c) and presented the fuzzy entropy in Figure 1(c) which unambiguously illustrates that the fuzzy entropy is a concave function.

For α = 1.5 and β = 2.5, values of H α β A have been represented with the help of graph for α = 1.5 and β = 2.5 which implies that the proposed measure is a concave function. Similarly, for other values of α and β, we get different concave curves.

Advertisement

4. A new parametric measure of fuzzy information measure involving three parameters α , β and γ

Further, a new generalised fuzzy information measure involving three parameters α, β and γ has been suggested and their necessary and required properties are examined. Thereafter, its validity is also verified. Also, the monotonic behaviour of fuzzy information measure of order α, β and γ has been introduced.

The generalised measure of fuzzy information involving three parameters α, β and γ is given by,

H α , β , γ A = 1 1 α i = 1 n μ A α + β μ A x i + 1 μ A x i α + β 1 μ A x i γ 2 γ , α 0 , α 1 , β 0 , γ 0 E27

4.1. Properties of H α , β , γ A

We have supposed that, 0 0 . α = 1 , we study the following properties:

Property 1: H α , β , γ A 0 i.e. H α , β , γ A is nonnegative.

Property 2: H α , β , γ A is minimum if A is a non-fuzzy set.

for μ A x i = 0 , it implies H α , β , γ A = 0 and μ A x i = 1 we have H α , β , γ A = 0 .

Property 3: H α , β , γ A H α , β , γ A , where A be sharpened version of A.

When μ A x i lies between 0 and 1/2 then H α , β , γ A is an increasing function whereas when μ A x i lies between 1/2 and 1 then H α , β , γ A is a decreasing function of μ A x i

Let A be sharpened version of A which means that

  1. If μ A x i < 0.5 then μ A x i μ A x i for all i = 1, 2, …, n

  2. If μ A x i > 0.5 then μ A x i μ A x i for all i = 1, 2, …, n

Since H α , β , γ A is an increasing function of μ A x i for 0 μ A x i 1 2 and decreasing function of μ A x i for 1 2 μ A x i 1, therefore

  1. μ A x i μ A x i this implies H α , β , γ A H α , β , γ A in [0, 0.5]

  2. μ A x i μ A x i this implies H α , β , γ A H α , β , γ A in [0.5, 1]

Hence, H α , β , γ A H α , β , γ A

Property 4: H α , β , γ A = H α , β , γ A ¯ , where A ¯ is the compliment of A i.e. μ A x i = 1 μ A x i

Thus, when μ A x i is varied to 1 − μ A x i then H α , β , γ A does not change.

Under the above conditions, the generalised measure proposed in (27) is a valid measure of fuzzy information measure.

4.2. Monotonic behaviour of fuzzy information measure

In this section we study the monotonic behaviour of the fuzzy information measure. For this, diverse values of H α , β , γ A by assigning various values to α, β and γ have been calculated and further the generalised measure has been presented graphically.

Case I: For α > 1, β = 2, γ = 3, we have compiled the values of H α , β , γ A in Table 2, (a)–(e) and presented the fuzzy entropy in Figure 2(a)(e) which unambiguously illustrates that the fuzzy information measure is a concave function. For α = 1.5, β = 2, γ = 3, values of H α , β , γ A have been represented with the help of graph γ = 3 which implies that the proposed measure is a concave function. Similarly, for other values of α, β and γ we get different concave curves. Further it has been shown that H α , β , γ A is a concave function obtaining its maximum value at μ A x i = 1 2 . Hence H α , β , γ A is increasing function of μ A x i in interval [0, 0.5) and decreasing function of μ A x i in interval (0.5, 1]. Similarly, for α = 2, β = 2 and γ = 3, α = 2.5, β = 2 and γ = 3, α = 3, β = 2 and γ = 3, α = 3.5, β = 2 and γ = 3, γ = 3 values of H α , β , γ A have been represented with the help of graph which implies that the proposed measure is a concave function.

μ A x i H α , β , γ A μ A x i H α , β , γ A μ A x i H α , β , γ A μ A x i H α , β , γ A μ A x i H α , β , γ A
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
0.1 12.8444 0.1 6.7318 0.1 4.6517 0.1 3.5865 0.1 2.9316
0.2 14.7302 0.2 7.5514 0.2 5.1212 0.2 3.8867 0.2 3.1353
0.3 15.3147 0.3 7.7794 0.3 5.2385 0.3 3.9540 0.3 3.1762
0.4 15.5250 0.4 7.8559 0.4 5.2750 0.4 3.9734 0.4 3.1870
0.5 15.5795 0.5 7.875 0.5 5.2837 0.5 3.9779 0.5 3.1894
0.6 15.5250 0.6 7.8559 0.6 5.2750 0.6 3.9734 0.6 3.1870
0.7 15.3147 0.7 7.7794 0.7 5.2385 0.7 3.9540 0.7 3.1762
0.8 14.7302 0.8 7.5514 0.8 5.1212 0.8 3.8867 0.8 3.1353
0.9 12.8444 0.9 6.7318 0.9 4.6517 0.9 3.5865 0.9 2.9316
1.0 0.0 1.0 0.0 1.0 0.0 1.0 0.0 1.0 0.0
(a) (b) (c) (d) (e)

Table 2.

The values of fuzzy information measure for α = 1.5, β = 2 and γ = 3; α = 2, β = 2 and γ = 3; α = 2.5, β = 2 and γ = 3; α = 3, β = 2 and γ = 3; and α = 3.5, β = 2 and γ = 3.

Figure 2.

Representation of the monotonic behaviour of fuzzy information measure for (a) For, α = 1.5, β = 2 and γ = 3; (b) For, α = 2, β = 2 and γ = 3; (c) For, α = 2.5, β = 2 and γ = 3; (d) For, α = 3, β = 2 and γ = 3; (e) For, α = 3.5, β = 2 and γ = 3.

Further it has been shown that H α , β , γ A is a concave function obtaining its maximum value at μ A x i = 1 2 . Hence H α , β , γ A is increasing function of μ A x i in interval [0, 0.5) and decreasing function of μ A x i in interval (0.5, 1].

Advertisement

5. Conclusions

In this chapter, after reviewing some literatures on measures of information for fuzzy sets, a new generalised fuzzy information measure involving two parameters α and β has been introduced.

The necessary properties of the proposed measure have been verified and further it has been studied that the proposed measure is a concave function as it has shown monotonicity.

Further, a new generalised fuzzy information measure involving three parameters α, β and γ has been suggested and their necessary and required properties are examined. Thereafter, its validity is also verified. Also, the monotonic behaviour of fuzzy information measure of order α, β and γ has been conferred.

Fuzzy sets are indispensable in fuzzy system model and fuzzy system design, while the measurement of fuzziness in fuzzy sets is the fuzzy entropy or fuzzy information measure. Therefore, fuzzy information measures occupy important place in the processing of system design. Thus there are enormous applications of fuzzy information in the design of neural network classifiers.

Advertisement

Conflict of interest

I declare that I have no conflict of interest.

References

  1. 1. Shannon C. A mathematical theory of communication. Bell System Technical Journal. 1948;379–423:623-659
  2. 2. Zadeh L. Fuzzy sets. Information and Control. 1966;8:94-102
  3. 3. Renyi A. On measures of entropy and information. In: Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability; University of California Press; 1961. pp. 547-561
  4. 4. Havrada J, Charvat F. Quantification methods of classification processes: Concept of structural α-entropy. Kybernetika. 1967;3:30-31
  5. 5. Behara M, Chawla J. Generalized γ-entropy. Selecta Statistica Canadiana. 1974;2:15-38
  6. 6. Kapur J. A comparative assessment of various measures of directed divergence. Advances in Management Studies. 1984;3:1-16
  7. 7. Kapur J. Measures of Fuzzy Information. New Delhi: Mathematical Sciences Trust Society; 1997
  8. 8. Kaufmann A. Fuzzy Subsets Fundamental Theoretical Elements. New York: Academic Press; 1980
  9. 9. De Luca A, Termini S. A definition of a non-probabilistic entropy in setting of fuzzy sets. Information and Control. 1972;20:301-312
  10. 10. Bhandari D, Pal N. Some new information measures for fuzzy sets. Information Sciences. 1993;67:209-228
  11. 11. Sharma B, Taneja I. Entropies of type (α, β) and other generalized measures of information theory. Metrika. 1975;22:202-215
  12. 12. Kapur J. Measures of Information and their Applications. New York: Wiley Eastern; 1995
  13. 13. Kosko B. Fuzzy entropy and conditioning. Information Sciences. 1986;40:165-174
  14. 14. Pal N, Pal S. Object background segmentation using new definition of entropy. IEEE Proceedings. 1989;136(A):284-295
  15. 15. Prakash O, Gandhi C. New measures of information based upon measures of central tendency and dispersion. International Review of Pure and Applied Mathematics. 2008;4:161-172
  16. 16. Prakash O, Gandhi C. New generalized measures of fuzzy entropy and their properties. Journal of Informatics and Mathematical Sciences. 2011;3:1-9
  17. 17. Parkash O, Singh R. On characterization of useful information theoretic measures. Kybernetika. 1987;23:245-251
  18. 18. Taneja I. On generalized information measures and their applications. Advances in Electronics and Electron Physics. 1989;76:327-413
  19. 19. Taneja H, Tuteja R. Characterization of quantitative-qualitative measure of relative information. Information Sciences. 1984;33:217-222
  20. 20. Tuteja R. On characterization of nonadditive measures of relative information and inaccuracy. Bulletin of the Calcutta Mathematical Society. 1983;77:363-369
  21. 21. Tuteja R, Hooda D. On a generalized useful information measure of type α and degree β. Journal of the Indian Society of Statistics and Operations Research. 1985;6:1-11
  22. 22. Tuteja R, Jain P. Characterization of relative useful information having utilities as monotone functions. Aplikace Matematiky. 1986;31:10-18
  23. 23. Tuteja R, Jain P. An axiomatic characterization of relative useful information. Journal of Information and Optimization Sciences. 1986;7:49-57
  24. 24. Tahayori H, Livi L, Sadeghian A, Rizzi A. Interval type-2 fuzzy set reconstruction based on fuzzy information-theoretic kernels. IEEE Transactions on Fuzzy Systems. 2015;23(4):1014-1029
  25. 25. Kumar T, Bajaj R. NTV metric based entropies of interval-valued intuitionistic fuzzy sets and their applications in decision making. Ann. Fuzzy Math. Inform. 2015;9(1):1-21
  26. 26. Mishra A, Hooda D, Jain D. On exponential fuzzy measures of information and discrimination. International Journal of Computer Applications. 2015;119(23):1-7
  27. 27. Hooda D, Mishra A. On trigonometric fuzzy information measures. ARPN Journal of Science and Technology. 2015;5(3):145-152

Written By

Anjali Munde

Submitted: 12 April 2018 Reviewed: 18 May 2018 Published: 31 October 2018