Abstract

We give an optimality characterization of nonadditive generalized mean-value entropies from suitable nonadditive and generalized mean-value properties of the measure of average length. The results obtained cover many results obtained by other authors as particular cases, as well as the ordinary length due to Shannon 1948. The main instrument is the function of the word lengths in obtaining the average length of the code.

1. Introduction

Given a discrete random variable taking a finite number of values with probabilities , , , the Shannon’s entropy of the probability distribution is given by where the base of the logarithm is in general arbitrary.

Shannon entropy is a very useful and powerful measure, having very rich meanings. Entropy has an important connection with noiseless coding. If represents an information source with messages and input probabilities , as given above, that is encoded into words of lengths forming an instantaneous code, then where is the size of the code alphabet.

The average length , for the instantaneous code is such that with equality if and only if for each

Result (1.4), (refer Shannon [5]), characterizes Shannon’s entropy as a measure of optimality of a linear function, namely, , under the relation (1.2).

Several generalizations of Shannon’s entropy have been studied by many authors in different ways. Here we will need the nonadditive generalized mean value measures such as These quantities satisfy the β€œnonadditivity”

In this paper, we give an optimality characterization of entropies (1.5) and (1.6) from suitable nonadditive and generalized mean-value properties of the measure of average length. The results obtained cover many results obtained by other authors as particular cases, as well as the result (1.3). The main instrument is the function of the word lengths in obtaining the average length of the code.

2. Nonadditive Measure of Code Length

Let us consider two independent sources and with associated probability distribution and . Then the probability distribution of the product is . Let the source be encoded with a code of length and the pair be represented by a sequence for and put side by side, so that the product source has code length sequence: The additive measure of mean length is required to satisfy the requirement, refer Campbell [2], where being a continuous strictly monotonic increasing function.

Campbell [3] proved a noiseless coding theorem for Renyi’s entropy of order in terms of mean length (2.3) defined for .

The mean length concerning Shannon’s entropy and order entropy of Renyi are both additive as they satisfy additivity of type (2.2).

Here we deal with nonadditive measures of length denoted by which satisfy β€œnonadditivity relation” and the mean value property where is the function of length of a single element with code word length , which is nonadditive.

3. Characterization of Nonadditive Measures of Code Length

We take the mean value nonadditive measures of length (2.5) to satisfy the relation (2.4), where the expression and notations used there have their meanings explained earlier.

First of all we will determine the nonadditive length function of the code word length in satisfying the nonadditivity relation This by taking gives The most general nonzero solution of (3.2) is where , are arbitrary constants (we have taken base with a purpose here). So that at this stage we make a proper choice of the constant . By analogy, its value is dictated as

Another purpose served with this value of is that when it tends to zero, that is, when , the function of length should reduce to additive one which is . This value of can be obtained by imposing a boundary condition also.

So that finally we have Next we proceed to determine by first evaluating the values of . To achieve this we put the value of from (3.6) in (2.5) and then use the relation (2.4) with , to get Now let us take and so that for (3.7) gives after some simplification or where Now, refer Hardy et al. [4], there must be a linear relation between and , that is, where and are independent of .

Using (3.10) and (3.11), we have where or where From the symmetry of (3.14), we get Thus, for all real values of .

There are two cases, namely, and .

If , , and (3.14) gives the most general continuous solution of which is given by where is an arbitrary constant.

This, by (3.15) and (3.13), gives which gives Again if , we have the relation obtained from (3.17) and (3.14): the general continuous solution of which are where is an arbitrary nonzero constant.

This by using (3.17), (3.15) in (3.13), gives which gives The value of given by (3.21) and (3.25) determine the following two nonadditive measures of length defined by (2.5), that is, These code lengths denoted by and may be named as nonadditive type lengths of order 1 and , respectively.

These results are contained in the following theorem.

Theorem 3.1. The mean length given by (2.5) of a sequence of lengths formed of the code alphabet of size of a probability distribution satisfying and nonadditivity relation can be only of one of the two forms given in (3.26) and (3.27).

3.1. Limiting and Particular Cases

It is immediate to see the following.(1)(2) the ordinary mean length due to Shannon [5].(3) length of order defined by Campbell [2].(4)For , both the expressions for length given by (3.26) and (3.27) reduce to which in the limiting case, when approaches unity, reduces to .(5)For , the expression (3.26) becomes (6)For , the expression (3.27) becomes (7)For the expression (3.32) and (3.33) reduces to and , respectively.

Thus, we have shown that and are type and generalizations of respectively. We now prove the following theorem.

Theorem 3.2. If denote the lengths of an instantaneous/uniquely decipherable code formed of code alphabet of size D. (i) with equality if and only if for all ,(ii) with equality if and only if where .

Proof. As in Shannon’s case, refer Feinstien [6], Now according as .
Therefore, from the above after suitable manipulation, we get the inequality which is the result (3.35).
The case of equality can be discussed, as for Shannon’s, which holds only when , for each .
We now proceed to part (ii). If , the result is the same proved in part (i). For other values, we use Holder’s inequality where and .
Making the substitutions and in (3.40), we get after suitable manipulations with
Raising the power to of both sides of (3.41) and using that according as , we get result (3.36) after simple manipulation.
Hence, the theorem is proved.