• 検索結果がありません。

JJ II

N/A
N/A
Protected

Academic year: 2022

シェア "JJ II"

Copied!
31
0
0

読み込み中.... (全文を見る)

全文

(1)

volume 2, issue 1, article 5, 2001.

Received 12 April, 2000;

accepted 06 October 2000.

Communicated by:L.-E. Persson

Abstract Contents

JJ II

J I

Home Page Go Back

Close Quit

Journal of Inequalities in Pure and Applied Mathematics

FURTHER REVERSE RESULTS FOR JENSEN’S DISCRETE INEQUALITY AND APPLICATIONS IN INFORMATION THEORY

I. BUDIMIR, S.S. DRAGOMIR AND J.E. PE ˇCARI ´C

Department of Mathematics Faculty of Textile Technology University of Zagreb, CROATIA.

EMail:ivanb@zagreb.tekstil.hr

School of Communications and Informatics Victoria University of Technology

PO Box 14428, Melbourne City MC 8001 Victoria, AUSTRALIA.

EMail:sever.dragomir@vu.edu.au

URL:http://rgmia.vu.edu.au/SSDragomirWeb.html Department of Mathematics

Faculty of Textile Technology University of Zagreb, CROATIA.

EMail:pecaric@mahazu.hazu.hr

URL:http://mahazu.hazu.hr/DepMPCS/indexJP.html

c

2000School of Communications and Informatics,Victoria University of Technology ISSN (electronic): 1443-5756

007-00

(2)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page2of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

Abstract

Some new inequalities which counterpart Jensen’s discrete inequality and im- prove the recent results from [4] and [5] are given. A related result for gener- alized means is established. Applications in Information Theory are also pro- vided.

2000 Mathematics Subject Classification:26D15, 94Xxx.

Key words: Convex functions, Jensen’s Inequality, Entropy Mappings.

Contents

1 Introduction. . . 3 2 Some New Counterparts for Jensen’s Discrete Inequality . . . 5 3 A Converse Inequality for Convex Mappings Defined onRn 12 4 Some Related Results. . . 18 5 Applications in Information Theory . . . 23

References

(3)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page3of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

1. Introduction

Letf :X →Rbe a convex mapping defined on the linear spaceXandxi ∈X, pi ≥0 (i= 1, ..., m)withPm :=Pm

i=1pi >0.

The following inequality is well known in the literature as Jensen’s inequal- ity

(1.1) f 1

Pm m

X

i=1

pixi

!

≤ 1 Pm

m

X

i=1

pif(xi).

There are many well known inequalities which are particular cases of Jensen’s inequality, such as the weighted arithmetic mean-geometric mean-harmonic mean inequality, the Ky-Fan inequality, the Hölder inequality, etc. For a com- prehensive list of recent results on Jensen’s inequality, see the book [25] and the papers [9]-[15] where further references are given.

In 1994, Dragomir and Ionescu [18] proved the following inequality which counterparts (1.1) for real mappings of a real variable.

Theorem 1.1. Letf :I ⊆R→Rbe a differentiable convex mapping on

I (

I is the interior of I), xi

I, pi ≥ 0 (i= 1, ..., n)and Pn

i=1pi = 1. Then we have the inequality

0 ≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

! (1.2)

n

X

i=1

pixif0(xi)−

n

X

i=1

pixi

n

X

i=1

pif0(xi),

(4)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page4of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

wheref0is the derivative off on

I.

Using this result and the discrete version of the Grüss inequality for weighted sums, S.S. Dragomir obtained the following simple counterpart of Jensen’s in- equality [5]:

Theorem 1.2. With the above assumptions forfand ifm, M ∈Iandm≤xi ≤ M (i= 1, ..., n), then we have

(1.3) 0≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

!

≤ 1

4(M −m) (f0(M)−f0(m)). This was subsequently applied in Information Theory for Shannon’s and Rényi’s entropy.

In this paper we point out some other counterparts of Jensen’s inequality that are similar to (1.3), some of which are better than the above inequalities.

(5)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page5of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

2. Some New Counterparts for Jensen’s Discrete Inequality

The following result holds.

Theorem 2.1. Letf :I ⊆R→Rbe a differentiable convex mapping on

I and xi ∈I withx1 ≤ x2 ≤ · · · ≤ xn andpi ≥ 0 (i= 1, ..., n)withPn

i=1pi = 1.

Then we have 0 ≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

! (2.1)

≤ (xn−x1) (f0(xn)−f0(x1)) max

1≤k≤n−1

Pkk+1

≤ 1

4(xn−x1) (f0(xn)−f0(x1)), wherePk :=Pk

i=1piandk+1 := 1−Pk.

Proof. We use the following Grüss type inequality due to J. E. Peˇcari´c (see for example [25]):

(2.2)

1 Qn

n

X

i=1

qiaibi− 1 Qn

n

X

i=1

qiai· 1 Qn

n

X

i=1

qibi

≤ |an−a1| |bn−b1| max

1≤k≤n−1

Qkk+1

Q2n

,

(6)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page6of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

provided that a, b are two monotonic n−tuples, q is a positive one, Qn :=

Pn

i=1qi >0,Qk:=Pk

i=1qiandQ¯k+1 =Qn−Qk+1.

If in (2.2) we chooseqi =pi,ai =xi,bi =f0(xi)(andai, bi will be monotonic nondecreasing), then we may state that

(2.3)

n

X

i=1

pixif0(xi)−

n

X

i=1

pixi

n

X

i=1

pif0(xi)

≤(xn−x1) (f0(xn)−f0(x1)) max

1≤k≤n−1

Pkk+1 .

Now, using (1.2) and (2.3) we obtain the first inequality in (2.1).

For the second inequality, we observe that Pkk+1 =Pk(1−Pk)≤ 1

4(Pk+ 1−Pk)2 = 1 4 for allk ∈ {1, ..., n−1}and then

1≤k≤n−1max

Pkk+1 ≤ 1 4, which proves the last part of (2.1).

Remark 2.1. It is obvious that the inequality (2.1) is an improvement of (1.3) if we assume that the order forxiis as in the statement of Theorem2.1.

Another result is embodied in the following theorem.

(7)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page7of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

Theorem 2.2. Let f : I ⊆ R→R be a differentiable convex mapping on

I andm, M ∈I withm ≤ xi ≤ M (i= 1, ..., n)andpi ≥ 0 (i= 1, ..., n) with Pn

i=1pi = 1. IfS is a subset of the set{1, ..., n}minimizing the expression (2.4)

X

i∈S

pi− 1 2 ,

then we have the inequality 0 ≤

n

X

i=1

pif(xi)−f

n

X

i=1

pixi

! (2.5)

≤ Q(M−m) (f0(M)−f0(m))

≤ 1

4(M −m) (f0(M)−f0(m)), where

Q=X

i∈S

pi 1−X

i∈S

pi

! .

Proof. We use the following Grüss type inequality due the Andrica and Badea [2]:

(2.6)

Qn

n

X

i=1

qiaibi

n

X

i=1

qiai ·

n

X

i=1

qibi

≤(M1−m1) (M2−m2)X

i∈S

qi Qn−X

i∈S

qi

!

(8)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page8of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

provided that m1 ≤ ai ≤ M1, m2 ≤ bi ≤ M2 for i = 1, ..., n, and S is the subset of{1, ..., n}which minimises the expression

X

i∈S

qi −1 2Qn

.

Choosingqi =pi,ai =xi, bi =f0(xi), then we may state that 0 ≤

n

X

i=1

pixif0(xi)−

n

X

i=1

pixi n

X

i=1

pif0(xi) (2.7)

≤ (M −m) (f0(M)−f0(m))X

i∈S

pi 1−X

i∈S

pi

! .

Now, using (1.2) and (2.7), we obtain the first inequality in (2.5). For the last part, we observe that

Q≤ 1 4

X

i∈S

pi+ 1−X

i∈S

pi

!2

= 1 4

and the theorem is thus proved.

The following inequality is well known in the literature as the arithmetic mean-geometric mean-harmonic-mean inequality:

(2.8) An(p, x)≥Gn(p, x)≥Hn(p, x),

(9)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page9of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

where

An(p, x) : =

n

X

i=1

pixi - the arithmetic mean, Gn(p, x) : =

n

Y

i=1

xpii - the geometric mean, Hn(p, x) : = 1

n

P

i=1 pi

xi

- the harmonic mean,

andPn

i=1pi = 1 pi ≥0,i= 1, n .

Using the above two theorems, we are able to point out the following reverse of the AGH - inequality.

Proposition 2.3. Letxi >0 (i= 1, ..., n)andpi ≥0withPn

i=1pi = 1.

(i) Ifx1 ≤x2 ≤ · · · ≤xn−1 ≤xn, then we have 1 ≤ An(p, x)

Gn(p, x) (2.9)

≤ exp

"

(xn−x1)2

x1xn max

1≤k≤n−1

Pkk+1

#

≤ exp

"

1

4 ·(xn−x1)2 x1xn

# .

(10)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page10of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

(ii) If the set S ⊆ {1, ..., n} minimizes the expression (2.4), and 0 < m ≤ xi ≤M <∞(i= 1, ..., n), then

1 ≤ An(p, x) Gn(p, x) (2.10)

≤ exp

"

Q· (M −m)2 mM

#

≤exp

"

1

4· (M −m)2 mM

# .

The proof goes by the inequalities (2.1) and (2.5), choosing f(x) = −lnx.

A similar result can be stated forGnandHn.

Proposition 2.4. Letp≥1andxi >0,pi ≥0 (i= 1, ..., n)withPn

i=1pi = 1.

(i) Ifx1 ≤x2 ≤ · · · ≤xn−1 ≤xn, then we have 0 ≤

n

X

i=1

pixpi

n

X

i=1

pixi

!p

(2.11)

≤ p(xn−x1) xp−1n −xp−11

1≤k≤n−1max

Pkk+1

≤ p

4(xn−x1) xp−1n −xp−11 .

(ii) If the set S ⊆ {1, ..., n} minimizes the expression (2.4), and 0 < m ≤ xi ≤M <∞(i= 1, ..., n), then

(2.12) 0≤

n

X

i=1

pixpi

n

X

i=1

pixi

!p

(11)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page11of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

≤ pQ(M −m) Mp−1−mp−1

≤ 1

4p(M −m) Mp−1−mp−1 .

Remark 2.2. The above results are improvements of the corresponding inequal- ities obtained in [5].

Remark 2.3. Similar inequalities can be stated if we choose other convex func- tions such as: f(x) = xlnx, x > 0 orf(x) = exp (x), x ∈ R. We omit the details.

(12)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page12of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

3. A Converse Inequality for Convex Mappings De- fined on R

n

In 1996, Dragomir and Goh [15] proved the following converse of Jensen’s inequality for convex mappings onRn.

Theorem 3.1. Letf :Rn →Rbe a differentiable convex mapping onRnand (∇f) (x) :=

∂f(x)

∂x1 , ..., ∂f(x)

∂xn

,

the vector of the partial derivatives,x= (x1, ..., xn)∈Rn. Ifxi ∈Rm(i= 1, ..., m),pi ≥0, i= 1, ..., m,withPm :=Pm

i=1pi >0, then

0≤ 1 Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.1)

≤ 1 Pm

m

X

i=1

pih∇f(xi), xii −

* 1 Pm

m

X

i=1

pi∇f(xi), 1 Pm

m

X

i=1

pixi +

.

The result was applied to different problems in Information Theory by providing different counterpart inequalities for Shannon’s entropy, conditional entropy, mutual information, conditional mutual information, etc.

For generalizations of (3.1) in Normed Spaces and other applications in In- formation Theory, see Mati´c’s Ph.D dissertation [23].

Recently, Dragomir [4] provided an upper bound for Jensen’s difference (3.2) ∆ (f, p, x) := 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! ,

(13)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page13of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

which, even though it is not as sharp as (3.1), provides a simpler way, and for applications, a better way, of estimating the Jensen’s differences ∆. His result is embodied in the following theorem.

Theorem 3.2. Letf :Rn→Rbe a differentiable convex mapping andxi ∈Rn, i= 1, ..., m. Suppose that there exists the vectorsφ,Φ∈Rnsuch that

(3.3) φ ≤xi ≤Φ (the order is considered on the co-ordinates) andm, M ∈Rnare such that

(3.4) m≤ ∇f(xi)≤M

for alli∈ {1, ..., m}. Then for allpi ≥0 (i= 1, ..., m)withPm >0, we have the inequality

(3.5) 0≤ 1 Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

!

≤ 1

4kΦ−φk kM −mk, wherek·kis the usual Euclidean norm onRn.

He applied this inequality to obtain different upper bounds for Shannon’s and Rényi’s entropies.

In this section, we point out another counterpart for Jensen’s difference, as- suming that the∇−operator is of Hölder’s type, as follows.

Theorem 3.3. Letf :Rn→Rbe a differentiable convex mapping andxi ∈Rn, pi ≥ 0 (i= 1, ..., m) with Pm > 0. Suppose that the ∇−operator satisfies a condition ofr−H−Hölder type, i.e.,

(3.6) k∇f(x)− ∇f(y)k ≤Hkx−ykr, for allx, y ∈Rn,

(14)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page14of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

whereH >0,r ∈(0,1]andk·kis the Euclidean norm.

Then we have the inequality:

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.7)

≤ H Pm2

X

1≤i<j≤m

pipjkxi−xjkr+1.

Proof. We recall Korkine’s identity, 1

Pm

m

X

i=1

pihyi, xii −

* 1 Pm

m

X

i=1

piyi, 1 Pm

m

X

i=1

pixi +

= 1 2Pm2

n

X

i,j=1

pipjhyi−yj, xi−xji, x, y ∈Rn,

and simply write 1

Pm

m

X

i=1

pih∇f(xi), xii −

* 1 Pm

m

X

i=1

pi∇f(xi), 1 Pm

m

X

i=1

pixi +

= 1 2Pm2

n

X

i,j=1

pipjh∇f(xi)− ∇f(xj), xi−xji.

(15)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page15of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

Using (3.1) and the properties of the modulus, we have

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

!

≤ 1 2Pm2

m

X

i,j=1

pipj|h∇f(xi)− ∇f(xj), xi−xji|

≤ 1 2Pm2

m

X

i,j=1

pipjk∇f(xi)− ∇f(xj)k kxi−xjk

≤ H Pm2

m

X

i,j=1

pipjkxi−xjkr+1

and the inequality (3.7) is proved.

Corollary 3.4. With the assumptions of Theorem3.3and if

∆ = max1≤i<j≤mkxi−xjk, then we have the inequality

(3.8) 0≤ 1 Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

!

≤ H∆r+1 2Pm2 1−

m

X

i=1

p2i

! .

Proof. Indeed, as X

1≤i<j≤m

pipjkxi−xjkr+1 ≤∆r+1 X

1≤i<j≤m

pipj.

(16)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page16of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

However, X

1≤i<j≤m

pipj = 1 2

m

X

i,j=1

pipj −X

i=j

pipj

!

= 1 2 1−

m

X

i=1

p2i

! ,

and the inequality (3.8) is proved.

The case of Lipschitzian mappings is embodied in the following corollary.

Corollary 3.5. Let f : Rn → R be a differentiable convex mapping and xi ∈ Rn, pi ≥ 0 (i= 1, ..., n) with Pm > 0. Suppose that the ∇−operator is Lipschitzian with the constantL >0, i.e.,

(3.9) k∇f(x)− ∇f(y)k ≤Lkx−yk, for allx, y ∈Rn, wherek·kis the Euclidean norm. Then

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.10)

≤ L

 1 Pm

m

X

i=1

pikxik2

1 Pm

m

X

i=1

pixi

2

.

Proof. The argument is obvious by Theorem 3.3, taking into account that for r = 1,

X

1≤i<j≤m

pipjkxi−xjk2 =Pm m

X

i=1

pikxik2

m

X

i=1

pixi

2

,

andk·kis the Euclidean norm.

(17)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page17of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

Moreover, if we assume more about the vectors (xi)i=1,n, we can obtain a simpler result that is similar to the one in [4].

Corollary 3.6. Assume thatf is as in Corollary3.5. If

(3.11) φ≤xi ≤Φ (on the co-ordinates),φ,Φ∈Rn (i= 1, .., m), then we have the inequality

0 ≤ 1

Pm

m

X

i=1

pif(xi)−f 1 Pm

m

X

i=1

pixi

! (3.12)

≤ 1

4·L· kΦ−φk2.

Proof. It follows by the fact that in Rn, we have the following Grüss type in- equality (as proved in [4])

(3.13) 1

Pm

m

X

i=1

pikxik2

1 Pm

m

X

i=1

pixi

2

≤ 1

4kΦ−φk2, provided that (3.11) holds.

Remark 3.1. For some Grüss type inequalities in Inner Product Spaces, see [7].

(18)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page18of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

4. Some Related Results

Start with the following definitions from [3].

Definition 4.1. Let −∞ < a < b < ∞. ThenCM[a, b]denotes the set of all functions with domain[a, b]that are continuous and strictly monotonic there.

Definition 4.2. Let −∞ < a < b < ∞, and let f ∈ CM[a, b]. Then, for each positive integer n, each n−tuple x = (x1, ..., xn), where a ≤ xj ≤ b (j = 1,2, ..., n), and eachn-tuplep= (p1, p2, ..., pn),wherepj >0 (j = 1,2, ..., n) andPn

j=1pj = 1, letMf(x, y)denote the (weighted) mean f−1

( n X

j=1

pjf(xj) )

.

We may state now the following result.

Theorem 4.1. Let Sbe the subset of{1, ..., n}which minimizes the expression

P

i∈Spi−1/2

. Iff, g∈CM[a, b], then sup

x

{|Mf(x, p)−Mg(x, p)|} ≤Q·

f−10 ·

f ◦g−100

·|g(b)−g(a)|2, provided that the right-hand side of the inequality is finite, where, as above,

Q= X

i∈S

pi

!

1−X

i∈S

pi

! , andk·kis the usual sup-norm.

(19)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page19of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

Proof. Let, as in [3],h=f ◦g−1,n >1,

x= (x1, x2, ..., xn) andp= (p1, p2, ..., pn)

be as in the Definition4.2, andyj =g(xj) (j = 1,2, ..., n). By the mean-value theorem, for someαin the open interval joiningf(a)tof(b), we have

Mf(x, p)−Mg(x, p) = f−1 ( n

X

j=1

pjf(xj) )

−f−1

"

h ( n

X

j=1

pjg(xj) )#

= f−10

(α)

" n X

j=1

pjf(xj)−h ( n

X

j=1

pjg(xj) )#

= f−10 (α)

" n X

j=1

pjh(yj)−h ( n

X

j=1

pjyj )#

= f−10

(α)

" n X

j=1

pj (

h(yj)−h

n

X

k=1

pkyk

!)#

.

Using the mean-value theorem a second time, we conclude that there exists pointsz1, z2, ..., znin the open interval joiningg(a)tog(b), such that

Mf(x, p)−Mg(x, p)

= f−10

(α)

p1{(1−p1)y1−p2y2− · · · −pnyn}h0(z1) +p2{−p1y1+ (1−p2)y2− · · · −pnyn}h0(z2)

+· · ·

+pn{−p1y1−p2y2− · · ·+ (1−pn)yn}h0(zn)

(20)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page20of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

= f−10

(α)

p1{p2(y1−y2) +· · ·+pn(y1−yn)}h0(z1) +p2{p1(y2−y1) +· · ·+pn(y2−yn)}h0(z2)

+· · ·

+pn{p1(yn−y1) +· · ·+pn−1(yn−yn−1)}h0(zn)

= f−10

(α) X

1≤i<j≤n

pipj(yi−yj){h0(zi)−h0(zj)}.

Using the mean value theorem a third time, we conclude that there exists points ωij (1≤i < j ≤n)in the open interval joiningg(a)tog(b), such that

f−10

(α) X

1≤i<j≤n

pipj(yi−yj){h0(zi)−h0(zj)}

= f−10

(α) X

1≤i<j≤n

pipj(yi−yj) (zi−zj)h00ij).

Consequently,

|Mf(x, p)−Mg(x, p)|

≤ f−10

(α)

X

1≤i<j≤n

pipj|yi−yj| · |zi−zj| · |h00ij)|

f−10

· kh00k· X

1≤i<j≤n

pipj|yi−yj| · |zi−zj|

≤(by the Cauchy-Buniakowski-Schwartz inequality)

(21)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page21of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

f−10 ·

f◦g−100 ·

s X

1≤i<j≤n

pipj|yi−yj|2

·

s X

1≤i<j≤n

pipj|zi−zj|2

≤ (by the Andrica and Badea result)

f−10 ·

f◦g−100 ·

v u u t

X

i∈S

pi

!

1−X

i∈S

pi

!

|g(b)−g(a)|2

· v u u t

X

i∈S

pi

!

1−X

i∈S

pi

!

|g(b)−g(a)|2

= Q

f−10 ·

f◦g−100

· |g(b)−g(a)|2, and the theorem is proved.

Corollary 4.2. Iff, g ∈CM[a, b], then sup

x

{|Mf(x, p)−Mg(x, p)|} ≤Q·

1 f0

·

1 g0

f0 g0

0

· |g(b)−g(a)|2, provided that the right hand side of the inequality exists.

Proof. This follows at once from the fact that f−10

= 1

f0◦f−1

(22)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page22of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

and

f◦g−100

= (g0◦g−1) (f00◦g−1)−(f0 ◦g−1) (g00◦g−1) (g0 ◦g−1)3 =

1 g0

f0 g0

0

◦g−1.

Remark 4.1. This establishes Theorem 4.3 from [3] and replaces the multiplica- tive factor 14 byQ. In Corollary4.2, we also replaced the multiplicative factor

1 4 byQ.

(23)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page23of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

5. Applications in Information Theory

We give some new applications for Shannon’s entropy Hb(X) :=

r

X

i=1

pilogb 1 pi,

whereXis a random variable with the probability distribution(pi)i=1,r.

Theorem 5.1. Let X be as above and assume that p1 ≥ p2 ≥ · · · ≥ pr or p1 ≤p2 ≤ · · · ≤pr. Then we have the inequality

(5.1) 0≤logbr−Hb(X)≤ (p1−pr)2 p1pr max

1≤k≤r

Pkk+1 . Proof. We choose in Theorem2.1,f(x) =−logbx,x >0,xi = p1

i (i= 1, ..., r).

Then we havex1 ≤x2 ≤ · · · ≤xrand by (2.1) we obtain 0≤logbr−Hb(X)≤

1 pr − 1

p1

1

p1

r

+ 1

1 p1

!

1≤k≤rmax

Pkk+1 ,

which is equivalent to (5.1). The same inequality is obtained ifp1 ≤p2 ≤ · · · ≤ pr.

Theorem 5.2. LetX be as above and suppose that pM : = max{pi|i= 1, ..., r},

pm : = min{pi|i= 1, ..., r}.

(24)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page24of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

IfS is a subset of the set{1, ..., r}minimizing the expression

P

i∈Spi−1/2 , then we have the estimation

(5.2) 0≤logbr−Hb(X)≤Q· (pM −pm)2 lnb·pMpm. Proof. We shall choose in Theorem2.2,

f(x) =−logbx, x > 0, xi = 1 pi

i= 1, r .

Thenm= p1

M,M = p1

m,f0(x) = −xlnb1 and the inequality (2.3) becomes:

0 ≤ logbr−

r

X

i=1

pilogb 1 pi

≤ Q 1 lnb

1 pm − 1

pM

− 1

1 pm

+ 1

1 pM

!

= Q· 1

lnb · (pM −pm)2 pMpm

,

hence the estimation (5.2) is proved.

Consider the Shannon entropy

(5.3) H(X) :=He(X) =

r

X

i=1

piln 1 pi

(25)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page25of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

and Rényi’s entropy of orderα(α∈(0,∞)\ {1})

(5.4) H[α](X) := 1

1−αln

r

X

i=1

pαi

! .

Using the classical Jensen’s discrete inequality for convex mappings, i.e.,

(5.5) f

r

X

i=1

pixi

!

r

X

i=1

pif(xi),

where f : I ⊆ R→R is a convex mapping on I, xi ∈ I (i= 1, ..., r) and (pi)i=1,r is a probability distribution, for the convex mapping f(x) = −lnx, we have

(5.6) ln

r

X

i=1

pixi

!

r

X

i=1

pilnxi.

Choosexi =pα−1i (i= 1, ..., r)in (5.6) to obtain ln

r

X

i=1

pαi

!

≥(α−1)

r

X

i=1

pilnpi,

which is equivalent to

(1−α)

H[α](X)−H(X)

≥0.

(26)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page26of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

Now, ifα ∈ (0,1),thenH[α](X) ≤H(X),and ifα >1thenH[α](X)≥ H(X).Equality holds iff(pi)i=1,ris a uniform distribution and this fact follows by the strict convexity of−ln (·).This inequality also follows as a special case of the following well known fact: H[α](X)is a nondecreasing function of α.

See for example [26] or [22].

Theorem 5.3. Under the above assumptions, given that pm = mini=1,rpi, pM = maxi=1,rpi, then we have the inequality

(5.7) 0≤(1−α)

H[α](X)−H(X)

≤Q· pα−1M −pα−1m 2

pα−1M pα−1m , for allα∈(0,1)∪(1,∞).

Proof. Ifα∈(0,1), then

xi :=pα−1i

pα−1M , pα−1m and ifα∈(1,∞), then

xi =pα−1i

pα−1m , pα−1M

, fori∈ {1, ..., n}.

Applying Theorem 2.2 for xi := pα−1i and f(x) = −lnx, and taking into account thatf0(x) =−1x, we obtain

(1−α)

H[α](X)−H(X)





Q pα−1m −pα−1M

1

pα−1m + 1

pα−1M

if α∈(0,1), Q pα−1M −pα−1m

1

pα−1M + 1

pα−1m

if α∈(1,∞)

(27)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page27of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

=









Q· (pα−1m −pα−1M )2

pα−1m pα−1M if α∈(0,1), Q· (pα−1M −pα−1m )2

pα−1M pα−1m if α∈(1,∞)

=Q· pα−1M −pα−1m 2

pα−1M pα−1m

for allα∈(0,1)∪(1,∞)and the theorem is proved.

Using a similar argument to the one in Theorem5.3, we can state the follow- ing direct application of Theorem2.2.

Theorem 5.4. Let(pi)i=1,r be as in Theorem5.3. Then we have the inequality (5.8) 0≤(1−α)H[α](X)−lnr−αlnGr(p)≤Q· pα−1M −pα−1m 2

PMα−1pα−1m ,

for allα∈(0,1)∪(1,∞).

Remark 5.1. The above results improve the corresponding results from [5] and [4] with the constantQwhich is less than 14.

Acknowledgement 1. The authors would like to thank the anonymous referee for valuable comments and for the references [26] and [22].

(28)

Further Reverse Results for Jensen’s Discrete Inequality and Applications in Information

Theory

I. Budimir,S.S. Dragomirand J. Peˇcari´c

Title Page Contents

JJ II

J I

Go Back Close

Quit Page28of31

J. Ineq. Pure and Appl. Math. 2(1) Art. 5, 2001

http://jipam.vu.edu.au

References

[1] A. RÉNYI, On measures of entropy and information, Proc. Fourth Berkley Symp. Math. Statist. Prob., 1 (1961), 547–561, Univ. of California Press, Berkley.

[2] D. ANDRICAANDC. BADEA, Grüss’ inequality for positive linear func- tionals, Periodica Math. Hung., 19(2) (1988), 155–167.

[3] G.T. CARGO AND O. SHISHA, A metric space connected with general means, J. Approx. Th., 2 (1969), 207–222.

[4] S.S. DRAGOMIR, A converse of the Jensen inequality for convex map- pings of several variables and applications. (Electronic Preprint:

http://matilda.vu.edu.au/~rgmia/InfTheory/

ConverseJensen.dvi)

[5] S.S. DRAGOMIR, A converse result for Jensen’s discrete inequality via Grüss’ inequality and applications in information theory, Analele Univ.

Oradea, Fasc. Math., 7 (1999-2000), 178–189.

[6] S.S. DRAGOMIR, A further improvement of Jensen’s inequality, Tamkang J. Math., 25(1) (1994), 29–36.

[7] S.S. DRAGOMIR, A generalisation of Grüss’s inequality in inner product spaces and applications, J. Math. Anal. Appl., 237 (1999), 74–82.

[8] S.S. DRAGOMIR, A new improvement of Jensen’s inequality, Indian J.

Pure Appl. Math., 26(10) (1995), 959–968.

参照

関連したドキュメント

Applications for discrete and integral inequalities including the Heisen- berg inequality for vector-valued functions in Hilbert spaces are provided.. 2000 Mathematics

The following result is useful in providing the best quadrature rule in the class for approximating the integral of a function f : [a, b] → R whose first derivative is

We present a new reversed version of a generalized sharp H¨older’s inequality which is due to Wu and then give a new refinement of H¨older’s inequality.. Moreover, the obtained

In this survey paper we present the natural applications of certain integral inequalities such as Chebychev’s inequality for synchronous and asynchronous mappings, H61der’s

An inequality providing some bounds for the integral mean via Pompeiu’s mean value theorem and applications for quadrature rules and special means are given.. 2000 Mathematics

ROUMELIOTIS, A new general- ization of Ostrowski integral inequality for mappings whose derivatives are bounded and applications in numerical integration and for special means,

In Section 3 we use the mentioned general inequality to obtain a particular two-dimensional Ostrowski-Grüss type inequality.... Ostrowski-Grüss type Inequalities in

As with M¨ obius groups, we define the limit set L(G) of the convergence group G to be the set of all limit points of those sequences { f n } converging in the sense of (ii)..