Misplaced Pages

Marcinkiewicz–Zygmund inequality

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

In mathematics, the Marcinkiewicz–Zygmund inequality, named after Józef Marcinkiewicz and Antoni Zygmund, gives relations between moments of a collection of independent random variables. It is a generalization of the rule for the sum of variances of independent random variables to moments of arbitrary order. It is a special case of the Burkholder-Davis-Gundy inequality in the case of discrete-time martingales.

Statement of the inequality

Theorem If X i {\displaystyle \textstyle X_{i}} , i = 1 , , n {\displaystyle \textstyle i=1,\ldots ,n} , are independent random variables such that E ( X i ) = 0 {\displaystyle \textstyle E\left(X_{i}\right)=0} and E ( | X i | p ) < + {\displaystyle \textstyle E\left(\left\vert X_{i}\right\vert ^{p}\right)<+\infty } , 1 p < + {\displaystyle \textstyle 1\leq p<+\infty } , then

A p E ( ( i = 1 n | X i | 2 ) p / 2 ) E ( | i = 1 n X i | p ) B p E ( ( i = 1 n | X i | 2 ) p / 2 ) {\displaystyle A_{p}E\left(\left(\sum _{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right)_{}^{p/2}\right)\leq E\left(\left\vert \sum _{i=1}^{n}X_{i}\right\vert ^{p}\right)\leq B_{p}E\left(\left(\sum _{i=1}^{n}\left\vert X_{i}\right\vert ^{2}\right)_{}^{p/2}\right)}

where A p {\displaystyle \textstyle A_{p}} and B p {\displaystyle \textstyle B_{p}} are positive constants, which depend only on p {\displaystyle \textstyle p} and not on the underlying distribution of the random variables involved.

The second-order case

In the case p = 2 {\displaystyle \textstyle p=2} , the inequality holds with A 2 = B 2 = 1 {\displaystyle \textstyle A_{2}=B_{2}=1} , and it reduces to the rule for the sum of variances of independent random variables with zero mean, known from elementary statistics: If E ( X i ) = 0 {\displaystyle \textstyle E\left(X_{i}\right)=0} and E ( | X i | 2 ) < + {\displaystyle \textstyle E\left(\left\vert X_{i}\right\vert ^{2}\right)<+\infty } , then

V a r ( i = 1 n X i ) = E ( | i = 1 n X i | 2 ) = i = 1 n j = 1 n E ( X i X ¯ j ) = i = 1 n E ( | X i | 2 ) = i = 1 n V a r ( X i ) . {\displaystyle \mathrm {Var} \left(\sum _{i=1}^{n}X_{i}\right)=E\left(\left\vert \sum _{i=1}^{n}X_{i}\right\vert ^{2}\right)=\sum _{i=1}^{n}\sum _{j=1}^{n}E\left(X_{i}{\overline {X}}_{j}\right)=\sum _{i=1}^{n}E\left(\left\vert X_{i}\right\vert ^{2}\right)=\sum _{i=1}^{n}\mathrm {Var} \left(X_{i}\right).}

See also

Several similar moment inequalities are known as Khintchine inequality and Rosenthal inequalities, and there are also extensions to more general symmetric statistics of independent random variables.

Notes

  1. J. Marcinkiewicz and A. Zygmund. Sur les fonctions indépendantes. Fund. Math., 28:60–90, 1937. Reprinted in Józef Marcinkiewicz, Collected papers, edited by Antoni Zygmund, Panstwowe Wydawnictwo Naukowe, Warsaw, 1964, pp. 233–259.
  2. Yuan Shih Chow and Henry Teicher. Probability theory. Independence, interchangeability, martingales. Springer-Verlag, New York, second edition, 1988.
  3. R. Ibragimov and Sh. Sharakhmetov. Analogues of Khintchine, Marcinkiewicz–Zygmund and Rosenthal inequalities for symmetric statistics. Scandinavian Journal of Statistics, 26(4):621–633, 1999.
Functional analysis (topicsglossary)
Spaces
Properties
Theorems
Operators
Algebras
Open problems
Applications
Advanced topics
Categories: