language-icon Old Web
English
Sign In

Schur product theorem

In mathematics, particularly in linear algebra, the Schur product theorem states that the Hadamard product of two positive definite matrices is also a positive definite matrix. The result is named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik.) In mathematics, particularly in linear algebra, the Schur product theorem states that the Hadamard product of two positive definite matrices is also a positive definite matrix. The result is named after Issai Schur (Schur 1911, p. 14, Theorem VII) (note that Schur signed as J. Schur in Journal für die reine und angewandte Mathematik.) For any matrices M {displaystyle M} and N {displaystyle N} , the Hadamard product M ∘ N {displaystyle Mcirc N} considered as a bilinear form acts on vectors a , b {displaystyle a,b} as where tr {displaystyle operatorname {tr} } is the matrix trace and diag ⁡ ( a ) {displaystyle operatorname {diag} (a)} is the diagonal matrix having as diagonal entries the elements of a {displaystyle a} . Suppose M {displaystyle M} and N {displaystyle N} are positive definite, and so Hermitian. We can consider their square-roots M 1 2 {displaystyle M^{frac {1}{2}}} and N 1 2 {displaystyle N^{frac {1}{2}}} , which are also Hermitian, and write Then, for a = b {displaystyle a=b} , this is written as tr ⁡ ( A ∗ A ) {displaystyle operatorname {tr} left(A^{*}A ight)} for A = N 1 2 diag ⁡ ( a ) M ¯ 1 2 {displaystyle A=N^{frac {1}{2}}operatorname {diag} (a){overline {M}}^{frac {1}{2}}} and thus is strictly positive for A ≠ 0 {displaystyle A eq 0} , which occurs if and only if a ≠ 0 {displaystyle a eq 0} . This shows that ( M ∘ N ) {displaystyle (Mcirc N)} is a positive definite matrix. Let X {displaystyle X} be an n {displaystyle n} -dimensional centered Gaussian random variable with covariance ⟨ X i X j ⟩ = M i j {displaystyle langle X_{i}X_{j} angle =M_{ij}} . Then the covariance matrix of X i 2 {displaystyle X_{i}^{2}} and X j 2 {displaystyle X_{j}^{2}} is Using Wick's theorem to develop ⟨ X i 2 X j 2 ⟩ = 2 ⟨ X i X j ⟩ 2 + ⟨ X i 2 ⟩ ⟨ X j 2 ⟩ {displaystyle leftlangle X_{i}^{2}X_{j}^{2} ight angle =2leftlangle X_{i}X_{j} ight angle ^{2}+leftlangle X_{i}^{2} ight angle leftlangle X_{j}^{2} ight angle } we have Since a covariance matrix is positive definite, this proves that the matrix with elements M i j 2 {displaystyle M_{ij}^{2}} is a positive definite matrix. Let X {displaystyle X} and Y {displaystyle Y} be n {displaystyle n} -dimensional centered Gaussian random variables with covariances ⟨ X i X j ⟩ = M i j {displaystyle leftlangle X_{i}X_{j} ight angle =M_{ij}} , ⟨ Y i Y j ⟩ = N i j {displaystyle leftlangle Y_{i}Y_{j} ight angle =N_{ij}} and independent from each other so that we have

[ "Schur decomposition" ]
Parent Topic
Child Topic
    No Parent Topic