User:Hussein Alasadi/Notebook/stephens/2013/10/14
Analyzing pooled sequenced data with selection | Main project page Previous entry Next entry |
Mean & Co-variance of random vectorsDefine: [math]\displaystyle{ \vec{X} = (X_1, X_2,...,X_m) }[/math] and [math]\displaystyle{ \vec{Y} = (Y_1, Y_2,...,Y_n) }[/math] [math]\displaystyle{ E[\vec{X}] = (E[X_1], E[X_2], ..., E[X_m]) }[/math] [math]\displaystyle{ Cov(X,Y) = E[(X-E[X])(Y-E[Y))^T] = m x n }[/math] matrix with [math]\displaystyle{ i,j }[/math] element equal to [math]\displaystyle{ Cov(X_i,Y_j) }[/math] Suppose A = p x m matrix, b [math]\displaystyle{ \in R^n }[/math], C = q x n matrix, d [math]\displaystyle{ \in R^q }[/math]. [math]\displaystyle{ A = (a_{i,j}), b = (b_i) }[/math]
Proof: ith element of [math]\displaystyle{ AX + B = \sum_{j=1}^m a_{i,j}X_j + b_i }[/math] so [math]\displaystyle{ E[\sum_{j=1}^m a_{i,j}X_j + b_i] = \sum_{j=1}^m a_{i,j}E[X_j] + b_i = }[/math] ith element of [math]\displaystyle{ AE[X] + b }[/math]
Proof involves first considering the [math]\displaystyle{ i,j }[/math] element of [math]\displaystyle{ AX+b }[/math] and [math]\displaystyle{ CY+d }[/math] then taking the covariance of both of these elements. By properties of 1D covariances, it works out that the [math]\displaystyle{ i,j }[/math] element is just [math]\displaystyle{ (AVC^T)_{i,j} }[/math]
Proof: [math]\displaystyle{ \forall b \in R^n, }[/math] [math]\displaystyle{ 0 \le Var(b^TX) = Cov(b^TX, b^TX) = b^TCov(X,X)(b^T)^T = b^T \sum b }[/math] And obviously [math]\displaystyle{ \sum }[/math] is symmetric thus by definition [math]\displaystyle{ \sum }[/math] is pd. Useful results of multivariate normals
[math]\displaystyle{ \vec{X} = (X_1, X_2,...,X_n) }[/math] is multivariate normal if [math]\displaystyle{ \vec{X} }[/math] can be written as: [math]\displaystyle{ \vec{X} = AZ+b }[/math] for some non-random matrix [math]\displaystyle{ A }[/math] and non-random vector [math]\displaystyle{ b }[/math] with [math]\displaystyle{ \vec{Z} = (Z_1, Z_2, ..., Z_n) }[/math] with [math]\displaystyle{ Z_1, .., Z_n }[/math] iid N(0,1).
Defining [math]\displaystyle{ \sum = AA^T }[/math], we have [math]\displaystyle{ \| \sum \| = \| A A^T \| = \| A \| \| A^T \| = \| A \| }[/math] and [math]\displaystyle{ \sum^-1 = (AA^T)^{-1} = (A^{-1})^TA^{-1} }[/math] [math]\displaystyle{ f_X(x) = \frac{1}{\sqrt{2\pi}^2\|\sum\|^{frac{1}{2}} }[/math] |