Analyzing pooled sequenced data with selection Main project page
Previous entry      Next entry

## Mean & Co-variance of random vectors

Define: $\displaystyle \vec{X} = (X_1, X_2,...,X_m)$ and $\displaystyle \vec{Y} = (Y_1, Y_2,...,Y_n)$

$\displaystyle E[\vec{X}] = (E[X_1], E[X_2], ..., E[X_m])$

$\displaystyle Cov(X,Y) = E[(X-E[X])(Y-E[Y))^T] = m x n$ matrix with $\displaystyle i,j$ element equal to $\displaystyle Cov(X_i,Y_j)$

Suppose A = p x m matrix, b $\displaystyle \in R^n$ , C = q x n matrix, d $\displaystyle \in R^q$ . $\displaystyle A = (a_{i,j}), b = (b_i)$

• Claim: E[AX + b] = AE[X] + b

Proof: ith element of $\displaystyle AX + B = \sum_{j=1}^m a_{i,j}X_j + b_i$ so

$\displaystyle E[\sum_{j=1}^m a_{i,j}X_j + b_i] = \sum_{j=1}^m a_{i,j}E[X_j] + b_i =$ ith element of $\displaystyle AE[X] + b$

• Claim: Suppose $\displaystyle Cov(X,Y) = V = (v_{i,j})$ then $\displaystyle Cov(AX+b, CY+d) = AVC^T$

Proof involves first considering the $\displaystyle i,j$ element of $\displaystyle AX+b$ and $\displaystyle CY+d$ then taking the covariance of both of these elements. By properties of 1D covariances, it works out that the $\displaystyle i,j$ element is just $\displaystyle (AVC^T)_{i,j}$

• Claim: The covariance matrix $\displaystyle \sum$ is positive defiinite

Proof: $\displaystyle \forall b \in R^n,$

$\displaystyle 0 \le Var(b^TX) = Cov(b^TX, b^TX) = b^TCov(X,X)(b^T)^T = b^T \sum b$

And obviously $\displaystyle \sum$ is symmetric thus by definition $\displaystyle \sum$ is pd.

## Useful results of multivariate normals

• Definition of Multivariate normal

$\displaystyle \vec{X} = (X_1, X_2,...,X_n)$ is multivariate normal if $\displaystyle \vec{X}$ can be written as:

$\displaystyle \vec{X} = AZ+b$ for some non-random matrix $\displaystyle A$ and non-random vector $\displaystyle b$ with $\displaystyle \vec{Z} = (Z_1, Z_2, ..., Z_n)$ with $\displaystyle Z_1, .., Z_n$ iid N(0,1).

From previous results, $\displaystyle E[X] = A0 + b = b$ and $\displaystyle Cov(X,X) = AIA^T = AA^T = \sum$ .

• Density of MVN

Defining $\displaystyle \sum = AA^T$ , we have $\displaystyle \| \sum \| = \| A A^T \| = \| A \| \| A^T \| = \| A \|$ and $\displaystyle \sum^-1 = (AA^T)^{-1} = (A^{-1})^TA^{-1}$

$\displaystyle f_X(x) = \frac{1}{\sqrt{2\pi}^2\|\sum\|^{frac{1}{2}}$