User:Timothee Flutre/Notebook/Postdoc/2011/06/28

From OpenWetWare

Jump to: navigation, search
Project name Main project page
Next entry

Calculate OLS estimates with summary statistics for simple linear regression

We obtained data from n individuals. Let be y_1,\ldots,y_n the (quantitative) phenotypes (eg. expression level at a given gene), and g_1,\ldots,g_n the genotypes at a given SNP.

We want to assess the linear relationship between phenotype and genotype. For this with use a simple linear regression:

yi = μ + βxi + εi with \epsilon_i \rightarrow N(0,\sigma^2) and for i \in {1,\ldots,n}

In vector-matrix notation:

y = Xθ + ε with \epsilon \rightarrow N_n(0,\sigma^2 I) and θT = (μ,β)

Here is the ordinary-least-square (OLS) estimator of θ:

\hat{\theta} = (X^T X)^{-1} X^T Y

\begin{bmatrix} \hat{\mu} \\ \hat{\beta} \end{bmatrix} =
\left( \begin{bmatrix} 1 & \ldots & 1 \\ g_1 & \ldots & g_n \end{bmatrix}
\begin{bmatrix} 1 & g_1 \\ \vdots & \vdots \\ 1 & g_n \end{bmatrix} \right)^{-1}
\begin{bmatrix} 1 & \ldots & 1 \\ g_1 & \ldots & g_n \end{bmatrix}
\begin{bmatrix} y_1 \\ \vdots \\ y_n \end{bmatrix}

\begin{bmatrix} \hat{\mu} \\ \hat{\beta} \end{bmatrix} =
\begin{bmatrix} n & \sum_i g_i \\ \sum_i g_i & \sum_i g_i^2 \end{bmatrix}^{-1}
\begin{bmatrix} \sum_i y_i \\ \sum_i g_i y_i \end{bmatrix}

\begin{bmatrix} \hat{\mu} \\ \hat{\beta} \end{bmatrix} =
\frac{1}{n \sum_i g_i^2 - (\sum_i g_i)^2}
\begin{bmatrix} \sum_i g_i^2 & - \sum_i g_i \\ - \sum_i g_i & n \end{bmatrix}
\begin{bmatrix} \sum_i y_i \\ \sum_i g_i y_i \end{bmatrix}

\begin{bmatrix} \hat{\mu} \\ \hat{\beta} \end{bmatrix} =
\frac{1}{n \sum_i g_i^2 - (\sum_i g_i)^2}
\begin{bmatrix} \sum_i g_i^2 \sum_i y_i - \sum_i g_i \sum_i g_i y_i \\ - \sum_i g_i \sum_i y_i + n \sum_i g_i y_i \end{bmatrix}

Let's now define 4 summary statistics, very easy to compute:

\bar{y} = \frac{1}{n} \sum_{i=1}^n y_i

\bar{g} = \frac{1}{n} \sum_{i=1}^n g_i

g^T g = \sum_{i=1}^n g_i^2

g^T y = \sum_{i=1}^n g_i y_i

This allows to obtain the estimate of the effect size only by having the summary statistics available:

\hat{\beta} = \frac{g^T y - n \bar{g} \bar{y}}{g^T g - n \bar{g}^2}

The same works for the estimate of the standard deviation of the errors:

\hat{\sigma}^2 = \frac{1}{n-r}(y - X\hat{\theta})^T(y - X\hat{\theta})

We can also benefit from this for the standard error of the parameters:

V(\hat{\theta}) = \hat{\sigma}^2 (X^T X)^{-1}

V(\hat{\theta}) = \hat{\sigma}^2 \frac{1}{n g^T g - n^2 \bar{g}^2}
\begin{bmatrix} g^Tg & -n\bar{g} \\ -n\bar{g} & n \end{bmatrix}

V(\hat{\beta}) = \frac{\hat{\sigma}^2}{g^Tg - n\bar{g}^2}

Personal tools