User:Jarle Pahr/SVD: Difference between revisions
Jarle Pahr (talk | contribs) No edit summary |
Jarle Pahr (talk | contribs) |
||
Line 57: | Line 57: | ||
The columns of U, called the ''left singular vectors'' and denoted <math>U_i</math> form linear combinations of the concentations variables. Similarly, columns of V, called the ''right singular vectors''' and denoted <math>V_i</math> form linear combinations of the fluxes. | The columns of U, called the ''left singular vectors'' and denoted <math>U_i</math> form linear combinations of the concentations variables. Similarly, columns of V, called the ''right singular vectors''' and denoted <math>V_i</math> form linear combinations of the fluxes. | ||
<math>u_k^Tx = {u_{k1}}{x_1} + {u_{k2}}{x_2} + ... + {u_{kr}}{x_m}</math> | |||
<math>v_k^Tv = {v_{k1}}{v_1} + {v_{k2}}{v_2} + ... + {v_{kn}}{v_n}</math> | |||
Each column | |||
<math>\sum_{\text{for } v_{kj < 0}} u_{ki}x_{i}\underset{{\sum\limits_{{\text{for }}{{\text{v}}_{kj < 0}}} {{v_{kj}}{v_j}} }}{\overset{{\sum\limits_{{\text{for }}{{\text{v}}_{kj > 0}}} {{v_{kj}}{v_j}} }}{\rightleftharpoons}} \sum_{\text{for } v_{kj > 0}} u_{ki}x_{i}</math> | |||
Attention: The right singular vectors <math>v_k</math> should not be confused with the flux vector <math>v</math>. | |||
=Links= | =Links= |
Revision as of 06:59, 15 February 2014
Notes on Singular Value Decomposition (SVD):
See also http://en.wikipedia.org/wiki/Singular_value_decomposition
Eigenvectors and eigenvalues:
An eigenvector of a matrix is A is a non-zero vector [math]\displaystyle{ \overrightarrow v }[/math] that satisfies the equation
[math]\displaystyle{ A\overrightarrow v = \lambda \overrightarrow v }[/math]
Where [math]\displaystyle{ \lambda }[/math] is a scalar. [math]\displaystyle{ \lambda }[/math] is called an eigenvalue.
Singular values:
Any mxn matrix A can be factored as:
[math]\displaystyle{ A = U\sum {V^T} }[/math] where:
- [math]\displaystyle{ U }[/math] is an mXm orthogonal matrix where the columns are the eigenvectors of [math]\displaystyle{ AA^T }[/math]
- [math]\displaystyle{ V }[/math] is an nXn orthogonal matrix where the columns are the eigenvectors of [math]\displaystyle{ A^{T}A }[/math]
- [math]\displaystyle{ \sum{} }[/math] is an mXn diagonal matrix where the [math]\displaystyle{ r }[/math] first diagona elements are the square roots of the eigenvalues of [math]\displaystyle{ A^{T}A }[/math], also called the singular values of A. Singular values are always real and positive.
For any SVD, the following facts apply:
- The rank of a matrix A equals the number of singular values of A.
- The column space of A is spanned by the first r columns of U.
- The null space of A is spanned by the last n − r columns of V.
- The row space of A is spanned by the first r columns of V.
- The null space of AT is spanned by the last m − r columns of U.
The columns of [math]\displaystyle{ U }[/math] are called the left singular vectors, and the columns of [math]\displaystyle{ V }[/math] theright singular vectors.
SVD in NumPy
http://docs.scipy.org/doc/numpy/reference/generated/numpy.linalg.svd.html
To find the rank of a matrix A, assign u,s,vh = linalg.svd(A) and r = len(s)
Applications to metabolic networks
Given a metabolic network with stoichoimetric matrix [math]\displaystyle{ S }[/math], the change in metabolite concentrations [math]\displaystyle{ x }[/math] as function of the reaction rates [math]\displaystyle{ v }[/math] is given by the equation:
[math]\displaystyle{ Sv = \frac{{dx}}{{dt}} }[/math]
Applying the SVD to S, we get:
[math]\displaystyle{ {U^T}\frac{{dx}}{{dt}} = \sum {V^T}v }[/math]
[math]\displaystyle{ \frac{{d(u_k^Tx)}}{{dt}} = {\sigma _k}(v_k^Tv),\,\,k = 1,...,r }[/math]
The columns of U, called the left singular vectors and denoted [math]\displaystyle{ U_i }[/math] form linear combinations of the concentations variables. Similarly, columns of V, called the right singular vectors' and denoted [math]\displaystyle{ V_i }[/math] form linear combinations of the fluxes.
[math]\displaystyle{ u_k^Tx = {u_{k1}}{x_1} + {u_{k2}}{x_2} + ... + {u_{kr}}{x_m} }[/math]
[math]\displaystyle{ v_k^Tv = {v_{k1}}{v_1} + {v_{k2}}{v_2} + ... + {v_{kn}}{v_n} }[/math]
Each column
[math]\displaystyle{ \sum_{\text{for } v_{kj \lt 0}} u_{ki}x_{i}\underset{{\sum\limits_{{\text{for }}{{\text{v}}_{kj \lt 0}}} {{v_{kj}}{v_j}} }}{\overset{{\sum\limits_{{\text{for }}{{\text{v}}_{kj \gt 0}}} {{v_{kj}}{v_j}} }}{\rightleftharpoons}} \sum_{\text{for } v_{kj \gt 0}} u_{ki}x_{i} }[/math]
Attention: The right singular vectors [math]\displaystyle{ v_k }[/math] should not be confused with the flux vector [math]\displaystyle{ v }[/math].
Links
http://www.uwlax.edu/faculty/will/svd/
math.stackexchange.com/questions/261801/how-can-you-explain-the-singular-value-decomposition-to-non-specialists
http://www.ams.org/samplings/feature-column/fcarc-svd
http://campar.in.tum.de/twiki/pub/Chair/TeachingWs05ComputerVision/3DCV_svd_000.pdf
http://langvillea.people.cofc.edu/DISSECTION-LAB/Emmie%27sLSI-SVDModule/p4module.html
Finding the null space of a matrix
Finding null space of matrix: http://www.math.odu.edu/~bogacki/cgi-bin/lat.cgi
https://github.com/amilsted/evoMPS/blob/master/evoMPS/nullspace.py
http://wiki.scipy.org/Cookbook/RankNullspace
Bibliography
Palsson, 2006. Systems Biology: Properties of reconstructed networks. Cambridge.
Famili & Palsson. Journal of theoretical biology 224 (2003) 87-96.