In mathematics, given a field
, non-negative integers
, and a matrix
, a rank decomposition or rank factorization of A is a factorization of A of the form A = CF, where
and
, where
is the rank of
.
Existence
Every finite-dimensional matrix has a rank decomposition: Let
be an
matrix whose column rank is
. Therefore, there are
linearly independent columns in
; equivalently, the dimension of the column space of
is
. Let
be any basis for the column space of
and place them as column vectors to form the
matrix
. Therefore, every column vector of
is a linear combination of the columns of
. To be precise, if
is an
matrix with
as the
-th column, then

where
's are the scalar coefficients of
in terms of the basis
. This implies that
, where
is the
-th element of
.
Non-uniqueness
If
is a rank factorization, taking
and
gives another rank factorization for any invertible matrix
of compatible dimensions.
Conversely, if
are two rank factorizations of
, then there exists an invertible matrix
such that
and
.
Construction
In practice, we can construct one specific rank factorization as follows: we can compute
, the reduced row echelon form of
. Then
is obtained by removing from
all non-pivot columns (which can be determined by looking for columns in
which do not contain a pivot), and
is obtained by eliminating any all-zero rows of
.
Note: For a full-rank square matrix (i.e. when
), this procedure will yield the trivial result
and
(the
identity matrix).
Example
Consider the matrix

is in reduced echelon form.
Then
is obtained by removing the third column of
, the only one which is not a pivot column, and
by getting rid of the last row of zeroes from
, so

It is straightforward to check that

Proof
Let
be an
permutation matrix such that
in block partitioned form, where the columns of
are the
pivot columns of
. Every column of
is a linear combination of the columns of
, so there is a matrix
such that
, where the columns of
contain the coefficients of each of those linear combinations. So
,
being the
identity matrix. We will show now that
.
Transforming
into its reduced row echelon form
amounts to left-multiplying by a matrix
which is a product of elementary matrices, so
, where
. We then can write
, which allows us to identify
, i.e. the nonzero
rows of the reduced echelon form, with the same permutation on the columns as we did for
. We thus have
, and since
is invertible this implies
, and the proof is complete.
Singular value decomposition
If
then one can also construct a full-rank factorization of
via a singular value decomposition

Since
is a full-column-rank matrix and
is a full-row-rank matrix, we can take
and
.
Consequences
rank(A) = rank(AT)
An immediate consequence of rank factorization is that the rank of
is equal to the rank of its transpose
. Since the columns of
are the rows of
, the column rank of
equals its row rank.
Proof: To see why this is true, let us first define rank to mean column rank. Since
, it follows that
. From the definition of matrix multiplication, this means that each column of
is a linear combination of the columns of
. Therefore, the column space of
is contained within the column space of
and, hence,
.
Now,
is
, so there are
columns in
and, hence,
. This proves that
.
Now apply the result to
to obtain the reverse inequality: since
, we can write
. This proves
.
We have, therefore, proved
and
, so
.
Notes
References