In mathematics, given a field , nonnegative integers , and a matrix , a rank decomposition or rank factorization of A is a factorization of A of the form A = CF, where and , where is the rank of . Every finite-dimensional matrix has a rank decomposition: Let be an matrix whose column rank is . Therefore, there are linearly independent columns in ; equivalently, the dimension of the column space of is . Let be any basis for the column space of and place them as column vectors to form the matrix . Therefore, every column vector of is a linear combination of the columns of . To be precise, if is an matrix with as the -th column, then where 's are the scalar coefficients of in terms of the basis . This implies that , where is the -th element of . If is a rank factorization, taking and gives another rank factorization for any invertible matrix of compatible dimensions. Conversely, if are two rank factorizations of , then there exists an invertible matrix such that and . In practice, we can construct one specific rank factorization as follows: we can compute , the reduced row echelon form of . Then is obtained by removing from all non-pivot columns (which can be determined by looking for columns in which do not contain a pivot), and is obtained by eliminating any all-zero rows of . Note: For a full-rank square matrix (i.e. when ), this procedure will yield the trivial result and (the identity matrix). Consider the matrix is in reduced echelon form. Then is obtained by removing the third column of , the only one which is not a pivot column, and by getting rid of the last row of zeroes from , so It is straightforward to check that Let be an permutation matrix such that in block partitioned form, where the columns of are the pivot columns of . Every column of is a linear combination of the columns of , so there is a matrix such that , where the columns of contain the coefficients of each of those linear combinations. So , being the identity matrix. We will show now that .