Suppose that $A\in\mathcal M_{n}(\mathbb R)$ with $\text{Rank}(A)=k$. Prove that $A$ is sum of $k$ matrices $X_{1},\dots,X_{k}$ with $\text{Rank}(X_{i})=1$.
Problem
Source: Iranian National Olympiad (3rd Round) 2006
Tags: linear algebra, matrix, linear algebra unsolved
21.09.2006 21:27
Isn't this clear? Let $v_{1},...,v_{k}$ a basis of the image of the linear map. Each $Ax$ is of the form $A_{1}(x)v_{1}+...+A_{k}(x)v_{k}$. This gives the desired linear maps: $B_{i}(x)=A_{i}(x)v_{i}$ which have rank 1 and which add up to $A$.
11.08.2008 18:57
Hello. Let $ A$ be the $ n \times n$ matrix assosiated with the linear map. Now $ A$ having rank $ k$ will mean that $ A$ will have $ k$ linearly independent rows. So if we perform elementary row operations on the rows of $ A$ we can form a matrix $ A'$ with the first $ k$ rows linearly independent, and the other rows all with zero entries. So we may express $ A'$ as the sum of $ k$ matrices, $ A'_i$, with rank $ 1$ - namely $ A'_i$ is the matrix which shares the same same $ i$'th row as $ A'$, but all the other entries are $ 0$. Now $ B$ be the "elementary row operation" matrix which transforms $ A$ to $ A'$. Thus $ BA = A' \Rightarrow A = B^{ - 1}A'$. $ \Rightarrow A = \sum_{i = 1}^{k} B^{ - 1}A'_i$ $ B^{ - 1}$ is also an "elementary row operation" matrix. But performing row operations on the matrix $ A'_i$, with only one (non-zero) row will still leave it with one lineraly independent row! Hence $ A_i = B^{ - 1}A'_i$ is a one rank matrix, which when summed over $ i$ will give us $ A$. I may include a further short post for those that do not no how to construct $ B$, and also show why the existance of $ B^{ - 1}$ is trivial.
12.08.2008 07:04
As I promised: Elementary row operations can be broken into two different classes. Those which exchange two rows, and those which add a multiple of one row onto another. Say we have a square $ n \times n$ matrix $ A$ and we wish to exchange the $ a$'th and $ b$'th rows of this matrix where $ a \neq b$. Premultiplying by the following matrix does the trick: $ B_{ii} = 1$ whenever $ i \neq a,b% $ $ B_{ab} = 1$ $ B_{ba} = 1$ And all other entries are 0. We can verify this with suffix notation (I hope it is clear where summation convention does or does not apply): When $ i \neq a,b$: $ (BA)_{ij} = B_{ik}A_{kj} = B_{ii}A_{ij} = A_{ij}$ When $ i = a$: $ (BA)_{aj} = B_{ak}A_{kj} = B_{ab}A_{bj} = A_{bj}$ When $ i = b$: $ (BA)_{bj} = B_{bk}A_{kj} = B_{ba}A_{aj} = A_{aj}$ Thus it does the required job. Note that $ B$ is it's own inverse. Muliplying by $ B$ swaps the rows back into their intial position Now we find the matrix which adds a multiple, $ m$, of the $ b$'th row onto the $ a$'th row. This matrix is: $ B_{ii} = 1 \, \forall i$ $ B_{ab} = m$ And all the other entries are zero. Again we check by suffix notation... When $ i \neq a$: $ (BA)_{ij} = B_{ik}A_{kj} = B_{ii}A_{ij} = A_{ij}$ When $ i = a$: $ (BA)_{aj} = B_{ak}A_{kj} = B_{aa}A_{aj} + B_{ab}A_{bj} = A_{aj} + mA_{bj}$ As required. The inverse is merely the matrix which adds a multiple $ -m$ of the $ b$'th row to the $ a$'th row. Since each row operation has an inverse, the full composition of row operation matrix is also invertible. Hence we have shown the existance of $ B$ and $ B^{-1}$. it should also be clear why $ B^{-1}$ is a row operation matrix. It is because the inverse of each row operation is just a row operation. Now swapping two rows in a matrix which has only one non-zero row, will leave it with one row. Adding a multiple of another row will only change the matrix if "the other" row is the non-zero row. So there can be two rows or more...but this doesn't cause a problem. Since the rows will still be linearly dependent (they are just multiples of each other). I hope this clarifies any difficulties with the explaination