A fun problem! (Problem 2)
So I was preparing for my Algebra qual recently when I came across this pretty nice problem in one of the earlier quals from the university
Question statement
Let $X$ be a subspace of $M_n(\mathbb{C})$, the $\mathbb{C}$-vector space of all $n\times n$ complex matrices. Assume that every non-zero matrix in $X$ is invertible. Prove that $\mathrm{dim}_{\mathbb{C}}(X) \leq 1$.
Of-course, as always you are encouraged to close the blog post right now, and try to have a go at this problem yourself!
Solution
So the first thing to do upon reading this question is momentarily rejoice that you are working in a vector space, because you are currently asked to do linear algebra in an abstract algebra qual; and that’s a win already. Next thing to do is to really understand what the problem is asking. So we have a space where every point (or vector if you insist on being purist) is an $n\times n$ complex matrix; and we have a subspace $X$ in said vector space, with the property that every point in $X$ is invertible (you see now why calling the points vectors is slightly troublesome).
Whenever you are working on vector spaces, the first thing to do is to pompously declare how you are planning to denote your basis elements. (You can’t really do this with modules, but hey, that’s not our problem). So suppose $X$ has basis elements $A_1, \dots, A_r$. Recall that we have to prove that $\mathrm{dim}_{\mathbb{C}}(X) \leq 1$, which in other words is simply equivalent to proving $r \leq 1$ (since dimension of a vector space is just the cardinality of its basis set).
Cool beans, but how do we do this? Proving this directly seems like a hassle (an approach is to consider a generic point $A$ in $X$, in terms of $A_1, \dots, A_r$ and then proving that it has to be a multiple of some $A_i$). It would simply be easier to assume the statement is true, get all the benefits of that assumption and then prove that something goes awfully wrong.
So we suppose $A_1, \dots, A_r$ span the sub-space $X$. The only additional tool this gives us is that any element of $X$ can be written as some linear combination of $A_i$. Now since we are looking for a contradiction, we want something that goes against one or more of the hypothesis. The only realistic option is for us to look for some element of $X$ that is not invertible. So our game plan is this
Prove that there exists some $A$ in $X$ such that $A$ is not invertible.
Let’s start from a simpler place assuming $r=2$. The reason for this is, if we can find some non-invertible $A$ as a linear combination of $A_1$ and $A_2$; then we have it as a linear combination of $A_1, \dots, A_r$, with all the coefficients of $A_i$ being $0$ for $i \geq 3$. (Of course, if this doesn’t work, we can cut our losses and go back to the general case where $r=r$.)
So, we have
$$A = \alpha A_1 + \beta A_2$$
We want to show $A$ is non-invertible. There are a plethora of equivalent conditions for a matrix to be invertible, but the most straightforward one is where $\mathrm{det}(M) \neq 0$ for some invertible matrix $M$. We also know that $\mathrm{det}$ as a functor is multiplicative (i.e, $\mathrm{det}(MN) = \mathrm{det}(M)\mathrm{det}(N)$) but sadly, not additive.
So is there anything at all we can do here given that we have a sum in the above equation? Well, there is a way to force products in the above equation
$$A = A_1(\alpha I - \beta A_1^{-1}A_2)$$
now, we want $\mathrm{det}(A) = 0$, so the only way this can happen is if $$\mathrm{det}(\alpha I - \beta A_1^{-1}A_2) = 0$$
So can we find a suitable $\alpha$ and $\beta$ that would give us that?
Hm, the above equation seems eerily familiar, where have we ever taken determinant of differences with a constant multiple of the identity matrix involved? That’s right! to compute the characteristic polynomial of a matrix, from which we extract the eigen-values. The process is usually done by solving the equation
$$\mathrm{det}(\lambda I - M) = 0$$
and any solution $\lambda$ to the above equation is an eigenvalue of $M$. But hey! wait a second, if we let $M = A_1^{-1}A_2$, then of course $M$ is invertible, and thus has all non-zero eigenvalues! and so we can just take any eigenvalue $\lambda$ of $M$ and let $\alpha = \lambda$ and $\beta = 1$ and we will get
$$\mathrm{det}(\alpha I - \beta A_1^{-1}A_2) = \mathrm{det}(\lambda I - M) = 0$$
proving exactly what we want to prove!
So what we have essentially done is shown that if $X$ was indeed spanned by $2$ vectors, then there exists a linear combination of them, namely $\lambda A_1 + A_2$ (where $\lambda$ is an eigenvalue of $A_1^{-1}A_2$), that is not invertible! Thus, $X$ can not be spanned by any set of basis elements with cardinality greater than $1$. So $\mathrm{dim}_{\mathbb{C}}(X) \leq 1$ and we win!