We will need the following technical Lemma.
Proof of Theorem 1.4.
Throughout the proof, let , , be fixed.Let be a continuous injective map which preserves commutativity and spectrum. By Lemma 4.1, we first conclude that also preserves the algebraic multiplicities of the eigenvalues.
For easier comprehension, the proof will now be divided into several steps.
Step 1.
There exists an invertible matrix such that, for all diagonal matrices .
Proof.
Since the matrix is diagonalizable with eigenvalues , there exists an such that . For any we have , since . So, for some diagonal . By continuity of , the argument from [30, Lemma 2.1] gives that , which confirms the claim.∎
Without loss of generality, we can assume further that is the identity on all diagonal matrices.
Step 2.
is a hom*ogeneous map. Moreover, if and are diagonalizable matrices in , with the similarity matrix in , satisfying , then .
Proof.
For any there exists a matrix such that . Applying Step 1 for the map provides that for every diagonal matrix . Replacing by , gives that is hom*ogeneous on the set of all diagonalizable matrices in . By density of such matrices (a consequence of Lemma 2.2) we conclude that is a hom*ogeneous map on .
To prove the second assertion, we first show that any pair of matrices is simultaneously diagonalizable in . Assume that for some and . If or is invertible, we are done. So, we can assume that the index set is a nonempty proper subset of . We observe that and, since it is diagonalizable, there exists a commuting with all , , such that is diagonal and . Now, for we have and .
To close the proof of this step, there exists an invertible matrix such that . It follows that , . Hence .∎
Step 3.
Fix a . Let and be invertible matrices such that for all . Then .
Proof.
We know that for some invertible which implies that for every diagonal matrix . It follows that , and so,
| | |
∎
Step 4 (Base of induction, ).
Now we prove Theorem 1.4 completely for .
It suffices to prove the theorem for or . Indeed, the case is covered by Theorem 1.1, while if is a continuous injective commutativity and spectrum preserving map, then
| | |
is also such a map and thus the theorem follows from the case.
Step 4.1 (Matrix units).
We show that we can, without loss of generality, assume that there exist constants such that
| | |
We have so by our assumption that fixes all diagonal matrices, we have . By the fact that is injective and preserves spectrum, the matrix is a nonzero nilpotent so we conclude
| | |
for some not both equal to . Analogously we get
| | |
for some not both equal to . Since we have
| | |
If , then implies so . Now implies and therefore
| | |
If however , then implies so . Now implies and therefore
| | |
Without loss of generality, we can suppose that we are in the first case, as otherwise we can pass to the map .
As above, we obtain
| | |
for some not both equal to .By using it follows that and therefore .
This settles the matter for . In the case of , similarly as above using we obtain for some .
Notice that
| | |
Therefore, by passing to the map
| | |
without loss of generality we can further assume that .
Step 4.2 (Rank-one matrices).
We show that for all non-nilpotent rank-one matrices .
When we refer to orthogonality in we mean orthogonality with respect to the standard inner product, i.e. if and only if .
Every rank-one matrix in is either of the form
| | |
We start with case (a) and introduce
| | |
After careful consideration of [29, Lemma 5], we note that the same arguments can be conducted entirely within the algebra in order to obtain continuous functions such that
| | |
with the property and for all . In particular, for all holds
| | | |
| | | |
However, the rest of the argument from [29, Lemma 5] is not applicable as it leaves the algebra . Denote . From Lemma 4.2 we can write for . Then
| | |
We see that and it commutes with . Applying also the spectrum preserving property, we get that for some complex-valued continuous function . Similarly, by applying orthogonality with and commuting with one gets that for some continuous function . Then we have
| | |
and
| | |
By Step 3 we get that
| | |
Let us next consider the matrix , being orthogonal to and commuting with . The matrix commutes with which is not applicable. However, it commutes with . So, there exist functions and , not both equal to zero at any non-zero , such that
| | |
We use this observation with the relation and thus obtain that giving for some nonzero constant . By continuity and hom*ogeneity of we have
| | |
By our reduction above and hence which proves the case (a).
Let us proceed with the case (b). We can write any such rank-one (possibly nilpotent) matrix as where , and . Suppose (otherwise, we are in the case (a)). We claim that for some nonzero vectors , . Indeed, if is non-nilpotent, then it is diagonalizable. We then apply that the multiplicities of all eigenvalues are preserved. If is nilpotent, it can be considered as a limit of a rank-one non-nilpotent matrix. By lower semicontinuity, the rank can only decrease. Obviously, commutes with for any vector orthogonal to . From (a) we already know that . So,
(4.3) | | | |
We claim that and . Suppose the contrary. Then, since both and are nonzero, (4.3) shows that both and are zero or both nonzero. If they are nonzero, we have , which conflicts injectivity. Therefore, and and so, . Without loss of any generality, we can assume , and the scalar factor can be absorbed in . So far we have obtained that for some vector which depends on and . On the other hand, commutes with (which is possibly nilpotent), where and . From the previous argument, we know that for some . Then . This gives
| | |
At this point assume that is not nilpotent, i.e. is not orthogonal to . Then (defined up to a scalar factor) and are linearly independent, hence . This implies that for some scalar . From the spectrum preserving property, comparing the traces, we get that , and we are done.
Step 4.3 (Conclusion of the base step).
for all matrices .
By density, it suffices to prove that is the identity map on the set of matrices in with distinct eigenvalues. By Lemma 2.2 for every matrix of this kind there exists a matrix and a diagonal matrix such that . Applying Step 1 for the map we get that there exists a such that
| | |
By Step 4.2 we also have
| | |
so by linearity and consequently,
| | |
for all diagonal matrices . We conclude that is the identity map.
Step 5 (Induction step).
By way of induction, suppose that and that Theorem 1.4 holds for .
By Step 1, we can assume without loss of any generality, that for any diagonal matrix and, in particular, for all .
Step 5.1.
There exists a diagonal matrix such that for all , or, for all , .
Proof.
By continuity of it suffices to consider only diagonalizable matrices. By Step 2, for any diagonalizable matrix we have that . By the induction hypothesis, for each there exists an invertible matrix commuting with such that the map satisfies
(4.4) | | | |
or
| | |
The matrix is diagonal by our assumption preceding this step, and so, let . We can set the -th diagonal entry of to be equal to for every . We may and we do assume that is of the form (4.4) when .Replacing by a map we can assume that . Let . Applying that for every holds and that contains at least for some as , we conclude that is of the form (4.4) on since otherwise the restriction of to would be multiplicative and anti-multiplicative simultaneously. Comparing the action of on different subspaces one observes that all , , possibly differ from only in the first term. Additionally, for every , , we have
| | |
so, is in fact independent of . Hence . Replacing by does not affect the action of on , so we have obtained that for every . ∎
By the previous Step we assume that for every matrix , .
Step 5.2.
for every rank-one matrix .
Proof.
It suffices to consider only idempotent matrices due to the continuity and hom*ogeneity of .
We first show that for every matrix supported in the first row. By Lemma 4.2, such a matrix can be written as for some . Applying Step 3 we observe that for every and when and . Therefore, for every and thus, by Step 3, . A very similar argument works in the case that is supported only in the last column as it can then be represented as for some .
Let further be supported in the first block-row but not in the first row. Then, for a block diagonal unitary matrix , satisfying , we have . Indeed, by Schur’s triangularization, there exists a unitary matrix which upper-triangularizes the first block of such that the diagonal is exactly . Then we can set . For consider
| | |
It is not difficult to see that for every , we have , and so, . On the other hand, for , observe that is supported in the last column and hence, by the argument in the previous paragraph, . Then Step 3 closes this part. In the same spirit we handle the case when is supported in the last block-column. We are done if .
Further, suppose that the number of diagonal blocks in is greater than and that is supported in -th block-row or block-column for some . By unitary triangularization, we have that for some , a unitary matrix block-diagonal matrix satisfying and , and are as in Lemma 4.2. Similarly as before, for consider
| | |
For every , by a direct computation we obtain that and so, . If , then and hence, . Finally Step 3 gives as desired.∎
Step 6 (Conclusion of the inductive step).
for all matrices .
Proof.
The verification of this step is essentially the same as in Step 6 for the case.∎
∎