THE SEMI-SQUARE OPERATOR AND SIMILAR MATRIX-VALUED

OPERATORS ON THE SPECTRUM-EQUIVALENCE CLASS

CONTAINING THE PAULI MATRICES, AS SPECIAL CASES OF

A MORE GENERAL THEOREM ABOUT MATRIX-POWER TRANSFORMS

Donald R. Burleson, Ph.D.

December 2006

The purpose of this paper is to show that for the class of matrices including the Pauli particle-spin matrices in quantum theory a number of interesting transforms (two of which I call the semi-inversion operator and the semi-square operator) are possible, and that the computations involved may be greatly simplified in view of a more general theorem, special cases of which lead to the matrix-valued operators mentioned. I will present the more general theorem first.

THEOREM: For any 2-by-2 matrix A with eigenvalues λ1 ≠ λ2 and for any complex number z,

Az = [ (λ1z – λ2z)A + (λ1λ2z – λ2λ1z)I ]

where Az is the matrix defined by principal complex-valued exponentiations λjz on the eigenvalues in the relation

Azvj = λjzvj (j = 1, 2)

for representative eigenvectors vj corresponding respectively to the eigenvalues λj, provided the eigenvalue exponentiations do not lead to division by zero.

PROOF: Let A = with distinct eigenvalues λ1 and λ2. We first consider the case b ≠ 0. To determine eigenvectors (x y)T we let = and = .

These equations imply representative eigenvectors and so that for the diagonalizing similarity relation A = M M-1 the modal matrix and its inverse are and .

Thus we have

Az =   Since (l 1-a)(l 2-a) = l 1l 2 – (l 1+l 2)a + a2 as the characteristic polynomial of A is equal (with det A = ab – bd = l 1l 2 and trace A = a + d = l 1 + l 2) to ad – bc – a2 – ad + a2 = -bc, the matrix in question equals  In this matrix, the element in row 2, column 2 is equal (since, again, a + d = l 1 + l 2, the trace of A) to l 1z(l 2 – d) + l 2z(d - l 1) = d(l 2z - l 1z) + l 1zl 2 - l 2zl 1 so that the matrix may be written as as needed, for the case b ¹ 0.

As for the remaining case, if b = 0 then the matrix A is triangular, with its eigenvalues on the diagonal, so in this case it can be written as .

To determine eigenvectors (x y)T we again examine the relations A(x y)T = λj(x y)T to find that this time the representative eigenvectors may be taken to have the form and so that the modal matrix M in the similarity relation is with inverse Then

Az =    as required to establish the case b = 0. This completes the proof. █

Among many other things, this theorem provides an easy way to compute matrix powers Az when z is a real integer. For example, for with eigenvalues λ1 = 5 and λ2 = 2,  However, the most interesting special cases are those involving other kinds of exponents z.

THE SEMI-INVERSION OPERATOR

When z = i, the theorem implies what I call the semi-inversion operator, so called because (Ai)i = A-1, the regular multiplicative inverse of A: (It is assumed throughout that for complex exponentiations on the eigenvalues, where multivalued functions are concerned, we choose principal complex value.) In particular, for the class U of 2-by-2 matrices having eigenvalues 1 and –1, i.e. the spectrum-equivalence class containing the Pauli matrices   from quantum theory (the matrices giving vector-component projections, along various axes, of spin angular momentum), we have:

COROLLARY 1: The (principal) semi-inverse of any matrix A belonging to the class U (of two-by-two matrices having trace 0 and determinant –1) is given by evaluating, at A, the first-degree polynomial function PROOF: This follows immediately from the main theorem, as 1i = 1 and (-1)i = e in principal complex value. █

(For a more detailed discussion of the semi-inversion operator on the spectrum-equivalence class containing the Pauli matrices, click here.)

THE SEMI-SQUARE OPERATOR

With z = , the main theorem implies an operator that I call the semi-square operator, since The (principal) semi-square of A, then, is computed by  By this method one may compute, for example (using eigenvalues λ1 = 4 and λ2 = 1):  ,

a result that is component-wise between and .

In particular, as before, if A is a matrix spectrum-equivalent to the Pauli matrices, with eigenvalues 1 and –1, then we have:

COROLLARY 2: For A any matrix in the class U of 2-by-2 matrices with trace 0 and determinant –1, the (principal) semi-square of A may be computed by evaluating, at A, the first-degree polynomial function PROOF: The eigenvalues of A are 1 and –1, and in principal complex value we have so that the corollary follows immediately from the main theorem. █

Clearly one may also use the above techniques to compute such results as (the principal "semi-cube" of A), as well as (the "demi-semi cubes" of A), etc.

as well as an infinitude of other matrix-valued operators based upon complex exponentiations, e.g. exponentiations by the nth roots of –1 to produce arbitrarily protracted chains of matrices "between" A and A-1 to generalize the notion of the semi-inverse into "hemi-demi-semi-inverses" and so on.

The remarkable fact is that all these matrix forms are generated by first-degree matrix-argumented polynomial functions in which the coefficients are simple functions of the eigenvalues. It is precisely this sort of thing that makes eigenvalues central to matrix theory. I conclude with one other quick result:

COROLLARY 3: For all the above matrix transforms Az the eigenvalues are λ1z and λ2z.

PROOF: Since λ being an eigenvalue of a matrix A implies, for any polynomial f(x), that f(λ) is an eigenvalue of f(A), one merely applies this idea to the generating function for which, as we may verify by direct computation, h(λj) = λjz, j = 1, 2. █