In this chapter, the linear transformations are from a
given finite dimensional vector space
to itself. Observe
that in this case, the matrix of the linear transformation is a square
matrix. So, in this chapter, all the matrices
are square matrices and a vector
means
for some positive integer
To solve this, consider the Lagrangian
Partially differentiating
and so on, till
Therefore, to get the points of extrema, we solve for
We therefore need to find a
Let
be a matrix of order
In general, we ask the question:
For what values of
there exist a non-zero
vector
such that
By Theorem 2.5.1, this system of linear equations has a non-zero solution, if
So, to solve (6.1.4), we are forced to choose those values of
Some books use the term EIGENVALUE in place of characteristic value.
has a non-zero solution. height6pt width 6pt depth 0pt
Consider the matrix
Then the characteristic polynomial of
is
Given the matrix
and
as eigenpairs.
are
eigenvectors of
,
then
is
also an eigenvector of
Suppose
is a root of the characteristic equation
Then
is singular and
Suppose
Then by Corollary 4.3.9,
the linear system
has
linearly
independent solutions. That is,
has
linearly independent
eigenvectors corresponding to the eigenvalue
whenever
Then
That is
is
equivalent to the equation
And this has the solution
Hence, from the above remark,
Then
from
In general, if
are linearly independent vectors
in
then
are
eigenpairs for the identity matrix,
Then
Now check that the
eigenpairs are
In this
case, we have TWO DISTINCT EIGENVALUES AND THE CORRESPONDING
EIGENVECTORS ARE ALSO LINEARLY INDEPENDENT. The reader is required to prove
the linear independence of the two eigenvectors.
Then
and
and
[Hint: Recall that if the matrices
and
are similar, then there
exists a non-singular matrix
such that
]
Then prove that
(
and
Also,
the coefficient of
So,
by definition of
trace.
But , from (6.1.5) and (6.1.7), we get
Hence, we get the required result. height6pt width 6pt depth 0pt
Then prove that
0
is an eigenvalue of
.If
Let
be an
matrix.
Then in the proof of the above theorem, we observed that
the characteristic equation
is
a polynomial equation of degree
in
Also, for some numbers
it has the form
Note that, in the expression
It turns out that the expression
holds true as a matrix identity. This is a celebrated theorem called the Cayley Hamilton Theorem. We state this theorem without proof and give some implications.
holds true as a matrix identity.
Some of the implications of Cayley Hamilton Theorem are as follows.
Then its characteristic polynomial is
where
and a polynomial
![]() |
|||
![]() |
That is, we just need to compute the powers of
In the language of graph theory, it says the following:
``Let
be a graph on
vertices. Suppose there is no path of length
or less from a vertex
to a vertex
of
Then there is no
path from
to
of any length. That is, the graph
is disconnected
and
and
are in different components."
This matrix identity can be used to calculate the inverse.
then the
set
Let the result be true for
We prove the result
for
We consider the equation
We have
From Equations (6.1.9) and (6.1.10), we get
This is an equation in
But the eigenvalues are distinct implies
Thus, we have the required result. height6pt width 6pt depth 0pt
We are thus lead to the following important corollary.
have the same set of eigenvalues.
is
an eigenvalue of
for any positive integer In each case, what can you say about the eigenvectors?
then show that
A K Lal 2007-09-12