Markov Matrices and Fourier Series¶
In this lecture we look at Markov matrices and Fourier series-two applications of eigenvalues and projections.
Eigenvalues of A^T¶
The eigenvalues of A and the eigenvalues of
So property 10 of determinants tells us that
Markov Matrices¶
A matrix like:
in which all entries are non-negative and each column adds to 1 is called a Markov Matrix
. These requirements come from Markov matrices' use in probability. Squaring or raising a Markov matrix to a power gives us another Markov matrix.
When dealing with systems of differential equations, eigenvectors with the eigenvalue of 0 represented steady states. Here we're dealing with powers of matrices and get a steady state when
The constraint that the columns add to 1 guarantees that 1 is an eigenvalue. All other eigenvalues will be less than 1. Remember that (if A has n independent eigenvectors) the solution to
Why does the fact that the columns sum to 1 guarantee that 1 is an eigenvalue? If 1 is an eigenvalue of A, then:
should be singular. Since we've substracted 1 from each diagonal entry, the sum of the entries in each column of
We're studying the equation
For example:
assumes that there's a 90% chance that a person in California will stay in California and only a 10% chance that she or he will move, while there's a 20% percent chance that a Massachusetts resident will move to California. If our initial conditions are
For the next few values of k, the Massachusetts population will decrease and the California population will increase while the total population remains constant at 1000.
To understand the long term behavior of this system we'll need the eigenvectors and eigenvalues of
Next we calculate the eigenvectors:
So we choose
To know how the population is distributed after a finite number of steps we look for an eigenvector corresponding to
so let
From what we learned about difference equations we know that:
When
so
In some applications Markov matrices are defined differently - their rows add to 1 rather than their columns. In this case, the calculations are the transpose of everything we've done here.
Fourier Series and Projections¶
Expansion with an orthonormal basis¶
If we have an orthonormal basis
where:
Since
Frourier series¶
The key idea above was that the basis of vector
This is Fourier series
is an infinite sum and the previous example was finite, but the two are related by the fact that the cosines and sines in the Fourier series are orthogonal.
We're now working in an infinite dimensional vector space. The vectors in this space are functions and the (orthogonal) basis vectors are
What does "orthogonal" mean in this context? How do we compute a dot product or inner product
in this vector space? For vectors in
We integrate from 0 to
The inner product of two basis vectors is 0, as desired. For example,
How do we find Fourier coefficients
of a function in this space? The constant term
We conclude that