Suppose we have a Markov chain, or a sequence of states so that every state is independent of any past steps. One example is a frog jumping around on the x-axis and moving one unit in any direction at every step. At any position the frog doesn’t care about how it got to that position; it will just move either left or right because it is at that current position. For another example, imagine that the axis is now bounded; this is still a Markov chain because even at the ends the frog doesn’t care about its past and can only move in one direction.
We can represent a Markov chain using a matrix by letting the term in the th row and th column be or the probability from getting to state from . This matrix is called the transition matrix, and by repeatedly multiplying by itself we can get the probabilities , or the probability of getting to state from in steps.
Exercise: Verify the above statement.
When , then this matrix will become the stationary distribution, and we find the overall probabilities of ever getting from state to . In most cases, this eventual transition matrix will become uniform (specifically, it will be if the chain is ergodic). However, we can measure how long that this process to the standard distribution takes. But this would just be , so instead we consider how long it takes until the variation distance is sufficiently small. Mathematicians usually consider that to be when the largest possible difference between an element in the current transition matrix and the stationary distribution is not greater than . The time it takes for this to occur, or the exponent of the transition matrix, is called the mixing time.
Mixing Times on Cyclic Groups
In this section, we consider a reversible ergodic Markov chain with generators on the cyclic group . What this means is that on the remainders modulo , we have a frog start from 0 and at every step it can jump forward or backward units. For example, if 1 is a generator, the frog could move to 1 or on the first step. Our question is:
What is the generating group that gives us the highest and lowest mixing tines?
It turns out that the mixing time is bounded by the 2nd largest eigenvalue of the transition matrix. Specifically this bound is
where is usually . In order to find the best and worst mixing times, we just find the maximum and minimum values of for every . This isn’t exactly correct, but it works for the most part because the bound is quite strict.
Now how do we find the eigenvalues of the transition matrix ? Well, if we have some eigenfunction , then since we get
where is a primitive th root of unity. But note that
so if we define , then we have
which gives us our eigenvalues. Noting that , it suffices to find the maximum and minimum values of for large enough .
Let’s assume that this sum can equal 1, and we have the relation
for . This may be a little weird when is composite, but for now we’ll ignore it and leave an extra term . Then we have
so in order to have a solution we need the coefficient to be a quadratic residue. Naturally, this is also hard to figure out because quadratic residues aren’t fully understood yet. Therefore we just turn to an easy alternative: if this coefficient is a square itself, then it will obviously be a quadratic residue!
Suppose that this square is . Substituting gives
so we immediately get integer solutions
And substituting this value of back gives
which gives us a direct solution. Some finer details need to be hammered out, such as possibly not being coprime to , but notice we can always choose other rational to give us different values. This concludes finding a generating set that yields the minimum residue. Further extensions can be done similarly, and are listed below as exercises to the reader.
Exercise: Find a generating set that maximizes the sum of squares.
Exercise: Here we tackled the reversible case. What is the minimum and maximum for the general case, where if we can jump forward and not backwards?
My research focuses on the general case, and if you wish to talk more about this topic, feel free to use the email provided in the team page.
Thanks for reading!