By Kevin Xu
Introduction
This article goes into the basics of random walks on groups, and may skip or skim over prerequisite information. It is advisable to look up the articles on those topics before reading this.
Recall the definition of a group:
Definition 1.1: A group is a set closed under a binary operation satisfying
- Identity: There exists
such that
for all
- Inverse: For every
, there exists
such that
- Association: For every
, we have
Subsets of groups that are also groups are called subgroups. For example, the set containing just the identity is the trivial subgroup, and a group is a subgroup of itself. Because subgroups are closed under their operation (as a subset of the group), we can split a group into cosets of a subgroup , where each coset is
for elements
. Some of these cosets can overlap, though cosets are either mutually exclusive or are equivalent.
Exercise 1.2: For , prove that for any subgroup
the cosets
and
are either the same set or contain no overlap.
Hint: What happens when we have for
?
In fact, is actually called a left coset, and
would be a right coset. However, we usually talk about cosets of normal subgroups, which is when
and
are the same set. Note that this is equivalent to
being closed under conjugation (i.e subgroup
is normal if and only if
for all
). It turns out that the set of cosets of a normal subgroup forms another group called the quotient group. One familiar quotient group is the remainders when dividing by
, or
, which is a quotient group of the integers under addition.
Random Walks
Now that we’ve introduced groups, let’s talk about probability. When we usually think of probability, we think of events such as rolling a die or flipping a coin, and associate a value from to
to that event. We can do the same but with groups, by using a function to associate a probability to each element of a group.
Definition 1.3: A probability distribution is a function such that
Essentially, the probability of picking element is
. Below are several examples of probability distributions:
Flipping a fair coin can be represented as a distribution on , or the remainders modulo
, where both probabilities are
. Rolling a fair die is a distribution on
, again with equal probabilities for each element. When we have equal probabilities, or
, we say the distribution is uniform. Here,
is the order of
, and is the number of elements in the group.
A game where you can move forward or backward at every step with equal probability is best represented by , the integers under addition, with distribution
and
otherwise. In these non-uniform distributions, the elements that have probability zero are not important as they can never occur, so in many cases we want to work with the support of
, or the set of elements that have positive probability.
Now just like how we can construct a sequence of coin flips or dice rolls, we can construct a sequence by starting from the identity and then multiply by an element with probability
at every step.
Definition 1.4: A random walk on a group along with its associated probability distribution
is a sequence of elements starting from
such that the probability of
moving to
is
for
.
A random walk is called irreducible if the probability of getting to from
is positive for all
, meaning we can always get from any element to any other element, and a random walk is called aperiodic if for time
where
is fixed the probability of reaching an element is positive, meaning we can reach any state at any time given that the time is large enough. When both conditions are satisfied, we say that the walk is ergodic. Below are a few examples of random walks on groups:
Example 1.5: A random walk on the plane moving one unit in any direction parallel to the axes can be represented as a random walk on defined on
.
Example 1.6: Shuffling cards is a random walk on , the symmetric group. The probability distribution depends on what method is used to shuffle the cards.
One easy albeit inefficient way to shuffle a deck is by transposition, or switching the positions of two not necessarily distinct cards in the deck. This can be represented by the distribution
Exercise 1.7: Are Examples 1.5 and 1.6 ergodic random walks?
In standard probability, as long as we repeat for a large amount of times, our sequence will “converge” to some value. For example, a fair coin will always converge to a 50-50 distribution after a large number of tries no matter what the first few flips are.
In the same way, as long as we make a large amount of steps in any random walk, we also approach a “stationary distribution” consisting of the probabilities of landing on each element. But to find a stationary distribution, we need to find the probability of being at element after a long time
, which is quite hard in practice. Thus we define a convolution recursively by
and
Basically, is the sum over all instances of first rolling element
and then rolling
, which results in being at element
after two steps. The recursion then follows inductively.
The convolutions will converge to a unique stationary distribution when the random walk is ergodic (and in fact, will be uniform). However, ergodicity is also somewhat hard to establish.
Theorem 1.8: For a group and its probability distribution
, the random walk on
will be ergodic if and only if the support of
cannot be contained by a proper subgroup or a coset of a proper normal subgroup of
.
Proof:
Here, proper subgroup means it is not equivalent to . Let the support of
be
. First suppose that the random walk is irreducible, so starting from
we can arrive at any element
. Because we can only multiply by elements in
, in order to reach any element the support must contain a generating set for
. Thus
generates
. But now, if
can be contained by a proper subgroup of
, then due to closure the set generated by
, or
, must be contained within the subgroup. This is a contradiction, so
cannot be contained within a proper subgroup.
Conversely, suppose cannot be contained within a proper subgroup
. The set generated by
must be closed and contain inverses, so it must be a subgroup. However, it cannot be a proper subgroup, so then it must be
. Thus
generates
. Then any element can be written as a product of elements in
, so we can always get from element
to
by multiplying the corresponding representation for
. Therefore
is irreducible.
Now suppose that is contained by some coset of a proper normal subgroup
. Then the first
steps can be expressed as multiplying by
in order. Thus we arrive at their product
after steps. Now note that
because of association and normality. By carrying out this process multiple times, we see that the product becomes . Closure tells us that
must lie in
, and therefore the product lies in the coset
. Since
is a proper normal subgroup, we can take an element of the nonempty set
, and observe that we cannot arrive at this element in
steps. Because
was chosen arbitrarily, the random walk cannot be aperiodic.
Conversely, suppose the random walk is not aperiodic. Since the walk is irreducible, this means that we will arrive at any element periodically with interval . We can express this formally as the common factor
This is because two sequences will arrive at the same element at different steps only if we multiply by the representation of the identity in . Now define
We prove that these sets do not overlap with each other. First note that if , then
is the product of
elements in
and thus lies in
, which further implies
. Now suppose we have overlap between two sets, i.e.
. Rearranging, this is
By the definition of this product must lie in
, so we have
, which implies
. Therefore
and
have no common elements unless
are equivalent. Lastly, note that
is a normal subgroup of
. If
, then
and
. For some element
, we have
. Therefore since
is a subset of
, it is contained in a coset of
.
Besides wanting to know what a distribution will converge, we also want to find out how many steps we need to get a close approximation of the stationary distribution.
Definition 1.9: Total variation measures how much a distribution deviates from its stationary distribution after a certain amount of steps, and is calculated by
When the total variation is small, commonly measured with or other small number, we say that the random walk is sufficiently mixed. The mixing time, is the minimum time
for which a random walk is mixed.
Conclusion
It is always nice to find the mixing time or find better bounds on it in a random walk in order to have more efficient processes. For example, Persi Diaconis and Mehrdad Shahshahani proved in 1980 that it takes transpositions on a 52-card deck to ensure the total variation was less than
. In comparison, the popular riffle shuffle takes about
shuffles to ensure the same “randomness.” Many real-life events can be expressed as random walks on groups, and I encourage you to explore them yourself and find their stationary distributions/mixing times!
References
- [Dia87] Persi Diaconis. Random Walks on Groups: Characters and Geometry. Stanford University, 1987.
- [Dia12] Persi Diaconis. Group Representations in Probability and Statistics. Harvard University, 2012.
- [DS80] Persi Diaconis and Mehrdad Shahshahani. Generating a random permutation with random transpositions. Stanford University, 1980.
- [Out15] Nolan Outlaw. Markov Chains, Random Walks, and Card Shuffling. Illinois Institute of Technology, 2015.