Lecture 32: Markov Chains Continued | Statistics 110 | Summary and Q&A

TL;DR
Reversible Markov chains, such as random walks on undirected networks, have a stationary distribution that is proportional to the degrees of the nodes in the network.
Key Insights
- 🏃 Reversible Markov chains have the property that if the chain is run in reverse, it will look the same as running it forward in time.
- 🚶 Random walks on undirected networks are an example of reversible Markov chains.
- ⛓️ The stationary distribution in a reversible Markov chain is proportional to the degrees of the nodes in the network.
Transcript
Okay, so I'll remind you, what a Markov chain is, cuz a week ago. Markov chain, in a certain sense, is kinda memoryless type of thing, in that you're imagining that this particle bouncing around from state to state, and it doesn't remember how it got. It's memoryless, not in as quite the same sense as the exponential, but it's memoryless in the sen... Read More
Questions & Answers
Q: What is a reversible Markov chain?
A reversible Markov chain is a memoryless system where if the chain is run in reverse, it will look the same as running it forward in time. This property is also known as reversibility or time reversibility.
Q: How can we determine if a Markov chain is reversible?
A Markov chain is reversible if there exists a probability vector S such that Si qij = Sj qji for all states i and j, where qij is the probability of transitioning from state i to state j.
Q: What is the significance of the stationary distribution in reversible Markov chains?
The stationary distribution in a reversible Markov chain is proportional to the degrees of the nodes in the network. This means that nodes with higher degrees will have a higher probability of being visited in the long run.
Q: How can we compute the stationary distribution for a reversible Markov chain?
To compute the stationary distribution, we can use the equation Si = (di / Σdi), where di is the degree of node i and the sum is taken over all nodes in the network. This equation ensures that the resulting vector is a valid probability distribution.
Summary
In this video, the concept of Markov chains is introduced. Markov chains are memoryless processes where the future and the past are conditionally independent given the present state. Pictures of various Markov chains are shown to provide intuition for how these chains behave. The concepts of irreducibility, recurrence, transience, and periodicity are defined and illustrated using examples. The video also introduces the concept of stationary distributions and shows how to determine if a Markov chain has a stationary distribution. It is stated that for irreducible Markov chains, there always exists a unique stationary distribution. Reversible Markov chains are then introduced, and it is shown that if a Markov chain is reversible, it is also stationary.
Questions & Answers
Q: What is a Markov chain?
A Markov chain is a memoryless process where the future and the past are conditionally independent given the present state.
Q: How does the concept of conditional independence apply to Markov chains?
In a Markov chain, given the present state, the future and the past are conditionally independent. The chain does not remember or care how it got to the current state, only the current state matters.
Q: What is irreducibility in a Markov chain?
In a Markov chain, irreducibility means that it is possible to get from any state to any other state with positive probability in some finite number of steps.
Q: What is recurrence in a Markov chain?
In a Markov chain, a state is recurrent if it is guaranteed to be visited again with probability one if the chain starts in that state.
Q: What is transience in a Markov chain?
In a Markov chain, a state is transient if it may be visited again for a while but eventually it will never be visited again.
Q: What is periodicity in a Markov chain?
In a Markov chain, periodicity refers to a specific repeating pattern in the chain's behavior, where the chain cycles through a set of states in a deterministic and predictable manner.
Q: What is a stationary distribution?
In a Markov chain, a stationary distribution is a probability distribution that remains unchanged by the transition matrix of the chain. It represents the long-run fraction of time that the chain spends in each state.
Q: How do you determine if a Markov chain has a stationary distribution?
For an irreducible Markov chain, a stationary distribution exists and is unique. It can be found by solving the system of linear equations given by sQ = s, where s is the stationary distribution and Q is the transition matrix of the chain.
Q: What is reversibility in a Markov chain?
In a Markov chain, reversibility means that if the chain is started with a certain probability distribution, and a video recording of the chain's behavior is played backwards in time, it will be indistinguishable from the original forward video. This is also known as time reversibility.
Q: What is a random walk on an undirected network?
A random walk on an undirected network is a Markov chain where the transitions between states are determined by randomly choosing an available edge with equal probabilities. It represents the movement of a particle on an undirected graph.
Q: How can the stationary distribution be determined for a reversible Markov chain?
For a reversible Markov chain, the stationary distribution can be found by taking the degree of each node and dividing it by the sum of all node degrees. This normalization ensures that the resulting vector is a probability distribution.
Takeaways
Markov chains are an important concept in probability theory and have applications in various fields such as physics, finance, and computer science. They provide a way to model and analyze systems that exhibit memoryless behavior. Understanding the concepts of irreducibility, recurrence, transience, and periodicity is crucial in analyzing the long-term behavior of a Markov chain. Stationary distributions capture the long-run fraction of time spent in each state and can be found by solving a set of linear equations. Reversible Markov chains are a special class where the stationary distribution can be easily determined based on the degrees of the nodes in the underlying network.
Summary & Key Takeaways
-
Markov chains are memoryless systems where the future and past of the chain are conditionally independent given the current state.
-
Reversible Markov chains have the property that if the chain is run in reverse, it will look the same as running it forward in time.
-
Random walks on undirected networks are an example of reversible Markov chains.
-
The stationary distribution of a reversible Markov chain is proportional to the degrees of the nodes in the network.
Share This Summary 📚
Explore More Summaries from Harvard University 📚





