What is a Markov chain?
A discrete-time stochastic process with a (finite or countable) state space where the next state depends only on the current state (the Markov property), not on past states.
What is a transition matrix and how is it used?
A square matrix P where P_ij is the probability of moving from state i to state j in one step; rows sum to 1. Powers P^n give n-step transition probabilities.
What is a stationary distribution in a Markov chain?
A probability distribution π over states that remains unchanged after one transition: π P = π (with sum of entries equal to 1). It describes long-run state proportions when the chain converges.
What is the Chapman-Kolmogorov equation and why is it useful?
It relates multi-step transitions to shorter steps: P^(n+m) = P^n P^m, and P^(n+m)(i,j) = sum_k P^n(i,k) P^m(k,j). It lets you compute n-step probabilities from smaller, known probabilities.