19th Ave New York, NY 95822, USA

# 统计代写| Stationary distribution stat代写

## 统计代考

11.3 Stationary distribution
The concepts of recurrence and transience are important for understanding the longrun behavior of a Markov chain. At first, the chain may spend time in transient states. Eventually though, the chain will spend all its time in recurrent states. But what fraction of the time will it spend in each of the recurrent states? This question is answered by the stationary distribution of the chain, also known as the steadystate distribution. We will learn in this section that for irreducible and aperiodic
Markov chains
507
Markov chains, the stationary distribution describes the long-run behavior of the chain, regardless of its initial conditions. It will tell us both the long-run probability of being in any particular state, and the long-run proportion of time that the chain spends in that state.

Definition 11.3.1 (Stationary distribution). A row vector $\mathrm{s}=\left(s_{1}, \ldots, s_{M}\right)$ such that $s_{i} \geq 0$ and $\sum_{i} s_{i}=1$ is a stationary distribution for a Markov chain with transition matrix $Q$ if
$$\sum_{i} s_{i} q_{i j}=s_{j}$$
for all $j$. This system of linear equations can be written as one matrix equation:
$$\mathrm{s} Q=\mathrm{s}$$
Recall that if $\mathrm{s}$ is the distribution of $X_{0}$, then $\mathrm{s} Q$ is the marginal distribution of $X_{1}$. Thus the equation $\mathrm{s} Q=\mathrm{s}$ means that if $X_{0}$ has distribution $\mathrm{s}$, then $X_{1}$ also has distribution $\mathbf{s}$. But then $X_{2}$ also has distribution $\mathbf{s}$, as does $X_{3}$, etc. That is, a Markov chain whose initial distribution is the stationary distribution s will stay in the stationary distribution forever.

One way to think about the stationary distribution of a Markov chain intuitively is to imagine a large number of particles, each independently bouncing from state to state according to the transition probabilities. After a while, the system of particles will approach an equilibrium where, at each time period, the number of particles leaving a state will be counterbalanced by the number of particles entering that state, and this will be true for all states. At this equilibrium, the system as a whole will appear to be stationary, and the proportion of particles in each state will be distributions more after Definition 11.4.1.
11.3.2 (Stationary distribution is marginal, not conditional). When a Markov chain is at the stationary distribution, the unconditional PMF of $X_{n}$ equals s for all $n$, but the conditional PMF of $X_{n}$ given $X_{n-1}=i$ is still encoded by the $i$ th row of the transition matrix $Q$.

If a Markov chain starts at the stationary distribution, then all of the $X_{n}$ are identically distributed (since they have the same marginal distribution s), but they are not necessarily independent, since the conditional distribution of $X_{n}$ given $X_{n-1}=i$ is, in general, different from the marginal distribution of $X_{n}$.
11.3.3 (Sympathetic magic). If a Markov chain starts at the stationary distribution, then the marginal distributions of the $X_{n}$ are all equal. This is not the same as saying that the $X_{n}$ themselves are all equal; confusing the random variables $X_{n}$ with their distributions is an example of sympathetic magic.

For very small Markov chains, we may solve for the stationary distribution by hand, using the definition. The next example illustrates this for a two-state chain.

## 统计代考

11.3 平稳分布

507

$$\sum_{i} s_{i} q_{i j}=s_{j}$$

$$\mathrm{s} Q=\mathrm{s}$$

11.3.2（平稳分布是边际的，不是有条件的）。当马尔可夫链处于平稳分布时，$X_{n}$ 的无条件 PMF 对所有 $n$ 都等于 s，但 $X_{n}$ 的条件 PMF 给定 $X_{n-1}=i$仍然由转移矩阵 $Q$ 的第 $i$ 行编码。

11.3.3（同情魔法）。如果马尔可夫链从平稳分布开始，则 $X_{n}$ 的边际分布都是相等的。这不等于说 $X_{n}$ 本身都是平等的；将随机变量 $X_{n}$ 与其分布混淆是同情魔法的一个例子。