统计代写| Poisson stat代写

统计代考

4.7 Poisson
The last discrete distribution that we’ll introduce in this chapter is the Poisson, which is an extremely popular distribution for modeling discrete data. We’ll introduce its PMF, mean, and variance, and then discuss its story in more detail.
Expectation
175
Definition 4.7.1 (Poisson distribution). An r.v. $X$ has the Poisson distribution with parameter $\lambda$, where $\lambda>0$, if the PMF of $X$ is
$$P(X=k)=\frac{e^{-\lambda} \lambda^{k}}{k !}, \quad k=0,1,2, \ldots$$
We write this as $X \sim \operatorname{Pois}(\lambda)$.
This is a valid PMF because of the Taylor series $\sum_{k=0}^{\infty} \frac{\lambda^{k}}{k !}=e^{\lambda} .$
Example 4.7.2 (Poisson expectation and variance). Let $X \sim \operatorname{Pois}(\lambda)$. We will show that the mean and variance are both equal to $\lambda$. For the mean, we have
\begin{aligned} E(X) &=e^{-\lambda} \sum_{k=0}^{\infty} k \frac{\lambda^{k}}{k !} \ &=e^{-\lambda} \sum_{k=1}^{\infty} k \frac{\lambda^{k}}{k !} \ &=\lambda e^{-\lambda} \sum_{k=1}^{\infty} \frac{\lambda^{k-1}}{(k-1) !} \ &=\lambda e^{-\lambda} e^{\lambda}=\lambda \end{aligned}
First we dropped the $k=0$ term because it was 0 . Then we took a $\lambda$ out of the sum so that what was left inside was just the Taylor series for $e^{\lambda}$.
To get the variance, we first find $E\left(X^{2}\right)$. By LOTUS,
$$E\left(X^{2}\right)=\sum_{k=0}^{\infty} k^{2} P(X=k)=e^{-\lambda} \sum_{k=0}^{\infty} k^{2} \frac{\lambda^{k}}{k !}$$
From here, the derivation is very similar to that of the variance of the Geometric. Differentiate the familiar series
$$\sum_{k=0}^{\infty} \frac{\lambda^{k}}{k !}=e^{\lambda}$$
with respect to $\lambda$ and replenish:
$$\begin{array}{r} \sum_{k=1}^{\infty} k \frac{\lambda^{k-1}}{k !}=e^{\lambda}, \ \sum_{k=1}^{\infty} k \frac{\lambda^{k}}{k !}=\lambda e^{\lambda} . \end{array}$$
Rinse and repeat:
\begin{aligned} &\sum_{k=1}^{\infty} k^{2} \frac{\lambda^{k-1}}{k !}=e^{\lambda}+\lambda e^{\lambda}=e^{\lambda}(1+\lambda) \ &\sum_{k=1}^{\infty} k^{2} \frac{\lambda^{k}}{k !}=e^{\lambda} \lambda(1+\lambda) \end{aligned}
176
Finally,
$$E\left(X^{2}\right)=e^{-\lambda} \sum_{k=0}^{\infty} k^{2} \frac{\lambda^{k}}{k !}=e^{-\lambda} e^{\lambda} \lambda(1+\lambda)=\lambda(1+\lambda)$$
so
$$\operatorname{Var}(X)=E\left(X^{2}\right)-(E X)^{2}=\lambda(1+\lambda)-\lambda^{2}=\lambda$$
Thus, the mean and variance of a Pois $(\lambda)$ r.v, are both equal to $\lambda$.
Figure $4.7$ shows the PMF and CDF of the Pois(2) and Pois(5) distributions from $k=0$ to $k=10$. It appears that the mean of the Pois(2) is around 2 and the mean of the Pois $(5)$ is around 5, consistent with our findings above. The PMF of the Pois(2) is highly skewed, but as $\lambda$ grows larger, the skewness is reduced and the PMF becomes more bell-shaped.

统计代考

4.7 泊松

175

$$P(X=k)=\frac{e^{-\lambda} \lambda^{k}}{k !}, \quad k=0,1,2, \ldots$$

$$\开始{对齐} E(X) &=e^{-\lambda} \sum_{k=0}^{\infty} k \frac{\lambda^{k}}{k !} \ &=e^{-\lambda} \sum_{k=1}^{\infty} k \frac{\lambda^{k}}{k !} \ &=\lambda e^{-\lambda} \sum_{k=1}^{\infty} \frac{\lambda^{k-1}}{(k-1) !} \ &=\lambda e^{-\lambda} e^{\lambda}=\lambda \end{对齐}$$

$$E\left(X^{2}\right)=\sum_{k=0}^{\infty} k^{2} P(X=k)=e^{-\lambda} \sum_{k=0 }^{\infty} k^{2} \frac{\lambda^{k}}{k !}$$

$$\sum_{k=0}^{\infty} \frac{\lambda^{k}}{k !}=e^{\lambda}$$

$$\开始{数组}{r} \sum_{k=1}^{\infty} k \frac{\lambda^{k-1}}{k !}=e^{\lambda}, \ \sum_{k=1}^{\infty} k \frac{\lambda^{k}}{k !}=\lambda e^{\lambda} 。 \结束{数组}$$

$$\开始{对齐} &\sum_{k=1}^{\infty} k^{2} \frac{\lambda^{k-1}}{k !}=e^{\lambda}+\lambda e^{\lambda} =e^{\lambda}(1+\lambda) \ &\sum_{k=1}^{\infty} k^{2} \frac{\lambda^{k}}{k !}=e^{\lambda} \lambda(1+\lambda) \end{对齐}$$
176

$$E\left(X^{2}\right)=e^{-\lambda} \sum_{k=0}^{\infty} k^{2} \frac{\lambda^{k}}{k ！ }=e^{-\lambda} e^{\lambda} \lambda(1+\lambda)=\lambda(1+\lambda)$$

$$\operatorname{Var}(X)=E\left(X^{2}\right)-(E X)^{2}=\lambda(1+\lambda)-\lambda^{2}=\lambda$$