19th Ave New York, NY 95822, USA

统计代写| Bernoulli and Binomial stat代写

统计代考

3.3 Bernoulli and Binomial
Some distributions are so ubiquitous in probability and statistics that they have their own names. We will introduce these named distributions throughout the book, starting with a very simple but useful case: an r.v. that can take on only two possible values, 0 and $1 .$

Definition 3.3.1 (Bernoulli distribution). An r.v. $X$ is said to have the Bernoulli distribution with parameter $p$ if $P(X=1)=p$ and $P(X=0)=1-p$, where $0<p<1$. We write this as $X \sim \operatorname{Bern}(p)$. The symbol $\sim$ is read “is distributed as”.
Any r.v. whose possible values are 0 and 1 has a $\operatorname{Bern}(p)$ distribution, with $p$ the probability of the r.v. equaling 1 . This number $p$ in $\operatorname{Bern}(p)$ is called the parameter of the distribution; it determines which specific Bernoulli distribution we have. Thus there is not just one Bernoulli distribution, but rather a family of Bernoulli distributions, indexed by $p$. For example, if $X \sim \operatorname{Bern}(1 / 3)$, it would be correct but both say its name (Bernoulli) and its parameter value $(1 / 3)$, which is the point of the notation $X \sim \operatorname{Bern}(1 / 3)$.
Any event has a Bernoulli r.v. that is naturally associated with it, equal to 1 if the event happens and 0 otherwise. This is called the indicator random variable of the event; we will see that such r.v.s are extremely useful.

Definition 3.3.2 (Indicator random variable). The indicator random variable of an event $A$ is the r.v. which equals 1 if $A$ occurs and 0 otherwise. We will denote the indicator r.v. of $A$ by $I_{A}$ or $I(A)$. Note that $I_{A} \sim \operatorname{Bern}(p)$ with $p=P(A)$.
We often imagine Bernoulli r.v.s using coin tosses, but this is just convenient language for discussing the following general story.
Story 3.3.3 (Bernoulli trial). An experiment that can result in either a “success” or a “failure” (but not both) is called a Bernoulli trial. A Bernoulli random variable can be thought of as the indicator of success in a Bernoulli trial: it equals 1 if success occurs and 0 if failure occurs in the trial.

Because of this story, the parameter $p$ is often called the success probability of the $\operatorname{Bern}(p)$ distribution. Once we start thinking about Bernoulli trials, it’s hard not to start thinking about what happens when we have more than one trial.
Story 3.3.4 (Binomial distribution). Suppose that $n$ independent Bernoulli trials are performed, each with the same success probability $p$. Let $X$ be the number of successes. The distribution of $X$ is called the Binomial distribution with parameters $n$ and $p$. We write $X \sim \operatorname{Bin}(n, p)$ to mean that $X$ has the Binomial distribution vith parameters nond $p$, where $n$ is a positive integer and $0<p<1$
113
andom en ables and their distributions about the type of experiment that could give rise to a random variable with a Binomial distribution. The most famous distributions in statistics all have stories which explain why they are so often used as models for data, or as the building blocks for more complicated distributions.

Thinking about the named distributions first and foremost in terms of their stories has many benefits. It facilitates pattern recognition, allowing us to see when two problems are essentially identical in structure; it often leads to cleaner solutions that avoid PMF calculations altogether; and it helps us understand how the named distributions are connected to one another. Here it is clear that Bern $(p)$ is the same distribution as Bin $(1, p)$ : the Bernoulli is a special case of the Binomial.
Using the story definition of the Binomial, let’s find its PMF.
Theorem $3.3 .5$ (Binomial PMF). If $X \sim \operatorname{Bin}(n, p)$, then the PMF of $X$ is
$P(X=k)=\left(\begin{array}{l}n \ k\end{array}\right) p^{k}(1-p)^{n-k}$
for $k=0,1, \ldots, n$ (and $P(X=k)=0$ otherwise).
3.3.6. To save writing, it is often left implicit that a PMF is zero wherever it is not specified to be nonzero, but in any case it is important to understand what the support of a random variable is, and good practice to check that PMFs are valid. If two discrete r.v.s have the same PMF, then they also must have the same support. So we sometimes refer to the support of a discrete distribution; this is the support of any r.v. with that distribution.

Proof. An experiment consisting of $n$ independent Bernoulli trials produces a sequence of successes and failures. The probability of any specific sequence of $k$ successes and $n-k$ failures is $p^{k}(1-p)^{n-k}$. There are $\left(\begin{array}{l}n \ k\end{array}\right)$ such sequences, since we just need to select where the successes are. Therefore, letting $X$ be the number of successes,
$$P(X=k)=\left(\begin{array}{l} n \ k \end{array}\right) p^{k}(1-p)^{n-k}$$
for $k=0,1, \ldots, n$, and $P(X=k)=0$ otherwise. This is a valid PMF because it is
nonnegative and it sums to 1 by the binomial theorem.
$$P(X=k)=\left(\begin{array}{l} n \ k \end{array}\right) p^{k}(1-p)^{n-k}$$
for $k=0,1, \ldots, n$, and $P(X=k)=0$ otherwise. This is a valid PMF because it is nonnegative and it sums to 1 by the binomial theorem.

Figure $3.6$ shows plots of the Binomial PMF for various values of $n$ and $p$. Note that the PMF of the Bin $(10,1 / 2)$ distribution is symmetric about 5 , but when the success probability is not $1 / 2$, the PMF is skewed. For a fixed number of trials $n, X$ tends to be larger when the success probability is high and lower when the success probability is low, as we would expect from the story of the Binomial distribution. Also recall that in any PMF plot, the sum of the heights of the vertical bars must be 1 .

We’ve used Story $3.3 .4$ to find the $\operatorname{Bin}(n, p) \operatorname{PMF}$. The story also gives us a straightforward proof of the fact that if $X$ is Binomial, then $n-X$ is also Binomial.

统计代考

3.3 伯努利和二项式

113

$P(X=k)=\left(\begin{array}{l}n \ k\end{array}\right) p^{k}(1-p)^{n-k}$

3.3.6。为了节省书写时间，PMF 在任何未指定为非零的地方通常都隐含为零，但无论如何重要的是要了解随机变量的支持是什么，以及检查 PMF 是否有效的良好做法.如果两个离散的 r.v.s 具有相同的 PMF，那么它们也必须具有相同的支撑。所以我们有时会提到离散分布的支持；这是任何房车的支持。与那个分布。

$$P(X=k)=\left(\begin{数组}{l} n \ ķ \end{数组}\right) p^{k}(1-p)^{n-k}$$


P(X=k)=\le