19th Ave New York, NY 95822, USA

# 统计代写| Functions of random variables stat代写

## 统计代考

3.7 Functions of random variables
In this section we will discuss what it means to take a function of a random variable, and we will build understanding for why a function of a random variable is a random variable. That is, if $X$ is a random variable, then $X^{2}, e^{X}$, and $\sin (X)$ are also random variables, as is $g(X)$ for any function $g: \mathbb{R} \rightarrow \mathbb{R}$.

For example, imagine that two basketball teams (A and B) are playing a sevengame match, and let $X$ be the number of wins for team A (so $X \sim \operatorname{Bin}(7,1 / 2)$ if the teams are evenly matched and the games are independent). Let $g(x)=7-x$, and let $h(x)=1$ if $x \geq 4$ and $h(x)=0$ if $x<4$. Then $g(X)=7-X$ is the number of wins for team $\mathrm{B}$, and $h(X)$ is the indicator of team A winning the majority of the games. Since $X$ is an r.v., both $g(X)$ and $h(X)$ are also r.v.s.

To see how to define functions of an r.v. formally, let’s rewind a bit. At the beginning of this chapter, we considered a random variable $X$ defined on a sample space with 6 elements. Figure $3.1$ used arrows to illustrate how $X$ maps each pebble in the sample space to a real number, and the left half of Figure $3.2$ showed how we can equivalently imagine $X$ writing a real number inside each pebble.

Now we can, if we want, apply the same function $g$ to all the numbers inside the pebbles. Instead of the numbers $X\left(s_{1}\right)$ through $X\left(s_{6}\right)$, we now have the numbers $g\left(X\left(s_{1}\right)\right)$ through $g\left(X\left(s_{6}\right)\right)$, which gives a new mapping from sample outcomes to real numbers – we’ve created a new random variable, $g(X)$.
Definition 3.7.1 (Function of an r.v.). For an experiment with sample space $S$, an r.v. $X$, and a function $g: \mathbb{R} \rightarrow \mathbb{R}, g(X)$ is the r.v. that maps $s$ to $g(X(s))$ for all $s \in S$.
Taking $g(x)=\sqrt{x}$ for concreteness, Figure $3.9$ shows that $g(X)$ is the composition of the functions $X$ and $g$, saying “first apply $X$, then apply $g$ “. Figure $3.10$ represents $g(X)$ more succinctly by directly labeling the sample outcomes. Both figures show us that $g(X)$ is an r.v.; if $X$ crystallizes to 4 , then $g(X)$ crystallizes to 2 .

Given a discrete r.v. $X$ with a known PMF, how can we find the PMF of $Y=g(X)$ ? In the case where $g$ is a one-to-one function, the answer is straightforward: the support of $Y$ is the set of all $g(x)$ with $x$ in the support of $X$, and
$$P(Y=g(x))=P(g(X)=g(x))=P(X=x) .$$
124
FIGURE $3.9$
The r.v. $X$ is defined on a sample space with 6 elements, and has possible values 0 , 1 , and 4 . The function $g$ is the square root function. Composing $X$ and $g$ gives the random variable $g(X)=\sqrt{X}$, which has possible values 0,1 , and 2 .
Random variables and their distributions
125
The case where $Y=g(X)$ with $g$ one-to-one is illustrated in the following tables; the idea is that if the distinct possible values of $X$ are $x_{1}, x_{2}, \ldots$ with probabilities $p_{1}, p_{2}, \ldots$ (respectively), then the distinct possible values of $Y$ are $g\left(x_{1}\right), g\left(x_{2}\right), \ldots$, with the same list $p_{1}, p_{2}, \ldots$ of probabilities.
\begin{tabular}{ccccc}
\hline$x$ & $P(X=x)$ & & $y$ & $P(Y=y)$ \
\hline$x_{1}$ & $p_{1}$ & & $g\left(x_{1}\right)$ & $p_{1}$ \
$x_{2}$ & $p_{2}$ & & $g\left(x_{2}\right)$ & $p_{2}$ \
$x_{3}$ & $p_{3}$ & & $g\left(x_{3}\right)$ & $p_{3}$ \
$\vdots$ & $\vdots$ & & $\vdots$ & $\vdots$ \
\hline
\end{tabular}
This suggests a strategy for finding the PMF of an r.v. with an unfamiliar distribution: try to express the r.v. as a one-to-one function of an r.v. with a known distribution. The next example illustrates this method.

Example 3.7.2 (Random walk). A particle moves $n$ steps on a number line. The particle starts at 0 , and at each step it moves 1 unit to the right or to the left, with equal probabilities. Assume all steps are independent. Let $Y$ be the particle’s position after $n$ steps. Find the PMF of $Y$.
Solution:
Consider each step to be a Bernoulli trial, where right is considered a success and left is considered a failure. Then the number of steps the particle takes to the right is a $\operatorname{Bin}(n, 1 / 2)$ random variable, which we can name $X$. If $X=j$, then the particle has taken $j$ steps to the right and $n-j$ steps to the left, giving a final position of $j-(n-j)=2 j-n .$ So we can express $Y$ as a one-to-one function of $X$, namely, $Y=2 X-n$. Since $X$ takes values in ${0,1,2, \ldots, n}, Y$ takes values in ${-n, 2-n, 4-n, \ldots, n} .$
The PMF of $Y$ can then be found from the PMF of $X$ :
$$P(Y=k)=P(2 X-n=k)=P(X=(n+k) / 2)=\left(\begin{array}{c} n \ \frac{n+k}{2} \end{array}\right)\left(\frac{1}{2}\right)^{n}$$
if $k$ is an integer between $-n$ and $n$ (inclusive) such that $n+k$ is an even number.
If $g$ is not one-to-one, then for a given $y$, there may be multiple values of $x$ such that $g(x)=y$. To compute $P(g(X)=y)$, we need to sum up the probabilities of $X$ taking on any of these candidate values of $x$.

Theorem 3.7.3 (PMF of $g(X)$ ). Let $X$ be a discrete r.v. and $g: \mathbb{R} \rightarrow \mathbb{R}$. Then the support of $g(X)$ is the set of all $y$ such that $g(x)=y$ for at least one $x$ in the support of $X$, and the PMF of $g(X)$ is
$$P(g(X)=y)=\sum_{x: g(x)=y} P(X=x),$$
126
for all $y$ in the support of $g(X)$.

## 统计代考

3.7 随机变量的函数

$$P(Y=g(x))=P(g(X)=g(x))=P(X=x) 。$$
124

125
$Y=g(X)$ 与 $g$ 一对一的情况如下表所示；这个想法是，如果 $X$ 的不同可能值是 $x_{1}、x_{2}、\ldots$，概率分别为 $p_{1}、p_{2}、\ldots$，那么$Y$ 的不同可能值为 $g\left(x_{1}\right)、g\left(x_{2}\right)、\ldots$，具有相同的列表 $p_{1}、p_{2 }, \ldots$ 的概率。
\开始{表格}{ccccc}
\hline$x$ & $P(X=x)$ & & $y$ & $P(Y=y)$ \
\hline$x_{1}$ & $p_{1}$ & & $g\left(x_{1}\right)$ & $p_{1}$ \
$x_{2}$ & $p_{2}$ & & $g\left(x_{2}\right)$ & $p_{2}$ \
$x_{3}$ & $p_{3}$ & & $g\left(x_{3}\right)$ & $p_{3}$ \
$\vdots$ & $\vdots$ & & $\vdots$ & $\vdots$ \
\hline
\end{表格}