# 统计代写| Law of the unconscious statistician (LOTUS) stat代写

## 统计代考

4.5 Law of the unconscious statistician (LOTUS)
As we saw in the St. Petersburg paradox, $E(g(X))$ does not equal $g(E(X))$ in general if $g$ is not linear. So how do we correctly calculate $E(g(X))$ ? Since $g(X)$ is an r.v., one way is to first find the distribution of $g(X)$ and then use the definition of expectation. Perhaps surprisingly, it turns out that it is possible to find $E(g(X))$ directly using the distribution of $X$, without first having to find the distribution of $g(X)$. This is done using the law of the unconscious statistician (LOTUS).

Theorem 4.5.1 (LOTUS). If $X$ is a discrete r.v. and $g$ is a function from $\mathbb{R}$ to $\mathbb{R}$, then
$$E(g(X))=\sum_{x} g(x) P(X=x)$$
where the sum is taken over all possible values of $X$.
This means that we can get the expected value of $g(X)$ knowing only $P(X=x)$, the PMF of $X$; we don’t need to know the PMF of $g(X)$. The name comes from the fact that in going from $E(X)$ to $E(g(X))$ it is tempting just to change $x$ to $g(x)$ in the definition, which can be done very easily and mechanically, perhaps in a state of unconsciousness. $\mathrm{On}$ second thought, it may sound too good to be true that finding the distribution of $g(X)$ is not needed for this calculation, but LOTUS says it is true.
Expectation
171
Before proving LOTUS in general, let’s see why it is true in some special cases. Let $X$ have support $0,1,2, \ldots$ with probabilities $p_{0}, p_{1}, p_{2}, \ldots$, so the PMF is $P(X=n)=$ $p_{n}$. Then $X^{3}$ has support $0^{3}, 1^{3}, 2^{3}, \ldots$ with probabilities $p_{0}, p_{1}, p_{2}, \ldots$, so
\begin{aligned} &E(X)=\sum_{n=0}^{\infty} n p_{n} \ &E\left(X^{3}\right)=\sum_{n=0}^{\infty} n^{3} p_{n} \end{aligned}
As claimed by LOTUS, to edit the expression for $E(X)$ into an expression for $E\left(X^{3}\right)$, we can just change the $n$ in front of the $p_{n}$ to an $n^{3}$. This was an easy example since the function $g(x)=x^{3}$ is one-to-one. But LOTUS holds much more generally. The key insight needed for the proof of LOTUS for general $g$ is the same as the one we used for the proof of linearity: the expectation of $g(X)$ can be written in ungrouped form as
$$E(g(X))=\sum_{s} g(X(s)) P({s})$$
where the sum is over all the pebbles in the sample space, but we can also group the pebbles into super-pebbles according to the value that $X$ assigns to them. Within the super-pebble $X=x, g(X)$ always takes on the value $g(x)$. Therefore,
\begin{aligned} E(g(X)) &=\sum_{s} g(X(s)) P({s}) \ &=\sum_{x} \sum_{s: X(s)=x} g(X(s)) P({s}) \ &=\sum_{x} g(x) \sum_{s: X(s)=x} P({s}) \ &=\sum_{x} g(x) P(X=x) . \end{aligned}
In the last step, we used the fact that $\sum_{s: X(s)=x} P({s})$ is the weight of the superpebble $X=x$.

## 统计代考

4.5 无意识统计学家定律（LOTUS）

$$E(g(X))=\sum_{x} g(x) P(X=x)$$

171

$$\开始{对齐} &E(X)=\sum_{n=0}^{\infty} n p_{n} \ &E\left(X^{3}\right)=\sum_{n=0}^{\infty} n^{3} p_{n} \end{对齐}$$

$$E(g(X))=\sum_{s} g(X(s)) P({s})$$

$$\开始{对齐} E(g(X)) &=\sum_{s} g(X(s)) P({s}) \ &=\sum_{x} \sum_{s: X(s)=x} g(X(s)) P({s}) \ &=\sum_{x} g(x) \sum_{s: X(s)=x} P({s}) \ &=\sum_{x} g(x) P(X=x) 。 \end{对齐}$$