7
Jan 16

## Mean plus deviation-from-mean decomposition

This is about separating the deterministic and random parts of a variable. This topic can be difficult or easy, depending on how you look at it. The right way to think about it is theoretical.

Everything starts with a simple question: What can you do to a random variable $X$ to obtain a new variable, say, $Y$, whose mean is equal to zero? Intuitively, when you subtract the mean from $X$, the distribution moves to the left or right, depending on the sign of $EX$, so that the distribution of $Y$ is centered on zero. One of my students used this intuition to guess that you should subtract the mean: $Y=X-EX$. The guess should be confirmed by algebra: from this definition

$EY=E(X-EX)=EX-E(EX)=EX-EX=0$

(here we distributed the expectation operator and used the property that the mean of a constant ($EX$) is that constant). By the way, subtracting the mean from a variable is called centering or demeaning.

If you understand the above, you can represent $X$ as

$X = EX+(X-EX).$

Here $\mu=EX$ is the mean and $u=X-EX$ is the deviation from the mean. As was shown above, $Eu=0$. Thus, we obtain the mean plus deviation-from-mean decomposition $X=\mu+u.$ Simple, isn't it? It is so simple, that students don't pay attention to it. In fact, it is omnipresent in Statistics because $Var(X)=Var(u)$. The analysis of $Var(X)$ is reduced to that of $Var(u)$!