Mean plus deviation-from-mean decomposition
This is about separating the deterministic and random parts of a variable. This topic can be difficult or easy, depending on how you look at it. The right way to think about it is theoretical.
Everything starts with a simple question: What can you do to a random variable to obtain a new variable, say, , whose mean is equal to zero? Intuitively, when you subtract the mean from , the distribution moves to the left or right, depending on the sign of , so that the distribution of is centered on zero. One of my students used this intuition to guess that you should subtract the mean: . The guess should be confirmed by algebra: from this definition
(here we distributed the expectation operator and used the property that the mean of a constant () is that constant). By the way, subtracting the mean from a variable is called centering or demeaning.
If you understand the above, you can represent as
Here is the mean and is the deviation from the mean. As was shown above, . Thus, we obtain the mean plus deviation-from-mean decomposition Simple, isn't it? It is so simple, that students don't pay attention to it. In fact, it is omnipresent in Statistics because . The analysis of is reduced to that of !