18
Oct 18

Law of iterated expectations: geometric aspect

Law of iterated expectations: geometric aspect

There will be a separate post on projectors. In the meantime, we'll have a look at simple examples that explain a lot about conditional expectations.

Examples of projectors

The name "projector" is almost self-explanatory. Imagine a point and a plane in the three-dimensional space. Draw a perpendicular from the point to the plane. The intersection of the perpendicular with the plane is the points's projection onto that plane. Note that if the point already belongs to the plane, its projection equals the point itself. Besides, instead of projecting onto a plane we can project onto a straight line.

The above description translates into the following equations. For any x\in R^3 define

(1) P_2x=(x_1,x_2,0) and P_1x=(x_1,0,0).

P_2 projects R^3 onto the plane L_2=\{(x_1,x_2,0):x_1,x_2\in R\} (which is two-dimensional) and P_1 projects R^3 onto the straight line L_1=\{(x_1,0,0):x_1\in R\} (which is one-dimensional).

Property 1. Double application of a projector amounts to single application.

Proof. We do this just for one of the projectors. Using (1) three times we get

(1) P_2[P_2x]=P_2(x_1,x_2,0)=(x_1,x_2,0)=P_2x.

Property 2. A successive application of two projectors yields the projection onto a subspace of a smaller dimension.

Proof. If we apply first P_2 and then P_1, the result is

(2) P_1[P_2x]=P_1(x_1,x_2,0)=(x_1,0,0)=P_1x.

If we change the order of projectors, we have

(3) P_2[P_1x]=P_2(x_1,0,0)=(x_1,0,0)=P_1x.

Exercise 1. Show that both projectors are linear.

Exercise 2. Like any other linear operator in a Euclidean space, these projectors are given by some matrices. What are they?

The simple truth about conditional expectation

In the time series setup, we have a sequence of information sets ...\subset I_t\subset I_{t+1}\subset... (it's natural to assume that with time the amount of available information increases). Denote

E_tX=E(X|I_t)

the expectation of X conditional on I_t. For each t,

E_t is a projector onto the space of random functions that depend only on the information set I_t.

Property 1. Double application of conditional expectation gives the same result as single application:

(4) E_t(E_tX)=E_tX

(E_tX is already a function of I_t, so conditioning it on I_t doesn't change it).

Property 2. A successive conditioning on two different information sets is the same as conditioning on the smaller one:

(5) E_tE_{t+1}X=E_tX,

(6) E_{t+1}E_tX=E_tX.

Property 3. Conditional expectation is a linear operator: for any variables X,Y and numbers a,b

E_t(aX+bY)=aE_tX+bE_tY.

It's easy to see that (4)-(6) are similar to (1)-(3), respectively, but I prefer to use different names for (4)-(6). I call (4) a projector property. (5) is known as the Law of Iterated Expectations, see my post on the informational aspect for more intuition. (6) holds simply because at time t+1 the expectation E_tX is known and behaves like a constant.

Summary. (4)-(6) are easy to remember as one property. The smaller information set winsE_sE_tX=E_{\min\{s,t\}}X.

Leave a Reply

You must be logged in to post a comment.