7
Aug 18

## Summary and questions for repetition

I was planning to cover the Moore-Penrose inverse which allows one to solve the equation $Ax=y$ for any $A$ (not necessarily square). Now my feeling is that it would be too much for a standard linear algebra course. This is the most easily accessible sourse.

1. Give the definition and example of an orthonormal system. Prove that elements of such a system are linearly independent.
2. "To know how a matrix acts on vectors, it is enough to know how it acts on the elements of an orthonormal basis." Explain.
3. How can you reveal the elements of a matrix $A$ from $(Ax)\cdot y$?
4. A linear mapping from one Euclidean space to another generates a matrix. Prove.
5. Prove that an inverse of a linear mapping is linear.
6. What is a linear mapping in the one-dimensional case?
7. In case of a square matrix, what are the four equivalent conditions for the equation $Ax=y$ to be good (uniquely solvable for all $y$)?
8. Give two equivalent definitions of linear independence.
9. List the simple facts about linear dependence that students need to learn first.
10. Prove the criterion of linear independence.
11. Let the vectors $x^{(1)},...,x^{(k)}$ be linearly dependent and consider the regression model $y=\beta_1x^{(1)}+...+\beta_kx^{(k)}+u.$ Show that here the coefficients $\beta_1,...,\beta_k$ cannot be uniquely determined (this is a purely algebraic fact, you don't need to know anything about multiple regression).
12. Define a basis. Prove that if $x^{(1)},...,x^{(k)}$ is a basis and $x$ is decomposed as $x=a_1x^{(1)}+...+a_kx^{(k)},$ then the coefficients $a_1,...,a_k$ are unique. Prove further that they are linear functions of $x.$
13. Prove that the terms in the orthogonal sum of two subspaces have intersection zero.