7
Aug 18

Summary and questions for repetition

Summary and questions for repetition

I was planning to cover the Moore-Penrose inverse which allows one to solve the equation Ax=y for any A (not necessarily square). Now my feeling is that it would be too much for a standard linear algebra course. This is the most easily accessible sourse.

  1. Give the definition and example of an orthonormal system. Prove that elements of such a system are linearly independent.
  2. "To know how a matrix acts on vectors, it is enough to know how it acts on the elements of an orthonormal basis." Explain.
  3. How can you reveal the elements of a matrix A from (Ax)\cdot y?
  4. A linear mapping from one Euclidean space to another generates a matrix. Prove.
  5. Prove that an inverse of a linear mapping is linear.
  6. What is a linear mapping in the one-dimensional case?
  7. In case of a square matrix, what are the four equivalent conditions for the equation Ax=y to be good (uniquely solvable for all y)?
  8. Give two equivalent definitions of linear independence.
  9. List the simple facts about linear dependence that students need to learn first.
  10. Prove the criterion of linear independence.
  11. Let the vectors x^{(1)},...,x^{(k)} be linearly dependent and consider the regression model y=\beta_1x^{(1)}+...+\beta_kx^{(k)}+u. Show that here the coefficients \beta_1,...,\beta_k cannot be uniquely determined (this is a purely algebraic fact, you don't need to know anything about multiple regression).
  12. Define a basis. Prove that if x^{(1)},...,x^{(k)} is a basis and x is decomposed as x=a_1x^{(1)}+...+a_kx^{(k)}, then the coefficients a_1,...,a_k are unique. Prove further that they are linear functions of x.
  13. Prove that the terms in the orthogonal sum of two subspaces have intersection zero.
  14. Prove dimension additivity.
  15. Prove that a matrix and its adjoint have the same rank.
  16. Prove the rank-nullity theorem.
  17. Prove the upper bound on the matrix rank in terms of the matrix dimensions.
  18. Vectors are linearly independent if and only if one of them can be expressed as a linear combination of the others.
  19. What can be said about linear (in)dependence if some vectors are added to or removed from the system of vectors?
  20. Prove that if the number of vectors in a system is larger than the space dimension, then such a system is linearly dependent.
  21. Give a list of all properties of rank that you've learned so far.

Leave a Reply

You must be logged in to post a comment.