## Gaussian elimination method

Consider the system

, ,

If the first equation is nontrivial, then one of the coefficients is different from zero. Suppose it is Adding the first equation multiplied by to the second one we eliminate from it. Similarly, adding the first equation multiplied by to the third one we eliminate from it. The system becomes

, ,

where with indexes stands for new numbers.

If we can use the second equation to eliminate from the third equation and the result will be

, ,

with some new 's. If we can solve the system backwards, finding first from the third equation, then from the second and, finally from the first.

Notice that for the method to work it does not matter what happens with the vector at the right of the system. It only matters what happens to That's why we focus on transformations of

## Theoretical treatment

**Exercise 1**. Denote

the **leading principal minors** of If all of them are different from zero, then the Gaussian method reduces (by way of premultiplication of by elementary matrices) to the triangular matrix

where

**Proof by induction**. Let Premultiplying by we get the desired result:

Now let the statement hold for By the induction assumption we can start with the matrix of form

Using the condition that all are different from zero, we can make zero the elements . The result is:

The determinant of a triangular matrix equals the product of diagonal elements because of the cross-out rule:

(1)

(for example, if a product contains , you should cross out the first row and then the product should contain one of the zeros below ).

On the other hand, is a result of premultiplication by elementary matrices: where by Exercise 1 for all Hence,

(2)

Combining (1) and (2) we get which concludes the inductive argument.

## Leave a Reply

You must be logged in to post a comment.