# Introductory Linear Algebra/System of linear equations

Jump to navigation Jump to search

## Systems of linear equations in matrix form

We should define what is system of linear equations before expressing it in matrix form.

Definition. (System of linear equations) A system of linear equations (SLE) in ${\displaystyle n}$ unknowns ${\displaystyle x_{1},x_{2},\ldots ,x_{n}}$ is a family of equations of the form

${\displaystyle {\begin{cases}a_{11}x_{1}+a_{12}x_{2}+\cdots +a_{1n}x_{n}=b_{1},\\a_{21}x_{1}+a_{22}x_{2}+\cdots +a_{2n}x_{n}=b_{2},\\\quad \qquad \qquad \qquad \vdots \\a_{m1}x_{1}+a_{m2}x_{2}+\cdots +a_{mn}x_{n}=b_{m},\end{cases}}}$
in which ${\displaystyle a_{ij}}$'s and ${\displaystyle b_{k}}$'s are some constants.

Remark.

• in some other definitions, a single linear equation can be regarded as a system of linear equations, but since we can already easily solve a single linear equation, we are not interested in such a case, and thus we do not include this possibility

We often use the terms 'consistent' and 'inconsistent' to describe the number of solutions of a system of linear equations.

Definition.

(Consistency of system of linear equations)

A system of linear equations is consistent if it has at least one solution. Otherwise, it is inconsistent (i.e., it is inconsistent if it has no solutions).

Remark.

• as we will see, a SLE can either have no solutions, one unique solution, or infinitely many solutions. Thus, a SLE is consistent if and only if it has either a unique solution or infinitely many solutions

Example. (Consistent SLE) Consider the SLE

${\displaystyle {\begin{cases}x+2y=3&\quad (1)\\3x+6y=9&\quad (2)\end{cases}}.}$
Since
${\displaystyle (2)\iff 3x+6y=9\iff (3/3)x+(6/3)y\iff (9/3)\iff x+2y=3\iff (1),}$
the solutions are ${\displaystyle (x,y)}$ such that ${\displaystyle x+2y=3}$. Thus, this SLE has infinitely many solutions, and the SLE is consistent.

Example. (Inconsistent SLE) Consider the SLE

${\displaystyle {\begin{cases}x+2y=3&\quad (1)\\x+2y=4&\quad (2)\end{cases}}.}$
Putting ${\displaystyle (1)}$ into ${\displaystyle (2)}$, we get
${\displaystyle 3=4}$
which is false, and thus there does not exist ${\displaystyle (x,y)}$ such that both equations are satisfied. That is, the SLE has no solutions, and thus it is inconsistent.

Example. (Application of SLE) Suppose ten bottles of orange juice have been allocated to a chicken, a duck and a goose. Given that the chicken and the duck have the same number of bottles of orange juice, and the goose has one more bottle of orange juice than the chicken, how many bottles of orange juice have been allocated to each animal?

Solution: Denote the number of orange juice bottles allocated to the chicken, duck and goose by ${\displaystyle c,d}$ and ${\displaystyle g}$ respectively. Then, from the situation and the given conditions, we have the following SLE:

${\displaystyle {\begin{cases}c+d+g=10&\quad (1)\\c=d&\quad (2)\\g=c+1&\quad (3)\end{cases}}}$
in which ${\displaystyle c,d,g}$ are all nonnegative integers.

Putting ${\displaystyle (2)}$ and ${\displaystyle (3)}$ into ${\displaystyle (1)}$, we have

${\displaystyle c+c+c+1=10\implies c=3.}$
It follows that ${\displaystyle d=c=3}$, and ${\displaystyle g=c+1=4}$.

Thus, three bottles of orange juice have been allocated to each of the chicken and duck, and four bottles of orange juice have been allocated to the goose.

Exercise.

1 Choose SLE(s) from the following.

 ${\displaystyle {\begin{cases}x^{2}+y^{2}=1\\2x+3y=4\end{cases}}}$ ${\displaystyle {\begin{cases}x=1\\y=2\end{cases}}}$ ${\displaystyle {\begin{cases}x=1\\x=2\end{cases}}}$ ${\displaystyle {\begin{cases}\pi x+{\sqrt {\pi }}y=\ln 2\\x+y{\sqrt {2}}=2\end{cases}}}$

2 The final score of each student (the highest possible score is ${\displaystyle 100}$) is a weighted average of the score of the student in test 1 and test 2 (full marks of test 1 and test 2 are both ${\displaystyle 100}$). Student A gets ${\displaystyle 90}$ and ${\displaystyle 80}$ marks from test 1 and test 2 respectively, and student B gets ${\displaystyle 75}$ and ${\displaystyle 100}$ marks from test 1 and test 2 respectively. Suppose the weighting on the scores of test 1 and test 2 are ${\displaystyle w_{1}}$ and ${\displaystyle w_{2}}$ respectively. It is given that the final score of student A is ${\displaystyle 85.23}$ exactly. Which of the following is (are) true?

 the given information is not sufficient to compute ${\displaystyle w_{1}}$ and ${\displaystyle w_{2}}$ one possible allocation of weightings is given by ${\displaystyle (w_{1},w_{2})=(0.6,0.390375)}$ the final score of student B is the same as that of student A the final score of student B is strictly higher than that of student A the final score of student B is strictly lower than that of student A we do not know whether the final score of student B is higher, lower than, or the same as that of student A

After defining a system of linear equations, we can express it in matrix form in multiple ways, as in the following definition.

Definition. (Coefficient and augmented matrix) Let

${\displaystyle {\begin{cases}a_{11}x_{1}+a_{12}x_{2}+\cdots +a_{1n}x_{n}=b_{1},\\a_{21}x_{1}+a_{22}x_{2}+\cdots +a_{2n}x_{n}=b_{2},\\\quad \qquad \qquad \qquad \vdots \\a_{m1}x_{1}+a_{m2}x_{2}+\cdots +a_{mn}x_{n}=b_{m}\end{cases}}}$
be a system of linear equations in the unknowns ${\displaystyle x_{1},x_{2},\ldots ,x_{n}}$, in which ${\displaystyle a_{ij}}$'s and ${\displaystyle b_{k}}$'s are some constants.

The matrix ${\displaystyle {\color {green}(a_{ij})_{m\times n}}}$ is the coefficient matrix of the system, and the matrix

${\displaystyle \left({\begin{array}{cccc|c}a_{11}&a_{12}&\cdots &a_{1n}&b_{1}\\a_{21}&a_{22}&\cdots &a_{2n}&b_{2}\\\vdots &\vdots &\ddots &\vdots &\vdots \\a_{m1}&a_{m2}&\cdots &a_{mn}&b_{n}\end{array}}\right)}$
is the augmented matrix of the system.

Remark.

• the vertical bar in the augmented matrix is optional, it is put there to separate the constants at the left of '${\displaystyle =}$' sign and the constants at the right of '${\displaystyle =}$' sign in the system of linear equations.
• this system is equivalent to

${\displaystyle \underbrace {\begin{pmatrix}a_{11}&a_{12}&\cdots &a_{1n}\\a_{21}&a_{22}&\cdots &a_{2n}\\\vdots &\vdots &\ddots &\vdots \\a_{m1}&a_{m2}&\cdots &a_{mn}\end{pmatrix}} _{{\text{denoted by}}\;A}\underbrace {\begin{pmatrix}x_{1}\\x_{2}\\\vdots \\x_{n}\end{pmatrix}} _{{\text{denoted by}}\;\mathbf {x} }=\underbrace {\begin{pmatrix}b_{1}\\b_{2}\\\vdots \\b_{n}\end{pmatrix}} _{{\text{denoted by}}\;\mathbf {b} },}$
which can be rewritten as ${\displaystyle A\mathbf {x} =\mathbf {b} }$

• the augmented matrix gives all necessary things to solve a system, since the notations for the unknowns are not needed for solving a system.

Example. (Coefficient matrix and augmented matrix) Consider the SLE

${\displaystyle {\begin{cases}w+x+y=1\\w+z=3\\x+z=8\end{cases}}}$
which can alternatively expressed as
${\displaystyle {\begin{cases}w+x+y+0z=1\\w+0x+0y+z=3\\0w+x+0y+z=8\end{cases}}.}$

The coefficient matrix of this SLE is

${\displaystyle {\begin{pmatrix}1&1&1&0\\1&0&0&1\\0&1&0&1\end{pmatrix}},}$
and the augmented matrix of this SLE is
${\displaystyle \left({\begin{array}{cccc|c}1&1&1&0&1\\1&0&0&1&3\\0&1&0&1&8\end{array}}\right).}$
We may also express this SLE as
${\displaystyle {\begin{pmatrix}1&1&1&0\\1&0&0&1\\0&1&0&1\end{pmatrix}}{\begin{pmatrix}w\\x\\y\\z\end{pmatrix}}={\begin{pmatrix}1\\3\\8\end{pmatrix}},}$
which is in the form of ${\displaystyle A\mathbf {x} =\mathbf {b} }$.

Exercise.

1 Choose SLE(s) from the following.

 ${\displaystyle (a_{ij})_{3\times 3}}$ ${\displaystyle \left({\begin{array}{cc|c}1&2&3\\4&5&6\end{array}}\right)}$ ${\displaystyle {\begin{pmatrix}1&2\\4&5\end{pmatrix}}{\begin{pmatrix}x_{1}\\x_{2}\end{pmatrix}}={\begin{pmatrix}3\\6\end{pmatrix}}}$ ${\displaystyle A\mathbf {x} =\mathbf {b} }$ in which ${\displaystyle A}$ is a ${\displaystyle 3\times 2}$ matrix, ${\displaystyle \mathbf {x} ,\mathbf {b} }$ are ${\displaystyle 3\times 1}$ matrices.

2 A SLE is represented by the augmented matrix

${\displaystyle {\begin{pmatrix}1&0&0&0&0&0&0\\0&1&0&0&0&0&0\\0&0&1&0&0&0&6\\0&0&0&0&1&0&3\\0&0&0&1&0&0&8\\0&0&0&0&0&1&2\end{pmatrix}}.}$
Choose correct statement(s).

 one possible solution is ${\displaystyle (x_{1},x_{2},x_{3},x_{4},x_{5},x_{6})=(0,0,6,3,8,2)}$ one possible solution is ${\displaystyle (x_{1},x_{2},x_{3},x_{4},x_{5},x_{6})=(0,0,6,8,3,2)}$ the size of the coefficient matrix of this SLE is ${\displaystyle 6\times 7}$ the SLE in the question has unique solution

3 Which of the following is (are) the augmented matrix (matrices) representing the SLE

${\displaystyle {\begin{cases}x+2y=3\\6x+8y+2z=0\end{cases}}}$
?

 the augmented matrix of this SLE does not exist ${\displaystyle {\begin{pmatrix}1&2&0&3\\6&8&2&0\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}1&2&3\\6&8&2\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}1&2&0\\6&8&2\end{pmatrix}}}$

4 Choose augmented matrix (matrices) that represents an inconsistent SLE from the following.

 ${\displaystyle {\begin{pmatrix}1&1&1\\6&6&0\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}1&1\\2&3\end{pmatrix}}}$ ${\displaystyle I_{2}}$ ${\displaystyle I_{3}}$ ${\displaystyle O_{2\times 2}}$

## Gauss-Jordan algorithm

Definition. (Elementary row operations) There are three types of elementary row operations (EROs) that we may perform on a matrix, as follows:

• (Type I) interchanging two different rows
• (Type II) multiplying a row by a nonzero scalar
• (Type III) adding a scalar multiple of a row to another row

Remark. We use the following notations for EROs:

• ${\displaystyle \mathbf {r} _{1},\mathbf {r} _{2},\ldots ,\mathbf {r} _{m}}$: rows of an ${\displaystyle m\times n}$ matrix (boldface is used since the rows are essentially row vectors)
• ${\displaystyle \mathbf {r} _{\color {blue}i}{\color {green}\leftrightarrow }\mathbf {r} _{\color {red}j}}$: interchanging the ${\displaystyle \color {blue}i}$th row and the ${\displaystyle \color {red}j}$th row
• ${\displaystyle {\color {green}k}\mathbf {r} _{i}\to \mathbf {r} _{i}}$: multiplying the ${\displaystyle i}$th row by a nonzero scalar ${\displaystyle \color {green}k}$
• ${\displaystyle {\color {purple}k}\mathbf {r} _{\color {red}j}{\color {green}+}\mathbf {r} _{\color {blue}i}\to \mathbf {r} _{\color {blue}i}}$: adding ${\displaystyle \color {purple}k}$ times the ${\displaystyle \color {red}j}$th row to the ${\displaystyle \color {blue}i}$th row

Definition. (Row equivalence) Two matrices of the same size are row equivalent to each other if one matrix can be obtained from the other matrix by performing some EROs.

Remark.

• since EROs are reversible (by the following proposition), if matrix ${\displaystyle A}$ can be obtained from matrix ${\displaystyle B}$ by performing some EROs, then ${\displaystyle B}$ can also be obtained from ${\displaystyle A}$ by performing some EROs (each of them reverse one ERO in the EROs performed to obtain ${\displaystyle A}$ from ${\displaystyle B}$, in suitable order)
• thus, it suffices to prove that ${\displaystyle A}$ can be obtained from ${\displaystyle B}$ by some EROs (or vice versa, either one direction) for proving the row equivalence of ${\displaystyle A}$ and ${\displaystyle B}$ (${\displaystyle A}$ and ${\displaystyle B}$ are of same size)

Example. (Demonstration of three types of EROs) Consider the matrix

${\displaystyle A={\begin{pmatrix}{\color {red}1}&{\color {red}2}&{\color {red}3}\\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {green}7}&{\color {green}8}&{\color {green}9}\end{pmatrix}}.}$
We perform some EROs as follows:
{\displaystyle {\begin{aligned}{\begin{pmatrix}{\color {red}1}&{\color {red}2}&{\color {red}3}\\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {green}7}&{\color {green}8}&{\color {green}9}\end{pmatrix}}&{\overset {\mathbf {r} _{1}\leftrightarrow \mathbf {r} _{3}}{\to }}{\begin{pmatrix}{\color {green}7}&{\color {green}8}&{\color {green}9}\\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {red}1}&{\color {red}2}&{\color {red}3}\end{pmatrix}}\\&{\overset {\pi \mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}{\color {green}7}\pi &{\color {green}8}\pi &{\color {green}9}\pi \\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {red}1}&{\color {red}2}&{\color {red}3}\end{pmatrix}}\\&{\overset {2\mathbf {r} _{2}+\mathbf {r} _{3}\to \mathbf {r} _{3}}{\to }}{\begin{pmatrix}{\color {green}7}\pi &{\color {green}8}\pi &{\color {green}9}\pi \\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\2({\color {blue}4})+{\color {red}1}&2({\color {blue}5})+{\color {red}2}&2({\color {blue}6})+{\color {red}3}\\\end{pmatrix}}\\&{\overset {-2\mathbf {r} _{2}+\mathbf {r} _{3}\to \mathbf {r} _{3}}{\to }}{\begin{pmatrix}{\color {green}7}\pi &{\color {green}8}\pi &{\color {green}9}\pi \\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {red}1}&{\color {red}2}&{\color {red}3}\\\end{pmatrix}}\\&{\overset {(1/\pi )\mathbf {r} _{1}\rightarrow \mathbf {r} _{1}}{\to }}{\begin{pmatrix}{\color {green}7}&{\color {green}8}&{\color {green}9}\\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {red}1}&{\color {red}2}&{\color {red}3}\\\end{pmatrix}}\\&{\overset {\mathbf {r} _{1}\rightarrow \mathbf {r} _{3}}{\to }}{\begin{pmatrix}{\color {red}1}&{\color {red}2}&{\color {red}3}\\{\color {blue}4}&{\color {blue}5}&{\color {blue}6}\\{\color {green}7}&{\color {green}8}&{\color {green}9}\\\end{pmatrix}}=A.\end{aligned}}}
Each matrix shown here is row equivalent to ${\displaystyle A}$, since each of them can be obtained from ${\displaystyle A}$ by performing some EROs, and has the same size as ${\displaystyle A}$ (it also shows how to perform EROs in suitable order to reverse the EROs performed previously).

Exercise.

1 Which of the following is (are) row equivalent to ${\displaystyle I_{3}}$, the identity matrix with size ${\displaystyle 3\times 3}$?

 ${\displaystyle I_{4}}$ ${\displaystyle {\begin{pmatrix}0&1&0\\1&0&0\\0&0&1\\\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}9&0&0\\0&1&0\\0&0&1\\\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}1&0&0\\-7&1&12\\-8&0&1\\\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}0&1&0\\1&0&0\\0&0&1\\\end{pmatrix}}{\begin{pmatrix}9&0&0\\0&1&0\\0&0&1\\\end{pmatrix}}}$

2 Choose correct statement(s).

 after performing EROs ${\displaystyle \mathbf {r} _{1}\leftrightarrow \mathbf {r} _{2},\mathbf {r} _{2}\leftrightarrow \mathbf {r} _{3},\mathbf {r} _{3}\leftrightarrow \mathbf {r} _{1}}$, in this order, on a matrix with at least three rows, the resultant matrix is the same as the original matrix after performing the ERO ${\displaystyle 3\mathbf {r} _{1}+\mathbf {r} _{2}\to \mathbf {r} _{2}}$ on ${\displaystyle I_{2}}$, we get ${\displaystyle {\begin{pmatrix}3&0\\0&1\end{pmatrix}}}$ given two arbitrary EROs, performing them on the same matrix in different orders give the same resltant matrix given two arbitrary EROs, performing them on the same matrix in different orders give the different resltant matrices

Proposition. (EROs are reversible) If matrix ${\displaystyle A}$ can be obtained from matrix ${\displaystyle B}$ by performing some EROs, then ${\displaystyle B}$ can also be obtained from ${\displaystyle A}$ by performing some EROs (which may be different from the EROs for obtaining ${\displaystyle A}$ from ${\displaystyle B}$).

Proof. Outline: There is a reverse process (i.e. performing the ERO with its reverse process together, in arbitrary orders, will have no effect on matrix) for each type of EROs, which is also a ERO itself, as illustrated below:

• the reverse process of type I ERO ${\displaystyle \mathbf {r} _{i}\leftrightarrow \mathbf {r} _{j}}$ is also ${\displaystyle \mathbf {r} _{i}\leftrightarrow \mathbf {r} _{j}}$
• the reverse process of type II ERO ${\displaystyle k\mathbf {r} _{i}\to \mathbf {r} _{i}}$ is type II ERO ${\displaystyle {\frac {1}{k}}\mathbf {r} _{i}\to \mathbf {r} _{i}}$ (if ${\displaystyle k=0}$, this ERO is undefined, this is why ${\displaystyle k}$ must be nonzero for type II ERO, so that it is reversible)
• the reverse process of type III ERO ${\displaystyle k\mathbf {r} _{j}+\mathbf {r} _{i}\to \mathbf {r} _{i}}$ is type III ERO ${\displaystyle -k\mathbf {r} _{j}+\mathbf {r} _{i}\to \mathbf {r} _{i}}$

${\displaystyle \Box }$

Example. (Illustration of reverse process of each type of ERO)

• Type I:

${\displaystyle {\begin{pmatrix}{\color {red}1}&{\color {red}2}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}{\overset {\mathbf {r} _{1}\leftrightarrow \mathbf {r} _{2}}{\to }}{\begin{pmatrix}{\color {blue}3}&{\color {blue}4}\\{\color {red}1}&{\color {red}2}\\\end{pmatrix}}{\overset {\mathbf {r} _{1}\leftrightarrow \mathbf {r} _{2}}{\to }}{\begin{pmatrix}{\color {red}1}&{\color {red}2}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}}$

• Type II:

${\displaystyle {\begin{pmatrix}{\color {red}1}&{\color {red}2}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}{\overset {{\color {green}k}\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}{\color {red}1}{\color {green}k}&{\color {red}2}{\color {green}k}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}{\overset {{\color {green}{\frac {1}{k}}}\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}{\color {red}1}{\color {green}{\cancel {k}}}\cdot {\color {green}{\frac {1}{\cancel {k}}}}&{\color {red}2}{\color {green}{\cancel {k}}}\cdot {\color {green}{\frac {1}{\cancel {k}}}}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}={\begin{pmatrix}{\color {red}1}&{\color {red}2}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}\quad (k\neq 0)}$

• Type III:

${\displaystyle {\begin{pmatrix}{\color {red}1}&{\color {red}2}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}{\overset {{\color {green}k}\mathbf {r} _{2}+\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}{\color {red}1}+{\color {blue}3}{\color {green}k}&{\color {red}2}+{\color {blue}4}{\color {green}k}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}{\overset {{\color {green}-k}\mathbf {r} _{2}+\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}{\color {red}1}{\cancel {+{\color {blue}3}{\color {green}k}-{\color {blue}3}{\color {green}k}}}&{\color {red}2}{\cancel {+{\color {blue}4}{\color {green}k}-{\color {blue}4}{\color {green}k}}}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}={\begin{pmatrix}{\color {red}1}&{\color {red}2}\\{\color {blue}3}&{\color {blue}4}\\\end{pmatrix}}}$

Exercise.

1 Choose correct ERO(s) that is (are) the reverse process of the ERO ${\displaystyle 2^{-1}\mathbf {r} _{3}\to \mathbf {r} _{3}}$.

 ${\displaystyle 2\mathbf {r} _{1}\to \mathbf {r} _{1}}$ ${\displaystyle 2\mathbf {r} _{3}\to \mathbf {r} _{3}}$ ${\displaystyle 2^{-1}\mathbf {r} _{3}\to \mathbf {r} _{3}}$ ${\displaystyle 2^{-1}\mathbf {r} _{1}\to \mathbf {r} _{1}}$

2 Choose correct ERO(s) that is (are) the reverse process of the ERO ${\displaystyle 2^{-1}\mathbf {r} _{3}+\mathbf {r} _{1}\to \mathbf {r} _{1}}$.

 ${\displaystyle -2^{-1}\mathbf {r} _{1}+\mathbf {r} _{3}\to \mathbf {r} _{3}}$ ${\displaystyle -2\mathbf {r} _{1}+\mathbf {r} _{3}\to \mathbf {r} _{3}}$ ${\displaystyle -2^{-1}\mathbf {r} _{3}+\mathbf {r} _{1}\to \mathbf {r} _{1}}$ ${\displaystyle -2\mathbf {r} _{3}+\mathbf {r} _{1}\to \mathbf {r} _{1}}$

Proposition. (Same solution set by row equivalence) Let ${\displaystyle A\mathbf {x} =\mathbf {b} }$ and ${\displaystyle C\mathbf {y} =\mathbf {d} }$ be two systems of linear equations with the same number of equations and the same number of variables. If the augmented matrices ${\displaystyle (A|\mathbf {b} )}$ and ${\displaystyle (C|\mathbf {d} )}$ are row equivalent, then the two systems have the same solution set.

Proof. Outline: It suffices to show that the solution set is unchanged if we perform one ERO. E.g.

• Type I ERO:

${\displaystyle {\text{Compare }}{\begin{cases}x+2y=3\\4x+5y=6\\\end{cases}}{\text{ and }}{\begin{cases}4x+5y=6\\x+2y=3\\\end{cases}}}$

• Type II ERO:

${\displaystyle {\text{Compare }}{\begin{cases}x+2y=3\\4x+5y=6\\\end{cases}}{\text{ and }}{\begin{cases}kx+2ky=3k\\4x+5y=6\\\end{cases}}\Leftrightarrow {\begin{cases}x+2y=3\\4x+5y=6\\\end{cases}}\quad (k\neq 0)}$

• Type III ERO:

${\displaystyle {\text{Compare }}{\begin{cases}x+2y=3\\4x+5y=6\\\end{cases}}{\text{ and }}{\begin{cases}x+2y=3&\quad (1)\\(4+k)x+(5+2k)y=6+3k&\quad (2)\\\end{cases}}}$
${\displaystyle (1)-k(2):}$
${\displaystyle (4+k-k)x+(5+2k-2k)y=6+3k-3k\iff 4x+5y=6}$

${\displaystyle \Box }$

Exercise. Consider three row equivalent matrices

${\displaystyle A={\begin{pmatrix}7&5&3&4\\6&2&3&2\\4&8&1&6\\\end{pmatrix}},B={\begin{pmatrix}1&0&0&-1\\0&1&0&1\\0&0&1&2\\\end{pmatrix}},C={\begin{pmatrix}7&5&3&4\\0&16&-3&10\\0&0&1&2\end{pmatrix}}.}$

1 Solve the SLE

${\displaystyle {\begin{cases}7x+5y+3z&=4\\6x+2y+3z&=2\\4x+8y-z&=6\end{cases}}.}$

 Its unique solution: ${\displaystyle x=}$ ${\displaystyle y=}$ ${\displaystyle z=}$

2 Solve the SLE

${\displaystyle {\begin{cases}3x+5y+7z&=4\\-3x+16y&=10\\x&=2\end{cases}}.}$

 Its unique solution: ${\displaystyle x=}$ ${\displaystyle y=}$ ${\displaystyle z=}$

Definition. (Leading entry) The leading entry of a row of a matrix is the leftmost nonzero entry.

Example. The leading entry of the 1st, 2nd row and 3rd of the matrix

${\displaystyle {\begin{pmatrix}0&{\color {blue}2}&3\\0&0&{\color {blue}3}\\{\color {blue}8}&6&2\\\end{pmatrix}}}$
is ${\displaystyle 2,3,8}$ respectively.

Exercise.

1 What is the leading entry of the first row of ${\displaystyle O_{3\times 3}}$?

 0 1 2 3 it does not exist

2 What is the leading entry of the first row of ${\displaystyle I_{3\times 3}}$?

 0 1 2 3 it does not exist

Definition. (Row echelon form) A matrix is in row echelon form (REF) if

1. All the zero rows (if exist) lie at the bottom of the matrix, and
2. the leading entry of each nonzero row is always strictly on the right of the leading entries of the rows above.

Definition. (Reduced row echelon form) A matrix is in reduced row echelon form (RREF) if

1. It is in row echelon form (REF),
2. the leading entry of each nonzero row equals ${\displaystyle {\color {green}1}}$ (which is called a leading one), and
3. for each leading one, all other entries in the same column are zero.

Example. (Examples of REF and RREF)

• the following matrices are in REF, but not in RREF:

${\displaystyle {\begin{pmatrix}{\color {blue}3}&2&1&1\\0&{\color {blue}2}&9&4\\0&0&0&{\color {blue}7}\\0&0&0&0\\0&0&0&0\\\end{pmatrix}},{\begin{pmatrix}{\color {blue}3}&2&1&1\\0&{\color {blue}2}&9&4\\0&0&0&{\color {blue}7}\\0&0&0&0\\0&0&0&0\\\end{pmatrix}},{\begin{pmatrix}0&0&{\color {blue}3}&2&1&0\\0&0&0&0&0&{\color {blue}2}\\0&0&0&0&0&0\\\end{pmatrix}}}$

• the following matrices are in RREF (and thus are also in REF):

${\displaystyle {\begin{pmatrix}{\color {blue}1}&{\color {green}0}&7&{\color {green}0}&1\\{\color {green}0}&{\color {blue}1}&9&{\color {green}0}&2\\{\color {green}0}&{\color {green}0}&0&{\color {blue}1}&9\\{\color {green}0}&{\color {green}0}&0&{\color {green}0}&0\\\end{pmatrix}},{\begin{pmatrix}{\color {blue}1}&{\color {green}0}&7&{\color {green}0}&1\\{\color {green}0}&{\color {blue}1}&9&{\color {green}0}&2\\{\color {green}0}&{\color {green}0}&0&{\color {blue}1}&9\\\end{pmatrix}},{\begin{pmatrix}{\color {blue}1}&{\color {green}0}&7&{\color {green}0}\\{\color {green}0}&{\color {blue}1}&9&{\color {green}0}\\{\color {green}0}&{\color {green}0}&0&{\color {blue}1}\\\end{pmatrix}},{\begin{pmatrix}0&{\color {blue}1}&{\color {green}0}&{\color {green}0}\\0&{\color {green}0}&{\color {blue}1}&{\color {green}0}\\0&{\color {green}0}&{\color {green}0}&{\color {blue}1}\\\end{pmatrix}},I,O}$

• the following matrices are not REF (and thus are also not in RREF):

${\displaystyle {\begin{pmatrix}1&2&1&1\\{\color {red}0}&{\color {red}0}&{\color {red}0}&{\color {red}0}\\0&1&9&4\\0&0&0&1\\0&0&0&0\\\end{pmatrix}},{\begin{pmatrix}0&0&0&0&0&{\color {red}1}\\0&0&{\color {red}1}&2&1&0\\0&0&0&0&0&0\\\end{pmatrix}},{\begin{pmatrix}{\color {red}1}&2&3\\{\color {red}4}&5&6\\{\color {red}7}&8&9\\\end{pmatrix}},{\begin{pmatrix}0&0&{\color {red}1}\\0&{\color {red}1}&0\\{\color {red}1}&0&0\\\end{pmatrix}},{\begin{pmatrix}{\color {red}1}&0&0\\0&{\color {red}1}&0\\0&{\color {red}1}&0\\\end{pmatrix}}}$

Exercise.

1 Choose RREF(s) from the following.

 ${\displaystyle {\begin{pmatrix}1&0\\1&0\\0&0\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}1&0\\1&0\\0&1\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}2&0&2&2\\0&2&0&0\\0&0&0&0\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}2&0\\0&2\\0&0\end{pmatrix}}}$ ${\displaystyle {\begin{pmatrix}1&2&0\\0&0&1\\0&2&0\end{pmatrix}}}$

2 How many possible values are there for ${\displaystyle x}$ such that the matrix

${\displaystyle {\begin{pmatrix}x&0&0&1\\0&x&2x&0\\x&x&1&3\end{pmatrix}}}$
is in RREF?

 0 1 2 3 infinitely many

3 Consider the matrix

${\displaystyle {\begin{pmatrix}y&0&0&0&0&1&0&0\\0&0&0&0&0&0&1&0\\0&0&0&0&0&0&0&0\\0&0&0&0&y&0&y&0\\0&0&0&y&0&0&0&y\\0&0&0&0&0&0&y&y\\\end{pmatrix}}}$
How many possible values are there for ${\displaystyle y}$ such that the matrix is in REF?

 0 1 2 3 infinitely many

4 Consider the matrix

${\displaystyle {\begin{pmatrix}0&x&0&0\\0&0&y&0\\0&0&0&z\end{pmatrix}}}$
How many possible values are there for ${\displaystyle (x,y,z)}$ such that the matrix is in RREF?

 6 7 8 9 infinitely many

Definition. (Gauss-Jordan Algorithm) The Gauss-Jordan algorithm is a process of performing some EROs to a matrix that converts it to a matrix in RREF. Its steps are as follows:

1. Consider the leftmost nonzero column, say column ${\displaystyle i}$. Swap rows (if necessary) to make the 1st entry ${\displaystyle a}$ of column ${\displaystyle i}$ is nonzero.
2. Multiply the 1st row by ${\displaystyle a^{-1}}$ so that the 1st entry of column ${\displaystyle i}$ is ${\displaystyle 1}$.
3. For each other row with a nonzero entry ${\displaystyle b}$ in column ${\displaystyle i}$, add ${\displaystyle -b}$ times the 1st row to this row to make the entry ${\displaystyle b=0}$.
4. If all rows except the 1st are zero rows, we are done. Otherwise, consider the 1st column which contains a nonzero entry not in the 1st row, say column ${\displaystyle j}$. Swap the rows below the 1st row (if necessary) to make the 2nd entry ${\displaystyle c}$ of the column ${\displaystyle j}$ nonzero.
5. Multiply the 2nd row by ${\displaystyle c^{-1}}$ to make the 2nd entry of column ${\displaystyle j}$ equal to ${\displaystyle 1}$.
6. For each other row with a nonzero entry ${\displaystyle d}$ in column ${\displaystyle j}$, add ${\displaystyle -d}$ times the 2nd row to this row to make the entry ${\displaystyle d=0}$.
7. Repeat the process by considering each of the rows below until all rows or columns are used, or the remaining rows are all zero rows. Then, the resultant matrix is in RREF.

Remark.

• RREF of matrix ${\displaystyle A}$ is a matrix in RREF obtained from ${\displaystyle A}$ by performing some EROs
• it follows that every matrix has its RREF, since we can use Gauss-Jordan algorithm for every matrix
• RREF of a matrix is unique (the proof is complicated, and so is skipped)
• in some other definitions of Gauss-Jordan algorithm, some steps may differ, but we should also be able to convert the matrix to its RREF in those ways
• you may find this website useful for performing computations related to EROs, Gauss-Jordan algorithm, etc.

Example. (Illustration of Gauss-Jordan Algorithm)

{\displaystyle {\begin{aligned}{\begin{pmatrix}0&0&1\\3&3&2\\8&2&6\\\end{pmatrix}}&{\overset {\mathbf {r} _{1}\leftrightarrow \mathbf {r} _{2}}{\to }}{\begin{pmatrix}3&3&2\\0&0&1\\8&2&6\\\end{pmatrix}}&{\text{step 1}}\\&{\overset {{\frac {1}{3}}\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}1&1&2/3\\0&0&1\\8&2&6\\\end{pmatrix}}&{\text{step 2}}\\&{\overset {-8\mathbf {r} _{1}+\mathbf {r} _{3}\to \mathbf {r} _{3}}{\to }}{\begin{pmatrix}1&1&2/3\\0&0&1\\0&-6&2/3\\\end{pmatrix}}&{\text{step 3}}\\&{\overset {\mathbf {r} _{2}\leftrightarrow \mathbf {r} _{3}}{\to }}{\begin{pmatrix}1&1&2/3\\0&-6&2/3\\0&0&1\\\end{pmatrix}}&{\text{step 4}}\\&{\overset {-{\frac {1}{6}}\mathbf {r} _{2}\to \mathbf {r} _{2}}{\to }}{\begin{pmatrix}1&1&2/3\\0&1&-1/9\\0&0&1\\\end{pmatrix}}&{\text{step 5}}\\&{\overset {-\mathbf {r} _{2}+\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}{\begin{pmatrix}1&0&7/9\\0&1&-1/9\\0&0&1\\\end{pmatrix}}&{\text{step 6}}\\&{\overset {{\frac {1}{9}}\mathbf {r} _{3}+\mathbf {r} _{2}\to \mathbf {r} _{2}}{\overset {-{\frac {7}{9}}\mathbf {r} _{3}+\mathbf {r} _{1}\to \mathbf {r} _{1}}{\to }}}{\begin{pmatrix}1&0&0\\0&1&0\\0&0&1\\\end{pmatrix}}&{\text{step 7}}\end{aligned}}}

Exercise.

Solve the SLE

${\displaystyle {\begin{cases}x+2y=3\\4x+5y=6\end{cases}}}$
using Gauss-Jordan algorithm.

 Its unique solution: ${\displaystyle x=}$ ${\displaystyle y=}$

Prove that the RREF of the matrix

${\displaystyle {\begin{pmatrix}1&4&2&1&0&0\\3&1&5&0&1&0\\0&1&0&0&0&1\end{pmatrix}}}$
is
${\displaystyle {\begin{pmatrix}1&0&0&-5&2&18\\0&1&0&0&0&1\\0&0&1&3&-1&-11\\\end{pmatrix}},}$

and

${\displaystyle {\begin{pmatrix}1&4&2\\3&1&5\\0&1&0\\\end{pmatrix}}{\begin{pmatrix}-5&2&18\\0&0&1\\3&-1&-11\\\end{pmatrix}}={\begin{pmatrix}1&0&0\\0&1&0\\0&0&1\\\end{pmatrix}}=I_{3}}$

Proposition. (Determining number of solutions of a system of linear equations) Let ${\displaystyle A\mathbf {x} =\mathbf {b} }$ be a system of ${\displaystyle m}$ linear equations in ${\displaystyle n}$ unknowns. Let ${\displaystyle R}$ be the RREF of the augmented matrix ${\displaystyle (A|\mathbf {b} )}$, which is of size ${\displaystyle m\times (n+1)}$. Then, the following hold.

• if ${\displaystyle R}$ has a leading one in the ${\displaystyle {\color {green}(n+1)}}$st column, then the system is inconsistent
• if ${\displaystyle R}$ has a leading one in each of the first ${\displaystyle {\color {green}n}}$ columns, but not the ${\displaystyle {\color {green}(n+1)}}$st column, then the system has a unique solution.
• if ${\displaystyle R}$ has no leading one in the ${\displaystyle {\color {green}(n+1)}}$st column, and not all of the first ${\displaystyle {\color {green}n}}$ columns contain leading ones, then the system has infinitely many solutions

Remark. Since ${\displaystyle R}$ must only satisfy one of these three conditions, it follows that the number of solutions to a system of linear equations can only be zero (no solutions), one (unique solution), or infinitely many (exclusive or).

Example. The SLE represented by the ${\displaystyle 3\times 4}$ augmented matrix

${\displaystyle {\begin{pmatrix}1&2&0&0\\0&0&1&0\\{\color {green}0}&{\color {green}0}&{\color {green}0}&{\color {green}1}\\\end{pmatrix}}}$
which is in RREF, is inconsistent, since it has a leading one in the 4th column.

The SLE represented by the ${\displaystyle 4\times 4}$ augmented matrix

${\displaystyle {\begin{pmatrix}{\color {green}1}&0&0&2\\0&{\color {green}1}&0&3\\0&0&{\color {green}1}&3\\0&0&0&0\\\end{pmatrix}}}$
which is in RREF, has a unique solution, since it has a leading one in each of the first 3 columns, but not the 4th column.

The SLE represented by the ${\displaystyle 3\times 4}$ augmented matrix

${\displaystyle {\begin{pmatrix}{\color {green}1}&3&0&2\\0&0&{\color {green}1}&3\\0&0&0&0\\\end{pmatrix}}}$
which is in RREF, has infinitely many solutions, since it has no leading one in the 4th column, and not all of the first 3 columns contain leading ones (2nd column does not contain leading one). This matrix can represent the SLE
${\displaystyle {\begin{cases}x+3y=2\\z=3\\\end{cases}}}$
If we let ${\displaystyle y=t}$ be the independent unknown, then ${\displaystyle x=2-3t}$ and ${\displaystyle z=3}$

Remark.

• independent unknowns (or free variables) are unknowns corresponding to columns without leading ones
• dependent unknowns (or basic variables) are unknowns corresponding to columns with leading ones

Exercise. It is given that the RREF of the matrix

${\displaystyle {\begin{pmatrix}1&4&5&6&8\\4&6&7&4&4\\4&2&3&6&7\\1&7&5&3&6\\\end{pmatrix}}}$
is
${\displaystyle {\begin{pmatrix}1&0&0&0&{\frac {-31}{51}}\\0&1&0&0&{\frac {4}{3}}\\0&0&1&0&{\frac {-89}{51}}\\0&0&0&1&2\\\end{pmatrix}}.}$
Denote the SLE's
${\displaystyle {\begin{cases}5w+4x+y+6z&=8\\7w+6x+4y+4z&=4\\3w+2x+4y+6z&=7\\5w+7x+y+3z&=6\\\end{cases}},{\begin{cases}5w+4x+y+6z&=8\\7w+6x+4y+4z&=4\\3w+2x+4y+6z&=7\\5w+7x+y+3z&=6\\0w+0x+0y+0z&=0\\\end{cases}},{\begin{cases}5w+4x+y+6z&=8\\7w+6x+4y+4z&=4\\3w+2x+4y+6z&=7\\5w+7x+y+3z&=6\\w+x+y+z&=1\\\end{cases}},}$
by ${\displaystyle A}$, ${\displaystyle B}$ and ${\displaystyle C}$ respectively.

1 Does ${\displaystyle A}$ have no solutions, a unique solution, or infinitely many solutions?

 no solutions a unique solution infinitely many solutions

2 Does ${\displaystyle B}$ have no solutions, a unique solution, or infinitely many solutions?

 no solutions a unique solution infinitely many solutions

3 Does ${\displaystyle C}$ have no solutions, a unique solution, or infinitely many solutions?

 no solutions a unique solution infinitely many solutions

Definition. (Homogeneous system of linear equations) A system of linear equations is homogeneous if it is of the form ${\displaystyle A\mathbf {x} =\mathbf {0} }$. A homogeneous system must be consistent, since it has a solution in which each unknown equals zero , which is defined to be a trivial solution. Other solutions, if they exist, are called nontrivial solutions.

Remark. By the proposition about determining the number of solutions of a system of linear equations, it follows that a homogeneous system of linear equations can either have a unique solution, or infinitely many solutions, since the possibility of having no solutions is excluded, by the fact that a homogeneous system of linear equations must be consistent.

Example. (Examples of homogeneous systems of linear equations) The SLE

${\displaystyle {\begin{cases}x+y+z&=0\\2x+8y+3z&=0\\2x+4y+6z&=0\\\end{cases}}}$
is homogeneous, and thus is consistent.

Indeed, the RREF of the augmented matrix representing this SLE is

${\displaystyle {\begin{pmatrix}1&0&0&0\\0&1&0&0\\0&0&1&0\\\end{pmatrix}},}$
and we can see that the unique solution of this SLE is ${\displaystyle (x,y,z)=(0,0,0)}$, which is the trivial solution.

The SLE

${\displaystyle {\begin{cases}x+4y+5z&=0\\2x+3y+5z&=0\\x+2y+3z&=0\\\end{cases}}}$
is homogeneous, and thus is consistent.

Indeed, the RREF of the augmented matrix representing this SLE is

${\displaystyle {\begin{pmatrix}1&0&1&0\\0&1&1&0\\0&0&0&0\\\end{pmatrix}},}$
and therefore this SLE has infinitely many solutions, since not all of the first 3 columns contain leading ones, and there is no leading one in the 4th column in this matrix.

Proposition. (Sufficient condition for homogeneous system of linear equations to have nontrivial solutions) A homogeneous system of ${\displaystyle m}$ linear equations in ${\displaystyle n}$ unknowns must have a nontrivial solution if ${\displaystyle {\color {green}m.

Proof. The augmented matrix of the homogeneous system of ${\displaystyle m}$ linear equations in ${\displaystyle n}$ unknowns is ${\displaystyle A\mathbf {x} =\mathbf {0} }$ is ${\displaystyle (A|\mathbf {0} )}$, and thus its RREF has the form ${\displaystyle (R|\mathbf {0} )}$ in which ${\displaystyle R}$ is of size ${\displaystyle m\times n}$, since there are ${\displaystyle m}$ linear equations and ${\displaystyle n}$ unknowns.

If ${\displaystyle R}$ has a leading one in each of the first ${\displaystyle n}$ columns, then there are at least ${\displaystyle n}$ rows in ${\displaystyle R}$. However, ${\displaystyle R}$ has only ${\displaystyle m rows, a contradiction. Thus, the homogeneous SLE does not have a unique solution.

Since a SLE can either have no solutions (which is impossible for a homogeneous SLE), a unique solution (which is impossible in this case), or infinitely many solutions, it follows that the homogeneous SLE must have infinitely many solutions, and thus have a nontrivial solution.

${\displaystyle \Box }$

Remark.

• this proposition does not claim that if a homogeneous SLE has a nontrivial solution, then the number of linear equations is strictly lower than the number of unknowns (converse of the proposition)
• e.g. in the above example, even if the number of linear equations equals that of unknowns, the SLE can still have a nontrivial solution

Example. The homogeneous SLE

${\displaystyle {\begin{cases}x+y+z=0\\2x+5y+z=0\end{cases}}}$
must have a nontrivial solution.

Indeed, the RREF of the augmented matrix representing this SLE is

${\displaystyle {\begin{pmatrix}1&0&{\frac {4}{3}}&0\\0&1&-{\frac {1}{3}}&0\end{pmatrix}}.}$
Let ${\displaystyle z=t}$ be the independent unknown, we have ${\displaystyle x=-{\frac {4}{3}}t,y={\frac {1}{3}}t}$.