There are 2 perspectives to look at linear equation systems, row view and column view. Both are equivalent, sometimes row view is more intuitive, sometimes column view is easier to understand.

Linear Equations (n = 2)

By row | Each equation represents a straight line on the x-y plane. The intersection is the only point on both lines and therefore the solution. |

By column | The 2 equations form 1 vector equation. One the left hand side, it is combination of the column vectors, which produces the vector on the right hand side. |

Linear Equations (n = 3)

By row | Each equation actually forms a plane. We are looking for intersection of 3 different planes. In 3 dimensions, the intersection of 2 planes is a line; the intersection of 3 planes is a point. |

By column | We still look for a way to combine the columns in left hand side of the equation, to form a column in right hand side. |

## Singularity

A linear system is singular if there is no unique solution, but there is either multiple (infinite) solutions or no solutions at all. Singularity can also be interpreted from both perspective:

By row | The n (hyper)planes do not intersect at exactly one point. |

By column | The n vectors do not span a complete n-dimensional space. |

## Gaussian Elimination to Solve `A x = b`

Gaussian elimination is the most widely adopted way for solving linear systems. The idea is to eliminate variables.

Step 1: **Forward elimination**. Use the i-th equation to eliminate the i-th variables in all i+1 to n equations. This process is essentially transforming the linear system to a smaller one, and finally get a triangular matrix system. Those number on the diagonal are called pivots.

Step 2: **Back substitution**. From the last equation to the first equation, use already-known value to derive values of all variables.

### Singular cases

Assume we have n equations, if we obtain n non-zero pivots, then we say the system is non-singular, there is one unique solution. But it is also possible that there are zeros at some pivot positions, first try to exchange rows it fix it, if a triangular system can be created, it is still non-singular.

If a triangular system can not be created by exchanging rows, the system is singular. This probably will leads to problem because pivot will be used as denominator in later calculation. There are 2 probable situations:

- Inconsistent system, then there is no solution.
- Consistent system, then there are infinitely many solutions.

Whatever you will have at the end: non-singular, singular inconsistent, or singular consistent, Gaussian elimination would make a correct conclusion.

### Complexity

How many elementary arithmetical operations required to solve a system with n equations and n variables?

Forward elimination:

- For first row, we want to eliminate first variable in all other n-1 rows.
- For each row in n-1 rows: 1 division and n multiplication-subtraction, n + 1 operations in total, are required.
- In total, for first row,
`(n-1) * (n+1) = n`

operations are required.^{2}- 1

- For second row,
`(n - 1)`

operations are required.^{2}- 1 - For third row,
`(n - 2)`

operations are required.^{2}- 1 - …
- For n row,
`(n - (n - 1))`

operations are required.^{2}- 1 = 0

Upon the completion of forward elimination: `(n`

. When n is large, it is roughly ^{2} + ... + 1^{2}) - (1 + ... + 1) = n (n + 1) (2n + 1) / 6 - n = n^{3} / 3 + n^{2} / 2 - 5n / 6`n`

.^{3} / 3

Backward substitution:

`1 + 2 + ... + n = n (n+1) / 2 โ n`

^{2} / 2

Totally the operation needed is `n`

, the complexity of Gaussian elimination is ^{3} / 3 + n^{2} / 2`O(n`

. ^{3})

## Gauss-Jordan Elimination to Solve `A`^{-1}

^{-1}

A matrix A is invertible if there exists a matrix B such that A B = B A = I . Then B is unique and called the inverse of A, and denoted by A^{-1} . To find A^{-1} from A, simply do this transformation, as long as you can get identity matrix, the remaining columns are A^{-1} .

`[ A | I ] -> [ I | A`^{-1} ]

If A can be converted to I, then A is non-singular and invertible; otherwise A is singular and non-invertible.

Assume `A A`

, this process is essentially doing n separate Gaussian elimination for all n equations ^{-1} = I`A x`

, where x_{i} = e_{i}_{i} is the i-th column of A^{-1}, and e_{i} is the i-th column of I.

`[ A | e`_{i} ] -> [ I | x_{i} ]

Gauss-Jordan elimination is pretty much a combination of several Gaussian elimination.

### Complexity

Because Gaussian elimination complexity is `O(n`

, and Gauss-Jordan elimination is based on Gaussian elimination, so the complexity of Gauss-Jordan elimination is also ^{3})`O(n`

.^{3})

## Linear Dependence and Independence

A set of m n-dimensional vectors: x_{1}, x_{2}, …, x_{m} are ** linearly dependent** if there exists a

**vector w โ R**

*non-zero*^{m}, such that

`w`_{1} x_{1} + w_{2} x_{2} + ... + w_{m} x_{m} = 0

. A set of vectors are **if they are not linearly dependent.**

*linearly independent*To tell if a set of vectors is dependent or independent, we just put all vectors into the columns of a matrix, then use Gaussian elimination to find the **number of pivots**, which is the **number of linearly independent vectors**.

number of pivots | |

equals m | linearly independent |

less than m | linearly dependent |

This immediately implies that m n-dimensional vectors must be linearly dependent if m > n.

## My Certificate

For more on **Linear Algebra in Operations Research**, please refer to the wonderful course here https://www.coursera.org/learn/operations-research-algorithms

## Related Quick Recap

*I am Kesler Zhu, thank you for visiting my website. Check out more course reviews at https://KZHU.ai*

Don't forget to sign up newsletter, don't miss any chance to learn.

Or share what you've learned with friends!

Tweet