There are 2 perspectives to look at linear equation systems, row view and column view. Both are equivalent, sometimes row view is more intuitive, sometimes column view is easier to understand.

Linear Equations (n = 2)

By rowEach equation represents a straight line on the x-y plane.
The intersection is the only point on both lines and therefore the solution.
By columnThe 2 equations form 1 vector equation.
One the left hand side, it is combination of the column vectors, which produces the vector on the right hand side.


Linear Equations (n = 3)

By rowEach equation actually forms a plane. We are looking for intersection of 3 different planes. In 3 dimensions, the intersection of 2 planes is a line; the intersection of 3 planes is a point.
By columnWe still look for a way to combine the columns in left hand side of the equation, to form a column in right hand side.

Singularity

A linear system is singular if there is no unique solution, but there is either multiple (infinite) solutions or no solutions at all. Singularity can also be interpreted from both perspective:

By rowThe n (hyper)planes do not intersect at exactly one point.
By columnThe n vectors do not span a complete n-dimensional space.

Gaussian Elimination to Solve A x = b

Gaussian elimination is the most widely adopted way for solving linear systems. The idea is to eliminate variables.

Step 1: Forward elimination. Use the i-th equation to eliminate the i-th variables in all i+1 to n equations. This process is essentially transforming the linear system to a smaller one, and finally get a triangular matrix system. Those number on the diagonal are called pivots.

Step 2: Back substitution. From the last equation to the first equation, use already-known value to derive values of all variables.

Singular cases

Assume we have n equations, if we obtain n non-zero pivots, then we say the system is non-singular, there is one unique solution. But it is also possible that there are zeros at some pivot positions, first try to exchange rows it fix it, if a triangular system can be created, it is still non-singular.

If a triangular system can not be created by exchanging rows, the system is singular. This probably will leads to problem because pivot will be used as denominator in later calculation. There are 2 probable situations:

  1. Inconsistent system, then there is no solution.
  2. Consistent system, then there are infinitely many solutions.

Whatever you will have at the end: non-singular, singular inconsistent, or singular consistent, Gaussian elimination would make a correct conclusion.

Complexity

How many elementary arithmetical operations required to solve a system with n equations and n variables?

Forward elimination:

  1. For first row, we want to eliminate first variable in all other n-1 rows.
    • For each row in n-1 rows: 1 division and n multiplication-subtraction, n + 1 operations in total, are required.
    • In total, for first row, (n-1) * (n+1) = n2 - 1 operations are required.
  2. For second row, (n - 1)2 - 1 operations are required.
  3. For third row, (n - 2)2 - 1 operations are required.
  4. For n row, (n - (n - 1))2 - 1 = 0 operations are required.

Upon the completion of forward elimination: (n2 + ... + 12) - (1 + ... + 1) = n (n + 1) (2n + 1) / 6 - n = n3 / 3 + n2 / 2 - 5n / 6 . When n is large, it is roughly n3 / 3 .

Backward substitution:

1 + 2 + ... + n = n (n+1) / 2 β‰ˆ n2 / 2

Totally the operation needed is n3 / 3 + n2 / 2 , the complexity of Gaussian elimination is O(n3) .



Gauss-Jordan Elimination to Solve A-1

A matrix A is invertible if there exists a matrix B such that A B = B A = I . Then B is unique and called the inverse of A, and denoted by A-1 . To find A-1 from A, simply do this transformation, as long as you can get identity matrix, the remaining columns are A-1 .

[ A | I ] -> [ I | A-1 ]

If A can be converted to I, then A is non-singular and invertible; otherwise A is singular and non-invertible.

Assume A A-1 = I , this process is essentially doing n separate Gaussian elimination for all n equations A xi = ei , where xi is the i-th column of A-1, and ei is the i-th column of I.

[ A | ei ] -> [ I | xi ]

Gauss-Jordan elimination is pretty much a combination of several Gaussian elimination.

Complexity

Because Gaussian elimination complexity is O(n3) , and Gauss-Jordan elimination is based on Gaussian elimination, so the complexity of Gauss-Jordan elimination is also O(n3) .

Linear Dependence and Independence

A set of m n-dimensional vectors: x1, x2, …, xm are linearly dependent if there exists a non-zero vector w ∈ Rm, such that w1 x1 + w2 x2 + ... + wm xm = 0 . A set of vectors are linearly independent if they are not linearly dependent.

To tell if a set of vectors is dependent or independent, we just put all vectors into the columns of a matrix, then use Gaussian elimination to find the number of pivots, which is the number of linearly independent vectors.

number of pivots
equals mlinearly independent
less than mlinearly dependent

This immediately implies that m n-dimensional vectors must be linearly dependent if m > n.



My Certificate

For more on Linear Algebra in Operations Research, please refer to the wonderful course here https://www.coursera.org/learn/operations-research-algorithms



I am Kesler Zhu, thank you for visiting my website. Check out more course reviews at https://KZHU.ai