Skip to content

KZHU.ai 馃殌

Into the Unknown

Menu
  • 馃搱 Discrete Optimization
    • Mathematics
    • Computer Science
    • Cryptography
    • C++
  • 鈿涳笍 Quantum Computing
    • Physics
    • Blockchain
  • 馃 Machine Learning
    • TensorFlow
    • Python
  • 馃洶 Data Science
    • Statistics
    • Matlab
  • 馃對 Hybrid Cloud
    • Kubernetes
    • Golang
    • Web
  • 馃搱 Business
    • MBA @ Gies
    • Six Sigma
  • 馃彟 Finance
    • Law
    • Economics
Menu
Gram-Schmidt Process, Orthogonal Projections, The least-squares problem

Vector Space & Fundamental Subspaces

Posted on October 8, 2021November 10, 2022 by keslerzhu

Table of Contents

Toggle
  • Linear independence
  • Gram-Schmidt process
  • Null Space
  • Column Space
  • Row Space
  • Left Null Space
  • Relationship between fundamental subspaces
  • Orthogonal Projections
  • Least Squares Problem
  • My Certificate
  • Related Quick Recap

A vector space consists of a set of vectors (column matrices) and a set of scalars (real numbers), which is closed under vector addition and scalar multiplication. For example, assuming u, v are 3-by-1 matrices; a, b are real numbers; w = a u + b v is a 3-by-1 matrix.



Linear independence

A set of vectors {u1, u2, …, un} is linearly independent if the equation c1 u1 + c2 u2 + … + cn un = 0 has only one solution c1 = c2 = … = cn = 0. It means you can not write any vectors in the set as a linear combination of other vectors in the set.

If you are given a set of vectors, you can generate a vector space by just forming all linear combination of that set of vectors, then we say that set of vectors span that vector space. Basis of a vector space is a set of minimum number of vectors that span the space. The dimension of a vector space is the number of basis vectors that are unique.

Gram-Schmidt process

Suppose we have a basis {v1, v2, …, vn} for a vector space. The ordinary basis vectors may be neither orthogonal with each other nor normalized. The Gram-Schmidt process then give us a algorithm for converting the ordinary basis to orthonormal basis {u1, u2, …, un}. The algorithm consists of 2 steps:

  1. Find an orthogonal basis
  2. Normalize

Null Space

Null space of a matrix A is a vector space of all column vectors x, such that A x = 0. Since null space is a vector space, we would like to find its basis. The basis can span the null space, it also tells us the dimension of the null space. The reduced row echelon form of matrix A will help us.

rref(A) x = 0 is the same to A x = 0

rref(A) discloses the relationship between component xi and other components, say:

x1 = c12 x2 + c13 x3 + ... + c1n xn
x2 = c21 x1 + c23 x3 + ... + c2n xn
...
xn = cn1 x1 + cn2 x2 + ... + cn n-1 xn-1

Using the equations above and factoring out xi, we could get the basis which consists of cij. The variable xi associated with pivot columns are called basic variables, others are called free variables.

One can use null space to determine the general solution (not a unique solution) of an under-determined system of equations A x = b, where there are fewer equations than unknowns, i.e. number of rows of matrix A < number of columns of the matrix A.

  1. Let u be a general vector of null space of matrix A. The u will be a linear combination of basis vectors of null space of A.
  2. Let v be any vector that solves the equation A x = b.

The general solution to the A x = b will be sum of u and v, i.e. x = u + v.

A x = A (u + v) = A u + A v = 0 + b = b

Because null space could have a large dimension, then the solution could have a big range of possible solutions.

Column Space

Column space is the vector space that is spanned by the columns of a matrix. Multiplication of a matrix by a column vector is just the linear combination of the column of the matrix. When we write A x = b, and assume there is a solution, then b (i.e. A x) has to be in the column space of the matrix A. We want to find the basis of column space Col(A) and its dimension.

The process of row reduction (reduced row echelon form) does not change the linear dependence relationship between the columns. Column space is NOT spanned by the pivot columns of rref(A), but by the corresponding columns in the original matrix A. So the overall procedure is:

  1. Find rref(A).
  2. Find the columns in original matrix corresponding the pivot columns in rref(A). This is the basis of Col(A).
  3. The dimension of Col(A) is equal to the number pivot columns of rref(A).

Row Space

The vector space spanned by the rows of a matrix is called row space, but another way of looking that is that is the column space of the transpose of the matrix.

Row(A) = Col(AT)


Left Null Space

Left null space of a matrix A is a vector space of all column vectors x, such that xT A = 0. It is just another way of viewing the whole equation transposed. So it is the vector x such that AT x = 0.

LeftNull(A) = Null(AT)

Relationship between fundamental subspaces

For a matrix A of size m by n:

SubspaceSubspace of all … matricesDimension
Null(A)n by 1 # of non-pivot columns of rref(A)
Col(A)m by 1# of pivot columns of rref(A)
Row(A)n by 1# of pivot columns of rref(A)
LeftNull(A)m by 1
  1. Null(A) and Row(A) are orthogonal complements. Because A Null(A) = 0, rows of A are orthogonal to the null space Null(A). So the vectors in row space are orthogonal to the vectors in null space.
  2. Rank of the matrix is the linearly independent columns that a matrix has., it also equals to the linearly independent rows that a matrix has.
Rank(A) = the dimension of Row(A) = the dimension of Col(A)

Orthogonal Projections

We have a vector v in a big vector space V, we want to project it down into a subspace W of the original vector space V. The reason we want to do this is the new projected vector vproj_W is closest to v in W, which is useful to solve some problem.

Least Squares Problem

Usually in statistics, you find the distance between each of the points and the line you may draw, and you square the distance and sum them up, and then you try to find the line that minimize the sum.

But from the perspective of matrices, we construct the least squares problem in the A x = b form. There are many more equations than there are unknowns, so it is over-determined. If A x = b is over determined, that means b is not in the column space of a. (Recall that column space is the vector space that is spanned by the columns of a matrix.) So there is no solution for the unknown x in the equation A x = b.

But to find the best solution for this equation, we can project b onto the column space of A. So, we will solve

A x = bproj_Col(A)

Normal equation is the key to solve it:

AT A x = AT b
Solution of the least-square problem


My Certificate

For more on Vector Space & Fundamental Subspaces, please refer to the wonderful course here https://www.coursera.org/learn/matrix-algebra-engineers

My #72 course certificate from Coursera

Related Quick Recap

Matrices & Systems of Linear Equations

I am Kesler Zhu, thank you for visiting my website. Check out more course reviews at https://KZHU.ai

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

American Contract Law I Andrew Ng Anna Koop Brenda Gunderson Christopher Millard Computer Communications Specialization Cryptography Economics of Money and Banking Evgenii Vashukevich Garud Iyengar Ivan Vybornyi Jeffrey Chasnov John Daily Jonathan Katz Kevin Webster Ling-Chieh Kung Machine Learning: Algorithms in the Real World Martin Haugh Mathematics for Engineers Specialization Matthew Hutchens Michael Donohoe Michael Fricke Microsoft Azure Fundamentals AZ-900 Exam Prep Specialization Operations Research (3): Theory Perry Mehrling Petro Lisowsky Physical Basics of Quantum Computing Practical Reinforcement Learning Rebekah May Search Engine Optimization (SEO) Specialization Sergey Sysoev Statistical Thermodynamics Specialization Statistics with Python Specialization Taxation of Business Entities I: Corporations TensorFlow 2 for Deep Learning Specialization U.S. Federal Taxation Specialization Wounjhang Park Xiaobo Zhou Yi Wang 小褘褋芯械胁 小械褉谐械泄 小械褉谐械械胁懈褔

Subscribe to our newsletter!

© 2025 KZHU.ai 馃殌 | Powered by Superbs Personal Blog theme

Privacy Policy - Terms and Conditions