Partial derivatives, Least squares, Chain rule, Triple product rule

Partial derivatives

Partial derivative is to differentiate functions of multiple variables. Assume a function f = f(x, y), you are differentiating f with respect x, that is the usual definition of a derivative of a function of one variable, but y is held constant. There is something called mix partial. It does not depend on the order that you take derivative. One important use is Taylor series. We can expend functions in terms of their partial derivatives.

Another important application of using partial derivatives is to find minimum of a function, e.g.: minimizing the sum of the squares. We set the partial derivatives with respect to variables to 0. That will give us a system of equations. Then we can solve by any method you like.



Chain Rule

Total differential tells you how function f is changing when you change both its variables, say x and y.

df = f(x + dx, y + dy) - f(x, y)

Total differential can also be written in partial derivatives.

df = โˆ‚f/โˆ‚x dx + โˆ‚f/โˆ‚y dy

Suppose, x and y are functions of t, when we got chain rule:0

df/dt = โˆ‚f/โˆ‚x dx/dt + โˆ‚f/โˆ‚y dy/dt

Triple Product Rule

Assume we have a function f with 3 variables x, y, and z, f(x, y, z) = 0, but they are related x = x(y, z), z = z(x, y):

โˆ‚x/โˆ‚z โˆ‚z/โˆ‚x = 1           # reciprocity relation
โˆ‚x/โˆ‚y โˆ‚y/โˆ‚z โˆ‚z/โˆ‚x = -1    # triple product rule
Gas law, Gradient, Divergence

Gradient

The gradient takes the partial derivative and puts them in a vector form. It shows up in all the fundamental equations of nature. The gradient is intrinsically a three dimensional object.

f = f(x, y, z)
df = โˆ‚f/โˆ‚x dx + โˆ‚f/โˆ‚y dy + โˆ‚f/โˆ‚z dz
= (โˆ‚f/โˆ‚x i + โˆ‚f/โˆ‚y j + โˆ‚f/โˆ‚z k) โˆ™ (dx i + dy j + dz k)
= โˆ‡f โˆ™ dr

df is maximum when โˆ‡f is parallel to dr. If you want to get the maximum change of f, you have to vary your motion in the direction of the gradient. So the direction of the gradient gives you the direction of the maximum change of f, when you move a little bit dr.

Del operator โˆ‡

Differential operator “del” (also called nabla symbol):

โˆ‡ = i โˆ‚/โˆ‚x + j โˆ‚/โˆ‚y + k โˆ‚/โˆ‚z

Divergence

Divergence is fundamental to Maxwell’s equations. It means you have a source of something that is going out. We have a vector field:

u = u1(x, y, z) i + u2(x, y, z) j + u3(x, y, z) k

Divergence of u is:

โˆ‡ โˆ™ u = (i โˆ‚/โˆ‚x + j โˆ‚/โˆ‚y + k โˆ‚/โˆ‚z) โˆ™ (i u1 + j u2 + k u3)
= โˆ‚u1/โˆ‚x + โˆ‚u2/โˆ‚y + โˆ‚u3/โˆ‚z

An important example is the divergence of the electric field from a point charge is zero.

โˆ‡ โˆ™ (r / |r|3) = 0, |r| โ‰  0
Curl, Laplacian, Vector derivative identities

Curl

If a vector field has a non-zero curl, it usually means that there’s some swirling motion, some vorticity in the vector field. The curl of a vector field is del cross u, the vector product, which can be calculated using 3 by 3 determinant.

โˆ‡ ร— u = (i โˆ‚/โˆ‚x + j โˆ‚/โˆ‚y + k โˆ‚/โˆ‚z) ร— (i u1 + j u2 + k u3)
= (โˆ‚u3/โˆ‚y - โˆ‚u2/โˆ‚z) i + (โˆ‚u1/โˆ‚z - โˆ‚u3/โˆ‚x) j + (โˆ‚u2/โˆ‚x - โˆ‚u1/โˆ‚y) k

An important example is the curl of gradient id zero.

โˆ‡ ร— (โˆ‡f) = 0

and divergence of a curl is zero:

โˆ‡ โˆ™ (โˆ‡ ร— u) = 0

Laplacian operator โˆ‡2

Laplacian operator is defined as below. It can act on both a scalar field or a vector field. It shows up in a lot of PDEs.

โˆ‡2 = โˆ‡ โˆ™ โˆ‡ = โˆ‚2/โˆ‚x2 + โˆ‚2/โˆ‚y2 + โˆ‚2/โˆ‚z2

Electromagnetic waves

Light waves, radio waves, x-rays are all waves consisting of electric fields E and magnetic fields B. Free space means there is no charges, no currents. Maxwell’s equations in free space can be written:

โˆ‡ โˆ™ E = 0
โˆ‡ โˆ™ B = 0
โˆ‡ ร— E = - โˆ‚B/โˆ‚t
โˆ‡ ร— B = ฮผ0ฮต0 โˆ‚E/โˆ‚t

To derive electromagnetic waves, we condense these equations into a single equation for the electric field.

Electromagnetic waves, Maxwell's equations


My Certificate

For more on From Partial Derivatives to Maxwell’s Equations, please refer to the wonderful course here https://www.coursera.org/learn/vector-calculus-engineers



I am Kesler Zhu, thank you for visiting my website. Check out more course reviews at https://KZHU.ai

All of your support will be used for maintenance of this site and more great content. I am humbled and grateful for your generosity. Thank you!




Don't forget to sign up newsletter, don't miss any chance to learn.

Or share what you've learned with friends!