Line 5: Line 5:
 
[[Category:math squad]]
 
[[Category:math squad]]
  
== Jacobians and their applications ==
+
= Jacobians and their applications =
  
 
by Joseph Ruan
 
by Joseph Ruan
 
----
 
----
  
===Basic Definition===
+
==Basic Definition==
  
 
The Jacobian Matrix is just a matrix that takes the partial derivatives of each element of a transformation. In general, the Jacobian Matrix of a transformation F, looks like this:
 
The Jacobian Matrix is just a matrix that takes the partial derivatives of each element of a transformation. In general, the Jacobian Matrix of a transformation F, looks like this:
Line 21: Line 21:
 
\frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\
 
\frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\
 
\frac{\partial y}{\partial u} & \frac{\partial y}{\partial v}  \end{bmatrix}</math>
 
\frac{\partial y}{\partial u} & \frac{\partial y}{\partial v}  \end{bmatrix}</math>
 +
 +
This Jacobian matrix noticably holds all of the partial derivatives of the transformation with respect to each of the variables. Therefore each row contains how a particular output element changes with respect to each of the input elements. This means that the Jacobian matrix contains vectors that help describe how a change in any of the input elements affects the output elements.
  
 
To help illustrate making Jacobian matrices, let's do some examples:  
 
To help illustrate making Jacobian matrices, let's do some examples:  
Line 106: Line 108:
 
Notice that, in an integral when changing from cartesian coordinates (dxdy) to polar coordinates <math> (drd\theta)</math>, the equation is as such:
 
Notice that, in an integral when changing from cartesian coordinates (dxdy) to polar coordinates <math> (drd\theta)</math>, the equation is as such:
  
<font size=4><math> dxdy=r*drd\theta </math></font>
+
<font size=4><math> dxdy=r*drd\theta=u*dudv </math></font>
 
+
in this case, since <math>u =r</math> and <math> v = \theta</math>, then
+
 
+
<font size=4><math> dxdy=u*dudv </math></font>
+
  
 
It is easy to extrapolate, then, that the transformation from one set of coordinates to another set is merely  
 
It is easy to extrapolate, then, that the transformation from one set of coordinates to another set is merely  
Line 118: Line 116:
 
where C1 is the first set of coordinates, det(J(C1)) is the determinant of the Jacobian matrix made from the Transformation T, T is the Transformation from C1 to C2 and C2 is the second set of coordinates.
 
where C1 is the first set of coordinates, det(J(C1)) is the determinant of the Jacobian matrix made from the Transformation T, T is the Transformation from C1 to C2 and C2 is the second set of coordinates.
  
It is important to notice several aspects: first, the determinant is assumed to exist and be non-zero, and therefore the Jacobian matrix must be square and invertible. This makes sense because  
+
It is important to notice several aspects: first, the determinant is assumed to exist and be non-zero, and therefore the Jacobian matrix must be square and invertible. This makes sense because when changing coordinates, it should be possible to change back.  Moreover,
  
  

Revision as of 09:51, 8 May 2013


Jacobians and their applications

by Joseph Ruan


Basic Definition

The Jacobian Matrix is just a matrix that takes the partial derivatives of each element of a transformation. In general, the Jacobian Matrix of a transformation F, looks like this:

JacobianGen.png F1,F2, F3... are each of the elements of the output vector and x1,x2, x3 ... are each of the elements of the input vector.

So for example, in a 2 dimensional case, let T be a transformation such that T(u,v)=<x,y> then the Jacobian matrix of this function would look like this:

$ J(u,v)=\begin{bmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \end{bmatrix} $

This Jacobian matrix noticably holds all of the partial derivatives of the transformation with respect to each of the variables. Therefore each row contains how a particular output element changes with respect to each of the input elements. This means that the Jacobian matrix contains vectors that help describe how a change in any of the input elements affects the output elements.

To help illustrate making Jacobian matrices, let's do some examples:

Example #1:

Let's take the Transformation:

$ T(u,v) = <u*\cos v, u*\sin v> $ .

What would be the Jacobian Matrix of this Transformation?

Solution:

$ x=u*\cos v \longrightarrow \frac{\partial x}{\partial u}= \cos v , \; \frac{\partial x}{\partial v} = -u*\sin v $

$ y=u*\sin v \longrightarrow \frac{\partial y}{\partial u}= \sin v , \; \frac{\partial y}{\partial v} = u*\cos v $

$ J(u,v)=\begin{bmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \end{bmatrix}= \begin{bmatrix} \cos v & -u*\sin v \\ \sin v & u*\cos v \end{bmatrix} $

This example actually showcased the transformation "T" as the change from polar coordinates into Cartesian coordinates.


Example #2:

Let's take the Transformation:

$ T(u,v) = <u, v, u+v> $ .

What would be the Jacobian Matrix of this Transformation?

Solution:

Notice, that this matrix will not be square because there is a difference in dimensions of the input and output, i.e. the transformation is not injective.

$ x=u \longrightarrow \frac{\partial x}{\partial u}= 1 , \; \frac{\partial x}{\partial v} = 0 $

$ y=v \longrightarrow \frac{\partial y}{\partial u}=0 , \; \frac{\partial y}{\partial v} = 1 $

$ z=u+v \longrightarrow \frac{\partial y}{\partial u}= 1 , \; \frac{\partial y}{\partial v} = 1 $

$ J(u,v)=\begin{bmatrix} \frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \\ \frac{\partial y}{\partial u} & \frac{\partial y}{\partial v} \\ \frac{\partial z}{\partial u} & \frac{\partial z}{\partial v} \end{bmatrix}= \begin{bmatrix} 1 & 0 \\ 0 & 1 \\ 1 & 1\end{bmatrix} $


Example #3:

Let's take the Transformation:

$ T(u,v) = <uv> $ .

What would be the Jacobian Matrix of this Transformation?

Solution:

Notice, that this matrix will not be square because there is a difference in dimensions of the input and output, i.e. the transformation is not injective.

$ x=u \longrightarrow \frac{\partial x}{\partial u}= v , \; \frac{\partial x}{\partial v} = u $

$ J(u,v)= \begin{bmatrix}\frac{\partial x}{\partial u} & \frac{\partial x}{\partial v} \end{bmatrix}=\begin{bmatrix}v & u \end{bmatrix} $


Application: Jacaobian Determinants

The determinant of Example #1 gives:

$ \left|\begin{matrix} \cos v & -u * \sin v \\ \sin v & u * \cos v \end{matrix}\right|=~~ u \cos^2 v + u \sin^2 v =~~ u $

Notice that, in an integral when changing from cartesian coordinates (dxdy) to polar coordinates $ (drd\theta) $, the equation is as such:

$ dxdy=r*drd\theta=u*dudv $

It is easy to extrapolate, then, that the transformation from one set of coordinates to another set is merely

$ dC1=det(J(T))dC2 $

where C1 is the first set of coordinates, det(J(C1)) is the determinant of the Jacobian matrix made from the Transformation T, T is the Transformation from C1 to C2 and C2 is the second set of coordinates.

It is important to notice several aspects: first, the determinant is assumed to exist and be non-zero, and therefore the Jacobian matrix must be square and invertible. This makes sense because when changing coordinates, it should be possible to change back. Moreover,




Sources:

  1. [[1]]

Alumni Liaison

BSEE 2004, current Ph.D. student researching signal and image processing.

Landis Huffman