HOME

TheInfoList



OR:

In vector calculus, the Jacobian matrix (, ) of a vector-valued function of several variables is the matrix of all its first-order
partial derivative In mathematics, a partial derivative of a function of several variables is its derivative with respect to one of those variables, with the others held constant (as opposed to the total derivative, in which all variables are allowed to vary). P ...
s. If this matrix is
square In geometry, a square is a regular polygon, regular quadrilateral. It has four straight sides of equal length and four equal angles. Squares are special cases of rectangles, which have four equal angles, and of rhombuses, which have four equal si ...
, that is, if the number of variables equals the number of components of function values, then its
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
is called the Jacobian determinant. Both the matrix and (if applicable) the determinant are often referred to simply as the Jacobian. They are named after Carl Gustav Jacob Jacobi. The Jacobian matrix is the natural generalization to vector valued functions of several variables of the
derivative In mathematics, the derivative is a fundamental tool that quantifies the sensitivity to change of a function's output with respect to its input. The derivative of a function of a single variable at a chosen input value, when it exists, is t ...
and the differential of a usual function. This generalization includes generalizations of the inverse function theorem and the implicit function theorem, where the non-nullity of the derivative is replaced by the non-nullity of the Jacobian determinant, and the
multiplicative inverse In mathematics, a multiplicative inverse or reciprocal for a number ''x'', denoted by 1/''x'' or ''x''−1, is a number which when Multiplication, multiplied by ''x'' yields the multiplicative identity, 1. The multiplicative inverse of a ra ...
of the derivative is replaced by the inverse of the Jacobian matrix. The Jacobian determinant is fundamentally used for changes of variables in multiple integrals.


Definition

Let \mathbf: \mathbb^n \to \mathbb^m be a function such that each of its first-order partial derivatives exists on \frac; explicitly \mathbf = \begin \dfrac & \cdots & \dfrac \end = \begin \nabla^ f_1 \\ \vdots \\ \nabla^ f_m \end = \begin \dfrac & \cdots & \dfrac\\ \vdots & \ddots & \vdots\\ \dfrac & \cdots & \dfrac \end where \nabla^ f_i is the transpose (row vector) of the gradient of the i-th component. The Jacobian matrix, whose entries are functions of , is denoted in various ways; other common notations include , \nabla \mathbf, and \frac. Some authors define the Jacobian as the transpose of the form given above. The Jacobian matrix represents the differential of at every point where is differentiable. In detail, if is a displacement vector represented by a column matrix, the matrix product is another displacement vector, that is the best linear approximation of the change of in a
neighborhood A neighbourhood (Commonwealth English) or neighborhood (American English) is a geographically localized community within a larger town, city, suburb or rural area, sometimes consisting of a single street and the buildings lining it. Neigh ...
of , if is differentiable at . This means that the function that maps to is the best linear approximation of for all points close to . The
linear map In mathematics, and more specifically in linear algebra, a linear map (also called a linear mapping, linear transformation, vector space homomorphism, or in some contexts linear function) is a mapping V \to W between two vector spaces that p ...
is known as the ''derivative'' or the ''differential'' of at . When m=n, the Jacobian matrix is square, so its
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
is a well-defined function of , known as the Jacobian determinant of . It carries important information about the local behavior of . In particular, the function has a differentiable
inverse function In mathematics, the inverse function of a function (also called the inverse of ) is a function that undoes the operation of . The inverse of exists if and only if is bijective, and if it exists, is denoted by f^ . For a function f\colon ...
in a neighborhood of a point if and only if the Jacobian determinant is nonzero at (see inverse function theorem for an explanation of this and Jacobian conjecture for a related problem of ''global'' invertibility). The Jacobian determinant also appears when changing the variables in multiple integrals (see substitution rule for multiple variables). When m=1, that is when f: \mathbb^n \to \mathbb is a scalar-valued function, the Jacobian matrix reduces to the row vector \nabla^ f; this row vector of all first-order partial derivatives of is the transpose of the gradient of , i.e. \mathbf_ = \nabla^ f. Specializing further, when m=n=1, that is when f: \mathbb \to \mathbb is a scalar-valued function of a single variable, the Jacobian matrix has a single entry; this entry is the derivative of the function . These concepts are named after the
mathematician A mathematician is someone who uses an extensive knowledge of mathematics in their work, typically to solve mathematical problems. Mathematicians are concerned with numbers, data, quantity, mathematical structure, structure, space, Mathematica ...
Carl Gustav Jacob Jacobi (1804–1851).


Jacobian matrix

The Jacobian of a vector-valued function in several variables generalizes the gradient of a scalar-valued function in several variables, which in turn generalizes the derivative of a scalar-valued function of a single variable. In other words, the Jacobian matrix of a scalar-valued function of several variables is (the transpose of) its gradient and the gradient of a scalar-valued function of a single variable is its derivative. At each point where a function is differentiable, its Jacobian matrix can also be thought of as describing the amount of "stretching", "rotating" or "transforming" that the function imposes locally near that point. For example, if is used to smoothly transform an image, the Jacobian matrix , describes how the image in the neighborhood of is transformed. If a function is differentiable at a point, its differential is given in coordinates by the Jacobian matrix. However, a function does not need to be differentiable for its Jacobian matrix to be defined, since only its first-order
partial derivative In mathematics, a partial derivative of a function of several variables is its derivative with respect to one of those variables, with the others held constant (as opposed to the total derivative, in which all variables are allowed to vary). P ...
s are required to exist. If is differentiable at a point in , then its differential is represented by . In this case, the linear transformation represented by is the best linear approximation of near the point , in the sense that \mathbf f(\mathbf x) - \mathbf f(\mathbf p) = \mathbf J_(\mathbf p)(\mathbf x - \mathbf p) + o(\, \mathbf x - \mathbf p\, ) \quad (\text \mathbf \to \mathbf), where is a quantity that approaches zero much faster than the
distance Distance is a numerical or occasionally qualitative measurement of how far apart objects, points, people, or ideas are. In physics or everyday usage, distance may refer to a physical length or an estimation based on other criteria (e.g. "two co ...
between and does as approaches . This approximation specializes to the approximation of a scalar function of a single variable by its Taylor polynomial of degree one, namely f(x) - f(p) = f'(p) (x - p) + o(x - p) \quad (\text x \to p). In this sense, the Jacobian may be regarded as a kind of " first-order derivative" of a vector-valued function of several variables. In particular, this means that the gradient of a scalar-valued function of several variables may too be regarded as its "first-order derivative". Composable differentiable functions and satisfy the
chain rule In calculus, the chain rule is a formula that expresses the derivative of the Function composition, composition of two differentiable functions and in terms of the derivatives of and . More precisely, if h=f\circ g is the function such that h ...
, namely \mathbf_(\mathbf) = \mathbf_(\mathbf(\mathbf)) \mathbf_(\mathbf) for in . The Jacobian of the gradient of a scalar function of several variables has a special name: the Hessian matrix, which in a sense is the "
second derivative In calculus, the second derivative, or the second-order derivative, of a function is the derivative of the derivative of . Informally, the second derivative can be phrased as "the rate of change of the rate of change"; for example, the secon ...
" of the function in question.


Jacobian determinant

If , then is a function from to itself and the Jacobian matrix is a square matrix. We can then form its
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
, known as the Jacobian determinant. The Jacobian determinant is sometimes simply referred to as "the Jacobian". The Jacobian determinant at a given point gives important information about the behavior of near that point. For instance, the continuously differentiable function is invertible near a point if the Jacobian determinant at is non-zero. This is the inverse function theorem. Furthermore, if the Jacobian determinant at is positive, then preserves orientation near ; if it is negative, reverses orientation. The
absolute value In mathematics, the absolute value or modulus of a real number x, is the non-negative value without regard to its sign. Namely, , x, =x if x is a positive number, and , x, =-x if x is negative (in which case negating x makes -x positive), ...
of the Jacobian determinant at gives us the factor by which the function expands or shrinks volumes near ; this is why it occurs in the general substitution rule. The Jacobian determinant is used when making a change of variables when evaluating a multiple integral of a function over a region within its domain. To accommodate for the change of coordinates the magnitude of the Jacobian determinant arises as a multiplicative factor within the integral. This is because the -dimensional element is in general a parallelepiped in the new coordinate system, and the -volume of a parallelepiped is the determinant of its edge vectors. The Jacobian can also be used to determine the stability of equilibria for systems of differential equations by approximating behavior near an equilibrium point.


Inverse

According to the inverse function theorem, the matrix inverse of the Jacobian matrix of an invertible function is the Jacobian matrix of the ''inverse'' function. That is, the Jacobian matrix of the inverse function at a point is \mathbf J_(\mathbf) = , and the Jacobian determinant is \det(\mathbf_(\mathbf)) = \frac. If the Jacobian is continuous and nonsingular at the point in , then is invertible when restricted to some
neighbourhood A neighbourhood (Commonwealth English) or neighborhood (American English) is a geographically localized community within a larger town, city, suburb or rural area, sometimes consisting of a single street and the buildings lining it. Neighbourh ...
of . In other words, if the Jacobian determinant is not zero at a point, then the function is ''locally invertible'' near this point. The (unproved) Jacobian conjecture is related to global invertibility in the case of a polynomial function, that is a function defined by ''n'' polynomials in ''n'' variables. It asserts that, if the Jacobian determinant is a non-zero constant (or, equivalently, that it does not have any complex zero), then the function is invertible and its inverse is a polynomial function.


Critical points

If is a differentiable function, a ''critical point'' of is a point where the rank of the Jacobian matrix is not maximal. This means that the rank at the critical point is lower than the rank at some neighbour point. In other words, let be the maximal dimension of the open balls contained in the image of ; then a point is critical if all minors of rank of are zero. In the case where , a point is critical if the Jacobian determinant is zero.


Examples


Example 1

Consider a function with given by \mathbf f\left(\begin x\\y\end\right) = \begin f_1(x,y)\\f_2(x,y)\end = \begin x^2 y \\5 x + \sin y \end. Then we have f_1(x, y) = x^2 y and f_2(x, y) = 5 x + \sin y. The Jacobian matrix of is \mathbf J_(x, y) = \begin \dfrac & \dfrac\\ em \dfrac & \dfrac \end = \begin 2 x y & x^2 \\ 5 & \cos y \end and the Jacobian determinant is \det(\mathbf J_(x, y)) = 2 x y \cos y - 5 x^2.


Example 2: polar-Cartesian transformation

The transformation from
polar coordinates In mathematics, the polar coordinate system specifies a given point (mathematics), point in a plane (mathematics), plane by using a distance and an angle as its two coordinate system, coordinates. These are *the point's distance from a reference ...
to Cartesian coordinates (''x'', ''y''), is given by the function with components \begin x &= r \cos \varphi ; \\ y &= r \sin \varphi . \end \mathbf J_(r, \varphi) = \begin \frac & \frac\\ .5ex \frac & \frac \end = \begin \cos\varphi & - r\sin \varphi \\ \sin\varphi & r\cos \varphi \end The Jacobian determinant is equal to . This can be used to transform integrals between the two coordinate systems: \iint_ f(x, y) \,dx \,dy = \iint_A f(r \cos \varphi, r \sin \varphi) \, r \, dr \, d\varphi .


Example 3: spherical-Cartesian transformation

The transformation from spherical coordinates to Cartesian coordinates (''x'', ''y'', ''z''), is given by the function with components \begin x &= \rho \sin \varphi \cos \theta ; \\ y &= \rho \sin \varphi \sin \theta ; \\ z &= \rho \cos \varphi . \end The Jacobian matrix for this coordinate change is \mathbf J_(\rho, \varphi, \theta) = \begin \dfrac & \dfrac & \dfrac \\ em \dfrac & \dfrac & \dfrac \\ em \dfrac & \dfrac & \dfrac \end = \begin \sin \varphi \cos \theta & \rho \cos \varphi \cos \theta & -\rho \sin \varphi \sin \theta \\ \sin \varphi \sin \theta & \rho \cos \varphi \sin \theta & \rho \sin \varphi \cos \theta \\ \cos \varphi & - \rho \sin \varphi & 0 \end. The
determinant In mathematics, the determinant is a Scalar (mathematics), scalar-valued function (mathematics), function of the entries of a square matrix. The determinant of a matrix is commonly denoted , , or . Its value characterizes some properties of the ...
is . Since is the volume for a rectangular differential volume element (because the volume of a rectangular prism is the product of its sides), we can interpret as the volume of the spherical differential volume element. Unlike rectangular differential volume element's volume, this differential volume element's volume is not a constant, and varies with coordinates ( and ). It can be used to transform integrals between the two coordinate systems: \iiint_ f(x, y, z) \,dx \,dy \,dz = \iiint_U f(\rho \sin \varphi \cos \theta, \rho \sin \varphi\sin \theta, \rho \cos \varphi) \, \rho^2 \sin \varphi \, d\rho \, d\varphi \, d\theta .


Example 4

The Jacobian matrix of the function with components \begin y_1 &= x_1 \\ y_2 &= 5 x_3 \\ y_3 &= 4 x_2^2 - 2 x_3 \\ y_4 &= x_3 \sin x_1 \end is \mathbf J_(x_1, x_2, x_3) = \begin \dfrac & \dfrac & \dfrac \\ em \dfrac & \dfrac & \dfrac \\ em \dfrac & \dfrac & \dfrac \\ em \dfrac & \dfrac & \dfrac \end = \begin 1 & 0 & 0 \\ 0 & 0 & 5 \\ 0 & 8 x_2 & -2 \\ x_3\cos x_1 & 0 & \sin x_1 \end. This example shows that the Jacobian matrix need not be a square matrix.


Example 5

The Jacobian determinant of the function with components \begin y_1 &= 5x_2 \\ y_2 &= 4x_1^2 - 2 \sin (x_2 x_3) \\ y_3 &= x_2 x_3 \end is \begin 0 & 5 & 0 \\ 8 x_1 & -2 x_3 \cos(x_2 x_3) & -2 x_2 \cos (x_2 x_3) \\ 0 & x_3 & x_2 \end = -8 x_1 \begin 5 & 0 \\ x_3 & x_2 \end = -40 x_1 x_2. From this we see that reverses orientation near those points where and have the same sign; the function is locally invertible everywhere except near points where or . Intuitively, if one starts with a tiny object around the point and apply to that object, one will get a resulting object with approximately times the volume of the original one, with orientation reversed.


Other uses


Dynamical systems

Consider a dynamical system of the form \dot = F(\mathbf), where \dot is the (component-wise) derivative of \mathbf with respect to the evolution parameter t (time), and F \colon \mathbb^ \to \mathbb^ is differentiable. If F(\mathbf_) = 0, then \mathbf_ is a stationary point (also called a steady state). By the Hartman–Grobman theorem, the behavior of the system near a stationary point is related to the eigenvalues of \mathbf_ \left( \mathbf_ \right), the Jacobian of F at the stationary point. Specifically, if the eigenvalues all have real parts that are negative, then the system is stable near the stationary point. If any eigenvalue has a real part that is positive, then the point is unstable. If the largest real part of the eigenvalues is zero, the Jacobian matrix does not allow for an evaluation of the stability.


Newton's method

A square system of coupled nonlinear equations can be solved iteratively by Newton's method. This method uses the Jacobian matrix of the system of equations.


Regression and least squares fitting

The Jacobian serves as a linearized design matrix in statistical regression and curve fitting; see non-linear least squares. The Jacobian is also used in random matrices, moments, local sensitivity and statistical diagnostics.


See also

* Center manifold * Hessian matrix * Pushforward (differential)


Notes


References


Further reading

* *


External links

*
Mathworld
A more technical explanation of Jacobians {{Matrix classes Multivariable calculus Differential calculus Generalizations of the derivative Determinants Matrices (mathematics) Differential operators