language-icon Old Web
English
Sign In

Jacobian matrix and determinant

In vector calculus, the Jacobian matrix (/dʒəˈkoʊbiən/, /dʒɪ-, jɪ-/) of a vector-valued function in several variables is the matrix of all its first-order partial derivatives. When this matrix is square, that is, when the function takes the same number of variables as input as the number of vector components of its output, both the matrix and its determinant are referred to as the Jacobian in literature. In vector calculus, the Jacobian matrix (/dʒəˈkoʊbiən/, /dʒɪ-, jɪ-/) of a vector-valued function in several variables is the matrix of all its first-order partial derivatives. When this matrix is square, that is, when the function takes the same number of variables as input as the number of vector components of its output, both the matrix and its determinant are referred to as the Jacobian in literature. Suppose f : ℝn → ℝm is a function each of whose first-order partial derivatives exist on ℝn. This function takes a point x ∈ ℝn as input and produces the vector f(x) ∈ ℝm as output. Then the Jacobian matrix of f is defined to be an m×n matrix, denoted by J, whose (i,j)th entry is J i j = ∂ f i ∂ x j {displaystyle mathbf {J} _{ij}={frac {partial f_{i}}{partial x_{j}}}} , or explicitly This matrix, whose entries are functions of x, is also denoted variously by Df, Jf, and ∂(f1,...,fm)/∂(x1,...,xn). (Note however that some literature defines the Jacobian as the transpose of the matrix given above.) The Jacobian matrix represents the differential of f at every point where f is differentiable. In detail, with respect to a given point x∈ ℝn, the linear transformation represented by J(x) takes a position vector in ℝn from x as reference point as input and produces the position vector in ℝm from f(x) as reference point obtained by multiplying by J(x) as output. If f is differentiable at some point x, then this is the linear transformation that best approximates f for points close x, and is known as the derivative or the differential of f at x. When m = n, the Jacobian matrix is square, so its determinant is a well-defined function of x, known as the Jacobian determinant of f. It carries important information about the local behavior of f. In particular, the function f has locally in the neighborhood of a point x an inverse function that is differentiable if and only if the Jacobian determinant is nonzero at x (see Jacobian conjecture). The Jacobian determinant also appears when changing the variables in multiple integrals (see substitution rule for multiple variables). When m = 1, that is when f : ℝn → ℝ is a scalar-valued function, the Jacobian matrix reduces to a row vector. This row vector of all first-order partial derivatives of f is the transpose of the gradient of f, i.e. J f = ( ∇ f ) ⊺ {displaystyle mathbf {J} _{f}=( abla f)^{intercal }} . Here we are adopting the convention that the gradient vector ∇ f {displaystyle abla f} is a column vector. Specialising further, when m = n = 1, that is when f : ℝ → ℝ is a scalar-valued function of a single variable, the Jacobian matrix has a single entry. This entry is the derivative of the function f. These concepts are named after the mathematician Carl Gustav Jacob Jacobi (1804–1851). The Jacobian of a vector-valued function in several variables generalizes the gradient of a scalar-valued function in several variables, which in turn generalizes the derivative of a scalar-valued function of a single variable. In other words, the Jacobian matrix of a scalar-valued function in several variables is (the transpose of) its gradient and the gradient of a scalar-valued function of a single variable is its derivative. At each point where a function is differentiable, its Jacobian matrix can also be thought of as describing the amount of 'stretching', 'rotating' or 'transforming' that the function imposes locally near that point. For example, if (x′, y′) = f(x, y) is used to smoothly transform an image, the Jacobian matrix Jf(x, y), describes how the image in the neighborhood of (x, y) is transformed.

[ "Control theory", "Mathematical optimization", "Mathematical analysis", "Calculus", "Algebra", "Superelliptic curve", "Jacobian conjecture", "Prym variety", "Generalized Jacobian", "Newton–Krylov method" ]
Parent Topic
Child Topic
    No Parent Topic