Liouville's theorem (differential algebra)

In mathematics, Liouville's theorem, originally formulated by Joseph Liouville in 1833 to 1841, places an important restriction on antiderivatives that can be expressed as elementary functions. In mathematics, Liouville's theorem, originally formulated by Joseph Liouville in 1833 to 1841, places an important restriction on antiderivatives that can be expressed as elementary functions. The antiderivatives of certain elementary functions cannot themselves be expressed as elementary functions. A standard example of such a function is e − x 2 , {displaystyle e^{-x^{2}},} whose antiderivative is (with a multiplier of a constant) the error function, familiar from statistics. Other examples include the functions sin ⁡ ( x ) x {displaystyle {frac {sin(x)}{x}}} and x x {displaystyle x^{x}} . Liouville's theorem states that elementary antiderivatives, if they exist, must be in the same differential field as the function, plus possibly a finite number of logarithms. For any differential field F, there is a subfield called the constants of F. Given two differential fields F and G, G is called a logarithmic extension of F if G is a simple transcendental extension of F (i.e. G = F(t) for some transcendental t) such that This has the form of a logarithmic derivative. Intuitively, one may think of t as the logarithm of some element s of F, in which case, this condition is analogous to the ordinary chain rule. However, F is not necessarily equipped with a unique logarithm; one might adjoin many 'logarithm-like' extensions to F. Similarly, an exponential extension is a simple transcendental extension that satisfies With the above caveat in mind, this element may be thought of as an exponential of an element s of F. Finally, G is called an elementary differential extension of F if there is a finite chain of subfields from F to G where each extension in the chain is either algebraic, logarithmic, or exponential. Suppose F and G are differential fields, with Con(F) = Con(G), and that G is an elementary differential extension of F. Let a be in F, y in G, and suppose Dy = a (in words, suppose that G contains an antiderivative of a). Then there exist c1, ..., cn in Con(F), u1, ..., un, v in F such that In other words, the only functions that have 'elementary antiderivatives' (i.e. antiderivatives living in, at worst, an elementary differential extension of F) are those with this form. Thus, on an intuitive level, the theorem states that the only elementary antiderivatives are the 'simple' functions plus a finite number of logarithms of 'simple' functions.

[ "Relationship between string theory and quantum field theory", "Liouville field theory" ]
Parent Topic
Child Topic
    No Parent Topic