language-icon Old Web
English
Sign In

Basis (linear algebra)

In mathematics, a set B of elements (vectors) in a vector space V is called a basis, if every element of V may be written in a unique way as a (finite) linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates on B of the vector. The elements of a basis are called basis vectors. N ≤ e ϵ 2 n 4 [ − ln ⁡ ( 1 − θ ) ] 1 2 {displaystyle Nleq e^{frac {epsilon ^{2}n}{4}}^{frac {1}{2}}}     (Eq. 1) In mathematics, a set B of elements (vectors) in a vector space V is called a basis, if every element of V may be written in a unique way as a (finite) linear combination of elements of B. The coefficients of this linear combination are referred to as components or coordinates on B of the vector. The elements of a basis are called basis vectors. Equivalently B is a basis if its elements are linearly independent and every element of V is a linear combination of elements of B. In more general terms, a basis is a linearly independent spanning set. A vector space can have several bases; however all the bases have the same number of elements, called the dimension of the vector space. A basis B of a vector space V over a field F (such as the real numbers R or the complex numbers C) is a linearly independent subset of V that spans V.This means that a subset B of V is a basis if it satisfies the two following conditions: The scalars vi are called the coordinates of the vector v with respect to the basis B, and by the first property they are uniquely determined. A vector space that has a finite basis is called finite-dimensional. In this case, the subset {b1, ..., bn} that is considered (twice) in the above definition may be chosen as B itself. It is often convenient or even necessary to have an ordering on the basis vectors, e.g. for discussing orientation, or when one considers the scalar coefficients of a vector with respect to a basis, without referring explicitly to the basis elements. In this case, the ordering is necessary for associating each coefficient to the corresponding basis element. This ordering can be done by numbering the basis elements. For example, when dealing with (m, n)-matrices, the (i, j)th element (in the ith row and jth column) can be referred to the (m⋅(j - 1) + i)th element of a basis consisting of the (m, n)-unit-matrices (varying column-indices before row-indices). For emphasizing that an order has been chosen, one speaks of an ordered basis, which is therefore not simply an unstructured set, but e.g. a sequence, or an indexed family, or similar; see Ordered bases and coordinates below. Many properties of finite bases result from the Steinitz exchange lemma, which states that, given a finite spanning set S and a linearly independent subset L of n elements of S, one may replace n well chosen elements of S by the elements of L for getting a spanning set containing L, having its other elements in S, and having the same number of elements as S. Most properties resulting from the Steinitz exchange lemma remain true when there is no finite spanning set, but their proof in the infinite case requires generally the axiom of choice or a weaker form of it, such as the ultrafilter lemma.

[ "Algorithm", "Mathematical optimization", "Mathematical analysis", "Artificial intelligence", "Geometry", "Change of basis", "basis selection" ]
Parent Topic
Child Topic
    No Parent Topic