language-icon Old Web
English
Sign In

Square matrix

In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order n {displaystyle n} . Any two square matrices of the same order can be added and multiplied. In mathematics, a square matrix is a matrix with the same number of rows and columns. An n-by-n matrix is known as a square matrix of order n {displaystyle n} . Any two square matrices of the same order can be added and multiplied. Square matrices are often used to represent simple linear transformations, such as shearing or rotation. For example, if R {displaystyle R} is a square matrix representing a rotation (rotation matrix) and v {displaystyle v} is a column vector describing the position of a point in space, the product R v {displaystyle Rv} yields another column vector describing the position of that point after that rotation. If v {displaystyle v} is a row vector, the same transformation can be obtained using v R T {displaystyle vR^{mathsf {T}}} , where R T {displaystyle R^{mathsf {T}}} is the transpose of R {displaystyle R} . The entries a i i {displaystyle a_{ii}} (i = 1, ..., n) form the main diagonal of a square matrix. They lie on the imaginary line which runs from the top left corner to the bottom right corner of the matrix. For instance, the main diagonal of the 4-by-4 matrix above contains the elements a11 = 9, a22 = 11, a33 = 4, a44 = 10. The diagonal of a square matrix from the top right to the bottom left corner is called antidiagonal or counterdiagonal. If all entries outside the main diagonal are zero, A {displaystyle A} is called a diagonal matrix. If only all entries above (or below) the main diagonal are zero, A {displaystyle A} ' is called a lower (or upper) triangular matrix. The identity matrix I n {displaystyle I_{n}} of size n {displaystyle n} is the n × n {displaystyle n imes n} matrix in which all the elements on the main diagonal are equal to 1 and all other elements are equal to 0, e.g. It is a square matrix of order n {displaystyle n} , and also a special kind of diagonal matrix. It is called identity matrix because multiplication with it leaves a matrix unchanged: A square matrix A that is equal to its transpose, i.e., A = A T {displaystyle A=A^{mathsf {T}}} , is a symmetric matrix. If instead, A was equal to the negative of its transpose, i.e., A = −AT, then A is a skew-symmetric matrix. In complex matrices, symmetry is often replaced by the concept of Hermitian matrices, which satisfy A H = A {displaystyle A^{mathrm {H} }=A} , where A H {displaystyle A^{mathrm {H} }} denotes the conjugate transpose of the matrix, i.e., the transpose of the complex conjugate of A {displaystyle A} . By the spectral theorem, real symmetric (or complex Hermitian) matrices have an orthogonal (or unitary) eigenbasis; i.e., every vector is expressible as a linear combination of eigenvectors. In both cases, all eigenvalues are real. This theorem can be generalized to infinite-dimensional situations related to matrices with infinitely many rows and columns, see below.

[ "Symmetric matrix", "companion matrices", "Commutation matrix", "Skew-Hermitian matrix", "Square root of a matrix", "Determinant" ]
Parent Topic
Child Topic
    No Parent Topic