Студопедия.Орг Главная | Случайная страница | Контакты | Мы поможем в написании вашей работы!  
 

Matrices



A matrix is a set of quantities arranged in rows and columns to form a rectangular array. Matrices don’t have a numerical value. They are used to represent relations between quantities as well as to represent and solve simultaneous equations. A matrix of m rows and n columns is called an (mn) matrix.

There are a few types of matrices: a square matrix, a row matrix, a column matrix, a unit matrix, a transpose of a matrix and others. Here we’ll regard more closely a square matrix.

In algebra a square matrix is an orthogonal matrix with real entries whose columns and rows are orthogonal unit vectors. Equivalently, a matrix Q is orthogonal if its transpose – the matrix that results from interchanging the rows and columns – is equal to its inverse: QT = Q-1, which derives QTQ = QQT = I, where I is the identity matrix. This type of matrix is a square matrix in which all the elements in the leading diagonal are ones and the other elements are equal to zero. An orthogonal matrix is the real specialization of a unitary matrix.

The set of n × n orthogonal matrices forms a group O (n), known as the orthogonal group. The subgroup SO (n) consisting of orthogonal matrices with determinant +1 is called the special orthogonal group, and each of its elements is a special orthogonal matrix. Orthogonal matrices arise naturally from inner products, and from matrices of complex numbers. Orthogonal matrices preserve inner product, so for vectors u, v in an n -dimensional real inner product space ‹u,v›=‹Qu,Qv›.

To see the inner product connection, let’s consider a vector v in an n -dimensional real inner product space. Written with respect to an orthonormal basis, the squared length of v is vTv=(Qv)T(Qv) = vTQT v.

The finite-dimensional linear isometries – rotations, reflections, and their combinations – produce orthogonal matrices. The converse is also true: orthogonal matrices imply orthogonal transformations. However, linear algebra includes orthogonal transformations between spaces which may be neither finite-dimensional nor of the same dimension.

The inverse of every orthogonal matrix is again orthogonal. In fact, the set of all n × n orthogonal matrices satisfies all the axioms of a group. It is a compact Lie group of dimension n (n − 1)/2, called the orthogonal group and denoted by O (n).

The orthogonal matrices whose determinant is +1 form the special orthogonal group SO (n) of rotations. Now let’s consider (n +1)×(n +1) orthogonal matrices with bottom right entry equal to 1. The remainder of the last column (and last row) must be zeros, and the product of any two such matrices has the same form. The rest of the matrix is an n × n orthogonal matrix; thus O (n) is a subgroup of O (n + 1) (and of all higher groups).

Since an elementary reflection can reduce any orthogonal matrix to this constrained form, a series of such reflections can bring any orthogonal matrix to the identity; thus an orthogonal group is a reflection group.

Orthogonal matrices are important for a number of reasons, both theoretical and practical.





Дата публикования: 2015-02-28; Прочитано: 307 | Нарушение авторского права страницы | Мы поможем в написании вашей работы!



studopedia.org - Студопедия.Орг - 2014-2024 год. Студопедия не является автором материалов, которые размещены. Но предоставляет возможность бесплатного использования (0.008 с)...