Lecture
Let in the Euclidean space the scalar product is defined in a known manner . Gram matrix of a vectors system is called a square matrix consisting of various scalar products of these vectors:
The Gram matrix is a symmetric matrix. Its determinant is called the Gram determinant (or Gramian ) of a vector system :
P
Example. If in space rows consisting of real numbers, the scalar product is determined by the rule1)
then the gram gram row
calculated by matrix multiplication:
and at meaning transpose. From the Binet-Cauchy theorem it immediately follows that with (the number of lines exceeding the dimension of space) Gram's determinant is zero. This result is generalized BELOW for arbitrary Euclidean spaces.
P
Example. If in the space of polynomials with real coefficients the scalar product is given by the formula
that
The generalization of the resulting matrix is known as the Hilbert matrix.
If the vector system forms the basis of space (i.e. space is an -dimensional), then setting the Gram matrix allows us to reduce the calculation of the scalar product of arbitrary vectors from to actions on their coordinates:
T
Theorem. if and only if the vector system linearly dependent .
Proof HERE.
=>
The rank of the Gram matrix coincides with the rank of the system of the vectors generating it:
=>
If some major minor of a Gram matrix vanishes, then all major minors of higher orders turn to zero.
T
Theorem. for any vector system .
Proof ☞ HERE
=>
With we obtain the Cauchy-Bunyakovsky inequality:
=>
The Gram matrix of a linearly independent system of vectors is positive definite.
T
Theorem. Let be means the orthogonal component of the vector regarding . Then
Proof ☞ HERE
=>
The value of the Gram determinant does not exceed its main term, i.e. products of its main diagonal elements:
=>
For an arbitrary square real matrix
fair Hadamard inequality 2):
In other words: the module of the determinant of the matrix does not exceed the product of the lengths of its rows. A similar statement is true for the columns of the matrix.
Evidence. Denote matrix row through . Then because (see property 1 ☞ HERE), we have:
when specifying a scalar product in in the standard way. Based on the previous investigation, we have:
Equality is possible if and only if either all the lines are pairwise orthogonal, or at least one line is zero. ♦
P
Example.
with the exact value of the determinant .
T
Theorem. The value of the Gram determinant will not change if the Gram-Schmidt orthogonalization algorithm is applied to the vector system. The notation of this algorithm has the equality:
T
Theorem. Distance from point to linear manifold in
and for fixed linearly independent calculated by the formula
Proof for the case ☞ HERE. Happening reduced to the previous shift of space by vector : see comments on the theorem ☞ HERE. ♦
§
Other applications of the Gram determinant in problems of calculating the distances between surfaces in ☞ HERE.
The area of the parallelogram is equal to the product of its base by the height. If the parallelogram is built on vectors and of then the base can be taken as the length of the vector , and for the height - the length of the perpendicular dropped from the end of the vector on the axis of the vector .
Similarly, the volume of a parallelepiped built on vectors of , equal to the product of the area of the base to the height; the area of the base is the area of the parallelogram built on the vectors and height is the length of the perpendicular dropped from the end of the vector on the plane of vectors .
Volume -dimensional parallelepiped in Euclidean space we define by induction. If this parallelepiped is built on vectors then for its volume we take the product of volume -dimensional parallelepiped built on vectors the length of the perpendicular dropped from a point on linear shell of vectors (i.e. the length of the orthogonal component regarding ):
T
Theorem. Square volume parallelepiped built on vectors , coincides with the value of the Gram determinant from the same system of vectors :
The proof follows from the representation of the length of the orthogonal component via Gram's determinants (see Theorem and the effect to it is Д HERE).
=>
The module of the determinant of a real matrix
equal to the volume of the parallelepiped in space plotted on vertices with coordinates
(i.e., “built on the rows of the matrix”) and equal to the volume of the parallelepiped constructed on the vertices with the coordinates
(i.e., “built on matrix columns”).
The proof actually coincides with the proof of Hadamard's inequality:
Comments
To leave a comment
Linear Algebra and Analytical Geometry
Terms: Linear Algebra and Analytical Geometry