Ugrás a tartalomhoz

Convex Geometry

Csaba Vincze (2013)

University of Debrecen

1. fejezet - Elements

1. fejezet - Elements

1.1 Linear Algebra

In what follows

E n = ( v 1 , , v n ) | v 1 , , v n R

(1.1)

is the standard real coordinate space of dimension n equipped with the canonical inner product

v , w : = v 1 w 1 + + v n w n ,

(1.2)

where

v = ( v 1 , , v n )   a n d   w = ( w 1 , , w n ) .

The elements of the coordinate space are called both vectors and points denoted by the symbols of the Latin alphabet in general. Especially we refer to the context for both terminology and notations. We speak about the norm

v : = v , v = ( v 1 ) 2 + + ( v n ) 2

(1.3)

of vectors but the distance

d ( p , q ) : = p - q = ( p 1 - q 1 ) 2 + + ( p n - q n ) 2

(1.4)

between points. Mathematical objects labelled by indices will appear as v(1), ..., v(k) in text mode. The notation refers to the one-to-one correspondence between the set of indices and the set of objects labelled by them. Otherwise (in displayed mathematical formulas) we use

v 1 , , v k

(1.5)

as usual. Let v and w be non-zero vectors in the coordinate space of dimension n and consider the auxiliary funtion

f ( t ) : = v + t w , v + t w

(1.6)

as t runs through the set of real numbers. Using the basic properties of the inner product it can be easily seen that the function 1.6 is a quadratic polynomial. Since the inner product is positive definite its discriminant must be less or equal than zero which leads to the so-called Cauchy-Buniakowsky-Schwartz inequality

| v , w | v w .

(1.7)

The angle between non-zero vectors v and w can be defined as

( v , w ) : = a r c c o s < v , w > v w

(1.8)

in the usual way. According to inequality 1.7 the absolute value of the ratio between the inner product and the product of the norms must be less or equal than one. The system 1.5

(i) generates the vector space if each vector w can be written as the linear combination

μ 1 v 1 + + μ k v k = w .

(ii) is linearly independent if

λ 1 v 1 + λ k v k = 0

implies that all of the coefficients are zero: λ(1)= ... =λ(k)=0.

Otherwise it is linearly dependent. Geometrically, the linear dependence means that we have a non-trivial polygonal chain with parallel sides to the vectors in the given system. Minimal generating systems (equivalently: maximal linearly independent systems) are called bases in the vector space. The common number of the members in minimal generating systems (maximal linearly independent systems) isthe dimension of the space. In this case each vector has exactly one expression as the linear combination of the members of the given system. The coefficients are called coordinates (with respect to the given basis). The canonical basis consists of the vectors

e i = ( 0 , , 0,1 , 0 , , 0 ) ,

(1.9)

where the number 1 stands at the ith position and i=1, 2, ..., n. Recall that a non-empty subset in the space is a linear subspace if it is closed under the vector addition and the scalar multiplication. Especially, the dimension of the subspace

L ( v 1 , , v k )

(1.10)

consisting of all linear combinations of the vectors in the argument is the rank of the system. It is clear that the rank is less or equal than k. Suppose that the vectors

w 1 = w 1 1 v 1 + + w 1 n v n ,

w 2 = w 2 1 v 1 + + w 2 n v n ,

.

.

.

w k = w k 1 v 1 + + w k n v n

are given in terms of the coordinates with respect to a basis

v 1 , , v n .

(1.11)

To decide the linear dependence (or independence) we have the following standard methods:

(i) The vanishing of a linear combination of the vectors

w 1 , , w k

(1.12)

can be written as a system of linear equations

w 1 1 w 2 1 . . w k 1 w 1 2 w 2 2 . . w k 2 . . . . . . . . . . . . . . . w 1 n w 2 n . . w k n n × k λ 1 λ 2 . . . λ k = 0 0 . . . 0

for the unknown coefficients

λ 1 , , λ k .

(1.13)

(ii) If the coordinates of the given vectors form the rows (or columns) of a matrix we can determine its rank which is just the same as the dimension of the generated subspace. If it is less than k then the system is linearly dependent. Otherwise (in case of rank k) the system is linearly independent.

(iii) In case of a square matrix we can calculate the determinant ofthe matrix for checking the linear dependence (the determinant vanishes) or independence (the determinant is different from zero). Especially any linearly independent (or generating) system containing exactly n vectors forms a basis in the coordinate space of dimension n.