Biographies Characteristics Analysis

Linear dependence of two vectors. Linearly dependent and linearly independent vectors

To check whether a system of vectors is linearly dependent, it is necessary to compose a linear combination of these vectors, and check whether it can be equal to zero if at least one coefficient is equal to zero.

Case 1. The system of vectors is given by vectors

We make a linear combination

We have obtained a homogeneous system of equations. If it has a non-zero solution, then the determinant must be equal to zero. Let's make a determinant and find its value.

The determinant is zero, therefore, the vectors are linearly dependent.

Case 2. The system of vectors is given by analytic functions:

a)
, if the identity is true, then the system is linearly dependent.

Let's make a linear combination.

It is necessary to check whether there are such a, b, c (at least one of which is not equal to zero) for which the given expression is equal to zero.

We write the hyperbolic functions

,
, then

then the linear combination of vectors will take the form:

Where
, take, for example, then the linear combination is equal to zero, therefore, the system is linearly dependent.

Answer: The system is linearly dependent.

b)
, we compose a linear combination

A linear combination of vectors, must be zero for any values ​​of x.

Let's check for special cases.

A linear combination of vectors is zero only if all coefficients are zero.

Therefore, the system is linearly independent.

Answer: The system is linearly independent.

5.3. Find some basis and determine the dimension of the linear space of solutions.

Let's form an extended matrix and bring it to the form of a trapezoid using the Gauss method.

To get some basis, we substitute arbitrary values:

Get the rest of the coordinates

Answer:

5.4. Find the coordinates of the vector X in the basis, if it is given in the basis.

Finding the coordinates of the vector in the new basis is reduced to solving the system of equations

Method 1. Finding using the transition matrix

Compose the transition matrix

Let's find the vector in the new basis by the formula

Find the inverse matrix and do the multiplication

,

Method 2. Finding by compiling a system of equations.

Compose the basis vectors from the coefficients of the basis

,
,

Finding a vector in a new basis has the form

, where d This given vector x.

The resulting equation can be solved in any way, the answer will be the same.

Answer: vector in new basis
.

5.5. Let x = (x 1 , x 2 , x 3 ) . Are the following transformations linear.

Let us compose matrices of linear operators from the coefficients of given vectors.



Let us check the property of linear operations for each matrix of a linear operator.

The left side is found by matrix multiplication BUT per vector

We find the right side by multiplying the given vector by a scalar
.

We see that
so the transformation is not linear.

Let's check other vectors.

, the transformation is not linear.

, the transformation is linear.

Answer: Oh- not linear transformation, Vx- not linear Cx- linear.

Note. You can complete this task much easier by carefully looking at the given vectors. AT Oh we see that there are terms that do not contain elements X, which could not be obtained as a result of a linear operation. AT Vx there is an element X to the third power, which also could not be obtained by multiplying by a vector X.

5.6. Given x = { x 1 , x 2 , x 3 } , Ax = { x 2 x 3 , x 1 , x 1 + x 3 } , bx = { x 2 , 2 x 3 , x 1 } . Perform the given operation: ( A ( B A )) x .

Let us write out the matrices of linear operators.


Let's perform an operation on matrices

When multiplying the resulting matrix by X, we get

Answer:

Task 1. Find out if the system of vectors is linearly independent. The system of vectors will be defined by the matrix of the system, the columns of which consist of the coordinates of the vectors.

.

Decision. Let the linear combination equals zero. Having written this equality in coordinates, we obtain the following system of equations:

.

Such a system of equations is called triangular. She has the only solution. . Hence the vectors are linearly independent.

Task 2. Find out if the system of vectors is linearly independent.

.

Decision. Vectors are linearly independent (see Problem 1). Let us prove that the vector is a linear combination of vectors . Vector expansion coefficients are determined from the system of equations

.

This system, like a triangular one, has a unique solution.

Therefore, the system of vectors linearly dependent.

Comment. Matrices such as in problem 1 are called triangular , and in problem 2 – stepped triangular . The question of the linear dependence of a system of vectors is easily solved if the matrix composed of the coordinates of these vectors is stepwise triangular. If the matrix does not special kind, then using elementary string transformations , preserving linear relationships between columns, it can be reduced to a stepped-triangular form.

Elementary transformations lines matrices (EPS) are called the following operations on the matrix:

1) permutation of lines;

2) multiplying a string by a non-zero number;

3) adding to the string another string, multiplied by an arbitrary number.

Task 3. Find the maximum linearly independent subsystem and calculate the rank of the system of vectors

.

Decision. Let us reduce the matrix of the system with the help of EPS to a stepped-triangular form. To explain the procedure, the line with the number of the matrix to be transformed will be denoted by the symbol . The column after the arrow shows the actions to be performed on the rows of the converted matrix to obtain the rows of the new matrix.


.

Obviously, the first two columns of the resulting matrix are linearly independent, the third column is their linear combination, and the fourth does not depend on the first two. Vectors are called basic. They form the maximum linearly independent subsystem of the system , and the rank of the system is three.



Basis, coordinates

Task 4. Find the basis and coordinates of vectors in this basis on the set geometric vectors, whose coordinates satisfy the condition .

Decision. The set is a plane passing through the origin. An arbitrary basis on the plane consists of two non-collinear vectors. The coordinates of the vectors in the selected basis are determined by solving the corresponding system of linear equations.

There is another way to solve this problem, when you can find the basis by coordinates.

Coordinates spaces are not coordinates on the plane, since they are related by the relation , that is, they are not independent. The independent variables and (they are called free) uniquely determine the vector on the plane and, therefore, they can be chosen as coordinates in . Then the basis consists of vectors lying in and corresponding to sets of free variables and , i.e .

Task 5. Find the basis and coordinates of the vectors in this basis on the set of all vectors in the space , whose odd coordinates are equal to each other.

Decision. We choose, as in the previous problem, coordinates in space .

As , then the free variables uniquely define a vector from and, therefore, are coordinates. The corresponding basis consists of vectors .

Task 6. Find the basis and coordinates of vectors in this basis on the set of all matrices of the form , where are arbitrary numbers.

Decision. Each matrix from can be uniquely represented as:

This relation is the expansion of the vector from in terms of the basis
with coordinates .

Task 7. Find the dimension and basis of the linear span of a system of vectors

.

Decision. Using the EPS, we transform the matrix from the coordinates of the system vectors to a stepped-triangular form.




.

columns of the last matrix are linearly independent, and the columns are linearly expressed through them. Hence the vectors form the basis , and .

Comment. Basis in chosen ambiguously. For example, vectors also form the basis .

a 1 = { 3, 5, 1 , 4 }, a 2 = { –2, 1, -5 , -7 }, a 3 = { -1, –2, 0, –1 }.

Decision. Are looking for common decision systems of equations

a 1 x 1 + a 2 x 2 + a 3 x 3 = Θ

Gaussian method. To do this, we write this homogeneous system in coordinates:

System Matrix

The allowed system looks like: (r A = 2, n= 3). The system is consistent and undefined. Its general solution ( x 2 - free variable): x 3 = 13x 2 ; 3x 1 – 2x 2 – 13x 2 = 0 => x 1 = 5x 2 => X o = . The presence of a non-zero private solution, for example, , indicates that the vectors a 1 , a 2 , a 3 linearly dependent.

Example 2

Find out if it is this system vectors linearly dependent or linearly independent:

1. a 1 = { -20, -15, - 4 }, a 2 = { –7, -2, -4 }, a 3 = { 3, –1, –2 }.

Decision. Consider the homogeneous system of equations a 1 x 1 + a 2 x 2 + a 3 x 3 = Θ

or expanded (by coordinates)

The system is homogeneous. If it is non-degenerate, then it has a unique solution. When homogeneous system is the zero (trivial) solution. Hence, in this case the system of vectors is independent. If the system is degenerate, then it has non-zero solutions and, therefore, it is dependent.

Checking the system for degeneracy:

= –80 – 28 + 180 – 48 + 80 – 210 = – 106 ≠ 0.

The system is non-degenerate and, therefore, the vectors a 1 , a 2 , a 3 are linearly independent.

Tasks. Find out if the given system of vectors is linearly dependent or linearly independent:

1. a 1 = { -4, 2, 8 }, a 2 = { 14, -7, -28 }.

2. a 1 = { 2, -1, 3, 5 }, a 2 = { 6, -3, 3, 15 }.

3. a 1 = { -7, 5, 19 }, a 2 = { -5, 7 , -7 }, a 3 = { -8, 7, 14 }.

4. a 1 = { 1, 2, -2 }, a 2 = { 0, -1, 4 }, a 3 = { 2, -3, 3 }.

5. a 1 = { 1, 8 , -1 }, a 2 = { -2, 3, 3 }, a 3 = { 4, -11, 9 }.

6. a 1 = { 1, 2 , 3 }, a 2 = { 2, -1 , 1 }, a 3 = { 1, 3, 4 }.

7. a 1 = {0, 1, 1 , 0}, a 2 = {1, 1 , 3, 1}, a 3 = {1, 3, 5, 1}, a 4 = {0, 1, 1, -2}.

8. a 1 = {-1, 7, 1 , -2}, a 2 = {2, 3 , 2, 1}, a 3 = {4, 4, 4, -3}, a 4 = {1, 6, -11, 1}.

9. Prove that a system of vectors will be linearly dependent if it contains:

a) two equal vectors;

b) two proportional vectors.

Vectors, their properties and actions with them

Vectors, actions with vectors, linear vector space.

Vectors are an ordered collection of a finite number of real numbers.

Actions: 1. Multiplying a vector by a number: lambda * vector x \u003d (lamda * x 1, lambda * x 2 ... lambda * x n). (3.4, 0. 7) * 3 \u003d (9, 12,0.21)

2. Addition of vectors (they belong to the same vector space) vector x + vector y \u003d (x 1 + y 1, x 2 + y 2, ... x n + y n,)

3. Vector 0=(0,0…0)---n E n – n-dimensional (linear space) vector x + vector 0 = vector x

Theorem. In order for a system of n vectors in an n-dimensional linear space to be linearly dependent, it is necessary and sufficient that one of the vectors be a linear combination of the others.

Theorem. Any set of n+ 1st vector of n-dimensional linear space yavl. linearly dependent.

Addition of vectors, multiplication of vectors by numbers. Subtraction of vectors.

The sum of two vectors is the vector directed from the beginning of the vector to the end of the vector, provided that the beginning coincides with the end of the vector. If the vectors are given by their expansions in terms of basis vectors, then adding the vectors adds up their respective coordinates.

Let's consider this using the example of a Cartesian coordinate system. Let be

Let us show that

Figure 3 shows that

The amount of any finite number vectors can be found using the polygon rule (Fig. 4): to build the sum of a finite number of vectors, it is enough to combine the beginning of each subsequent vector with the end of the previous one and construct a vector connecting the beginning of the first vector with the end of the last one.

Properties of the vector addition operation:

In these expressions m, n are numbers.

The difference of vectors is called the vector. The second term is a vector opposite to the vector in direction, but equal to it in length.

Thus, the vector subtraction operation is replaced by the addition operation

The vector, the beginning of which is at the origin of coordinates, and the end at the point A (x1, y1, z1), is called the radius vector of the point A and is denoted or simply. Since its coordinates coincide with the coordinates of the point A, its expansion in terms of vectors has the form

A vector starting at point A(x1, y1, z1) and ending at point B(x2, y2, z2) can be written as

where r 2 is the radius vector of the point B; r 1 - radius vector of point A.

Therefore, the expansion of the vector in terms of orts has the form

Its length is equal to the distance between points A and B

MULTIPLICATION

So in case plane problem the product of a vector by a = (ax; ay) and a number b is found by the formula

a b = (ax b; ay b)

Example 1. Find the product of the vector a = (1; 2) by 3.

3 a = (3 1; 3 2) = (3; 6)

So in case spatial problem the product of the vector a = (ax; ay; az) and the number b is found by the formula

a b = (ax b; ay b; az b)

Example 1. Find the product of the vector a = (1; 2; -5) by 2.

2 a = (2 1; 2 2; 2 (-5)) = (2; 4; -10)

Dot product of vectors and where is the angle between the vectors and ; if either , then

From the definition of the scalar product, it follows that

where, for example, is the value of the projection of the vector onto the direction of the vector .

Scalar square of a vector:

Dot product properties:

Dot product in coordinates

If a then

Angle between vectors

Angle between vectors - the angle between the directions of these vectors (smallest angle).

Vector product(The vector product of two vectors.)- this is a pseudovector, perpendicular to the plane, built by two factors, which is the result of the binary operation "vector multiplication" over vectors in three-dimensional Euclidean space. The product is neither commutative nor associative (it is anticommutative) and is different from the dot product of vectors. In many engineering and physics problems, it is necessary to be able to build a vector perpendicular to two existing ones - the vector product provides this opportunity. The cross product is useful for "measuring" the perpendicularity of vectors - the length of the cross product of two vectors is equal to the product of their lengths if they are perpendicular, and decreases to zero if the vectors are parallel or anti-parallel.

Vector product is defined only in three-dimensional and seven-dimensional spaces. The result of the vector product, like the scalar product, depends on the metric of the Euclidean space.

Unlike the formula for calculating the dot product from the coordinates of the vectors in a three-dimensional rectangular coordinate system, the formula for the cross product depends on the orientation rectangular system coordinates or, in other words, its "chirality"

Collinearity of vectors.

Two non-zero (not equal to 0) vectors are called collinear if they lie on parallel lines or on the same line. We allow, but not recommended, a synonym - "parallel" vectors. Collinear vectors can be directed in the same direction (“co-directed”) or oppositely directed (in last case they are sometimes called "anticollinear" or "antiparallel").

Mixed product of vectors( a,b,c)- scalar product of vector a and vector product of vectors b and c:

(a,b,c)=a ⋅(b×c)

sometimes called triple scalar product vectors, apparently due to the fact that the result is a scalar (more precisely, a pseudoscalar).

geometric sense: The modulus of the mixed product is numerically equal to the volume of the parallelepiped formed by the vectors (a,b,c) .

Properties

A mixed product is skew-symmetric with respect to all its arguments: that is, e. a permutation of any two factors changes the sign of the product. It follows from this that the mixed product on the right Cartesian system coordinates (in an orthonormal basis) is equal to the determinant of a matrix composed of vectors and:

The mixed product in the left Cartesian coordinate system (in an orthonormal basis) is equal to the determinant of a matrix composed of vectors and taken with a minus sign:

In particular,

If any two vectors are parallel, then with any third vector they form a mixed product equal to zero.

If three vectors are linearly dependent (i.e., coplanar, lie in the same plane), then their mixed product is zero.

Geometric sense - Mixed product by absolute value is equal to the volume of the parallelepiped (see figure) formed by the vectors and; the sign depends on whether this triple of vectors is right or left.

Complanarity of vectors.

Three vectors (or more) are called coplanar if they, being reduced to common beginning, lie in the same plane

Complanarity Properties

If at least one of three vectors- zero, then the three vectors are also considered coplanar.

A triple of vectors containing a pair of collinear vectors is coplanar.

Mixed product of coplanar vectors. This is a criterion for the coplanarity of three vectors.

Coplanar vectors are linearly dependent. This is also a criterion for coplanarity.

In 3-dimensional space, 3 non-coplanar vectors form a basis

Linearly dependent and linearly independent vectors.

Linearly dependent and independent systems of vectors.Definition. The system of vectors is called linearly dependent, if there is at least one non-trivial linear combination of these vectors equal to the zero vector. Otherwise, i.e. if only a trivial linear combination of given vectors is equal to the null vector, the vectors are called linearly independent.

Theorem (linear dependence criterion). For a system of vectors in a linear space to be linearly dependent, it is necessary and sufficient that at least one of these vectors be a linear combination of the others.

1) If there is at least one zero vector among the vectors, then the entire system of vectors is linearly dependent.

Indeed, if, for example, , then, assuming , we have a nontrivial linear combination .▲

2) If some of the vectors form a linearly dependent system, then the entire system is linearly dependent.

Indeed, let the vectors , , be linearly dependent. Hence, there exists a non-trivial linear combination equal to the zero vector. But then, assuming , we also obtain a non-trivial linear combination equal to the zero vector.

2. Basis and dimension. Definition. System of linearly independent vectors vector space called basis this space, if any vector from can be represented as a linear combination of the vectors of this system, i.e. for each vector there are real numbers such that equality holds. This equality is called vector decomposition according to the basis , and the numbers called vector coordinates relative to the basis(or in basis) .

Theorem (on the uniqueness of the expansion in terms of the basis). Each space vector can be expanded in terms of the basis in a unique way, i.e. coordinates of each vector in the basis are defined unambiguously.

The system of vectors is called linearly dependent, if there are such numbers , among which at least one is different from zero, that the equality https://pandia.ru/text/78/624/images/image004_77.gif" width="57" height="24 src=" >.

If this equality holds only if all , then the system of vectors is called linearly independent.

Theorem. The system of vectors will linearly dependent if and only if at least one of its vectors is a linear combination of the others.

Example 1 Polynomial is a linear combination of polynomials https://pandia.ru/text/78/624/images/image010_46.gif" width="88 height=24" height="24">. Polynomials constitute a linearly independent system, since the https polynomial: //pandia.ru/text/78/624/images/image012_44.gif" width="129" height="24">.

Example 2 The matrix system , , https://pandia.ru/text/78/624/images/image016_37.gif" width="51" height="48 src="> is linearly independent, since the linear combination is equal to the zero matrix only in when https://pandia.ru/text/78/624/images/image019_27.gif" width="69" height="21">, , https://pandia.ru/text/78/624 /images/image022_26.gif" width="40" height="21"> linearly dependent.

Decision.

Compose a linear combination of these vectors https://pandia.ru/text/78/624/images/image023_29.gif" width="97" height="24">=0..gif" width="360" height=" 22">.

Equating the coordinates of the same name equal vectors, we get https://pandia.ru/text/78/624/images/image027_24.gif" width="289" height="69">

Finally we get

and

The system has only one trivial solution, so the linear combination of these vectors is equal to zero only in the case when all coefficients are equal to zero. Therefore, this system of vectors is linearly independent.

Example 4 The vectors are linearly independent. What will be the systems of vectors

a).;

b).?

Decision.

a). Compose a linear combination and equate it to zero

Using the properties of operations with vectors in a linear space, we rewrite the last equality in the form

Since the vectors are linearly independent, the coefficients for must be equal to zero, i.e..gif" width="12" height="23 src=">

The resulting system of equations has a unique trivial solution .

Since equality (*) executed only at https://pandia.ru/text/78/624/images/image031_26.gif" width="115 height=20" height="20"> – linearly independent;

b). Compose the equality https://pandia.ru/text/78/624/images/image039_17.gif" width="265" height="24 src="> (**)

Applying similar reasoning, we get

Solving the system of equations by the Gauss method, we obtain

or

The last system has infinite set solutions https://pandia.ru/text/78/624/images/image044_14.gif" width="149" height="24 src=">. Thus, there is a non-zero set of coefficients for which the equality (**) . Therefore, the system of vectors is linearly dependent.

Example 5 The vector system is linearly independent, and the vector system is linearly dependent..gif" width="80" height="24">.gif" width="149 height=24" height="24"> (***)

In equality (***) . Indeed, for , the system would be linearly dependent.

From the relation (***) we get or Denote .

Get

Tasks for independent decision(in the audience)

1. A system containing a zero vector is linearly dependent.

2. Single vector system a, is linearly dependent if and only if, a=0.

3. A system consisting of two vectors is linearly dependent if and only if the vectors are proportional (that is, one of them is obtained from the other by multiplying by a number).

4. If k is linear dependent system add a vector, you get a linearly dependent system.

5. If from linear independent system delete a vector, then the resulting system of vectors is linearly independent.

6. If the system S linearly independent, but becomes linearly dependent when a vector is added b, then the vector b linearly expressed in terms of the vectors of the system S.

c). The system of matrices , , in the space of matrices of the second order.

10. Let the system of vectors a,b,c vector space is linearly independent. Prove linear independence following systems vectors:

a).a+b, b, c.

b).a+https://pandia.ru/text/78/624/images/image062_13.gif" width="15" height="19">– arbitrary number

c).a+b, a+c, b+c.

11. Let be a,b,c are three vectors in the plane that can be used to form a triangle. Will these vectors be linearly dependent?

12. Given two vectors a1=(1, 2, 3, 4),a2=(0, 0, 0, 1). Pick up two more 4D vectors a3 anda4 so that the system a1,a2,a3,a4 was linearly independent .