Call Now
1800-102-2727Eigenvectors is an essential concept in linear algebra, especially in the study of matrices. They are critical in various applications, including data analysis, image processing, quantum physics, etc. Eigenvectors of a matrix are also known by other names like latent vector, characteristic vector, or proper vector.
Eigenvectors are special vectors with intriguing properties that are connected to matrices. A square matrix's eigenvector is a non-vector that becomes a scalar multiple of that matrix when multiplied by another matrix.
Eigenvalues are the distinctive values that represent the parameter by which eigenvectors are extended in their direction. It does not involve a change in vector direction unless the eigenvalue is negative. The position simply changes if an eigenvalue is negative.
Given a square matrix A, an eigenvector v is a nonzero vector that satisfies the equation
, where v is the eigenvector of matrix A and
is a scalar known as the eigenvalue.
To put it the other way, multiplying a matrix by its eigenvector yields a scaled reproduction of the original vector.
Eigenvalues and eigenvectors are inseparably connected. Eigenvalues specify the scaling factor, whereas eigenvectors specify the vector's direction following matrix multiplication. Eigenvectors connected with a given Eigenvalue generate a subspace which is known as eigenspace.
To find the eigenvectors of a matrix, follow these steps:
to determine the eigenvalues, where A is the matrix
, is the eigenvalue, and I is the identity matrix.
, where v is the Eigenvector for the Eigenvalue. This equation can also be written as
.The equation of eigenvectors of a matrix can be denoted as
and can be simplified to write it as
. This equation implies that the matrix
is distinct, suggesting the presence of a nontrivial null space. The Eigenvectors are the null space vectors that do not include the zero vector. Eigenvalue so scales the linked Eigenvector v.
In some circumstances, Eigenvectors corresponding to various Eigenvalues are orthogonal to each other. This is valid for Hermitian or symmetric matrices. Orthogonal Eigenvectors have many applications, including orthogonal diagonalisation, simplifying matrix operations.
Orthogonal eigenvectors are special vectors connected with a matrix that have two important properties: they are matrix eigenvectors, and they are mutually perpendicular. This feature of orthogonality allows for the simplification and study of complicated systems, with applications in domains such as linear algebra and data analysis.
Example 1: For example, consider the matrix A = [[3, 1], [2, 2]].
Now find the eigenvectors and eigenvalues for A.
Solution: To find the eigenvectors and eigenvalues,
We need to solve the equation Av = λv,
Here, A is the matrix, v is the eigenvector, and λ is the eigenvalue.
Start by finding the eigenvalues.
Solve the equation det(A - λI) = 0,
Here I is the identity matrix.
A - λI = [[3, 1], [2, 2]] - [[λ, 0], [0, λ]] = [[3 - λ, 1], [2, 2 - λ]]
The determinant of A - λI is

Setting it equal to zero, we have (λ - 4)(λ - 1) = 0.
Therefore, the eigenvalues will be λ = 4 and λ = 1.
Now, we substitute each eigenvalue into the equation Av = λv ;
For λ = 4:
A - 4I = [[3 - 4, 1], [2, 2 - 4]] = [[-1, 1], [2, -2]]
Let’s solve (A - 4I)v = 0 to find the eigenvector.
[-1, 1] [x, y] = [0, 0]
Now, from the equation, we get -x + y = 0 and 2x - 2y = 0.
Hence, by solving the equation we get; eigenvector corresponding to λ = 4 is
v₁ = [1, 1].
For λ = 1:
A - I = [[3 - 1, 1], [2, 2 - 1]] = [[2, 1], [2, 1]]
Solve (A - I)v = 0 to find the eigenvector.

From the equation, we have 2x + y = 0.
After solving the equations, we get the eigenvector corresponding to λ = 1 is
v₂ = [-1, 2].
Hence, the eigenvectors of the matrix A are:
v₁ = [1, 1] (corresponding to λ = 4) and
v₂ = [-1, 2] (corresponding to λ = 1).
Example 2: For example, let us consider the matrix B = [[5, 2], [2, 3]].
Find the eigenvectors and eigenvalues for B.
Solution: To determine the eigenvalues and eigenvectors,
We use the equation Bv = λv,
We start by finding the eigenvalues.
Now, we solve the equation det(B - λI) = 0, where I is the identity matrix.
B - λI = [[5, 2], [2, 3]] - [[λ, 0], [0, λ]] = [[5 - λ, 2], [2, 3 - λ]]
The determinant of 
By setting this equal to zero, we get 
Although, the quadratic equation has no real solutions.
So, matrix B has no real eigenvectors and eigenvalues.
Q1: Assume a
matrix A having eigenvalues λ₁ = 3 and λ₂ = -1.
Which one of the given vectors can be an eigenvector of A corresponding to the eigenvalue λ₁?
A) [1, 0]
B) [0, 1]
C) [3, 1]
D) [-1, 1]
Answer: A) [1, 0]
Explanation: To get the eigenvector corresponding to an eigenvalue,
We solve the equation
,
After equating the given eigenvalue, we get
.
Input values; we have:


When we simplify the equation, we obtain:


Substitute the values of A:


Expanding the equation:


We'll now replace the values for x in the alternatives and see which one makes the equation work.
After substituting [1, 0] for x, we have:


Further simplifying, we get:


We know that λ₁ = 3 is an eigenvalue of A, therefore
,
and
Additionally,
, indicating that the eigenvector's initial element should not be zero.
The vector [1, 0] satisfies these equation and is, so, the eigenvector corresponding to the eigenvalue λ₁.
2: Assume that a matrix A = [[3, 2], [2, 4]].
What would be the eigenvectors for A?
a) [1, 1] and [-1, 1]
b) [1, -1] and [1, 1]
c) [2, 1] and [1, 2]
d) [1, 0] and [0, 1]
Answer: c) [2, 1] and [1, 2]
Explanation: In order to find the eigenvectors, we solve the equation (A - λI)v = 0,
Here A is the matrix, λ is the eigenvalue, I is the identity matrix, and v is the eigenvector.
The eigenvalues of matrix A are 5 and 2.
After setting these eigenvalues into the equation,
we have two sets of eigenvectors:
[2, 1] for eigenvalue 5 and
[1, 2] for eigenvalue 2.
Q3: Let’s consider a matrix A = [[2, 1], [1, 3]].
Which one of the following vectors would be an eigenvector of A?
a) [1, 2]|
b) [1, 1]
c) [3, 1]
d) [0, 1]
Answer: b) [1, 1]
Explanation: In order to find the eigenvectors of matrix A,
We solve (A - λI)v = 0,
Here A is the matrix, λ is the eigenvalue, I is the identity matrix, and v is the eigenvector.
The eigenvalues for matrix A are 4 and 1.
After setting these eigenvalues into the equation,
We get that [1, 1] is an eigenvector corresponding to eigenvalue 4.
Q1. Is it true that Eigenvectors are always unique for a given matrix?
Ans. Eigenvectors are not unique. In a matrix, many Eigenvectors may correspond to the same Eigenvalue. Yet, the direction of the Eigenvector remains constant while its magnitude fluctuates.
Q2. Is it possible for a matrix to have complicated Eigenvalues and Eigenvectors?
Ans. Yes, matrices can have complicated Eigenvalues and Eigenvectors. This is common with non-real or non-symmetric matrices.
Q3. What role do Eigenvectors play in data analysis?
Ans. Eigenvectors are commonly utilised in data analysis techniques like Principal Component Analysis (PCA). By projecting high-dimensional data onto a lower-dimensional space defined by Eigenvectors, PCA decreases the dimensionality of the data.