Questions tagged [eigenvalues-eigenvectors]
Eigenvalues are numbers associated to a linear operator from a vector space $V$ to itself: $\lambda$ is an eigenvalue of $T\colon V\to V$ if the map $x\mapsto \lambda x-Tx$ is not injective. An eigenvector corresponding to $\lambda$ is a non-trivial solution to $\lambda x - Tx = 0$.
14,653 questions
371
votes
11
answers
249k
views
What is the importance of eigenvalues/eigenvectors?
What is the importance of eigenvalues/eigenvectors?
272
votes
6
answers
376k
views
Eigenvectors of real symmetric matrices are orthogonal [closed]
Can someone point me to a paper, or show here, why symmetric matrices have orthogonal eigenvectors? In particular, I'd like to see proof that for a symmetric matrix $A$ there exists decomposition $A = ...
255
votes
8
answers
276k
views
Proof that the trace of a matrix is the sum of its eigenvalues [closed]
I have looked extensively for a proof on the internet but all of them were too obscure. I would appreciate if someone could lay out a simple proof for this important result. Thank you.
220
votes
8
answers
131k
views
Intuitively, what is the difference between Eigendecomposition and Singular Value Decomposition?
I'm trying to intuitively understand the difference between SVD and eigendecomposition.
From my understanding, eigendecomposition seeks to describe a linear transformation as a sequence of three basic ...
206
votes
7
answers
164k
views
How to intuitively understand eigenvalue and eigenvector?
I’m learning multivariate analysis and I have learnt linear algebra for two semesters when I was a freshman.
Eigenvalues and eigenvectors are easy to calculate and the concept is not difficult to ...
195
votes
7
answers
239k
views
What is the difference between "singular value" and "eigenvalue"?
I am trying to prove some statements about singular value decomposition, but I am not sure what the difference between singular value and eigenvalue is.
Is "singular value" just another name for ...
167
votes
3
answers
147k
views
Why is the eigenvector of a covariance matrix equal to a principal component?
If I have a covariance matrix for a data set and I multiply it times one of it's eigenvectors. Let's say the eigenvector with the highest eigenvalue. The result is the eigenvector or a scaled ...
162
votes
7
answers
271k
views
Show that the determinant of $A$ is equal to the product of its eigenvalues
Show that the determinant of a matrix $A$ is equal to the product of its eigenvalues $\lambda_i$.
So I'm having a tough time figuring this one out. I know that I have to work with the characteristic ...
145
votes
0
answers
5k
views
Probability for an $n\times n$ matrix to have only real eigenvalues
Let $A$ be an $n\times n$ random matrix where every entry is i.i.d. and uniformly distributed on $[0,1]$. What is the probability that $A$ has only real eigenvalues?
The answer cannot be $0$ or $1$, ...
136
votes
8
answers
163k
views
How to prove that eigenvectors from different eigenvalues are linearly independent [duplicate]
How can I prove that if I have $n$ eigenvectors from different eigenvalues, they are all linearly independent?
101
votes
4
answers
103k
views
Are the eigenvalues of $AB$ equal to the eigenvalues of $BA$?
First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an ...
98
votes
2
answers
135k
views
What is the relation between rank of a matrix, its eigenvalues and eigenvectors
I am quite confused about this. I know that zero eigenvalue means that null space has non zero dimension. And that the rank of matrix is not the whole space. But is the number of distinct eigenvalues (...
92
votes
6
answers
56k
views
Geometric interpretation for complex eigenvectors of a 2×2 rotation matrix
The rotation matrix
$$\pmatrix{ \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta}$$
has complex eigenvalues $\{e^{\pm i\theta}\}$ corresponding to eigenvectors $\pmatrix{1 \\i}$ and $\...
89
votes
5
answers
132k
views
Are all eigenvectors, of any matrix, always orthogonal?
I have a very simple question that can be stated without any proof. Are all eigenvectors, of any matrix, always orthogonal? I am trying to understand principal components and it is crucial for me to ...
83
votes
7
answers
10k
views
What exactly are eigen-things?
Wikipedia defines an eigenvector like this:
An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that differs from the original vector at most ...