Skip to main content

Questions tagged [eigenvalues-eigenvectors]

Eigenvalues are numbers associated to a linear operator from a vector space $V$ to itself: $\lambda$ is an eigenvalue of $T\colon V\to V$ if the map $x\mapsto \lambda x-Tx$ is not injective. An eigenvector corresponding to $\lambda$ is a non-trivial solution to $\lambda x - Tx = 0$.

371 votes
11 answers
249k views

What is the importance of eigenvalues/eigenvectors?
Ryan's user avatar
  • 5,659
272 votes
6 answers
376k views

Can someone point me to a paper, or show here, why symmetric matrices have orthogonal eigenvectors? In particular, I'd like to see proof that for a symmetric matrix $A$ there exists decomposition $A = ...
Phonon's user avatar
  • 4,152
255 votes
8 answers
276k views

I have looked extensively for a proof on the internet but all of them were too obscure. I would appreciate if someone could lay out a simple proof for this important result. Thank you.
JohnK's user avatar
  • 6,921
220 votes
8 answers
131k views

I'm trying to intuitively understand the difference between SVD and eigendecomposition. From my understanding, eigendecomposition seeks to describe a linear transformation as a sequence of three basic ...
user541686's user avatar
  • 14.4k
206 votes
7 answers
164k views

I’m learning multivariate analysis and I have learnt linear algebra for two semesters when I was a freshman. Eigenvalues and eigenvectors are easy to calculate and the concept is not difficult to ...
Jill Clover's user avatar
  • 4,917
195 votes
7 answers
239k views

I am trying to prove some statements about singular value decomposition, but I am not sure what the difference between singular value and eigenvalue is. Is "singular value" just another name for ...
Ramon's user avatar
  • 1,959
167 votes
3 answers
147k views

If I have a covariance matrix for a data set and I multiply it times one of it's eigenvectors. Let's say the eigenvector with the highest eigenvalue. The result is the eigenvector or a scaled ...
Ryan's user avatar
  • 5,659
162 votes
7 answers
271k views

Show that the determinant of a matrix $A$ is equal to the product of its eigenvalues $\lambda_i$. So I'm having a tough time figuring this one out. I know that I have to work with the characteristic ...
onimoni's user avatar
  • 6,806
145 votes
0 answers
5k views

Let $A$ be an $n\times n$ random matrix where every entry is i.i.d. and uniformly distributed on $[0,1]$. What is the probability that $A$ has only real eigenvalues? The answer cannot be $0$ or $1$, ...
Exodd's user avatar
  • 12.3k
136 votes
8 answers
163k views

How can I prove that if I have $n$ eigenvectors from different eigenvalues, they are all linearly independent?
Corey L.'s user avatar
  • 1,379
101 votes
4 answers
103k views

First of all, am I being crazy in thinking that if $\lambda$ is an eigenvalue of $AB$, where $A$ and $B$ are both $N \times N$ matrices (not necessarily invertible), then $\lambda$ is also an ...
dantswain's user avatar
  • 1,175
98 votes
2 answers
135k views

I am quite confused about this. I know that zero eigenvalue means that null space has non zero dimension. And that the rank of matrix is not the whole space. But is the number of distinct eigenvalues (...
Shifu's user avatar
  • 1,133
92 votes
6 answers
56k views

The rotation matrix $$\pmatrix{ \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta}$$ has complex eigenvalues $\{e^{\pm i\theta}\}$ corresponding to eigenvectors $\pmatrix{1 \\i}$ and $\...
Alf's user avatar
  • 2,657
89 votes
5 answers
132k views

I have a very simple question that can be stated without any proof. Are all eigenvectors, of any matrix, always orthogonal? I am trying to understand principal components and it is crucial for me to ...
Bober02's user avatar
  • 2,604
83 votes
7 answers
10k views

Wikipedia defines an eigenvector like this: An eigenvector of a square matrix is a non-zero vector that, when multiplied by the matrix, yields a vector that differs from the original vector at most ...
Eigeneverything's user avatar

15 30 50 per page
1
2 3 4 5
977