2
$\begingroup$

I have a problem with the definition of Orthogonal Space. On linear algebra Serge Lang, it is defined: Let V a vector space with a scalar product. If S is a subset of V, we denote by $S^\perp$ the set of all elements $w\in V$ which are perpendicular to all elements of S, i.e $\langle w,v \rangle =0$ for all $v\in S$. My problem here is to interpret it for the space of solutions of a linear system in such way that the order of the terms fix it. Let's recall:

Let $(a_{ij})$ be a $m \times n$ matrix over a field K. Let $A_1,...,A_m$ its row vectors and $X=(x_1,...,x_n)^T$ as usual. The system of homogeneous linear equations

$$a_{11}x_1 +...+a_{1n}x_n=0$$ $$\vdots$$ $$a_{m1}x_1 +...+a_{mn}x_n=0$$

can also be written in an abbreviated form, namely $A_1X=0,...,A_mX=0$

In this case, I can not define

$$\text{space of solutions}=\{X^n\in K^n \mid \langle X,A_i \rangle =0, i=1,...,m \}$$

because if you multiply those elements you get a $n \times n$ matrix and not zero. I'm following the definition but in cant order it. It has to be $\langle A_i,X \rangle=A_iX=0$ but with the definition the order is confusing.

$\endgroup$
8
  • 1
    $\begingroup$ Your question is confusing. Which elements do you claim get you an $n \times n$ matrix? Are you claiming that $A_1X$ should be $n \times n$? Are you claiming that $\langle X, A_i \rangle$ should be $n \times n$? $\endgroup$ Commented Nov 26 at 22:17
  • $\begingroup$ Regardless, the "answer" is that both $A_i X$ and $\langle A_i, X \rangle$ are the same computation, namely the dot-product of $A_i$ with $X$, which is a scalar rather than an $n \times n$ matrix. $\endgroup$ Commented Nov 26 at 22:19
  • $\begingroup$ Is not confusing thats why I say that A is a m by n matrix and X is a nx1 matrix with the notation. $\endgroup$ Commented Nov 26 at 22:20
  • 2
    $\begingroup$ @BenGrossmann The confusion is that William is interpreting $\langle v, w \rangle$ to mean that we should identify $v$ and $w$ with matrices and multiply them. Then if $A$ is a row vector and $X$ is a column vector William is expecting $\langle X, A \rangle$ to be an $n \times n$ matrix -- physicists would write $| X \rangle \langle A |$ for this operation as opposed to the physicist's $\langle A | X \rangle$ which gives a number (read: a $1 \times 1$ matrix). Of course, we shouldn't be treating these as row/column matrices at all! $\endgroup$ Commented Nov 26 at 22:28
  • 1
    $\begingroup$ @Chris Aha, familiarity with notation has caused me to completely ignore the order of arguments in the inner product when that was the entire point. Thanks for spelling it out. $\endgroup$ Commented Nov 26 at 22:30

1 Answer 1

3
$\begingroup$

The point is that in $V$ every vector is a column vector! So you shouldn't be thinking of your $A_m$ as row vectors at all.

One of the axioms for an inner product $\langle -, - \rangle$ is symmetry, which says that $\langle v, w \rangle = \langle w, v \rangle$! So whether you write $\langle A, X \rangle$ or $\langle X, A \rangle$ you're supposed to get the same answer, and for the standard inner product this is

$$ \Big \langle (x_1, \ldots, x_n)^T, (a_1, \ldots, a_n)^T \Big \rangle = \sum x_i a_i $$

which you can check is symmetric, as needed. Indeed, the dot product (read: the usual inner product) is often defined to be $\langle v, w \rangle = v^T w$! Note that we have to transpose $v$ to get a row vector here, since by default both $v$ and $w$ are assumed to be column vectors.

Of course, this shows there must be something to do with row and column vectors around, so what's happening?

When you work with row vectors, you're secretly using an inner product! Given a vector space $V$ you can form its "linear dual" $V^*$ which is the space of linear maps $V \to k$. In the finite dimensional case, you can think of elements of $V$ as "column vectors" and elements of $V^*$ as "row vectors" where $\varphi = (a_1, \ldots, a_n) \in V^*$ acts on $v = (v_1, \ldots, v_n)^T$ exactly as you describe -- $\sum a_i v_i$. But note that in order to write a vector $v \in V$ as a column vector $(v_1, \ldots v_n)^T$ we have to fix a basis! And fixing a basis gives us access to the standard inner product, which makes that basis orthonormal.

In general, an inner product $\langle -, - \rangle$ is exactly the structure that allows us to identify $V$ and $V^*$. That is, which allows us to identify "column vectors" and "row vectors" without picking a basis. Indeed, the Riesz Representation Theorem tells us that (once we fix an inner product!) for every $\varphi \in V^*$ there's an $a \in V$ so that $\varphi(-) = \langle a, - \rangle$. That is, so that for every $x \in V$ we have $\varphi(x) = \langle a, x \rangle$. If you like this is letting you identify the "row vector" $\varphi \in V^*$ (which naturally acts on $V$ by multiplication) with the "column vector" $a \in V$.


I hope this helps ^_^

$\endgroup$
4
  • $\begingroup$ $⟨v,w⟩=v^Tw$ how is it possible to define it like this if the definition follows that v and w have to be the same?. I mean, $v^T$ doesnt belong the space of $w$. What am i missing? :( $\endgroup$ Commented Nov 26 at 22:34
  • 1
    $\begingroup$ Instead of thinking of any kind of matrix multiplication, it might be better to just think of the inner product by the formula $$\langle v, w \rangle = \sum v_i w_i.$$ Sure, this looks like a row-vector/column-vector multiplication, but it might be better to ignore that for now. As you grow mathematically you'll see how this fact that this can be related to dual spaces, the distinction between "row" and "column" vectors, etc. Indeed, a lot of professional mathematicians struggle with this exact concept when they learn about metrics. $\endgroup$ Commented Nov 26 at 22:39
  • $\begingroup$ Thanks a lot, this is a better way to see it. It helped a lot. Now I have another problem. If $v \in K^n$, being $K^n$ a vector space, if you see it, it can be interpreted as column vector. Do the set of elements $v^T$ belong to $K^n$ $\endgroup$ Commented Nov 26 at 22:45
  • 1
    $\begingroup$ No. In order to even say what "$v^T$" is you need to choose a basis or an inner product, but once you've chosen such structure you should think of $v^T$ as an element of the dual vector space $V^*$ $\endgroup$ Commented Nov 26 at 22:51

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.