[Linear Algebra] 8. Dot products and duality

2022. 3. 12. 12:58Mathematics/Linear Algebra

Standard way

Numerically, if $\mathbf{\vec{v}}$ and $\mathbf{\vec{w}}$ are in the same dimension, dot product is 1). pairing all of the coordinates, 2). multiplying those pairs together, and 3). adding the results:

$$
\mathbf{\vec{v}} = \begin{bmatrix} a \\ b \\ c \end{bmatrix}
\  
\mathbf{\vec{w}} = \begin{bmatrix} d \\ e \\ f \end{bmatrix}
$$
$$
\begin{bmatrix}
a \\ b \\ c
\end{bmatrix}
\cdot
\begin{bmatrix}
d \\ e \\ f
\end{bmatrix}
=
a \cdot d + b \cdot e + c \cdot f
$$

 

Geometrically, dot product between two vectors $\mathbf{\vec{v}}$ and $\mathbf{\vec{w}}$ is 1) projecting $\mathbf{\vec{w}}$ onto the line of $\mathbf{\vec{v}}$ and 2). multiplying length of this projected $\mathbf{\vec{w}}$ by the length of $\mathbf{\vec{v}}$. Moreover, if we project $\mathbf{\vec{v}}$ onto $\mathbf{\vec{w}}$, we can get same result.

 

So when 1). two vectors are generally pointing in the same direction, their dot product is positive. When 2). they're perpendicular, the projection of one onto the other is the zero vector, so the dot product is zero. And if 3). they're pointing generally in opposite direction, their dot product is negative.


Relation between dot product and projection

Let's think about "duality" between dot product and linear transformation from N-D to 1-D.

Duality is situations where you have a natural but surprising correspondence between two types of mathematical thing.

 

Numerically, linear transformation of N-D to 1-D and dot product have same computation. This means there's a nice association between $1 \times N$ matrices and N-D vectors.

$$
\begin{bmatrix}
a & b & c & \cdots & z
\end{bmatrix}
\begin{bmatrix}
\alpha \\ \beta \\ \gamma \\ \vdots \\ \omega
\end{bmatrix}
=
\begin{bmatrix}
a \\ b \\ c \\ \cdots \\ d
\end{bmatrix}
\cdot
\begin{bmatrix}
\alpha \\ \beta \\ \gamma \\ \vdots \\ \omega
\end{bmatrix}
$$

 

This suggests something awesome geometric view: some kind of connection between linear transformations that take vectors to numbers and vectors themselves.

Example

This example answers the connection between transformation and vector. 1). Make a diagonal "number line" passing through the origin. 2). Define 2-D unit vector ($\mathbf{\hat{u}}$), that has the same direction as diagonal line. 3). Define transformation that project 2-D vectors straight onto this diagonal number line. That is a linear transformation that move 2-D vectors to numbers(=1-D vectors).

 

This project transformation can be described by $1 \times 2$ matrix, by finding where $\mathbf{\hat{i}}$ and $\mathbf{\hat{j}}$ land. As a result, $\mathbf{\hat{i}}$ lands on $x$-coordinate of $\mathbf{\hat{u}}$ and $\mathbf{\hat{j}}$ lands on $y$-coordinate of $\mathbf{\hat{u}}$. So the entries of $1 \times 2$ matrix describing the project transformation is coordinate of $\mathbf{\hat{u}}$.

 

 

Unit vector
Computing this project transformation for arbitrary vectors in space, which requires multiplying that matrix by those vectors, is computationally identical to taking a dot product with $\mathbf{\hat{u}}$. This is why taking the dot product with $\mathbf{\hat{u}}$ can be interpreted as projecting a vector onto the span(=line) of $\mathbf{\hat{u}}$ and taking the length.

 

Non-unit vector
How about non-unit vectors? Let's think about non-unit vector $\mathbf{\vec{u}}$, obtained by scaling $\mathbf{\hat{u}}$ by a factor of $3$. Numerically, each component of $\mathbf{\vec{u}}$ is $3$ times bigger than $\mathbf{\hat{u}}$, so matrix associated with $\mathbf{\vec{u}}$ transforms $\mathbf{\hat{i}}$ and $\mathbf{\hat{j}}$ to coordinate of $3\mathbf{\hat{u}}$. Moreover, the new matrix can be interpreted as projecting vector onto the diagonal line and then multiplying length of projected vector by $3$. This is why the dot product with a non-unit vector first projects vector then scales up the length of that projected vector by the length of another.


Summary

We can define linear transformation that projects space onto a diagonal line. Because it is linear, it can described by $1 \times 2$ matrix, since multiplying $1 \times 2$ matrix by 2-D vector is same as dot product of 2-D vectors, that's why this transformation was related to 2-D vector.

Insight

To sum up, 1). If linear transformation moves vectors to numbers, there's a unique vector $\mathbf{\vec{v}}$ corresponding to that transformation, in the sense that applying the transformation is the same thing as taking a dot product with that vector. 2). The dot product is very useful geometric tool for understanding projections and testing whether pair of vectors is pointing in the same direction. 3). From now on we can deal vector as a certain transformation not only as arrow in space.