전체 글(95)
-
[Linear Algebra] 11. Cramer's rule, explained geometrically
In this section, let's view "Cramer's rule" by geometrically. Cramer's rule is not the best way to compute solutions of systems of linear equations. Gaussian elimination will always be faster. But understanding Cramer's rule geometrically will help consolidate ideas of relation between determinant and system of linear equations. Example In this setup, we define system of linear equations with 1)..
2022.03.12 -
[Linear Algebra] 10. Cross products and duality
3-D cross product formula: $$ \mathbf{\vec{v}} = \begin{bmatrix} v_1 \\ v_2 \\ v_3 \end{bmatrix} ; \mathbf{\vec{w}} = \begin{bmatrix} w_1 \\ w_2 \\ w_3 \end{bmatrix} $$ $$ \mathbf{\vec{v}} \times \mathbf{\vec{w}} = \det \begin{pmatrix} \begin{bmatrix} \mathbf{\hat{i}} & v_1 & w_1 \\ \mathbf{\hat{j}} & v_2 & w_2 \\ \mathbf{\hat{k}} & v_3 & w_3 \\ \end{bmatrix} \end{pmatrix} = \mathbf{\hat{i}}(v_2..
2022.03.12 -
[Linear Algebra] 9. Cross products via transformations
2 dimension The cross product of $\mathbf{\vec{v}}$ and $\mathbf{\vec{w}}$, written $\mathbf{\vec{v}} \times \mathbf{\vec{w}}$, is the area of the parallelogram defined by $\mathbf{\vec{v}}$ and $\mathbf{\vec{w}}$. Cross product also considers orientation. 1). If $\mathbf{\vec{v}}$ is on the right of $\mathbf{\vec{w}}$, then $\mathbf{\vec{v}} \times \mathbf{\vec{w}}$ is positive and equal to the..
2022.03.12 -
[Linear Algebra] 8. Dot products and duality
Standard way Numerically, if $\mathbf{\vec{v}}$ and $\mathbf{\vec{w}}$ are in the same dimension, dot product is 1). pairing all of the coordinates, 2). multiplying those pairs together, and 3). adding the results: $$ \mathbf{\vec{v}} = \begin{bmatrix} a \\ b \\ c \end{bmatrix} \ \mathbf{\vec{w}} = \begin{bmatrix} d \\ e \\ f \end{bmatrix} $$ $$ \begin{bmatrix} a \\ b \\ c \end{bmatrix} \cdot \b..
2022.03.12 -
[Linear Algebra] 7. Nonsquare matrices as transformations between dimensions
In this section, let's think about non-square matrices geometrically. Non-square matrics are transformations that transform particular dimensional vectors into other dimensional vectors. 2-D (plane) to 3-D (space) If there is a transformation that takes $\mathbf{\hat{i}}$ to the coordinate $\begin{bmatrix} a \\ b \\ c \end{bmatrix}$ and $\mathbf{\hat{j}}$ to the coordinate $\begin{bmatrix} d \\ ..
2022.03.12 -
[Linear Algebra] 6. Inverse matrices, column space and null space
Let's think about the usefulness of linear algebra. One of the main reasons that linear algebra is broadly applicable is that it can solve system of linear equations. "System of linear equations":$$\begin{matrix}ax + by + cz = l \\dx + ey + fz = m \\gx + hy + iz = n \\\end{matrix}$$ Let's package "system of linear equations" into single vector equation, which consists of 1). matrix ($A$) which ..
2022.03.12