Dot Products vs Angles
0x88dfac8bedc5 Lv3

In some generality:

Say x=(x1,x2)x = (x_1, x_2). We can associate the angle θ\theta to xx where

  • cos(θ)=x1/x12+x22\cos(\theta) = x_1 / \sqrt{x_1^2 + x_2^2}
  • sin(θ)=x2/x12+x22\sin(\theta) = x_2 / \sqrt{x_1^2 + x_2^2}

Let a=(cos(α),sin(α))a = (\cos(\alpha), \sin(\alpha)).

  • Let uu be reference direction with u=1\Vert u \Vert = 1.
  • Let β\beta be an orthonormal basis with β1=u\beta_1 = u.
  • For any xx we can write x=kαkβkx = \sum_k \alpha_k \beta_k. Define πk(x)=αk\pi_k(x) = \alpha_k where xx has the above expression in the basis β\beta.
  • Then π1(x)=α1=<x,u>\pi_1(x) = \alpha_1 = \left< x, u \right>

In 2 dimensions:

Suppose two vectors a,ba,b are in R2\mathbb{R}^2. We want to compute the cosine of the angle between them.

It should be independent of lengths so let u=a/a,v=b/bu=a / \Vert a \Vert, v = b / \Vert b \Vert. Suppose u=(cos(α),sin(α)),v=(cos(β),sin(β))u = (cos(\alpha), sin(\alpha)), v = (cos(\beta), sin(\beta)).

By applying Rα1R_\alpha^{-1} to vv we get v1cos(α)+v2sin(α)=u1v1+u2v2v_1 \cos(\alpha) + v_2 \sin(\alpha) = u_1 v_1 + u_2 v_2 as first component. So uvu \cdot v gives component of vv in direction of uu. The expression is symmetric in uu and vv as desired. It also scales bilinearly with the norms of uu and vv.

In the end all we did was extend uu to an orthonormal basis and find the component of vv in the direction of uu. Said component is independent of the particular orthonormal basis chosen so it is well-defined. In particular a Graham-Schmidt construction applied to {u,v}\{u, v\} gives the usual formulae u1=w1=uw2=v<u,v><u,u>u=v<u,v>uu2=w2/w2\begin{aligned} u_1 &= w_1 = u \\ w_2 &= v - \frac{\left< u, v \right>}{\left< u, u \right >} u \\ &= v - \left< u, v \right> u \\ u_2 &= w_2 / \Vert w_2 \Vert \end{aligned}

In general if we have a subspace WW of an inner product space VV, and the columns of the matrix AA span WW then we can construct the component xWx_W of a vector xVx \in V which lies in WW as:

x=xW+xWxW=AA+xxW=(IAA+)x\begin{aligned} x &= x_W + x_{W^\perp} \\ x_W &= A A^{+} x \\ x_{W^\perp} &= (I - A A^{+}) x \end{aligned}

where A+A^+ is the Moore–Penrose inverse of AA and the square matrix AA+A A^+ is the orthogonal projector to the range space of AA.