Multiple regression
Last updated
Last updated
We're going to solve the example on the Wikipedia article using Clifford, a geometric algebra module. Optimization for large vector space does not quite work yet, so it's going to take (a lof of) time and a fair amount of memory, but it should work.
Let's create four vectors containing our input data:
w = w k e k h 0 = ( h k ) 0 e k h 1 = ( h k ) 1 e k h 2 = ( h k ) 2 e k {\displaystyle {\begin{aligned}\mathbf {w} &=w^{k}\mathbf {e} _{k}\\mathbf {h_{0}} &=(h^{k})^{0}\mathbf {e} _{k}\\mathbf {h_{1}} &=(h^{k})^{1}\mathbf {e} _{k}\\mathbf {h_{2}} &=(h^{k})^{2}\mathbf {e} _{k}\end{aligned}}}
Then what we're looking for are three scalars α {\displaystyle \alpha } , β {\displaystyle \beta } and γ {\displaystyle \gamma } such that:
α h 0 + β h 1 + γ h 2 = w {\displaystyle \alpha \mathbf {h0} +\beta \mathbf {h1} +\gamma \mathbf {h2} =\mathbf {w} }
To get for instance α {\displaystyle \alpha } we can first make the β {\displaystyle \beta } and γ {\displaystyle \gamma } terms disappear:
α h 0 ∧ h 1 ∧ h 2 = w ∧ h 1 ∧ h 2 {\displaystyle \alpha \mathbf {h0} \wedge \mathbf {h1} \wedge \mathbf {h2} =\mathbf {w} \wedge \mathbf {h1} \wedge \mathbf {h2} }
Noting I = h 0 ∧ h 1 ∧ h 2 {\displaystyle I=\mathbf {h0} \wedge \mathbf {h1} \wedge \mathbf {h2} } , we then get:
α = ( w ∧ h 1 ∧ h 2 ) ⋅ I ~ / I ⋅ I ~ {\displaystyle \alpha =(\mathbf {w} \wedge \mathbf {h1} \wedge \mathbf {h2} )\cdot {\tilde {I}}/I\cdot {\tilde {I}}}
Note: a number of the formulae above are invisible to the majority of browsers, including Chrome, IE/Edge, Safari and Opera. They may (subject to the installation of necessary fronts) be visible to Firefox.
April 2016: This computation took over an hour with MoarVM, running in a VirtualBox linux system guest hosted by a windows laptop with a i7 intel processor.
March 2020: With various improvements to the language, 6.d releases of Raku now run the code 2x to 3x as fast, depending on the hardware used, but even so it is generous to describe it as 'slow'.