Examples: Hypercubes, Triangles, Linear Sticking, Sticking

July 31, 2021 | 7 minutes, 23 seconds

Introduction

This will be a short post, consisting of a few examples of the techniques here that I've put to use and playing around with.

Hypercubes

For the context of this little tidbit, an n-dimensional (unit, centred, hollow) hypercube is given by an object whose vertices, in n-dimensions, are given by all vertices generated by \(\left(x_1,\dots,x_n\right),\ x_k\in\left\{-\frac{1}{2},\frac{1}{2}\right\}\). We also define a hypercube as being built up of (n-1)-dimensional objects (e.g. lines, planes, cubes, etc.) with each having a single invariant axis \(x_n\), restricted maximally and minimally by our vertices.

For example, a 2-dimensional square is built up of four 1-dimensional lines, two of which span \(y\) with invariant \(x\), and two which span \(x\) with invariant \(y\). A 3-dimensional cube is built up of six 2-dimensional planes, 2 of which span \(y,z\) with invariant \(x\), 2 which span \(x,z\) with invariant \(y\) and 2 which span \(x,y\) with invariant \(z\).

Notice then, for some \(m\in\left\{1,\dots,n\right\}\) and \(k\in\left\{1,\dots,n\right\}\setminus\left\{m\right\}\), that the equations of our (n-1)-dimensional objects take the following form:

\[ x_m=\pm\frac{1}{2}, -\frac{1}{2}\leq x_k\leq\frac{1}{2}\]

For positive \(x_m\), we can rewrite this as \(x_m=\frac{1}{2},\ |x_k|\leq x_m\).

For negative \(x_m\), we can rewrite this as \(x_m=-\frac{1}{2},\ |x_k|\leq -x_m\).

We can combine the cases (written formally, this is a quick simplification) and as such we get:

\[ |x_m|=\frac{1}{2},\ |x_k|\leq |x_m|\]

Suppose then we define \(V=\left\{|x_1|,\dots,|x_n|\right\}\), then formally we can write the equation of the cube piecewise as:

\[ \begin{cases} |x_1| & |x_1|\geq\max\left\{V\setminus\left\{|x_1|\right\}\right\}\\ |x_2| & |x_2|\geq\max\left\{V\setminus\left\{|x_2|\right\}\right\}\\ \vdots & \vdots\\ |x_n| & |x_n|\geq\max\left\{V\setminus\left\{|x_n|\right\}\right\} \end{cases}=\frac{1}{2}\]

Since we have that, in each case for some \(|x_m|\), \(\forall x_k\in V\setminus\left\{|x_m|\right\},\ |x_m|\geq|x_k|\iff |x_m|\geq\max(V\setminus\left\{x_k\right\})\). This is a familiar construction, leaving us with the equation:

\[ \max\left\{|x_1|,\dots,|x_n|\right\}=\frac{1}{2}\]

Triangles

Isosceles?

Suppose our triangle has vertices at \(\left(0,1\right), \left(-1,-1\right), \left(1,-1\right)\).

The equations of its edges are therefore given by: \(y=-1\) when \(-1\leq x\leq 1\), \(y=1+2x\) when \(-1\leq x\leq 0\) and \(y=1-2x\) when \(0\leq x\leq 1\).

By our multi-equation-sticking-thing found here, in other words:

\[ \max\left\{\alpha(y+1),\beta(y-2x-1),\gamma(y+2x-1)\right\}=0\]

The inequality \(y\geq -1\implies y+1\geq 0\) defines an infinite region that includes our triangle. Suppose \(\alpha(y+1)\) is for general \(x,y\), not the maximum of the above equation. Then \(\alpha(y+1)\leq 0\iff\alpha=-1\). The inequality \(y\leq 1+2x\implies y-2x-1\leq 0\) defines an infinite region that includes our triangle. Suppose \(\beta(y-2x-1)\) is for general \(x,y\), not the maximum of the above equation. Then \(\beta(y-2x-1)\leq 0\iff \beta=1\). Lastly, the inequality \(y\leq 1-2x\implies y+2x-1\leq 0\) defines an infinite region that includes our triangle. Suppose \(\gamma(y+2x-1)\) is for general \(x,y\), not the maximum of the above equation. Then \(\gamma(y+2x-1)\leq 0\iff \gamma=1\).

We're therefore left with:

\[ \max\left\{-y-1,y-2x-1,y+2x-1\right\}=0\]

In other words,

\[ \max\left\{-y,y+2|x|\right\}=1\implies |x|+|y+|x||=1\]

(You can rescale edge lengths to acquire equilateral triangle, or translate it change the centre of the triangle).

Right-angled

The equations describing this triangle are, by simple algebra:

  1. \(x=0,\ y\in\left[0,1\right]\)
  2. \(y=0,\ x\in\left[0,1\right]\)
  3. \(y=1-x,\ (x,y)\in\left[0,1\right]\times\left[0,1\right]\)

By one of our techniques established previously, we can construct the following relation:

\[ \max\{\alpha x,\beta y,\gamma(y+x-1)\}=0\]

For which \(\alpha,\beta,\gamma\in\left\{0,1\right\}\). To determine these values, observe the following inequalities, and by using our restrictions as given by above equations:

\[ \begin{alignat*}{2} &\alpha x\leq\underbrace{\beta y}_{0}\\ \implies& \alpha x\leq 0 \\ \implies& \alpha = -1 \end{alignat*}\]

\[ \begin{alignat*}{2} &\beta y\leq\underbrace{\alpha x}_{0}\\ \implies& \beta y\leq 0 \\ \implies& \beta =-1 \end{alignat*}\]

\[ \begin{alignat*}{2} &\gamma(y+x-1)\leq\underbrace{\alpha x}_{0}\\ \implies&\gamma(y+x-1)\leq0\\ \implies&\gamma=1 \end{alignat*}\]

For our last inequality, which is trickier to prove, we use that \(x\geq 0\) must imply that \(y\leq 1\).

We hence have the following equation: \(\max\left\{-x,-y,y+x-1\right\}=0\).

Linear sticking

A linear function in this context is a function satisfying the piecewise representation (where \(x_1>x_0\)) for two given points \((x_0,y_0),\ (x_1,y_1)\):

\[ f(x)=\begin{cases} y_0 & x=x_0 \\ y_1 & x=x_1 \end{cases}\]

Where \(f(x)\) is a polynomial of degree \(1\). It hence stands to reason that:

\[ \begin{align} f(x) &= y_0+\begin{cases} 0 & x=x_0 \\ y_1-y_0 & x=x_1 \end{cases} \\ &= y_0+(y_1-y_0)\begin{cases} 0 & x=x_0 \\ 1 & x=x_1 \end{cases} \\ &= y_0+\frac{y_1-y_0}{x_1-x_0}\begin{cases} 0 & x=x_0 \\ x_1-x_0 & x=x_1 \end{cases} \\ &= y_0+\frac{y_1-y_0}{x_1-x_0}\left(\begin{cases} x_0 & x=x_0 \\ x_1 & x=x_1 \end{cases}-x_0\right) \\ &= y_0+\underbrace{\frac{y_1-y_0}{x_1-x_0}}_{c_0}\left(x-x_0\right) \\ &= y_0+c_0(x-x_0) \end{align}\]

Which is given on the domain \(x\in\mathbb{R}\).

Recall that our sticking function requires we have \(s=\left\{a,x_1,x_2,\dots,x_n,b\right\}\):

\[ \begin{align} G(x)=\sum_{k=1}^{n+1}{f_k(\max(\min(x,s_{k+1}),s_k))}-\sum_{k=2}^{n+1}{f_k(s_k)} \end{align}\]

Really, this is just an inconveniently-compacted form of:

\[ G(x)=f_1(\max(x,x_1))+f_{n+1}(\min(x,x_{n}))+\\ \sum_{k=2}^{n}f_k(\max(\min(x,x_{k}),x_{k-1})-\\ \sum_{k=2}^{n+1}{\underbrace{f_k(x_{k-1})}_{y_{k-1}}}\]

Our linear equations can be given by:

\[ f_k(x)=y_{k-1}+\frac{y_k-y_{k-1}}{x_k-x_{k-1}}(x-x_{k-1})\]

Giving us:

\[ G(x)=f_1(\max(x,x_1))+f_{n+1}(\min(x,x_n))+\\ \sum_{k=2}^{n}{\frac{y_k-y_{k-1}}{x_k-x_{k-1}}(\max(\min(x,x_{k}),x_{k-1})-x_{k-1})}-y_n\]

Letting \(f_1(x)=y_1\) and \(f_{n+1}(x)=y_n\) while also re-indexing \(k\to k+1\) we simplify to:

\[ G(x)=y_1+\sum_{k=1}^{n-1}{\frac{y_{k+1}-y_{k}}{x_{k+1}-x_{k}}(\max(\min(x,x_{k+1}),x_{k})-x_{k})}\]

After a bit of work, we get the final equation:

\[ G(x)=\frac{1}{2}\left(y_n+y_1\right)+\frac{1}{2}\sum_{k=1}^{n-1}{\frac{y_{k+1}-y_{k}}{x_{k+1}-x_k}\left(\left|x-x_k\right|-\left|x-x_{k+1}\right|\right)}\]

Alternatively, to create a function which preserves the last two lines at each end, rather than constant functions, one could set \(f_1=f_2\) and \(f_{n+1}=f_n\), and go from there.

An interactive Desmos can be found here.

More gluing/sticking functions

Previously I derived the equations for which we could stick functions together. More recently, a friend (thanks, Tim!) assisted me in working on a slightly more general base problem:

Given \(f:\mathbb{R}\to\mathbb{R}\) is a continuous function, can we find a form for:

\[ f(x) = \begin{cases} p(x) & A(x)\geq a \\ q(x) & A(x)\leq a \end{cases}\]

The case we worked on was for \(A(x)=\sin(x)\) and \(a=0\). His solution was to essentially let \(p(x)=k(x)+\sin(x)g(x)\) and \(q(x)=k(x)+\sin(x)h(x)\). It's a brilliant solution. Tim reasoned that the function didn't necessarily have to be periodic (as my solution enforced) - but only meet for points for which \(\sin(x)=0\) - i.e. \(k(x)\). My solution, however, relied on the invertibility of \(\sin(x)\) and therefore enforced periodicity.

In general, suppose \(p(x)=B(x)+p'(x)(A(x)-a)\) and \(q(x)=B(x)+q'(x)(A(x)-a)\). Then:

\[ \begin{align} f(x) &= \begin{cases} B(x)+p'(x)(A(x)-a) & A(x)\geq a\\ B(x)+q'(x)(A(x)-a) & A(x)\leq a \end{cases}\\ &= B(x)+q'(x)(A(x)-a)+\begin{cases} (p'(x)-q'(x))(A(x)-a) & A(x)\geq a\\ 0 &A(x)\leq a \end{cases}\\ &= B(x)+q'(x)(A(x)-a)+(p'(x)-q'(x))\begin{cases} A(x)-a & A(x)\geq a\\ 0 &A(x)\leq a \end{cases}\\ &= B(x)+q'(x)(A(x)-a)+(p'(x)-q'(x))\max(A(x)-a,0)\\ &= B(x)+\min(A(x)-a,0)q'(x)+\max(A(x)-a,0)p'(x)\\ \end{align}\]

My solution is the alternative to this and also works. However, it assumes the invertibility of \(A(x)\) on some domain, and also assumes \(a\in\operatorname{dom}{A^{-1}}\):

\[ \begin{align} f(x)&=\begin{cases} p(x) & A(x)\geq a \\ 0 & A(x)<a \end{cases} + \begin{cases} 0 & A(x)> a \\ q(x) & A(x)\leq a \end{cases}\\ &=\begin{cases} p(x) & A(x)\geq a \\ p(A^{-1}(a)) & A(x)<a \end{cases} + \begin{cases} q(A^{-1}(a)) & A(x)> a \\ q(x) & A(x)\leq a \end{cases}-\begin{cases} q(A^{-1}(a)) & A(x)\geq a \\ p(A^{-1}(a)) & A(x)\leq a \end{cases}\\ &=p\left(\begin{cases} x & A(x)\geq a \\ A^{-1}(a) & A(x)<a \end{cases}\right) + q\left(\begin{cases} A^{-1}(a) & A(x)> a \\ x & A(x)\leq a \end{cases}\right)-q(A^{-1}(a))\\ &=p\left(A^{-1}\left(\begin{cases} A(x) & A(x)\geq a \\ a & A(x)\leq a \end{cases}\right)\right) + q\left(A^{-1}\left(\begin{cases} a & A(x)\geq a \\ A(x) & A(x)\leq a \end{cases}\right)\right)-q(a)\\ &=p\left(A^{-1}\left(\max(A(x),a)\right)\right) + q\left(A^{-1}\left(\min(A(x),a)\right)\right)-q(a)\\ \end{align}\]

This is the method used for the previous gluing/sticking function, for which \(A(x)=x\). Given that \(x\) is invertible on \(\mathbb{R}\), we did not encounter a single issue in finding an equation. It also depends on function composition which is always (never) a fun thing to work with.