# Piecewise objects

Conventional piecewise notation, as defined by places such as Wolfram MathWorld, describes a function that itself describes several functions over respective intervals (with no requirement for continuity nor differentiability). Also conventionally, the intervals are unique (and if not, they help describe intervals wherein the values of each sub-function are equal). For example:

$\left|x\right|=\begin{cases} x & x\geq 0 \\ -x & x\leq 0 \end{cases}$

This post will make more use of a more general idea - a piecewise object - piecewise notation applied in a context which does not impose uniqueness of intervals (although to perform many operations on them, you must have unique intervals), and does not necessarily have to be a function, but must specify relevant cases and conditions. See the following, for example:

$f\pm g=\begin{cases}f+g & \top\\f-g & \top\end{cases}$

Where $$\top$$ denotes a tautology - realistically, any other condition which would also be equivalent to a tautology would work here). Also - if $$h=f\pm g$$ then $$h-f=\pm g\implies\left|h-f\right|=|g|$$ (which could be considered to be an alternate definition).

It is furthermore important to recognise piecewise objects may not describe a single function, relation or other object, and could, in fact, describe multiple. In this sense, piecewise objects are representations of objects.

# Anonymous piecewise objects

An anonymous piecewise object (APO for short, because I'm lazy) is a piecewise object (not strictly a function) which is unspecified/undetermined on one or more of its cases, but describes other objects that satisfy its other cases at a minimum (i.e. it does not satisfy the requirement of a regular piecewise object in that it must specify every case). It hence describes an infinite family of objects with those requirements.

$$\star$$ is the object/symbol used to denote an unspecified case, and can be used for the condition also (provided the condition is distinct from other cases' conditions - alternatively, 'else' can be used).

For example:

$f:\mathbb{R}\to\mathbb{R},\ f(x)=\begin{cases}x&x\geq 0\\\star & x\leq 0\end{cases}$

Is an anonymous piecewise function that describes the set of functions for which, at a minimum, satisfy $$f(0)=0$$ and $$f(x)=x$$ when $$x\geq 0$$. Such functions include $$f(x)=x+h(\min(x,0))$$ for some $$h(x)$$ where $$h(0)=0$$, or $$f(x)=x\cdot g(\min(x,0))$$ for $$g(0)=C\in\mathbb{R}$$.

Another example is the following vector function for $$\vec{u},\vec{v}\in\mathbb{R}^n$$:

$\vec{r}(t):\mathbb{R}\to\mathbb{R}^n,\ \vec{r}(t)=\begin{cases}\vec{u}&t=0\\ \vec{v}&t=1\\\star&\star\end{cases}$

Which describes a vector function, minimally, such that $$\vec{r}(0)=\vec{u}$$ and $$\vec{r}(1)=\vec{v}$$. One of simplest vector functions that satisfies this is a line $$\mathbb{R}\to\mathbb{R}^n$$​​, $$\vec{r}(t)=\vec{u}+t(\vec{v}-\vec{u})$$.

Derivation of objects (or whatever) that satisfy our anonymous piecewise object can be done by ignoring the presence of $$\star$$. By substituting the equation found by ignoring this (found using traditional solving methods), back into each case, we find a solution that satisfies our definite cases anyway.

# Uses of APOs

## Lagrange polynomials

Lagrange polynomials are cool, to say the least. They describe the construction of a polynomial of the lowest required degree from a set of points (where, for example, $$x_k$$​​ is unique). Interestingly, they can be derived from an APO (which has been a long-time goal of mine, being able to create a polynomial from a piecewise equation).

How, you ask? Let's use an example. Consider the points $$(3,16)$$, $$(0,1)$$ and $$(1,4)$$. We construct our APO:

$P:\mathbb{R}\to\mathbb{R},\ P(x)=\begin{cases} 16 & x=3 \\ 1 & x=0 \\ 4 & x=1 \\ \star & \star \end{cases}$

From there, we make use of a fairly regular and common decomposition method in order to deal with the cases more nicely. This is where the domains must be unique. We have:

$P(x)=\begin{cases} 16 & x=3 \\ 0 & x=0 \\ 0 & x=1 \\ \star & \star \end{cases}+ \begin{cases} 0 & x=3 \\ 1 & x=0 \\ 0 & x=1 \\ \star & \star \end{cases}+ \begin{cases} 0 & x=3 \\ 0 & x=0 \\ 4 & x=1 \\ \star & \star \end{cases}$

From here, we know that each term is specified on $$\mathbb{R}$$ so looking at the roots of each piecewise term, we see for cases where $$x-a$$ (for some $$a$$) is $$0$$, we can multiply out $$x-a$$​ on that term and then ignore the zero terms respectively (as they become part of $$\star$$). At the same time, since we're multiplying out the terms, we have to divide inside the cases respectively. This is still a standard technique:

$P(x)=x(x-1)\begin{cases} \frac{16}{x(x-1)} & x=3 \\ \star & \star \end{cases}+ (x-1)(x-3)\begin{cases} \frac{1}{(x-1)(x-3)} & x=0 \\ \star & \star \end{cases}+ x(x-3)\begin{cases} \frac{4}{x(x-3)} & x=1 \\ \star & \star \end{cases}$

We can simplify with respect to our known cases:

$P(x)=x(x-1)\begin{cases} \frac{8}{3} & x=3 \\ \star & \star \end{cases}+(x-1)(x-3)\begin{cases} \frac{1}{3} & x=0 \\ \star & \star \end{cases}+x(x-3)\begin{cases} -2 & x=1 \\ \star & \star \end{cases}$

In our remaining piecewise terms, since we have constant functions for our known cases (and we want a polynomial function) we can determine each respective $$\star$$ to be those constants. Hence:

$P(x)=\frac{8}{3}x(x-1)+\frac{1}{3}(x-1)(x-3)-2x(x-3)$

And we're essentially done. We can simplify from here, giving us $$P(x)=(x+1)^2$$. From here, you can confirm that this function satisfies our original points.

## Sum of consecutive numbers (raised to a power)

Trivially, $$\underbrace{1+\dots+1}_{n\ \textrm{times}}=n$$. Likewise, the sum of $$1+2+\dots+n$$ is widely known as having the formula $$\frac{1}{2}n(n+1)$$. Likewise that $$1^2+2^2+\dots+n^2$$ has formula $$\frac{1}{6}n(n+1)(2n+1)$$.

So, the hypothesis is: sum of numbers with power $$p$$ is a polynomial of degree $$p+1$$ and is hence uniquely described by $$p+2$$ points. (Haven't found a proof yet! Can you?)

If the above hypothesis is true, then one could obtain the respective formulas via Lagrange polynomials, and i.e. via APOs.

For example, let's attempt $$1+\dots+n$$ formula. By the above, we require 3 points - so we choose $$(0,0)$$, $$(1,1)$$ and $$(2,3)$$:

$P(x)=\begin{cases} 0 & x=0 \\ 1 & x=1 \\ 3 & x=2 \\ \star & \star \end{cases}$

Then:

$P(x)=\begin{cases} 0 & x=0 \\ 0 & x=1 \\ 0 & x=2 \\ \star & \star \end{cases}+\begin{cases} 0 & x=0 \\ 1 & x=1 \\ 0 & x=2 \\ \star & \star \end{cases}+\begin{cases} 0 & x=0 \\ 0 & x=1 \\ 3 & x=2 \\ \star & \star \end{cases}$

From there we can apply the same process as before:

$P(x)=0+\frac{1}{(1)(1-2)}x(x-2)+\frac{3}{(3)(3-1)}x(x-1)$

And we have, after annoyingly tedious polynomial factoring and simplification:

$P(x)=\frac{1}{2}x(x+1)$

## More powerful than Lagrange polynomials

The APOs that create lagrange polynomials are interesting in and of themselves - but there is another idea and/or technique we might point out. Let's take the points $$(3, 3)$$ and $$(2,1)$$ - conventionally describing the linear function $$f(x)=2x-3$$ - and write an APO:

$f(x)=\begin{cases} 3 & x=3 \\ 1 & x=2 \\ \star & \star \end{cases}$

For our first case, instead of writing $$x=3$$, we can change our condition such that, perhaps, $$\sin(\frac{\pi x}{3})=0$$. Notice that $$x=3$$ satisfies this, and $$x=2$$ does not (cases must be unique if we are to work with them conventionally!). Then:

$f(x)=\begin{cases} 3 & \sin(\frac{\pi x}{3})=0 \\ 1 & x=2 \\ \star & \star \end{cases}$

We can work with this in the same way that we did before:

\begin{align} f(x)&=\begin{cases} 3 & \sin(\frac{\pi x}{3})=0 \\ 0 & x=2 \\ \star & \star \end{cases}+\begin{cases} 0 & \sin(\frac{\pi x}{3})=0 \\ 1 & x=2 \\ \star & \star \end{cases}\\ &=(x-2)\begin{cases} \frac{3}{x-2} & \sin(\frac{\pi x}{3})=0 \\ \star & \star \end{cases}+\sin(\frac{\pi x}{3})\begin{cases} \frac{1}{\sin(\frac{\pi x}{3})} & x=2 \\ \star & \star \end{cases}\\ &=(x-2)\begin{cases} 3 & \sin(\frac{\pi x}{3})=0 \\ \star & \star \end{cases}+\sin(\frac{\pi x}{3})\begin{cases} \frac{2}{\sqrt{3}} & x=2 \\ \star & \star \end{cases}\\ &=3(x-2)+\frac{2}{\sqrt{3}}\sin(\frac{\pi x}{3})\\ \end{align}

The difference being that when we have to perform local case-based calculations, we can use our original equation to show $$x=2\implies \sin(\frac{\pi x}{3})=\frac{\sqrt{3}}{2}$$, and vice versa (hence the third line's simplifications). Again, if you check this derived function, you'll see that this matches our original condition, despite being vastly different from the 'conventional' linear function.

TL;DR - by performing substitution by implication on the conditions of the APO (e.g. $$x=a\implies f(x)=f(a)$$) we can create far more complex and interesting equations with vastly different properties to standard polynomials.