- solved -

DSP, Plugin and Host development discussion.
RELATED
PRODUCTS

Post

Lorenzo90 wrote: Fri Oct 15, 2021 4:19 pm
Urs wrote: Mon Oct 04, 2021 2:31 pm
As for the clamp trick, I was checking if Maple might have know a trick to make the anti derivative branchless as well. Apparently it doesn't. Pity.
Hi, I don't know if this is what you're looking for, but the antiderivative of (|x+1| - |x-1|)/2 can be written as ( |x+1|(x+1) - |x-1|(x-1)|)/4, which avoid any if statement (I don't know if it is effectively faster).
Absolute value is just a bitmask of the signbit, so this is 3 adds (assuming common subexpression elimination), 3 multiplies (assuming strength-reduction of the division) and 2 logic ops. So it's probably about the same (at least assuming bulk processing so we can ignore latency) when the branches are predicted perfectly and way faster when they aren't (which I'd imagine is the common case here).

Post

flisk wrote: Fri Oct 08, 2021 7:14 pm |x| and x*sgn(x) is the same of course. This may be nitpicking but the sgn(x) function doesn't have an antiderivative, I'm quite sure. You could prove this using limits, but the short answer is Darboux's theorem. Maybe WolframAlpha defines the "indefinite integral" not the same as "antiderivative", if not it's a mistake. sgn(x) is Riemann integrable however.

The piecewise hardclip from Urs post does have antiderivatives. It seems the 0,5 term is not the constant of integration. It's there to make the piecewise definition of the antiderivative seamless. When you add a constant of integration 'C', the antiderivative of the hardclip function becomes:

Code: Select all

x * sgn(x) - 0.5 + C	for x =< -1
0.5*x^2 + C		for -1 < x <1
x * sgn(x) - 0.5 + C	for x >= -1
Looking up the fundamental theorem of calculus in Wikipedia, I noticed that there is the "second part" of this theorem which explicitly doesn't require continuity, but somehow still speaks of an antiderivative. I guess there may be different definitions of antiderivative, the more advanced versions of which do not require differentiability everywhere. This is pretty much what we're using in the context of ADAA.

As for "making the antiderivative seamless", I really dislike this way of thinking here. It implies that we are fixing something which was broken. But I'd rather see breaking it as already a kind of mistake: we were required to find an antiderivative of a piecewise function, and the way to do that is to compute a definite integral with a varying upper limit. Of course, if instead we integrate each piece separately, we'll have to patch them together.

Post

flisk wrote: Fri Oct 08, 2021 7:14 pmMaybe WolframAlpha defines the "indefinite integral" not the same as "antiderivative", if not it's a mistake.
Let's stop this non-sense. The Wikipedia page of "indefinitely integral" redirects to the page for "antiderivative" which says the following (emphasis mine):
In calculus, an antiderivative, inverse derivative, primitive function, primitive integral or indefinite integral of a function f is a differentiable function F whose derivative is equal to the original function f.
So clearly in common language these two things are very much used to mean the same thing.

Now, you could argue that anti-derivative is a function F such that f=dF/dx which you could argue doesn't exist if f is not C0 (where as you could define "indefinite integral" as some random expression that satisfies definite integrals for any bounds), but here in the applied field of signal-processing we're typically perfectly happy to call dirac-delta a function which has an anti-derivative known as the Heaviside step-function (which for our purposes is "differentiable") and we can define sign(x)=2*H(x)-1 which then has a perfectly fine antiderivative (2*H(x)-1)*x.

So arguably if there is "abuse of language" here it's not about antiderivatives and indefinite integrals, it's about whether or not dirac-delta is a function (or equivalently whether a Heaviside-step is differentiable) and for the purposes of signal processing the answer is very much YES because you'll quickly go nuts otherwise.

Post

mystran wrote: Sun Oct 17, 2021 5:07 pm So arguably if there is "abuse of language" here it's not about antiderivatives and indefinite integrals, it's about whether or not dirac-delta is a function (or equivalently whether a Heaviside-step is differentiable) and for the purposes of signal processing the answer is very much YES because you'll quickly go nuts otherwise.
Actually here I think the argument was not about the differentiation of the Heaviside function, but "one level higher" - the differentiation of the |x| function. Formally it's not differentiable at 0, but, guess what, it doesn't matter, we only care that definite integrals of sgn x are differences of |x|, as would be stated by the fundamental theorem of calculus.

If we wish to apply Fourier theory to a wider extent here, we could require that the function's value at a discontinuity point is always the arithmetic average of the left- and right-side limits etc. The point is, classical definitions are overconservative for our purposes, but then, relaxing them a bit still keeps most of the results - the ones we care about - applicable. I wouldn't even call this "an abuse of terminology", simply a slightly less restrictive version thereof.

Post

Z1202 wrote: Sun Oct 17, 2021 6:54 pm The point is, classical definitions are overconservative for our purposes, but then, relaxing them a bit still keeps most of the results - the ones we care about - applicable.
Yes, this is very much my stance.

Post

In fact, I'd argue that in the DSP context it makes sense to redefine the derivative as the arithmetic average of the left- and right-side derivatives (this will help staying compatible to Fourier transform). In this case we strictly have |x|' = sgn x.

Similarly the generalized functions like delta(x) will need to be understood in the limiting sense (IIUC), such as the limit of a sinc function (or of a somewhat wider family), which then allows the integrals of the form
int delta(x) f(x) dx = int f(x) dH(x)
to converge also if f(x) is discontinuous at 0.
Disclaimer: I'm not 100% sure, but it seems that otherwise the Riemann-Stieltjes integral in the RHS does not converge.

Post

Z1202 wrote: Sun Oct 17, 2021 9:27 pm In fact, I'd argue that in the DSP context it makes sense to redefine the derivative as the arithmetic average of the left- and right-side derivatives (this will help staying compatible to Fourier transform). In this case we strictly have |x|' = sgn x.
I think this is perfectly ok, one could define |x| as a limit of some perfectly differentiable function, let's say x*tanh(a*x) as a approaches infinity and be done with it.

Post

urosh wrote: Mon Oct 18, 2021 10:58 am
Z1202 wrote: Sun Oct 17, 2021 9:27 pm In fact, I'd argue that in the DSP context it makes sense to redefine the derivative as the arithmetic average of the left- and right-side derivatives (this will help staying compatible to Fourier transform). In this case we strictly have |x|' = sgn x.
I think this is perfectly ok, one could define |x| as a limit of some perfectly differentiable function, let's say x*tanh(a*x) as a approaches infinity and be done with it.
Not sure it's a good general idea. E.g. sin(a*x) while all being differentiable, hardly converge to anything "differentiable". OTOH left-right averaging is pretty common with Fourier convergence behavior.

Post

Z1202 wrote: Mon Oct 18, 2021 11:19 am Not sure it's a good general idea.
Why? I can't think of any (differentiable, with limit of -1/+1 for +-infinity) sigmoid function which would cause problems if we use it to define |x| as limit of x*sigmoid(a*x) on a. And in every case you get |x|' = 0 for x = 0.

Post

urosh wrote: Mon Oct 18, 2021 4:48 pm
Z1202 wrote: Mon Oct 18, 2021 11:19 am Not sure it's a good general idea.
Why? I can't think of any (differentiable, with limit of -1/+1 for +-infinity) sigmoid function which would cause problems if we use it to define |x| as limit of x*sigmoid(a*x) on a. And in every case you get |x|' = 0 for x = 0.
I was referring to a general case of defining differentiability in terms of limits of function families, not to the special case of |x|. OTOH sin(ax) doesn't converge to a meaningful function either, so you may have a point. You'd need to specify which convergence do you imply though, I fear it might get tricky, as pointwise or uniform convergence of a function family do not guarantee (if I'm not mistaken) the reasonable convergence of the derivative, and specifying convergence of a function family in terms of convergence of its derivative would be recursive definition.

Still, my main point was that we still want the left-right average to stay compatible to Fourier series or integral convergence anyway, so why not just use that?

Post

urosh wrote: Mon Oct 18, 2021 10:58 am
Z1202 wrote: Sun Oct 17, 2021 9:27 pm In fact, I'd argue that in the DSP context it makes sense to redefine the derivative as the arithmetic average of the left- and right-side derivatives (this will help staying compatible to Fourier transform). In this case we strictly have |x|' = sgn x.
I think this is perfectly ok, one could define |x| as a limit of some perfectly differentiable function, let's say x*tanh(a*x) as a approaches infinity and be done with it.
Maybe you're trying to reinvent the concept of mollifiers?

Post

mystran wrote: Mon Oct 18, 2021 5:36 pm Maybe you're trying to reinvent the concept of mollifiers?
Nah. In school we we were though "if you you feel uneasy about mathematical formalism of dirac and heaviside, think of them as limit of nice continuous functions", I haven't reinvent much if anything.
Z1202 wrote: Mon Oct 18, 2021 5:12 pm Still, my main point was that we still want the left-right average to stay compatible to Fourier series or integral convergence anyway, so why not just use that?
I think that using |x| = lim-over-a x*sigmoid(a*x) is simple-to-trivial argument for claim that differential of |x| is for all practical purposes well defined and 0 @0

Post

urosh wrote: Mon Oct 18, 2021 8:23 pm I think that using |x| = lim-over-a x*sigmoid(a*x) is simple-to-trivial argument for claim that differential of |x| is for all practical purposes well defined and 0 @0
In order to apply this convincingly, you'd need to somehow verify that all (or most) of our knowledge about derivatives commutes with taking a limit of a parameterized function family. Something tells me that averaging is much more obvious in this regard, since differentiation is a linear operation. And, as already mentioned, we get an extra benefit of staying compatible with Fourier theory for free.

Post

urosh wrote: Mon Oct 18, 2021 8:23 pm
mystran wrote: Mon Oct 18, 2021 5:36 pm Maybe you're trying to reinvent the concept of mollifiers?
Nah. In school we we were though "if you you feel uneasy about mathematical formalism of dirac and heaviside, think of them as limit of nice continuous functions", I haven't reinvent much if anything.
Mollifiers basically let you also compute with them as if they were nice continuous functions. Let's say we need C0 continuity, we start with some annoying discontinuous function, we convolve with a triangle function (ie. our chosen mollifier; triangle is enough for C0) and now you've got a nice continuous function! Need more continuity? Use a smoother mollifier.

Especially when you only need a few orders of continuity, things like B-spline basis functions or even two smoothsteps back-to-back make for some great mollifiers that mostly don't make any of the computations significantly more involved (since they are just polynomials and polynomials are usually not a huge deal). The Wikipedia page suggest you should got a completely smooth mollifier, but that's often not necessary, usually C1 or C2 is good enough and a simple piecewise poly will do the trick.

Post

urosh wrote: Mon Oct 18, 2021 8:23 pm I think that using |x| = lim-over-a x*sigmoid(a*x) is simple-to-trivial argument for claim that differential of |x| is for all practical purposes well defined and 0 @0
Maybe I realized the main reason why I don't like this argument. It's possible to construct function families converging to |x|, whose derivative will converge to a nonzero at 0. Obviously, such functions are nonsymmetric, but I hope you see how the whole argument breaks. It's more like an explanation of an "already established fact" rather than any kind of "proof".

Post Reply

Return to “DSP and Plugin Development”