Decaying approximation

I'm trying to approximate a function $f:\mathbb{R}^+\to\mathbb{R}$. I would like my approximation to be well behaved and have bounded derivatives of all orders. Is there a good reason why I should not use a group of functions of the form $P(x)e^{-x}$ to create a basis?

I know that it can form a basis as for any analytic $f(x)$ we see that $f(x)e^{-x}$ is also analytic. Thus as we have $\sum c_n x^n = f(x)e^x$ then $\sum c_n x^ne^{-x} = f(x)$. It can approximate anything, but then what happens around 0? This also produces an issue where we weight point closer to 0 heavier, but perhaps in some cases this is not a problem. We can also expand the series definition of $e^{-x}$ to get the actual coefficient values, so no real problems there in the analysis.

Still, I don't like it, except maybe as a means to construct probability distributions given the relationship to the gamma distribution, with integration to $1$ being a linear condition. I feel like this could be used to generate them by generating chebyshev-style polynomials that cancel out in a clean way. But then again, why do I care? Nothing is analytic these days.

No comments:

Post a Comment

March thoughts

Lets start by taking any system of ordinary differential equations. We can of course convert this to a first order system by creating stand-...