3
$\begingroup$

In modeling systems with impulsive inputs, the Dirac delta function often appears on the right-hand side of an ordinary differential equation (ODE), such as:

$$ a_n\frac{d^ny}{d x^n}+a_{n-1}\frac{d^{n-1}y}{d x^{n-1}}+\cdots+a_0y=\delta(x) $$

Since the delta function is a distribution with singular support at $x=0$, it must be "balanced" by a term on the left-hand side that can match its singularity. In practice, this seems to always fall to the highest-order derivative term.

  • Why is it specifically the highest derivative term that must absorb the delta function?

  • Is there a rigorous mathematical justification for this (e.g., in distribution theory or functional analysis)?

  • Are there intuitive or physical interpretations—perhaps from mechanics or signal processing—that help explain this behavior?

Any references to textbooks, papers, or lecture notes that explore this would be greatly appreciated!

$\endgroup$
1
  • 6
    $\begingroup$ If $\delta$ were balanced by the $y^{(n-1)}$ term, for example, then the $y^{(n)}$ term would introduce a $\delta’$. $\endgroup$ Commented 11 hours ago

2 Answers 2

5
$\begingroup$

I interpret your question as follows:

The distributional solutions of $$a_n\frac{d^ny}{d x^n}+a_{n-1}\frac{d^{n-1}y}{d x^{n-1}}+\cdots+a_0y=\delta(x)\quad(+)$$ are the distributions $y$ of this form: They are given by actual functions (i.e. not any "wilder" distributions) such that both $y_- := y|_{(-\infty, 0)}$ and $y_+ := y|_{(0,\infty)}$ are solutions of $$a_n\frac{d^ny}{d x^n}+a_{n-1}\frac{d^{n-1}y}{d x^{n-1}}+\cdots+a_0y=0\quad(*)$$ in the ordinary sense (the value $y(0)$ is irrelevant, since it doesn't influence $y$ as a distribution), which satisfy the "jump" condition $$y_+^{(k)}(0) = y_-^{(k)}(0)\quad\text{for $k\leq n-2$,}\qquad a_n\,y_+^{(n-1)}(0) = a_n\,y_-^{(n-1)}(0) + 1$$ Why is it true?

There are two parts to the answer. The first is that any distributional solution of the homogeneous equation $(*)$ is an ordinary $C^\infty$ solution. (I will not prove it, unless you really want, but the proof boils down to the fact that any distributional solution $f$ of $f'=0$ is a constant function (+ a transformation of the order-$n$ ODE to a 1st order one).)

The 2nd part is: if a distribution $f$ is a function given by $f|_{(-\infty, 0)} = f_-|_{(-\infty, 0)}$ and $f|_{(0,\infty)} = f_+|_{(0,\infty)}$, where $f_+$ and $f_-$ are $C^\infty$-functions, then $$f' = (f')_{ord} + (f_+(0)-f_-(0))\,\delta,$$ where $(f')_{ord}$ is the function given by $(f')_{ord}(x) = f_+'(x)$ for $x>0$ and $(f')_{ord}(x) = f_-'(x)$ for $x<0$. The proof follows directly from the definition of the derivative of a distribution. Differentiating again and again we get $$f^{(n)}= (f^{(n)})_{ord} + \sum_{k=1}^{n}(f_+^{(n-k)}(0)-f_-^{(n-k)}(0))\,\delta^{(k-1)}$$

Combining these two parts: Using just part 2, if we suppose that $y$ is a function, given by two $C^\infty$ functions $y_-$ and $y_+$, and if we compute the LHS of $(+)$ and compare it with the RHS, we see that both $y_+$ and $y_-$ are solutions of $(*)$ and that $$y_+^{(k)}(0) = y_-^{(k)}(0)\quad\text{for $k\leq n-2$,}\qquad a_n\,y_+^{(n-1)}(0) = a_n\,y_-^{(n-1)}(0) + 1.$$ To see that we found all the solutions, we then use part 1: any other solution will differ from any solution that we found by a solution of $(*)$, which is therefore a $C^\infty$ function.

$\endgroup$
3
$\begingroup$

[user8268's answer already addresses the issue raised in the OP, but I thought some additional observation may be useful]

It is quite often (at least in PDE theory) that we only look for a solution in a restricted function space (usually some Sobolev spaces or something similar), not in the space of all distributions. For example, in your equation $$ a_n\frac{d^ny}{d x^n}+a_{n-1}\frac{d^{n-1}y}{d x^{n-1}}+\cdots+a_0y=\delta(x) $$ it seems natural to only consider distributions $y$ for which $y,y',\ldots,y^{(n)}$ are all measures (note: I don't know if in your case it's clear that the solution you consider lies in this space). If you do so, the answer is clear thanks to the following observation:

If $f'$ is (representable by) a measure, then $f$ is (representable by) a locally bounded function, even a function of bounded variation.

Now, if you apply it to $f = y,y',\ldots y^{(n-1)}$, you learn that all the terms $$a_0 y, a_1 y', \ldots a_{n-1} y^{(n-1)}$$ that appear on the left-hand side of your ODE are in fact representable by functions. The only possible exception is the highest order term $a_n y^{(n)}$, so that must be the one to "absorb" $\delta$.

Summary: Any regularity of $f'$ implies higher regularity of $f$, so if anything funky happens to one of the derivatives $y,y',\ldots,y^{(n)}$, it must be the last one. Note that this argument is not special to the Dirac delta measure on the right-hand side - it could be any other measure with some singular part (i.e. measure not absolutely continuous wrt the Lebesgue measure). And this moral is still useful in a more general context of multiple variables.

$\endgroup$

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.