This may seem like a silly question, but I just wanted to check. I know there are proofs that if $f(x)=f'(x)$ then $f(x)=Ae^x$. But can we 'invent' another function that obeys $f(x)=f'(x)$ which is non-trivial?
-
2$\begingroup$ How should such a function be invented? If you invent one, then $f=f'$ and by the proof you mentioned, then $f(x) = Ae^x$ must hold. $\endgroup$Lukas Betz– Lukas Betz2015-05-20 17:34:52 +00:00Commented May 20, 2015 at 17:34
-
4$\begingroup$ If you can prove that $f(x)=f'(x)$ implies $f(x)=Ae^x$, then logically any function that obeys $f(x)=f'(x)$ will have to be of the form $f(x)=Ae^x$. That might have sounded redundant, but that's what the implication means... unless I'm misunderstanding the question. $\endgroup$Blake– Blake2015-05-20 17:35:06 +00:00Commented May 20, 2015 at 17:35
-
$\begingroup$ @LeBtz I don't know... It was inspired by another post which asked if there are any 'non-trivial' examples where it was true. I assumed that meant there were non-trivial examples. $\endgroup$bnosnehpets– bnosnehpets2015-05-20 17:37:46 +00:00Commented May 20, 2015 at 17:37
-
2$\begingroup$ If you allow for piecewise functions which are not everywhere differentiable, then there are many nontrivial functions of that form. $\endgroup$Ryan– Ryan2015-05-20 17:38:02 +00:00Commented May 20, 2015 at 17:38
-
1$\begingroup$ Almost everywhere in the sense of Lebesgue measure. en.wikipedia.org/wiki/Almost_everywhere $\endgroup$Robert Israel– Robert Israel2015-05-20 23:47:56 +00:00Commented May 20, 2015 at 23:47
6 Answers
Observe that if $ f(x)=f'(x) $ then $$ \left(\frac{f(x)}{e^x}\right)'=\frac{f'(x)-f(x)}{e^x}=0 $$ Hence $\dfrac{f(x)}{e^x}$ is constant...
-
$\begingroup$ @MichaelHardy: Thanks for the edit... $\endgroup$Keivan– Keivan2015-05-20 17:58:35 +00:00Commented May 20, 2015 at 17:58
-
1$\begingroup$ +1 and accepted. This is my favourite just because of its beautiful simplicity. $\endgroup$bnosnehpets– bnosnehpets2015-05-20 22:51:37 +00:00Commented May 20, 2015 at 22:51
-
1$\begingroup$ The method of the "integrating factor". Every student of calculus should learn this, $\endgroup$GEdgar– GEdgar2015-05-21 12:07:52 +00:00Commented May 21, 2015 at 12:07
I think other answers given here assume the existence of a nice function $e^{x}$ and this makes the proof considerably simpler. However I believe that it is better to approach the problem of solving $f'(x) = f(x)$ without knowing anything about $e^{x}$.
When we go down this path our final result is the following:
Theorem: There exists a unique function $f:\mathbb{R}\to \mathbb{R}$ which is differentiable for all $x \in \mathbb{R}$ and satisfies $f'(x) = f(x)$ and $f(0) = 1$. Further any function $g(x)$ which is differentiable for all $x$ and satisfies $g'(x) = g(x)$ is explicitly given by $g(x) = g(0)f(x)$ where $f(x)$ is the unique function mentioned previously.
We give a simple proof of the above theorem without using any properties/knowledge of $e^{x}$. Let's show that if such a function $f$ exists then it must be unique. Suppose there is another function $h(x)$ such that $h'(x) = h(x)$ and $h(0) = 1$. Then the difference $F(x) = f(x) - h(x)$ satisfies $F'(x) = F(x)$ and $F(0) = 0$. We will show that $F(x) = 0$ for all $x$. Suppose that it is not the case and that there is a point $a$ such that $F(a) \neq 0$ and consider $G(x) = F(a + x)F(a - x)$. Clearly we have \begin{align} G'(x) &= F(a - x)F'(a + x) - F(a + x)F'(a - x)\notag\\ &= F(a - x)F(a + x) - F(a + x)F(a - x)\notag\\ &= 0 \end{align} so that $G(x)$ is constant for all $x$. Therefore $G(x) = G(0) = F(a) \cdot F(a) > 0$. We thus have $F(a + x)F(a - x) > 0$ and hence putting $x = a$ we get $F(2a)F(0) > 0$. This contradicts $F(0) = 0$.
It follows that $F(x) = 0$ for all $x$ and hence the function $f$ must be unique. Now we need to show the existence. To that end we first establish that $f(x) > 0$ for all $x$. If there is a number $b$ such that $f(b) = 0$ then we can consider the function $\phi(x) = f(x + b)$ and it will have the property that $\phi'(x) = \phi(x)$ and $\phi(0) = 0$. By argument in preceding paragraph $\phi(x)$ is identically $0$ and hence $f(x) = \phi(x - b)$ is also identically $0$. Hence it follows that $f(x)$ is non-zero for all $x$. Since $f(x)$ is continuous and $f(0) = 1 > 0$ it follows that $f(x) > 0$ for all $x$.
Since $f'(x) = f(x) > 0$ for all $x$, it follows that $f(x)$ is strictly increasing and differentiable with a non-vanishing derivative. By inverse function therorem the inverse function $f^{-1}$ exists (if $f$ exists) and is also increasing with non-vanishing derivative. Also using techniques of differentiation it follows that $f'(x) = f(x)$ implies that $\{f^{-1}(x)\}' = 1/x$ for all $x > 0$ and $f^{-1}(1) = 0$. Since $1/x$ is continuous the definite integral $$\psi(x) = \int_{1}^{x}\frac{dt}{t}$$ exists for all $x > 0$ and has the properties of $f^{-1}$ and it is easy to show that $f^{-1}(x) = \psi(x)$. Clearly the function $(f^{-1}(x) - \psi(x))$ is constant as it derivative is $0$ and hence $$f^{-1}(x) - \psi(x) = f^{-1}(1) - \psi(1) = 0$$ so that $$f^{-1}(x) = \psi(x) = \int_{1}^{x}\frac{dt}{t}$$ Next using inverse function theorem $f(x)$ exists. Thus the question of existence of $f(x)$ is settled.
Now consider $g(x)$ with $g'(x) = g(x)$. If $g(0) = 0$ then we know from argument given earlier that $g(x) = 0$ for all $x$. If $g(0) \neq 0$ then we study the function $\psi(x) = g(x)/g(0)$. Clearly $\psi'(x) = \psi(x)$ and $\psi(0) = 1$ and hence it is same as the unique function $f(x)$. Thus $g(x)/g(0) = \psi(x) = f(x)$ for all $x$. Hence $g(x) = g(0)f(x)$.
The unique function $f(x)$ in the theorem proved above is denoted by $\exp(x)$ or $e^{x}$.
No. Suppose that $f \not\equiv 0$. We solve the ODE: $$f(x) = f'(x) \implies \frac{f'(x)}{f(x)} = 1 \implies \int \frac{f'(x)}{f(x)}\,{\rm d}x = \int \,{\rm d}x \implies \ln |f(x)| = x+c, \quad c \in \Bbb R$$With this, $|f(x)| = e^{x+c} = e^ce^x$. Call $e^c = A > 0$. Eliminating the absolute value, we have $f(x) = Ae^x$, with $A \in \Bbb R \setminus \{0\}$. Since $f \equiv 0$ also satisfies the equation, we can write all at once: $$f(x) = Ae^x, \quad A \in \Bbb R.$$
-
1$\begingroup$ This does not tells the only solution for $\int \frac{f'(x)}{f(x)} dx$ is $\ln|f(x)|$ $\endgroup$Isura Manchanayake– Isura Manchanayake2016-01-05 03:51:48 +00:00Commented Jan 5, 2016 at 3:51
Consider $g(x) = f(x)\exp(-x)$. Then we have $g'(x) = f'(x)\exp(-x)-f(x)\exp(-x) = 0$. Thus, $g\equiv c$ for some constant $c$. Hence $f(x) = c\exp(x)$.
You have $\dfrac{dy}{dx}=y$. Often one writes $\dfrac{dy} y = dx$ and then evaluates both sides of $\displaystyle\int\frac{dy} y = \int dx$, etc.
However, for a question like this perhaps one should be more careful.
If $f(x)\ne 0$ for all $x$, then one has $\dfrac{f'(x)}{f(x)}=1$ for all $x$. This implies $$ \frac{d}{dx} \log |f(x)| = 1 $$ for all $x$. The mean value theorem entails there can be no antiderivatives besides $$ \log|f(x)| = x + \text{constant} $$ for all $x$. This implies $|f(x)| = e^x\cdot\text{a positive constant}$, so $f(x)=e^x\cdot\text{a nonzero constant}$.
Dividing by $f(x)$ assumes $f(x)$ is not $0$, so one has to check separately the case where $f(x)$ is everywhere $0$, and it checks.
Now what about cases where $f(x)=0$ for some $x$ but not all? Can we rule those out? We would have $f(x_0)=0$ and $f(x_1)\ne 1$. Then one solution would be $$ g(x) = f(x_1)e^{x-x_1} = f(x_1)e^{-x_1} e^x = A e^x. $$ But $g(x_0)\ne0=f(x_0)$ and $g(x_1)=f(x_1)$ and $(f-g)'=f-g$ everywhere.
[to be continued, maybe${}\,\ldots\,{}$]
-
$\begingroup$ If $f(x)=0$ then $f'(x)=0$, so there is no problem at all $\endgroup$AnalysisStudent0414– AnalysisStudent04142015-05-20 17:41:55 +00:00Commented May 20, 2015 at 17:41
-
$\begingroup$ @AnalysisStudent0414 : If $f(x_0)=0$ then certainly $f'(x_0)=0$ so they're both $0$ at one point. Certainly the function that is everywhere $0$ is a solution, but the question is how to prove that if a function satisfying this differential equation is somewhere $0$ then it is everywhere $0$. ${}\qquad{}$ $\endgroup$Michael Hardy– Michael Hardy2015-05-20 17:50:34 +00:00Commented May 20, 2015 at 17:50
OK, let's try a plodding pedestrian approach: $$ f'(x)-f(x) = 0 $$ The idea of an exponential multiplier $m(x)$ is that we write $$ 0 = m(x)f'(x) + (-m(x))f(x) = m(x)f'(x)+m'(x)f(x) = \Big( m(x)f(x)\Big)'. $$ For this to work we would need $-m(x)=m'(x)$. We do not need all functions $m$ satisfying this, but only one. So there's no need to wonder whether any non-trivial solutions to $m'=-m$ exist; we only need a trivial one. $m(x)=e^{-x}$ does it. So $$ 0 = \Big( m(x)f(x)\Big)'. $$ Thus, by the mean value theorem, $m(x)f(x)$ is constant, so $f(x)=(\text{constant}\cdot e^x$).