Let's say I have a function $u \in BV(I)$ where $I \subset \mathbb{R}$ is an open interval. Its distributional derivative $Du$ is a Radon measure. Now let's say $x \in I$ is a Lebesgue point i.e. $$\lim_{r \rightarrow{} 0} \frac{1}{2r} \int_{x - r}^{x + r} \left | u(y) - u(x) \right | \ dy = 0.$$ My question is how to prove that $|Du|(\{x\}) = 0$? My idea was to somehow use the definition of distributional derivative $$\int_I u(y) \phi_k'(y) \ dy = -\int_I \phi_k \ dDu$$ where the sequence $\{\phi_k\}_{k = 1}^\infty\subset C_c^\infty(I)$ is an approximation by smooth functions with compact support of Dirac delta at x. Then if we have that $|\phi'_k|$ are bounded it is not hard to show that the integral on the left hand side converges to 0 using definition of Lebesgue points. I don't know whether this idea is correct and if it even lead anywhere. One of the problems is that this method involves $Du$ rather than $|Du|$. It would be greatly appreciated if someone suggested a solution or a hint. Thank you.
Measure of Lebesgue point
user1757903
- 31
- 2