11
$\begingroup$

I have sequence of random variables defined by the following recursion:

$$X_{n+1} = X_n+\begin{cases} \alpha(S_n - X_n), \text{ if } S_n > X_n \\ \beta(S_n - X_n), \text{ if } S_n < X_n, \end{cases}$$ where $0<\beta < \alpha <1$ are constants, $(S_n)$ are i.i.d with known distributions. Also, $S_n$ independent of $\sigma(X_1, X_2,\dots, X_n)$ and $X_ 0 = 0.$

Initially, I asked about the convergence/limiting distribution of $X_n,$ but after doing some research, I realize that it is generally considered a very difficult problem - to obtain explicit distribution/asymptotics.

Therefore, I want to ask following questions with increasing orders of difficulties (according to my very limited probability theory knowledge.)

1) Can we at least prove that it has a limiting distribution? It looks like one can formulate this as a general state space Markov Chain but there do not seem to be an abundance of sources on this topic. Probability by Durrett has a brief chapter on it and he mentions that discrete Orstein- Uhlehnbeck process: $$V_{n+1} = \theta V_n+\xi_n$$ is an example of a discrete time, general state space Markov Chain. However, most of the resources I could find on the internet refers to the continuous one and as such my hope of modifying proofs for OU did not pan out.

2) If there is a limiting distribution, what kind of qualitative results can I hope to achieve? For example, one has the following for the expected value: $$\mathbb{E}[X_{n+1}] = \mathbb{E}[X_n](1 - \beta ) + \beta\mu + ( \alpha - \beta)\mathbb{E}[\delta_n\mathbb{1}_{\delta_n >0}],$$ where $\delta_n = S_n - X_n,$ and $\mu = \mathbb{E}[S_n].$ But then, the issue I am having is manipulating: $$P(S_n - X_n > t|S_n > X_n)$$, which will come from the last term.

I will greatly appreciate if anyone has some ideas or point me to a helpful source.

Simulation: I attach some simulations that seem to suggest that there is a bounded, limiting distribution.

R ~ Skew normal

$\endgroup$
13
  • $\begingroup$ I think it's a Markov Chain. Why wouldn't it be? It is certainly true that $X_{n+1}$ is dependent on $X_{n-1}$ ("implicitly" as you said), but the Markov property is that such a dependence disappears when we also condition on $X_n$. I.e., if you know $X_n$, then also knowing $X_{n-1}$ gives you no additional info w.r.t. distribution of $X_{n+1}$, which is the case here. $\endgroup$ Commented Nov 7, 2019 at 3:53
  • $\begingroup$ @antkam I see that makes sense. $\endgroup$ Commented Nov 7, 2019 at 4:47
  • $\begingroup$ @antkam OP said "$S_n$ is independent of $X_n$", so it may very well be that $S_n$ is not independent of $X_{n-1}$. $\endgroup$ Commented Nov 16, 2019 at 23:26
  • $\begingroup$ @mathworker21 - oh you are right! I had assumed $S_n$ is independent of all $X_j$'s. But if $S_n$ depends on $X_{n-1}$ then this is indeed not a Markov Chain. Maybe the OP can clarify? $\endgroup$ Commented Nov 17, 2019 at 1:03
  • 1
    $\begingroup$ @NapD.Lover, you are right the upper bound is for positive means. There is similar lower bound for negative means etc. $\endgroup$ Commented Nov 24, 2019 at 3:07

2 Answers 2

2
+250
$\begingroup$

1. Here is a mild condition on the distribution of $S_n$'s that guarantee the existsence of limiting distribution.

Proposition. Assume that $\sum_{n=0}^{\infty} (1-\beta)^n |S_n| < \infty$ holds almost surely. Then $(X_n)_{n\geq 0}$ has a limiting distribution.

Proof. For each $s \in \mathbb{R}$, define $f_s : \mathbb{R} \to \mathbb{R}$ by

$$ f_s(x) = \begin{cases} \alpha s + (1-\alpha) x, & \text{if $s \geq x$}, \\ \beta s + (1-\beta) x, & \text{if $s \leq x$}. \end{cases} $$

This function allows to rewrite the recursive formula as $ X_{n+1} = f_{S_n}(X_n) $, and so,

$$X_n = (f_{S_{n-1}} \circ \cdots \circ f_{S_0} )(0).$$

But since $S_n$'s are i.i.d., this implies

$$X_n \stackrel{\text{law}}{=} X'_n := (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(0).$$

In light of this, it suffices to prove that $(X'_n)_{n\geq 0}$ converges in distribution. Given the assumption, we actually prove that $(X'_n)_{n\geq 0}$ converges almost surely. Indeed, note that

$$|f_s(y) - f_s(x)| \leq (1-\beta)|y - x|$$

uniformly in $s, x, y \in \mathbb{R}$. Applying this to

$$ X'_{n+1} - X'_n = (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(f_{S_n}(0)) - (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(0), $$

we get

$$ \sum_{n=0}^{\infty} \left| X'_{n+1} - X'_n \right| \leq \sum_{n=0}^{\infty} (1-\beta)^n \left|f_{S_n}(0) - 0\right| \leq \sum_{n=0}^{\infty} (1-\beta)^n \alpha \left|S_n\right|. $$

Together with the assumption, the desired conclusion follows. $\square$

Remarks.

  1. The assumption of the proposition is rather arbitrary but still moderately general. For instance, it is satisfied whenever $S_0$ is integrable.

  2. Although the original problem is formulated with the initial condition $X_0 = 0$, this is not important. Indeed, the recurrence relation teaches us that $(X_n)$ forgets its initial condition at least exponentially fast, hence the question on the existence of limiting distribution does not depend on $X_0$.

    To see this, let $X_n(\xi) = (f_{S_{n-1}} \circ \cdots \circ f_{S_0})(\xi)$ denote the sequence given by OP's recurrence relation with the (possibly random) initial condition $X_0 = \xi$. Then

    $$ |X_n(\xi_2) - X_n(\xi_1)| \leq (1-\beta)^n|\xi_2 - \xi_1|. $$


2. Assume that $(X_n)$ converges in distribution to a random variable $X$. Let $S$ be identically distributed as $S_0$ and independent of $X$. Then the followings are easy consequences.

  • If $S$ is supported on an interval $I$, then so is $X$.
  • If $S$ is integrable, then so is $X$. More precisely, we have $\mathbb{E}[|X|] \leq \frac{\alpha}{\beta}\mathbb{E}[|S|]$. Also,

    $$ \mathbb{E}[X] = \mathbb{E}[S] + \frac{\alpha-\beta}{\alpha+\beta}\mathbb{E}[|X-S|] $$

    In particular, if $S$ is non-degenerate, then we have $\mathbb{E}[X] > \mathbb{E}[S]$.

$\endgroup$
8
  • $\begingroup$ Were you applying inequality for $\vert f_s(y) - f_s(x) \vert$ on $\left\vert f_{S_{n+1}}(X_{n+1}) - f_{S_n}(X_n) \right\vert$? $\endgroup$ Commented Nov 24, 2019 at 17:32
  • $\begingroup$ @XiaohaiZhang, I updated my answer to clarify that step. $\endgroup$ Commented Nov 24, 2019 at 20:15
  • $\begingroup$ I am not sure about your reversing order of $X_n$s. But I don't think $\sum \vert X_{n+1} - X_{n} \vert$ would be finite. Take for example, a simple random walk for $S_n$. Then the sequence values of $X_n$ will be pulled to walk between +1 and -1. If $X_n$s go to -1 for an extended period of time/steps, and then comes $S_n=+1$, you will have a sizable up move in the value of $X_n$ much larger than the previous steps. The increments never diminish and that is why I don't think it converges a.s. or in probability. $\endgroup$ Commented Nov 24, 2019 at 21:11
  • 1
    $\begingroup$ @XiaohaiZhang, Reversing the order only gives the 'distributional identity', i.e., $$ X_n=(f_{S_{n-1}} \circ \cdots \circ f_{S_{0}} )(0) \stackrel{\text{law}}= (f_{S_{0}} \circ \cdots \circ f_{S_{n-1}} )(0)=X'_n. $$ Although this reversal changes the joint distribution, i.e., $(X_n)$ has not the same joint distribution as $(X'_n)$, this does not affect the convergence in distribution which only cares marginal distributions of $X_n$'s. $\endgroup$ Commented Nov 24, 2019 at 21:47
  • 1
    $\begingroup$ @XiaohaiZhang, Also, this is not the same as simple random walk. Working with the case $\alpha=\beta$, we have $$X_n = \alpha \sum_{k=0}^{n-1} (1-\alpha)^{k} S_{n-1-k}. $$ Note the geometric weights. And although $(X_n)_{n\geq 0}$ converges neither a.s. nor in probability, the reversals $(X'_n)_{n\geq 0}$ given by $$X'_n = \alpha \sum_{k=0}^{n-1} (1-\alpha)^{k} S_{k} $$ (which has the same distribution as $X_n$ for each given $n$) converges a.s. under quite mild conditions on $S_n$'s. $\endgroup$ Commented Nov 24, 2019 at 21:58
1
$\begingroup$

I will take my shot here. However, the post won't fully answer the question, and it lacks rigor in later parts.

Let's first examine a simpler claim that will be a basis for later discussion.

Claim: Let $S_n$ be i.i.d. and has symmetric distribution with non-negative characteristic function values. $Z_{n+1}=Z_n + \gamma (S_n - Z_n)$ be a sequence of random variable with $Z_0 = 0, 0 < \gamma < 1$. Then $Z_n$ converges in distribution.

Proof: Let the characteristic function of $S_n$ be $\psi(t)$. Since $S_n$ is symmetric, $\psi(t)$ takes real values and $\vert \psi(t) \vert \le 1$. The characteristic function of $Z_n$ is denoted by $\phi_n(t)$. Then $\phi_0(t) = 1 $, and $$ \phi_{n+1}(t)=\phi_n((1-\gamma)t)\cdot\psi(\gamma t). $$ Recursively applying the last equation leads to $$ \phi_n(t) = \prod_{k=0}^{n}\psi\left( \gamma{(1-\gamma)}^kt \right). \quad (1) $$ Note the value of $\phi_n(t)$ is always non-increasing and lower bounded by zero for any $t \in \mathcal{R}$. Hence $\phi_n(t)$ converges to a characteristic function $\phi(t)$ pointwise, and $Z_n$ converges in distribution.

Note: I think it is possible to weaken the requirements on the characteristic function values of $S_n$. But it will require more work.

Coming back to the original problem, let's first reformulate it into an equivalent format.

Reformulation: The original recursion is equivalent to $$ X_{n+1} = \max\{(1-\alpha)X_n + \alpha S_n, (1-\beta)X_n + \beta S_n\}. $$

Proof: Recall $0 < \beta < \alpha < 1$. The above is true because if $S_n > X_n$, then $\alpha(S_n - X_n) > \beta(S_n - X_n)$, and if $S_n \le X_n$, then $\alpha(S_n - X_n) \le \beta(S_n - X_n)$. The original recursion always takes the maximum of the two branches.

Unfortunately things from here below become a bit fuzzy.

Conjecture 1: $X_n$ does NOT converge a.s. or in probability.

Justification: Both $(1-\alpha)X_n + \alpha S_n$ and $(1-\beta)X_n + \beta S_n$ can be viewed as moving average between $X_n$ and $S_n$ with two different coefficients. Assuming $S_n$ is non-degenerate, the sequence $X_n$ can never settle at a converging point, since the next averaging will make a non-diminishing move proportional to $\alpha$ or $\beta$. This is most likely true since most Markov process do not converge in probability.

Conjecture 2: $X_n$ does NOT converge in distribution.

Justification: Equation (1) shows the limiting characteristic function, hence the distribution as well, is affected by the parameter $\gamma$. So recursion patterns $(1-\alpha)X_n + \alpha S_n$ and $(1-\beta)X_n + \beta S_n$ converge to two different distributions since $\alpha \ne \beta$. Given that $S_n$ is non-degenerate, we always have non-zero probability in following one limiting path and suddenly switching the recursion patterns, hence the limiting distribution will oscillate between different distributions. I am less confident in this conclusion.

Finding the stationary distribution if there is one:

I actually think this might be tractable computationally and analytically for simple cases.

  • Determine the domain of the distribution. Domain is limited by the minimum value and maximum value of $S_n$ (because $X_n$ are moving averages of $S_n$ starting from zero with switching). This helps in computation. Even if range of $S_n$ is not bounded, one can still truncate it to use a reasonable approximation.
  • Establish a functional equation based on the simple fact that the distributions of post and pre Markov transition are the same. Once the equation is established, analytical solutions can be sort for $S_n$ with simple distributions. An iterative algorithm can be used to compute the stationary distribution for $S_n$ with more complicated distributions.
$\endgroup$
2
  • $\begingroup$ I can't get through second sentence. are you saying proof won't lack rigor in later parts? $\endgroup$ Commented Nov 23, 2019 at 22:10
  • $\begingroup$ @mathworker21 Clarified. Thanks $\endgroup$ Commented Nov 23, 2019 at 22:28

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.