-1
$\begingroup$

I’m studying basic queueing theory, in particular a single–server queue with one arrival stream and one server (a G/G/1 type setup).

Let

  • A be the interarrival time with mean 𝔼(A),
  • B be the service time with mean 𝔼(B).

The usual “traffic intensity” is ρ = λ 𝔼(B) = 𝔼(B) / 𝔼(A), where λ = 1/𝔼(A) is the arrival rate. Textbooks say that the system is stable if ρ < 1.

By “stable” I mean that the queue length (or workload) process has a proper steady-state / stationary distribution and does not drift to infinity.

I understand intuitively why this inequality should hold: on average, the amount of work arriving per unit time, λ 𝔼(B), should be strictly smaller than the server’s processing capacity (which is 1 unit of work per unit time). Otherwise the queue should grow.

What I’m looking for is a more quantitative / rigorous argument that makes this precise. For example:

  • How can one prove rigorously (under reasonable assumptions such as i.i.d. interarrival and service times) that if ρ < 1 the queue-length process is stable (e.g. positive recurrent, has a stationary distribution)?
  • Conversely, how can one show that if ρ ≥ 1 the queue length grows without bound (e.g. diverges to infinity with probability 1 or in some appropriate sense)?
$\endgroup$
3
  • 2
    $\begingroup$ do you understand why this is true for M/M/1 queues? $\endgroup$ Commented Nov 23 at 16:19
  • $\begingroup$ I intuitively understand that the underlying Markov chain would ''explode to infinity'' otherwise, but can not find a rigorous argument that shows the divergence. $\endgroup$ Commented Nov 25 at 18:18
  • $\begingroup$ I propose that you try and consider the MM1 case before you take on the general case $\endgroup$ Commented Nov 25 at 21:29

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.