1

I am using tc to test the behaviour of a networked app under various network conditions. The setup is like this:

if [ -z "$(tc qdisc show dev ${MAIN_LINK} ingress)" ]
then
    sudo tc qdisc add dev ${MAIN_LINK} handle ffff: ingress
fi

sudo tc filter del dev ${MAIN_LINK} ingress
sudo tc filter add dev ${MAIN_LINK} parent ffff: protocol ip u32 match ip dport 20780 0xffff match ip protocol 17 0xff action mirred egress redirect dev ${BRIDGE}

sudo tc qdisc add dev ${MAIN_LINK} root handle 1: prio
sudo tc filter add dev ${MAIN_LINK} parent 1: protocol ip prio 1 u32 flowid 1:1 match ip dport 20780 0xffff match ip protocol 17 0xff

If I add packet loss, using a loop to ramp it up bit by bit like this then it works:

    tc qdisc add dev "${MAIN_LINK}" parent 1:1 netem loss random ${LEVEL}%
    tc qdisc add dev "${BRIDGE}" root handle 1: netem loss random ${LEVEL}%

If I add packet delay, again ramping up bit by bit like this then it has no effect that I can see at all:

    tc qdisc add dev "${MAIN_LINK}" parent 1:1 netem delay ${DELAY} ${JITTER} distribution normal
    tc qdisc add dev "${BRIDGE}" root netem delay ${DELAY} ${JITTER} distribution normal

Values of DELAY and JITTER went up to 1870 and 1530 (340 to 3400 ms delay) and there was no apparent effect at all.

How do I get packet delay to work? Why does packet loss work but packet delay does not?

3
  • Please describe the methods you're using to test the results of the packet jitter and packet delay. The TCP protocol was designed (and additional algorithms added to the original design) to adapt to impairments like latency (delay) and jitter, allowing packet streams to, e.g., expand and contract the limits on the number of not-yet-acknowledged packets in transit between the sender and receiver. If you're using very basic throughput measurements of large file transfers, then the protocol's adaptive features can make the latency and jitter invisible to your measurements. Commented Jul 10 at 17:34
  • I'm using UDP not TCP. Commented Jul 14 at 6:23
  • I am visually inspecting a display which is displaying data which is changing in real time. I am setting up the system so that values increase or decrease linearly and are graphed. I expect to see flat spots followed by a steeper slope where a single packet is delayed by a certain amount. I expect to see spikes where a previous packet is delayed by more than a subsequent packet. With a maximum delay of over 3 seconds, these effects should be very clear. Commented Jul 14 at 6:26

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.