Skip site navigation (1)Skip section navigation (2)
Date:      Tue, 06 May 2008 15:03:43 -0400
From:      Matthew Pope <>
Subject:   dummynet queue size relative to bw setting?
Message-ID:  <>

Next in thread | Raw E-Mail | Index | Archive | Help
I've been reading about dummynet for 2 weeks, including the seminal ACM 
paper & I'm very impressed.  I've configured and run some preliminary 
simulations that have my colleagues quite interested too.

However, I'm finding my delay settings are yielding delays of about two 
orders of magnitude larger that requested.  I believe I don't understand 
the relationship very well that defines the setting of the queue size to 
relative to the bandwidth setting (and plr?)  Can someone explain or 
point me to a source for this?

I recall reading that with lower bandwidths one should use lower queue 
sizes to avoid long queuing delays.  So I presume that is why my delays 
are so long. So I've run some tests with various queue sizes.

With Queue sizes of 100, 80, 60, 40, 10 slots on a pipe with a bw of 
48Kbits/s, delay of 5ms, and plr 0.025 defined in each direction, I'm 
consistently getting RT delays of 500-600ms with a ping test, packet 
loss does come out around 5%.  The target I'm pinging is normally a 40 
ms latency.  At Queue size 120 I get 100% packet loss (but I can ignore 

I am not a networking specialist, so I realize my question is ignorant 
:-) I'm running this in VMWare Server on a dual core 2 GHz with 2 GB RAM 
using a modification of the dummynet test network design described at

Want to link to this message? Use this URL: <>