Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 06 Aug 2009 19:22:09 -0700
From:      Chris Stankevitz <cstankevitz@toyon.com>
To:        freebsd-questions@freebsd.org
Subject:   Strange timing when reading from the serial port
Message-ID:  <4A7B8FD1.9040003@toyon.com>

next in thread | raw e-mail | index | archive | help
Hello,

I have a device that sends one byte over the serial line every 10ms.

Using c, I wrote an application that opens the serial port and reads 
bytes in an infinite loop.  I disabled all blocking (O_NONBLOCK, VMIN=0, 
VTIME=0, B115200).  My CPU spends ~100% of its time calling read() 
[which almost always returns 0].

I compute the time each byte shows up using gettimeofday().  By 
differencing the time of successive samples, I can compute the time it 
took each byte to arrive.  Since the bytes are transmitted at 100Hz, I 
expect to find that delta_time is 10ms.

For several seconds I get good results with  delta_time = 10ms  with a 
noise of ~50us

Then performance deteriorates and I get 10ms + with a noise of ~50us and 
a bias that cycles through 0ms, 5ms, 0ms -5ms.

Then results go back to good.

See a graph of this here (y axis is delta_timeval, x axis is time in sec):

http://img218.imageshack.us/img218/4944/plot1t.gif
http://img12.imageshack.us/img12/9693/plot2.gif
http://img10.imageshack.us/img10/5995/plot3.gif

Q: What is the source of the alternating +/- 5ms bias that comes and 
goes every few seconds?

Possible answers:

1. My external device is sending the bytes strangely (I don't believe 
this, but I can use an oscilliscope to confirm).

2. read() doesn't return within 1ms of the data coming in to the serial 
port.

3. gettimeofday() does not return a time good to 1ms

4. none of the above

Thank you for your help!

Chris

PS: I am using 7.2-RELEASE



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?4A7B8FD1.9040003>