Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 10 Jan 2013 23:32:32 -0800
From:      Kevin Oberman <kob6558@gmail.com>
To:        freebsd-questions@freebsd.org
Subject:   Process hangs on large pipe writes
Message-ID:  <CAN6yY1tOLtQ8zjD56JE%2B-qqcg2SmrbngocEdqGK%2BsQaNDKEiKw@mail.gmail.com>

next in thread | raw e-mail | index | archive | help
I have run into a problem after upgrading s system from 7.2 to 8.3
with a script that has worked correctly for about a decade. The
script, written in perl, starts about 25 processes to gather
information from a number of other systems. I create an array of
filehandles, one per remote system and then open each with an ssh
command:
$pid = open $fh, "ssh -xa $_ <command> 2>& 1 |"
I then set up a 30 second watchdog via sigalarm that will kill any
process that has not completed and exited and do a wait for each
process to complete with a waitpid.  (The processes normally all
complete in about 2 seconds.)

Each ssh command will return between about 14 and 32 KB of ASCII data,
written line at a time with lines of no more than 80- characters.. On
each run after the upgrade between 8 and 15 of the processes will hang
and only a portion of the data will be available when I read the
filehandle.

Anyone have any idea what could have changed between FreeBSD 7.2 and
8.3? It looks like a bug or a new limitation on pipe operations. I
don't know much about Perl internals, but I assume that no read is
done on the pipe until I attempt to read the filehandle, so I believe
it is some limit on writes to the pipe with no reads done.

I have modified the script to write to files instead of a pipe and it
is working again, but I would like to understand what broke my script.
-- 
R. Kevin Oberman, Network Engineer
E-mail: kob6558@gmail.com



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?CAN6yY1tOLtQ8zjD56JE%2B-qqcg2SmrbngocEdqGK%2BsQaNDKEiKw>