Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 9 Aug 2007 18:51:13 +0200
From:      Bram Schoenmakers <bramschoenmakers@xs4all.nl>
To:        freebsd-questions@freebsd.org
Cc:        Alex Zbyslaw <xfb52@dial.pipex.com>, Nikos Vassiliadis <nvass@teledomenet.gr>
Subject:   Re: Problem with dump over SSH: Operation timed out
Message-ID:  <200708091851.14649.bramschoenmakers@xs4all.nl>
In-Reply-To: <46BB2730.8090702@dial.pipex.com>
References:  <200708091025.43912.bramschoenmakers@xs4all.nl> <200708091704.31952.nvass@teledomenet.gr> <46BB2730.8090702@dial.pipex.com>

Next in thread | Previous in thread | Raw E-Mail | Index | Archive | Help
Op donderdag 09 augustus 2007, schreef Alex Zbyslaw:

Hello,

> Bram Schoenmakers wrote:
> ># /sbin/dump -0uan -L -h 0 -f - / | /usr/bin/bzip2 | /usr/bin/ssh
> >backup@office.example.com \
> >        dd of=/backup/webserver/root.0.bz2
>
> bzip2 is darned slow and not always much better than gzip -9.  It might
> be that ssh is just timing out in some way (I've seen that but not with
> ethernet dumps specifically).  Can you try the test using gzip -9
> instead of bzip?  If that works, then look for ssh options that affect
> timeouts, keepalives etc.  In particular, ServerAliveInterval 60 in a
> .ssh/config stopped xterm windows dying on me to certain hosts.  YMMV :-(
>
> If you have the disk space then you could try without any compression at
> all; or try doing the compression remotely:
>
>            /sbin/dump -0 -a -C 64 -L -h 0 -f - / | \
>                 /usr/local/bin/ssh backup@office.example.com
> \
>                 "gzip -9 > /backup/webserver/root.0.gz"
>
> Otherwise:
>
> Nikos Vassiliadis wrote:
> >1) Can you dump the file locally?
> >
> >2) Is scp working?
>
> If you can write (and compress if short of disk space) the dump locally and
> try an scp to your remote host as Nikos is suggesting, that will narrow
> down the problem a bit.  Any other large file will do: doesn't have to be a
> dump.

As I wrote in my initial mail:

======
* Downloading the very same big file over SCP causes problems too, below some 
SCP debug output. The connection drops quickly after it gained a reasonable 
download speed.

        Read from remote host office.example.com: Connection reset by peer
        debug1: Transferred: stdin 0, stdout 0, stderr 77 bytes in 103.3 
seconds
        debug1: Bytes per second: stdin 0.0, stdout 0.0, stderr 0.7
        debug1: Exit status -1
        lost connection
======

That was just a file generated with 'dd if=/dev/zero of=zeroes bs=1024k 
count=200' . So no, SCP doesn't work.

I haven't tried gzip -9 yet, although it looks like a workaround than a 
solution to the real problem.

Kind regards,

-- 
Bram Schoenmakers

You can contact me directly on Jabber with bram@kde.org



Want to link to this message? Use this URL: <http://docs.FreeBSD.org/cgi/mid.cgi?200708091851.14649.bramschoenmakers>