Skip site navigation (1)Skip section navigation (2)
Date:      Wed, 10 Feb 2010 09:35:08 -0800
From:      Kurt Buff <kurt.buff@gmail.com>
To:        freebsd-questions@freebsd.org
Subject:   Re: curl question - not exactly on-topic
Message-ID:  <a9f4a3861002100935w32b21832v6cf26e517a64a885@mail.gmail.com>
In-Reply-To: <20100210050518.GA64193@dan.emsphone.com>
References:  <a9f4a3861002091721h6b38e3beu5e55f0bbf4bff9e5@mail.gmail.com> <20100210050518.GA64193@dan.emsphone.com>

next in thread | previous in thread | raw e-mail | index | archive | help
On Tue, Feb 9, 2010 at 21:05, Dan Nelson <dnelson@allantgroup.com> wrote:
> In the last episode (Feb 09), Kurt Buff said:
>> Actually, it's not merely a curl question, it's a "curl and squid"
>> question.
>>
>> I'm trying to determine the cause of a major slowdown in web browsing on
>> our network, so I've put curl on the squid box, and am using the followi=
ng
>> incantations to see if I can determine the cause of the slowdown:
>>
>> =C2=A0 curl -s -w "%{time_total}\n" "%{time_namelookup}\n" -o /dev/null =
http://www.example.com
>>
>> and
>>
>> =C2=A0 curl -s -w "%{time_total}\n" "%{time_namelookup}\n" -o /dev/null =
-x 192.168.1.72 http://www.example.com
>>
>> The problem arises with the second version, which uses the proxy. The
>> first incantation just returns the times, which is exactly what I want.
>>
>> However, when I use the -x parameter, to use the proxy, I get html
>> returned as well as the times, which is a pain to separate out.
>
> Your problem is what's after -w. =C2=A0You want one argument:
> "%{time_total}\n%{time_namelookup}\n", not two. =C2=A0With your original =
command,
> "%{time_namelookup}\n" is treated as another URL to fetch. =C2=A0With no =
proxy
> option, curl realizes it's not an url immediately and skips to the next
> argument on the commandline - http://www.example.com. =C2=A0With a proxy,=
 curl
> has to send each url to the proxy for processing. =C2=A0The proxy probabl=
y
> returns a "400 Bad Request" error on the first (invalid) url, which is
> redirected to /dev/null. =C2=A0The next url doesn't have another -o so it=
 falls
> back to printing to stdout.
>
> Adding -v to the curl commandline will help you diagnose problems like th=
is.

Thanks for that, though it's unfortunate.

I would really like a better understanding of the times, to help
further diagnose the problem, and 'man curl' says that multiple
invocations of '-w' will result in the last one winning, which I've
verified.

Do you have any suggestions for a way to get the timing of these
operations without resorting to tcpdump?

Kurt



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?a9f4a3861002100935w32b21832v6cf26e517a64a885>