From owner-freebsd-stable@FreeBSD.ORG Tue May 2 18:50:47 2006 Return-Path: X-Original-To: stable@freebsd.org Delivered-To: freebsd-stable@FreeBSD.ORG Received: from mx1.FreeBSD.org (mx1.freebsd.org [216.136.204.125]) by hub.freebsd.org (Postfix) with ESMTP id E33E716A44B for ; Tue, 2 May 2006 18:50:47 +0000 (UTC) (envelope-from cdillon@wolves.k12.mo.us) Received: from mail.wolves.k12.mo.us (mail.wolves.k12.mo.us [207.160.214.1]) by mx1.FreeBSD.org (Postfix) with ESMTP id CF96143D6D for ; Tue, 2 May 2006 18:50:43 +0000 (GMT) (envelope-from cdillon@wolves.k12.mo.us) Received: from localhost (localhost [127.0.0.1]) by mail.wolves.k12.mo.us (Postfix) with ESMTP id B20A4B820; Tue, 2 May 2006 13:50:42 -0500 (CDT) X-Virus-Scanned: amavisd-new at wolves.k12.mo.us Received: from mail.wolves.k12.mo.us ([127.0.0.1]) by localhost (mail.wolves.k12.mo.us [127.0.0.1]) (amavisd-new, port 10024) with LMTP id dDdvRm-2DkbH; Tue, 2 May 2006 13:50:42 -0500 (CDT) Received: from wolves.k12.mo.us (localhost [127.0.0.1]) by mail.wolves.k12.mo.us (Postfix) with ESMTP id 09207B82F; Tue, 2 May 2006 13:50:42 -0500 (CDT) Received: from cdtech.int.wolves.k12.mo.us (cdtech.int.wolves.k12.mo.us [10.1.3.200]) by www.wolves.k12.mo.us (Horde MIME library) with HTTP; Tue, 02 May 2006 13:50:42 -0500 Message-ID: <20060502135042.d4owhc54eg4g8koc@www.wolves.k12.mo.us> Date: Tue, 02 May 2006 13:50:42 -0500 From: Chris Dillon To: Kris Kennaway References: <20060502171853.GG753@dimma.mow.oilspace.com> <20060502172225.GA90840@xor.obsecurity.org> <20060502174429.GH753@dimma.mow.oilspace.com> <20060502175859.GB91405@xor.obsecurity.org> In-Reply-To: <20060502175859.GB91405@xor.obsecurity.org> MIME-Version: 1.0 Content-Type: text/plain; charset=ISO-8859-1; DelSp="Yes"; format="flowed" Content-Disposition: inline Content-Transfer-Encoding: quoted-printable User-Agent: Internet Messaging Program (IMP) H3 (4.1) / FreeBSD-6.1 Cc: stable@freebsd.org Subject: Re: quota deadlock on 6.1-RC1 X-BeenThere: freebsd-stable@freebsd.org X-Mailman-Version: 2.1.5 Precedence: list List-Id: Production branch of FreeBSD source code List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Tue, 02 May 2006 18:50:48 -0000 Quoting Kris Kennaway : > On February 21 -- that is over 2 months ago -- I sent email to this > list containing a fix for the quota deadlocks that were known at the > time. I got minimal response from users, but it was uniformly > positive. The fix was committed, and the status of the "quota > deadlocks" item on the 6.1-RELEASE todo list was changed from "must > fix" to "believed fixed, please test". > > The next I heard about the problem was about a week ago when someone > reported another deadlock and several others chimed in with "oh yeah, > it still deadlocks for me too". > > Well, sorry folks, you should have told me in February. Or if you > only found out about the problem a week ago, you need to recognize > that problems raised at the last minute cannot always be fixed > instantly. I was one of those others who said "me too". :-) Although I subscribe to every FreeBSD mailing list, I usually just =20 glance over all of the subject lines until something catches my eye. =20 So, unfortunately, I apparently missed that whole bit and it wasn't =20 until a particular subject caught my eye recently that I thought it =20 might be addressing the same problem I had. I hadn't mentioned the =20 problem to the lists before because I had zero diagnostic information =20 about it and it was a production box that I couldn't fool around with =20 too much, so I had found a workaround (daily reboot) a long time ago =20 and didn't think much more about it. Although I recently compiled the =20 kernel with various debug options (WITNESS, DDB, etc.), it takes days =20 for it to recur (without daily reboots) and when it hanged again a =20 couple of nights ago, I completely forgot about trying to break into =20 the debugger and rebooted the box anyway. *slaps forehead* And of =20 course it hasn't happened again, yet. Maybe next time. I'll be happy when we figure out what the problem is and find a fix =20 for it, it doesn't matter to me whether or not it makes it into the =20 6.1 release.