From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from chapelperilous.net([216.254.44.199]) (1400 bytes) by braille.uwo.ca via smail with P:esmtp/D:aliases/T:pipe (sender: ) id for ; Wed, 19 Apr 2000 17:08:57 -0400 (EDT) (Smail-3.2.0.102 1998-Aug-2 #2 built 1999-Sep-5) Received: from localhost (bmccoy@localhost) by chapelperilous.net (8.9.3/8.9.3) with ESMTP id RAA10882; Wed, 19 Apr 2000 17:16:50 -0400 Date: Wed, 19 Apr 2000 17:16:49 -0400 (EDT) From: "Brett W. McCoy" To: Janina Sajka cc: ma-linux@tux.org, speakup@braille.uwo.ca Subject: Re: Grabbing An Entire Website In-Reply-To: Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII List-Id: On Wed, 19 Apr 2000, Janina Sajka wrote: > Anyone know how to auto-retrieve an entire www page hierarchy? > > I know software like ncftp can and wuftp can tar up an entire directory > tree, but the pages I need aren't available over ftp, only http. I'd hate > to have them by hand one at a time, though. Take a look at http://www.enfin.com/getweb/ Brett W. McCoy http://www.chapelperilous.net --------------------------------------------------------------------------- If only God would give me some clear sign! Like making a large deposit in my name at a Swiss Bank. - Woody Allen