From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from smtp-out1.bellatlantic.net([199.45.39.156]) (1457 bytes) by braille.uwo.ca via smail with P:esmtp/D:aliases/T:pipe (sender: ) id for ; Wed, 19 Apr 2000 16:21:01 -0400 (EDT) (Smail-3.2.0.102 1998-Aug-2 #2 built 1999-Sep-5) Received: from adsl-151-200-20-29.bellatlantic.net (adsl-151-200-20-29.bellatlantic.net [151.200.20.29]) by smtp-out1.bellatlantic.net (8.9.1/8.9.1) with ESMTP id QAA17050; Wed, 19 Apr 2000 16:20:44 -0400 (EDT) Received: from localhost (janina@localhost) by adsl-151-200-20-29.bellatlantic.net (8.9.3/8.8.7) with ESMTP id QAA09100; Wed, 19 Apr 2000 16:20:33 -0400 X-Authentication-Warning: adsl-151-200-20-29.bellatlantic.net: janina owned process doing -bs Date: Wed, 19 Apr 2000 16:20:33 -0400 (EDT) From: Janina Sajka X-Sender: janina@adsl-151-200-20-29.bellatlantic.net To: ma-linux@tux.org, speakup@braille.uwo.ca Subject: Grabbing An Entire Website In-Reply-To: Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII List-Id: Hi: Anyone know how to auto-retrieve an entire www page hierarchy? I know software like ncftp can and wuftp can tar up an entire directory tree, but the pages I need aren't available over ftp, only http. I'd hate to have them by hand one at a time, though. -- Janina Sajka, Director Information Systems Research & Development American Foundation for the Blind (AFB) janina@afb.net