* Playing sound.
@ Tommy Moore
` Mike Gorse
` (2 more replies)
0 siblings, 3 replies; 11+ messages in thread
From: Tommy Moore @ UTC (permalink / raw)
To: speakup
Hey guys. You guys know of a way to have the system play you a sound when
new mail comes in?
Thanks.
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Playing sound.
Playing sound Tommy Moore
@ ` Mike Gorse
` Janina Sajka
` Grabbing An Entire Website Janina Sajka
2 siblings, 0 replies; 11+ messages in thread
From: Mike Gorse @ UTC (permalink / raw)
To: speakup
On Tue, 18 Apr 2000, Tommy Moore wrote:
> Hey guys. You guys know of a way to have the system play you a sound when
> new mail comes in?
>
Here is part of my .procmailrc:
:0 c
* ^To:.*mgorse
* !^To:.*,
|cat youvegotmail.au >/dev/audio
youvegotmail.au is a sound file that I downloaded a while ago from
sunsite's archive. This way it only plays the sound when I get mail
addressed to me and not to anyone else (ie, not for messages from mailing
lists).
Or you could use biff, which will beep at you and give you a blurb on any
consoles where you have it set on.
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Playing sound.
Playing sound Tommy Moore
` Mike Gorse
@ ` Janina Sajka
` Grabbing An Entire Website Janina Sajka
2 siblings, 0 replies; 11+ messages in thread
From: Janina Sajka @ UTC (permalink / raw)
To: speakup
I use Pine, and my system beeps everytime I get new mail. So, it's mapped
in somewhere.
I'm with you, though. I'd love to map it to a more distinctive sound, with
significantly enhanced aesthetic values.
On Tue, 18 Apr 2000, Tommy Moore wrote:
> Hey guys. You guys know of a way to have the system play you a sound when
> new mail comes in?
>
> Thanks.
>
>
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
--
Janina Sajka, Director
Information Systems Research & Development
American Foundation for the Blind (AFB)
janina@afb.net
^ permalink raw reply [flat|nested] 11+ messages in thread
* Grabbing An Entire Website
Playing sound Tommy Moore
` Mike Gorse
` Janina Sajka
@ ` Janina Sajka
` Xandy Johnson
` (4 more replies)
2 siblings, 5 replies; 11+ messages in thread
From: Janina Sajka @ UTC (permalink / raw)
To: ma-linux, speakup
Hi:
Anyone know how to auto-retrieve an entire www page hierarchy?
I know software like ncftp can and wuftp can tar up an entire directory
tree, but the pages I need aren't available over ftp, only http. I'd hate
to have them by hand one at a time, though.
--
Janina Sajka, Director
Information Systems Research & Development
American Foundation for the Blind (AFB)
janina@afb.net
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` Grabbing An Entire Website Janina Sajka
@ ` Xandy Johnson
` ADAM Sulmicki
` (3 subsequent siblings)
4 siblings, 0 replies; 11+ messages in thread
From: Xandy Johnson @ UTC (permalink / raw)
To: Janina Sajka; +Cc: ma-linux, speakup
You probably want to look into wget. It can follow links to recursively
retrieve all documents referenced by an http URL (also does ftp, but you
specifically said your needs were http). There are a lot of options (e.g.
maximum depth, spanning hosts, converting absolute links to relative ones
locally, etc.), so I suggest reading the man page and then asking more
specific questions if you have them.
Yours,
Xandy
On Wed, 19 Apr 2000, Janina Sajka wrote:
> Hi:
>
> Anyone know how to auto-retrieve an entire www page hierarchy?
>
> I know software like ncftp can and wuftp can tar up an entire directory
> tree, but the pages I need aren't available over ftp, only http. I'd hate
> to have them by hand one at a time, though.
>
>
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` Grabbing An Entire Website Janina Sajka
` Xandy Johnson
@ ` ADAM Sulmicki
` Janina Sajka
` George Lewis
` (2 subsequent siblings)
4 siblings, 1 reply; 11+ messages in thread
From: ADAM Sulmicki @ UTC (permalink / raw)
To: Janina Sajka; +Cc: ma-linux, speakup
> I know software like ncftp can and wuftp can tar up an entire directory
> tree, but the pages I need aren't available over ftp, only http. I'd hate
> to have them by hand one at a time, though.
wget
ftp.gnu.org/pub/gnu/wget
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` Grabbing An Entire Website Janina Sajka
` Xandy Johnson
` ADAM Sulmicki
@ ` George Lewis
` Brett W. McCoy
` Garrett Nievin
4 siblings, 0 replies; 11+ messages in thread
From: George Lewis @ UTC (permalink / raw)
To: Janina Sajka; +Cc: ma-linux, speakup
Use wget, it has special mirroring options specifically for such a task.
There are also other applications and perl modules for this task, but
generically wget is excellent.
George
Janina Sajka (janina@afb.net) wrote:
> Hi:
>
> Anyone know how to auto-retrieve an entire www page hierarchy?
>
> I know software like ncftp can and wuftp can tar up an entire directory
> tree, but the pages I need aren't available over ftp, only http. I'd hate
> to have them by hand one at a time, though.
>
> --
>
> Janina Sajka, Director
> Information Systems Research & Development
> American Foundation for the Blind (AFB)
>
> janina@afb.net
>
--
George Lewis
http://schvin.net/
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` Grabbing An Entire Website Janina Sajka
` (2 preceding siblings ...)
` George Lewis
@ ` Brett W. McCoy
` Garrett Nievin
4 siblings, 0 replies; 11+ messages in thread
From: Brett W. McCoy @ UTC (permalink / raw)
To: Janina Sajka; +Cc: ma-linux, speakup
On Wed, 19 Apr 2000, Janina Sajka wrote:
> Anyone know how to auto-retrieve an entire www page hierarchy?
>
> I know software like ncftp can and wuftp can tar up an entire directory
> tree, but the pages I need aren't available over ftp, only http. I'd hate
> to have them by hand one at a time, though.
Take a look at http://www.enfin.com/getweb/
Brett W. McCoy
http://www.chapelperilous.net
---------------------------------------------------------------------------
If only God would give me some clear sign! Like making a large deposit
in my name at a Swiss Bank.
- Woody Allen
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` ADAM Sulmicki
@ ` Janina Sajka
0 siblings, 0 replies; 11+ messages in thread
From: Janina Sajka @ UTC (permalink / raw)
To: ADAM Sulmicki; +Cc: ma-linux, speakup
Got wget. Very cool.
No, very very cool! <grin>
Thanks.
Janina
On Wed, 19 Apr 2000, ADAM Sulmicki
wrote:
>
> > I know software like ncftp can and wuftp can tar up an entire directory
> > tree, but the pages I need aren't available over ftp, only http. I'd hate
> > to have them by hand one at a time, though.
>
> wget
>
> ftp.gnu.org/pub/gnu/wget
>
>
>
--
Janina Sajka, Director
Information Systems Research & Development
American Foundation for the Blind (AFB)
janina@afb.net
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` Grabbing An Entire Website Janina Sajka
` (3 preceding siblings ...)
` Brett W. McCoy
@ ` Garrett Nievin
` Aaron
4 siblings, 1 reply; 11+ messages in thread
From: Garrett Nievin @ UTC (permalink / raw)
To: Janina Sajka; +Cc: ma-linux, speakup
I think that you can use wget for that. Have not done it myself.
Cheers,
Garrett
On Wed, 19 Apr 2000, Janina Sajka wrote:
> Hi:
>
> Anyone know how to auto-retrieve an entire www page hierarchy?
>
> I know software like ncftp can and wuftp can tar up an entire directory
> tree, but the pages I need aren't available over ftp, only http. I'd hate
> to have them by hand one at a time, though.
>
>
--
Garrett P. Nievin <gnievin@gmu.edu>
Non est ad astra mollis e terris via. -- Seneca
^ permalink raw reply [flat|nested] 11+ messages in thread
* Re: Grabbing An Entire Website
` Garrett Nievin
@ ` Aaron
0 siblings, 0 replies; 11+ messages in thread
From: Aaron @ UTC (permalink / raw)
To: Garrett Nievin; +Cc: Janina Sajka, ma-linux, speakup
yup, wget -r www.foobar.com. Of course that gets what a browser would
"see" not the source code behind dymanic pages, unless of course it's cold
fusion ;) If you want to get source code, for dynamic pages or something
else that would depend on the situation.
Aaron
On Wed, 19 Apr 2000, Garrett Nievin wrote:
> I think that you can use wget for that. Have not done it myself.
>
>
> Cheers,
> Garrett
>
> On Wed, 19 Apr 2000, Janina Sajka wrote:
>
> > Hi:
> >
> > Anyone know how to auto-retrieve an entire www page hierarchy?
> >
> > I know software like ncftp can and wuftp can tar up an entire directory
> > tree, but the pages I need aren't available over ftp, only http. I'd hate
> > to have them by hand one at a time, though.
> >
> >
>
> --
> Garrett P. Nievin <gnievin@gmu.edu>
>
> Non est ad astra mollis e terris via. -- Seneca
>
^ permalink raw reply [flat|nested] 11+ messages in thread
end of thread, other threads:[~ UTC | newest]
Thread overview: 11+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
Playing sound Tommy Moore
` Mike Gorse
` Janina Sajka
` Grabbing An Entire Website Janina Sajka
` Xandy Johnson
` ADAM Sulmicki
` Janina Sajka
` George Lewis
` Brett W. McCoy
` Garrett Nievin
` Aaron
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).