* Re: An idea,
` An idea, Lorenzo Taylor
@ ` Kenny Hitt
` Kelly Prescott
` (2 subsequent siblings)
3 siblings, 0 replies; 31+ messages in thread
From: Kenny Hitt @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Hi.
At the Xserver level, you don't have any
information but draw this pixle using this comvination of rgb data.
This pixel should be drawn on this screen of this display.
The text info and control type are at the library level and not at the
Xserver level.
The best comparison I've come up with to MS windows is to think of the
Xserver as the drivers for video, keyboard, and mouse. Even the
implimentation and management of windows on the screen is handled by an
app that just talks to the Xserver.
If you want to see this, create an .xinitrc file that only starts the
Xserver and no apps. You will have a screen with a mouse pointer. You
can move the pointer, but that will be it. Since no window manager or
app will be running, you can only kill off the Xserver, change the
screen resolution, or move a useless mouse pointer around a screen.
BTW, the words "screen"and "display"have different and specific meanings
in X. My explanation above mixes terms to try to get across my point.
I've read some X HOWTOs that do a better job of explaining it, but I
think you get the idea. Since current distros set up X for you
automatically, you don't need to follow the instructions in an X HOWTO,
but they are worth reading for educational purposes.
To get the best results with Gnome, you need an app that uses gtk2, a
screen reader that understands the info sent by the gtk2 accessibility
structure (atspi), and a window manager that also sends window events to
atspi.
For KDE accessibility, substitute KDE for Gnome and qt4 for gtk2 in the
previous paragraph.
Hope this helps.
Kenny
On Tue, Jul 26, 2005 at 11:41:21PM -0400, Lorenzo Taylor wrote:
> Here's another idea, maybe no one has thought of it yet, or maybe it is
> impossible to implement, but here it goes.
>
> It seems that the existing approaches for X screen readers should be taking a
> look at Speakup as a model. Gnopernicus, for example, is using libraries that
> rely on certain information ent by the underlying application libraries.
> Unfortunately, this implementation causes only some apps to speak while others
> which use the same widgets but whose libraries don't send messages to the
> accessibility system will not speak. But it occurs to me that X is simply a
> protocol by which client applications send messages to a server which renders
> the proper text, windows, buttons and other widgets on the screen. I believe
> that a screen reader that is an extension to the X server itself, (like Speakup
> is a set of patches to the kernel) would be a far better solution, as it could
> capture everything sent to the server and correctly translate it into humanly
> understandable speech output without relying on "accessibility messages" being
> sent from the client apps.
>
> Any thoughts on this would be welcome.
>
> Lorenzo
> --
> -----BEGIN GEEK CODE BLOCK-----
> Version: 3.12
> GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> G e* h---- r+++ y+++
> ------END GEEK CODE BLOCK------
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` An idea, Lorenzo Taylor
` Kenny Hitt
@ ` Kelly Prescott
` John covici
` Janina Sajka
` Sean McMahon
3 siblings, 1 reply; 31+ messages in thread
From: Kelly Prescott @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
hmm, a interesting concept...
The problem is that by the time the x server sees most of the stuff, it is
just screen position renderings. The server does not have a concept of
letters, characters, etc.
The server knows where you click on a screen, for example, but it just
sends the information to the under lying application which is responsible
for deciding if you have clicked on a button etc.
This is a over simplified explaination, but for our purposes, it will
do...
Bottom line is that what ever toolbox, library, wigit set, rendering app,
or what ever, it must feed the textual information to some interface for
the screen reader to get at it so it can be read.
Hope this helps.
kp
On Tue, 26 Jul 2005, Lorenzo Taylor wrote:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Here's another idea, maybe no one has thought of it yet, or maybe it is
> impossible to implement, but here it goes.
>
> It seems that the existing approaches for X screen readers should be taking a
> look at Speakup as a model. Gnopernicus, for example, is using libraries that
> rely on certain information ent by the underlying application libraries.
> Unfortunately, this implementation causes only some apps to speak while others
> which use the same widgets but whose libraries don't send messages to the
> accessibility system will not speak. But it occurs to me that X is simply a
> protocol by which client applications send messages to a server which renders
> the proper text, windows, buttons and other widgets on the screen. I believe
> that a screen reader that is an extension to the X server itself, (like Speakup
> is a set of patches to the kernel) would be a far better solution, as it could
> capture everything sent to the server and correctly translate it into humanly
> understandable speech output without relying on "accessibility messages" being
> sent from the client apps.
>
> Any thoughts on this would be welcome.
>
> Lorenzo
> - --
> - -----BEGIN GEEK CODE BLOCK-----
> Version: 3.12
> GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> G e* h---- r+++ y+++
> - ------END GEEK CODE BLOCK------
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.1 (GNU/Linux)
>
> iD8DBQFC5wJhG9IpekrhBfIRAuhgAKDNMp7ThoUKPYqiWC+u8WB3RS0oKQCgulck
> 2KEeJCAheJfd5oqbbUgiM5k=
> =lUXl
> -----END PGP SIGNATURE-----
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` Kelly Prescott
@ ` John covici
` Sergei V. Fleytin
` (2 more replies)
0 siblings, 3 replies; 31+ messages in thread
From: John covici @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Well, I am not sure how it worked, but I once tried an X server for
windows which was able to figure out the controls under Linux and my
windows screen reader was able to read them after a fashion, so I
wonder if there is some window information passed to the Xserver after
all.
on Wednesday 07/27/2005 Kelly Prescott(prescott@deltav.org) wrote
> hmm, a interesting concept...
> The problem is that by the time the x server sees most of the stuff, it is
> just screen position renderings. The server does not have a concept of
> letters, characters, etc.
> The server knows where you click on a screen, for example, but it just
> sends the information to the under lying application which is responsible
> for deciding if you have clicked on a button etc.
> This is a over simplified explaination, but for our purposes, it will
> do...
> Bottom line is that what ever toolbox, library, wigit set, rendering app,
> or what ever, it must feed the textual information to some interface for
> the screen reader to get at it so it can be read.
> Hope this helps.
> kp
>
>
>
> On Tue, 26 Jul 2005, Lorenzo Taylor wrote:
>
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > Here's another idea, maybe no one has thought of it yet, or maybe it is
> > impossible to implement, but here it goes.
> >
> > It seems that the existing approaches for X screen readers should be taking a
> > look at Speakup as a model. Gnopernicus, for example, is using libraries that
> > rely on certain information ent by the underlying application libraries.
> > Unfortunately, this implementation causes only some apps to speak while others
> > which use the same widgets but whose libraries don't send messages to the
> > accessibility system will not speak. But it occurs to me that X is simply a
> > protocol by which client applications send messages to a server which renders
> > the proper text, windows, buttons and other widgets on the screen. I believe
> > that a screen reader that is an extension to the X server itself, (like Speakup
> > is a set of patches to the kernel) would be a far better solution, as it could
> > capture everything sent to the server and correctly translate it into humanly
> > understandable speech output without relying on "accessibility messages" being
> > sent from the client apps.
> >
> > Any thoughts on this would be welcome.
> >
> > Lorenzo
> > - --
> > - -----BEGIN GEEK CODE BLOCK-----
> > Version: 3.12
> > GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> > O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> > G e* h---- r+++ y+++
> > - ------END GEEK CODE BLOCK------
> > -----BEGIN PGP SIGNATURE-----
> > Version: GnuPG v1.4.1 (GNU/Linux)
> >
> > iD8DBQFC5wJhG9IpekrhBfIRAuhgAKDNMp7ThoUKPYqiWC+u8WB3RS0oKQCgulck
> > 2KEeJCAheJfd5oqbbUgiM5k=
> > =lUXl
> > -----END PGP SIGNATURE-----
> >
> > _______________________________________________
> > Speakup mailing list
> > Speakup@braille.uwo.ca
> > http://speech.braille.uwo.ca/mailman/listinfo/speakup
> >
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?
John Covici
covici@ccs.covici.com
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` John covici
@ ` Sergei V. Fleytin
` Janina Sajka
` Kelly Prescott
` hank smith
2 siblings, 1 reply; 31+ messages in thread
From: Sergei V. Fleytin @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Hello, all.
As far as gui accessibility is concerned, I wonder if anyone on this
list knows what happened with project called ultrasonix. If I remember
correctly, the aim of that project was to provide a library
independent screen reader for the x environment.
--
With best regards, Sergei.
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` John covici
` Sergei V. Fleytin
@ ` Kelly Prescott
` hank smith
2 siblings, 0 replies; 31+ messages in thread
From: Kelly Prescott @ UTC (permalink / raw)
To: covici, Speakup is a screen review system for Linux.
A Xserver for windows and one for Linux are 2 different beasts...
Windows servers use windows toolbox calls to do there work, and that is
where your screen reader is hooked.
it is like comparing apples to keewee fruit.
They can both be eaten, but they are nothing alike.
kp
On Wed, 27 Jul 2005, John covici wrote:
> Well, I am not sure how it worked, but I once tried an X server for
> windows which was able to figure out the controls under Linux and my
> windows screen reader was able to read them after a fashion, so I
> wonder if there is some window information passed to the Xserver after
> all.
>
> on Wednesday 07/27/2005 Kelly Prescott(prescott@deltav.org) wrote
> > hmm, a interesting concept...
> > The problem is that by the time the x server sees most of the stuff, it is
> > just screen position renderings. The server does not have a concept of
> > letters, characters, etc.
> > The server knows where you click on a screen, for example, but it just
> > sends the information to the under lying application which is responsible
> > for deciding if you have clicked on a button etc.
> > This is a over simplified explaination, but for our purposes, it will
> > do...
> > Bottom line is that what ever toolbox, library, wigit set, rendering app,
> > or what ever, it must feed the textual information to some interface for
> > the screen reader to get at it so it can be read.
> > Hope this helps.
> > kp
> >
> >
> >
> > On Tue, 26 Jul 2005, Lorenzo Taylor wrote:
> >
> > > -----BEGIN PGP SIGNED MESSAGE-----
> > > Hash: SHA1
> > >
> > > Here's another idea, maybe no one has thought of it yet, or maybe it is
> > > impossible to implement, but here it goes.
> > >
> > > It seems that the existing approaches for X screen readers should be taking a
> > > look at Speakup as a model. Gnopernicus, for example, is using libraries that
> > > rely on certain information ent by the underlying application libraries.
> > > Unfortunately, this implementation causes only some apps to speak while others
> > > which use the same widgets but whose libraries don't send messages to the
> > > accessibility system will not speak. But it occurs to me that X is simply a
> > > protocol by which client applications send messages to a server which renders
> > > the proper text, windows, buttons and other widgets on the screen. I believe
> > > that a screen reader that is an extension to the X server itself, (like Speakup
> > > is a set of patches to the kernel) would be a far better solution, as it could
> > > capture everything sent to the server and correctly translate it into humanly
> > > understandable speech output without relying on "accessibility messages" being
> > > sent from the client apps.
> > >
> > > Any thoughts on this would be welcome.
> > >
> > > Lorenzo
> > > - --
> > > - -----BEGIN GEEK CODE BLOCK-----
> > > Version: 3.12
> > > GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> > > O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> > > G e* h---- r+++ y+++
> > > - ------END GEEK CODE BLOCK------
> > > -----BEGIN PGP SIGNATURE-----
> > > Version: GnuPG v1.4.1 (GNU/Linux)
> > >
> > > iD8DBQFC5wJhG9IpekrhBfIRAuhgAKDNMp7ThoUKPYqiWC+u8WB3RS0oKQCgulck
> > > 2KEeJCAheJfd5oqbbUgiM5k=
> > > =lUXl
> > > -----END PGP SIGNATURE-----
> > >
> > > _______________________________________________
> > > Speakup mailing list
> > > Speakup@braille.uwo.ca
> > > http://speech.braille.uwo.ca/mailman/listinfo/speakup
> > >
> >
> > _______________________________________________
> > Speakup mailing list
> > Speakup@braille.uwo.ca
> > http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
> --
> Your life is like a penny. You're going to lose it. The question is:
> How do
> you spend it?
>
> John Covici
> covici@ccs.covici.com
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` John covici
` Sergei V. Fleytin
` Kelly Prescott
@ ` hank smith
` John covici
2 siblings, 1 reply; 31+ messages in thread
From: hank smith @ UTC (permalink / raw)
To: covici, Speakup is a screen review system for Linux.
what program? just curiousity
----- Original Message -----
From: "John covici" <covici@ccs.covici.com>
To: "Speakup is a screen review system for Linux." <speakup@braille.uwo.ca>
Sent: Tuesday, July 26, 2005 10:52 PM
Subject: Re: An idea,
> Well, I am not sure how it worked, but I once tried an X server for
> windows which was able to figure out the controls under Linux and my
> windows screen reader was able to read them after a fashion, so I
> wonder if there is some window information passed to the Xserver after
> all.
>
> on Wednesday 07/27/2005 Kelly Prescott(prescott@deltav.org) wrote
> > hmm, a interesting concept...
> > The problem is that by the time the x server sees most of the stuff, it
is
> > just screen position renderings. The server does not have a concept of
> > letters, characters, etc.
> > The server knows where you click on a screen, for example, but it just
> > sends the information to the under lying application which is
responsible
> > for deciding if you have clicked on a button etc.
> > This is a over simplified explaination, but for our purposes, it will
> > do...
> > Bottom line is that what ever toolbox, library, wigit set, rendering
app,
> > or what ever, it must feed the textual information to some interface
for
> > the screen reader to get at it so it can be read.
> > Hope this helps.
> > kp
> >
> >
> >
> > On Tue, 26 Jul 2005, Lorenzo Taylor wrote:
> >
> > > -----BEGIN PGP SIGNED MESSAGE-----
> > > Hash: SHA1
> > >
> > > Here's another idea, maybe no one has thought of it yet, or maybe it
is
> > > impossible to implement, but here it goes.
> > >
> > > It seems that the existing approaches for X screen readers should be
taking a
> > > look at Speakup as a model. Gnopernicus, for example, is using
libraries that
> > > rely on certain information ent by the underlying application
libraries.
> > > Unfortunately, this implementation causes only some apps to speak
while others
> > > which use the same widgets but whose libraries don't send messages to
the
> > > accessibility system will not speak. But it occurs to me that X is
simply a
> > > protocol by which client applications send messages to a server which
renders
> > > the proper text, windows, buttons and other widgets on the screen. I
believe
> > > that a screen reader that is an extension to the X server itself,
(like Speakup
> > > is a set of patches to the kernel) would be a far better solution, as
it could
> > > capture everything sent to the server and correctly translate it into
humanly
> > > understandable speech output without relying on "accessibility
messages" being
> > > sent from the client apps.
> > >
> > > Any thoughts on this would be welcome.
> > >
> > > Lorenzo
> > > - --
> > > - -----BEGIN GEEK CODE BLOCK-----
> > > Version: 3.12
> > > GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> > > O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> > > G e* h---- r+++ y+++
> > > - ------END GEEK CODE BLOCK------
> > > -----BEGIN PGP SIGNATURE-----
> > > Version: GnuPG v1.4.1 (GNU/Linux)
> > >
> > > iD8DBQFC5wJhG9IpekrhBfIRAuhgAKDNMp7ThoUKPYqiWC+u8WB3RS0oKQCgulck
> > > 2KEeJCAheJfd5oqbbUgiM5k=
> > > =lUXl
> > > -----END PGP SIGNATURE-----
> > >
> > > _______________________________________________
> > > Speakup mailing list
> > > Speakup@braille.uwo.ca
> > > http://speech.braille.uwo.ca/mailman/listinfo/speakup
> > >
> >
> > _______________________________________________
> > Speakup mailing list
> > Speakup@braille.uwo.ca
> > http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
> --
> Your life is like a penny. You're going to lose it. The question is:
> How do
> you spend it?
>
> John Covici
> covici@ccs.covici.com
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
>
> --
> No virus found in this incoming message.
> Checked by AVG Anti-Virus.
> Version: 7.0.338 / Virus Database: 267.9.5/58 - Release Date: 7/25/2005
>
>
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` hank smith
@ ` John covici
0 siblings, 0 replies; 31+ messages in thread
From: John covici @ UTC (permalink / raw)
To: hank smith; +Cc: Speakup is a screen review system for Linux., covici
Can't remember -- I think it had 34 in the name.
on Wednesday 07/27/2005 hank smith(hanksmith4@earthlink.net) wrote
> what program? just curiousity
> ----- Original Message -----
> From: "John covici" <covici@ccs.covici.com>
> To: "Speakup is a screen review system for Linux." <speakup@braille.uwo.ca>
> Sent: Tuesday, July 26, 2005 10:52 PM
> Subject: Re: An idea,
>
>
> > Well, I am not sure how it worked, but I once tried an X server for
> > windows which was able to figure out the controls under Linux and my
> > windows screen reader was able to read them after a fashion, so I
> > wonder if there is some window information passed to the Xserver after
> > all.
> >
> > on Wednesday 07/27/2005 Kelly Prescott(prescott@deltav.org) wrote
> > > hmm, a interesting concept...
> > > The problem is that by the time the x server sees most of the stuff, it
> is
> > > just screen position renderings. The server does not have a concept of
> > > letters, characters, etc.
> > > The server knows where you click on a screen, for example, but it just
> > > sends the information to the under lying application which is
> responsible
> > > for deciding if you have clicked on a button etc.
> > > This is a over simplified explaination, but for our purposes, it will
> > > do...
> > > Bottom line is that what ever toolbox, library, wigit set, rendering
> app,
> > > or what ever, it must feed the textual information to some interface
> for
> > > the screen reader to get at it so it can be read.
> > > Hope this helps.
> > > kp
> > >
> > >
> > >
> > > On Tue, 26 Jul 2005, Lorenzo Taylor wrote:
> > >
> > > > -----BEGIN PGP SIGNED MESSAGE-----
> > > > Hash: SHA1
> > > >
> > > > Here's another idea, maybe no one has thought of it yet, or maybe it
> is
> > > > impossible to implement, but here it goes.
> > > >
> > > > It seems that the existing approaches for X screen readers should be
> taking a
> > > > look at Speakup as a model. Gnopernicus, for example, is using
> libraries that
> > > > rely on certain information ent by the underlying application
> libraries.
> > > > Unfortunately, this implementation causes only some apps to speak
> while others
> > > > which use the same widgets but whose libraries don't send messages to
> the
> > > > accessibility system will not speak. But it occurs to me that X is
> simply a
> > > > protocol by which client applications send messages to a server which
> renders
> > > > the proper text, windows, buttons and other widgets on the screen. I
> believe
> > > > that a screen reader that is an extension to the X server itself,
> (like Speakup
> > > > is a set of patches to the kernel) would be a far better solution, as
> it could
> > > > capture everything sent to the server and correctly translate it into
> humanly
> > > > understandable speech output without relying on "accessibility
> messages" being
> > > > sent from the client apps.
> > > >
> > > > Any thoughts on this would be welcome.
> > > >
> > > > Lorenzo
> > > > - --
> > > > - -----BEGIN GEEK CODE BLOCK-----
> > > > Version: 3.12
> > > > GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> > > > O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> > > > G e* h---- r+++ y+++
> > > > - ------END GEEK CODE BLOCK------
> > > > -----BEGIN PGP SIGNATURE-----
> > > > Version: GnuPG v1.4.1 (GNU/Linux)
> > > >
> > > > iD8DBQFC5wJhG9IpekrhBfIRAuhgAKDNMp7ThoUKPYqiWC+u8WB3RS0oKQCgulck
> > > > 2KEeJCAheJfd5oqbbUgiM5k=
> > > > =lUXl
> > > > -----END PGP SIGNATURE-----
> > > >
> > > > _______________________________________________
> > > > Speakup mailing list
> > > > Speakup@braille.uwo.ca
> > > > http://speech.braille.uwo.ca/mailman/listinfo/speakup
> > > >
> > >
> > > _______________________________________________
> > > Speakup mailing list
> > > Speakup@braille.uwo.ca
> > > http://speech.braille.uwo.ca/mailman/listinfo/speakup
> >
> > --
> > Your life is like a penny. You're going to lose it. The question is:
> > How do
> > you spend it?
> >
> > John Covici
> > covici@ccs.covici.com
> >
> > _______________________________________________
> > Speakup mailing list
> > Speakup@braille.uwo.ca
> > http://speech.braille.uwo.ca/mailman/listinfo/speakup
> >
> >
> > --
> > No virus found in this incoming message.
> > Checked by AVG Anti-Virus.
> > Version: 7.0.338 / Virus Database: 267.9.5/58 - Release Date: 7/25/2005
> >
> >
--
Your life is like a penny. You're going to lose it. The question is:
How do
you spend it?
John Covici
covici@ccs.covici.com
^ permalink raw reply [flat|nested] 31+ messages in thread
* Re: An idea,
` An idea, Lorenzo Taylor
` Kenny Hitt
` Kelly Prescott
@ ` Janina Sajka
` Scott Berry
` Sean McMahon
` Sean McMahon
3 siblings, 2 replies; 31+ messages in thread
From: Janina Sajka @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Hi, Lorenzo:
Others have responded with reference to what an X server does, and
doesn't do. I want to respond to two other particular points from your
message.
Lorenzo Taylor writes:
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> ... Gnopernicus, for example, is using libraries that
> rely on certain information ent by the underlying application libraries.
> Unfortunately, this implementation causes only some apps to speak while others
> which use the same widgets but whose libraries don't send messages to the
> accessibility system will not speak.
This is only partially correct. Any applications using those "same
widgets," as you put it, will speak. There are no exceptions.
What causes them to not speak is that the properties required to make
them speak have not been supplied. So, Gnopernicus is getting an empty
string to renderd, which I suppose it dutifully renders as silence.
Fortunately, these are open source applications and we don't need an
advocacy campaign to resolve these kinds of problems. A solid example of
this at work is the Gnome Volume Control. It was written with gtk2, but
the developers did not supply all the relevant property data. So, a
blind programmer came along one weekend, fixed it, and submitted the
patch which has shipped with the rest of Gnome Volume Control ever
since.
Now the next point ...
> But it occurs to me that X is simply a
> protocol by which client applications send messages to a server which renders
> the proper text, windows, buttons and other widgets on the screen. I believe
> that a screen reader that is an extension to the X server itself, (like Speakup
> is a set of patches to the kernel) would be a far better solution, as it could
> capture everything sent to the server and correctly translate it into humanly
> understandable speech output without relying on "accessibility messages" being
> sent from the client apps.
As other have pointed out, there's nothing to be gained by speaking RGB
values at some particular X-Y mouse coordinate location. But, I'm sure
that's not what you really intend. If I interpret you correctly you're
suggesting some kind of mechanism whereby a widget of some kind can be
reliably identified and assigned values that the screen reader can
henceforth utter. This is the approach with Windows OSM that has been
used over the past decade, and it's what allows screen readers, like
JFW, to develop interfaces based on scripts. For instance, Take widget
number 38,492 and call it "volume slider," and speak it before anything
else on screen when it shows up on screen, and facilitate the method
that will allow user to use up and down arrow to change it's value,
etc., etc.
It is arguable, and has been cogently argued over the past 18 months,
that the failure of the original Desktop Accessibility Architecture
promoted by Sun and Gnome was to not provide such mechanisms. A great
part of the intent of the Orca screen reader proof of concept was to
provide exactly this kind of functionality. I believe this is now being
addressed, though I'm not aware any code for newer Gnopernicus (or post
Gnopernicus) readers is yet released. However, I do fully expect that
Gnopernicus is not the last word in desktop screen readers.
Janina
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` Janina Sajka
@ ` Scott Berry
` Luke Yelavich
` (2 more replies)
` Sean McMahon
1 sibling, 3 replies; 31+ messages in thread
From: Scott Berry @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
This really doesn't pertain to how Gnopernicus works. But one
question I really would like to through around out here is why are
the keystrokes so difficult. I know you can change key bindings in
Gnopernicus but it looked like a guy with 20 years of experience
would have to do it. Maybe this has changed in recent versions but I
remember when I started using Gnopernicus wow! Difficult to learn.
At 08:05 AM 7/27/2005, you wrote:
>Hi, Lorenzo:
>
>Others have responded with reference to what an X server does, and
>doesn't do. I want to respond to two other particular points from your
>message.
>
>Lorenzo Taylor writes:
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > ... Gnopernicus, for example, is using libraries that
> > rely on certain information ent by the underlying application libraries.
> > Unfortunately, this implementation causes only some apps to speak
> while others
> > which use the same widgets but whose libraries don't send messages to the
> > accessibility system will not speak.
>
>This is only partially correct. Any applications using those "same
>widgets," as you put it, will speak. There are no exceptions.
>
>What causes them to not speak is that the properties required to make
>them speak have not been supplied. So, Gnopernicus is getting an empty
>string to renderd, which I suppose it dutifully renders as silence.
>
>Fortunately, these are open source applications and we don't need an
>advocacy campaign to resolve these kinds of problems. A solid example of
>this at work is the Gnome Volume Control. It was written with gtk2, but
>the developers did not supply all the relevant property data. So, a
>blind programmer came along one weekend, fixed it, and submitted the
>patch which has shipped with the rest of Gnome Volume Control ever
>since.
>
>Now the next point ...
>
> > But it occurs to me that X is simply a
> > protocol by which client applications send messages to a server
> which renders
> > the proper text, windows, buttons and other widgets on the
> screen. I believe
> > that a screen reader that is an extension to the X server itself,
> (like Speakup
> > is a set of patches to the kernel) would be a far better
> solution, as it could
> > capture everything sent to the server and correctly translate it
> into humanly
> > understandable speech output without relying on "accessibility
> messages" being
> > sent from the client apps.
>
>
>As other have pointed out, there's nothing to be gained by speaking RGB
>values at some particular X-Y mouse coordinate location. But, I'm sure
>that's not what you really intend. If I interpret you correctly you're
>suggesting some kind of mechanism whereby a widget of some kind can be
>reliably identified and assigned values that the screen reader can
>henceforth utter. This is the approach with Windows OSM that has been
>used over the past decade, and it's what allows screen readers, like
>JFW, to develop interfaces based on scripts. For instance, Take widget
>number 38,492 and call it "volume slider," and speak it before anything
>else on screen when it shows up on screen, and facilitate the method
>that will allow user to use up and down arrow to change it's value,
>etc., etc.
>
>It is arguable, and has been cogently argued over the past 18 months,
>that the failure of the original Desktop Accessibility Architecture
>promoted by Sun and Gnome was to not provide such mechanisms. A great
>part of the intent of the Orca screen reader proof of concept was to
>provide exactly this kind of functionality. I believe this is now being
>addressed, though I'm not aware any code for newer Gnopernicus (or post
>Gnopernicus) readers is yet released. However, I do fully expect that
>Gnopernicus is not the last word in desktop screen readers.
>
> Janina
>
>_______________________________________________
>Speakup mailing list
>Speakup@braille.uwo.ca
>http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` Scott Berry
@ ` Luke Yelavich
` remapping keys in Gnopernicus was " Kenny Hitt
` Janina Sajka
` Sean McMahon
2 siblings, 1 reply; 31+ messages in thread
From: Luke Yelavich @ UTC (permalink / raw)
To: speakup
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
On Thu, Jul 28, 2005 at 12:25:06AM EST, Scott Berry wrote:
> This really doesn't pertain to how Gnopernicus works. But one
> question I really would like to through around out here is why are
> the keystrokes so difficult. I know you can change key bindings in
> Gnopernicus but it looked like a guy with 20 years of experience
> would have to do it. Maybe this has changed in recent versions but I
> remember when I started using Gnopernicus wow! Difficult to learn.
I haven't used Gnopernicus regularly for a while now, but I think it is
something to do with the various layers of functionality that
Gnopernicus has. For example, a mouse control layer, a magnification
control layer, a speech configuration layer, etc. I think the idea was
to keep as many keystrokes as possible on the numpad.
As you and others may have found out, this is hopeless when attempting
to use Gnopernicus on a laptop. One has to know how to access their
laptop numpad keys to use Gnopernicus, and then one still has to switch
back and forth between modes while working.
It would be worth while bringing this up on the gnome-accessibility
list, as others may have some ideas about all of this.
- --
Luke Yelavich
GPG key: 0xD06320CE
(http://www.themuso.com/themuso-gpg-key.txt)
Email & MSN: themuso@themuso.com
ICQ: 18444344
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1 (GNU/Linux)
iD8DBQFC553ujVefwtBjIM4RAjXBAKDFfrM6MKxIjQpILu43OH63cKSFmACg09HN
GPAAnfZM1eHvPHRDuYNmwss=
=zf/5
-----END PGP SIGNATURE-----
^ permalink raw reply [flat|nested] 31+ messages in thread* remapping keys in Gnopernicus was Re: An idea,
` Luke Yelavich
@ ` Kenny Hitt
` Farhan
0 siblings, 1 reply; 31+ messages in thread
From: Kenny Hitt @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Hi.
On Thu, Jul 28, 2005 at 12:45:02AM +1000, Luke Yelavich wrote:
> On Thu, Jul 28, 2005 at 12:25:06AM EST, Scott Berry wrote:
> > This really doesn't pertain to how Gnopernicus works. But one
> > question I really would like to through around out here is why are
> > the keystrokes so difficult. I know you can change key bindings in
> > Gnopernicus but it looked like a guy with 20 years of experience
> > would have to do it. Maybe this has changed in recent versions but I
> > remember when I started using Gnopernicus wow! Difficult to learn.
>
> I haven't used Gnopernicus regularly for a while now, but I think it is
> something to do with the various layers of functionality that
> Gnopernicus has. For example, a mouse control layer, a magnification
> control layer, a speech configuration layer, etc. I think the idea was
> to keep as many keystrokes as possible on the numpad.
>
> As you and others may have found out, this is hopeless when attempting
> to use Gnopernicus on a laptop. One has to know how to access their
> laptop numpad keys to use Gnopernicus, and then one still has to switch
> back and forth between modes while working.
>
> It would be worth while bringing this up on the gnome-accessibility
> list, as others may have some ideas about all of this.
> --
Maybe I've used Gnopernicus for too long now, but it isn't all that
difficult. Most commands needed by a totally blind person are on layer
0.
If you want to remap keys, go to the "command mapping"dialog under the
preferences menu.
Use right arrow to move to the user defined keys tab, tab to the
add button, select the key in the dialog, select the command in the list and you're done.
To delete a key from the keypad, goto the command mapping dialog under
the preferences menu, select the layers tab, select the key you want to
delete, tab to the delete button, and you're done.
Hope this helps.
Kenny
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: remapping keys in Gnopernicus was Re: An idea,
` remapping keys in Gnopernicus was " Kenny Hitt
@ ` Farhan
` Kenny Hitt
0 siblings, 1 reply; 31+ messages in thread
From: Farhan @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
the funny thing is when, i used gnopernicus with my fc3 install.
festival was the slow. so I really couldn't get much use outa it. I
wonder if someone could help me find a better software synth that's
free? i looked for flite but couldn't find anything.
on7/27/2005Kenny Hitt said
Hi.
On Thu, Jul 28, 2005 at 12:45:02AM +1000, Luke Yelavich wrote:
> On Thu, Jul 28, 2005 at 12:25:06AM EST, Scott Berry wrote:
> > This really doesn't pertain to how Gnopernicus works. But one
> > question I really would like to through around out here is why are
> > the keystrokes so difficult. I know you can change key bindings in
> > Gnopernicus but it looked like a guy with 20 years of experience
> > would have to do it. Maybe this has changed in recent versions but I
> > remember when I started using Gnopernicus wow! Difficult to learn.
>
> I haven't used Gnopernicus regularly for a while now, but I think it is
> something to do with the various layers of functionality that
> Gnopernicus has. For example, a mouse control layer, a magnification
> control layer, a speech configuration layer, etc. I think the idea was
> to keep as many keystrokes as possible on the numpad.
>
> As you and others may have found out, this is hopeless when attempting
> to use Gnopernicus on a laptop. One has to know how to access their
> laptop numpad keys to use Gnopernicus, and then one still has to switch
> back and forth between modes while working.
>
> It would be worth while bringing this up on the gnome-accessibility
> list, as others may have some ideas about all of this.
> --
Maybe I've used Gnopernicus for too long now, but it isn't all that
difficult. Most commands needed by a totally blind person are on layer
0.
If you want to remap keys, go to the "command mapping"dialog under the
preferences menu.
Use right arrow to move to the user defined keys tab, tab to the
add button, select the key in the dialog, select the command in the list and you're done.
To delete a key from the keypad, goto the command mapping dialog under
the preferences menu, select the layers tab, select the key you want to
delete, tab to the delete button, and you're done.
Hope this helps.
Kenny
_______________________________________________
Speakup mailing list
Speakup@braille.uwo.ca
http://speech.braille.uwo.ca/mailman/listinfo/speakup
Farhan
contact info.
Aim: and stoof
msn: i.am.Farhan@gmail.com
Jabber: Farhan@jabber.org
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: remapping keys in Gnopernicus was Re: An idea,
` Farhan
@ ` Kenny Hitt
0 siblings, 0 replies; 31+ messages in thread
From: Kenny Hitt @ UTC (permalink / raw)
To: Farhan, Speakup is a screen review system for Linux.
Hi.
Did you increase the rate for festival to the max? The only other free
synth I know about is freeTTS. It is java based. I've never tried it
myself. Festival is fine for my limited use of Gnopernicus and Gnome.
If you want to try freeTTS, you will need to ask a Fedora user how to
install and get it running.
Actually, since the last update of gnome-speech broke my festival, I just
use braille in Gnome.
Kenny
On Wed, Jul 27, 2005 at 11:37:47AM -0500, Farhan wrote:
> the funny thing is when, i used gnopernicus with my fc3 install.
> festival was the slow. so I really couldn't get much use outa it. I
> wonder if someone could help me find a better software synth that's
> free? i looked for flite but couldn't find anything.
>
> on7/27/2005Kenny Hitt said
> Hi.
>
> On Thu, Jul 28, 2005 at 12:45:02AM +1000, Luke Yelavich wrote:
> > On Thu, Jul 28, 2005 at 12:25:06AM EST, Scott Berry wrote:
> > > This really doesn't pertain to how Gnopernicus works. But one
> > > question I really would like to through around out here is why are
> > > the keystrokes so difficult. I know you can change key bindings in
> > > Gnopernicus but it looked like a guy with 20 years of experience
> > > would have to do it. Maybe this has changed in recent versions but I
> > > remember when I started using Gnopernicus wow! Difficult to learn.
> >
> > I haven't used Gnopernicus regularly for a while now, but I think it is
> > something to do with the various layers of functionality that
> > Gnopernicus has. For example, a mouse control layer, a magnification
> > control layer, a speech configuration layer, etc. I think the idea was
> > to keep as many keystrokes as possible on the numpad.
> >
> > As you and others may have found out, this is hopeless when attempting
> > to use Gnopernicus on a laptop. One has to know how to access their
> > laptop numpad keys to use Gnopernicus, and then one still has to switch
> > back and forth between modes while working.
> >
> > It would be worth while bringing this up on the gnome-accessibility
> > list, as others may have some ideas about all of this.
> > --
>
> Maybe I've used Gnopernicus for too long now, but it isn't all that
> difficult. Most commands needed by a totally blind person are on layer
> 0.
>
> If you want to remap keys, go to the "command mapping"dialog under the
> preferences menu.
> Use right arrow to move to the user defined keys tab, tab to the
> add button, select the key in the dialog, select the command in the list and you're done.
> To delete a key from the keypad, goto the command mapping dialog under
> the preferences menu, select the layers tab, select the key you want to
> delete, tab to the delete button, and you're done.
>
> Hope this helps.
> Kenny
>
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
> Farhan
> contact info.
> Aim: and stoof
> msn: i.am.Farhan@gmail.com
> Jabber: Farhan@jabber.org
>
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread
* Re: An idea,
` Scott Berry
` Luke Yelavich
@ ` Janina Sajka
` Sean McMahon
2 siblings, 0 replies; 31+ messages in thread
From: Janina Sajka @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Good point, Scott. Far too difficult, imho.
All I can answer by way of "Why?" is to point out that somebody had to
decide. And, in this instance, that somebody got insufficient input from
the greater "us." Clearly, they blew it. Output like:
Layer 0
Layer 1
Layer 2
Is so o o o unhelpful, when it could easily be very helpful.
Scott Berry writes:
> This really doesn't pertain to how Gnopernicus works. But one
> question I really would like to through around out here is why are
> the keystrokes so difficult. I know you can change key bindings in
> Gnopernicus but it looked like a guy with 20 years of experience
> would have to do it. Maybe this has changed in recent versions but I
> remember when I started using Gnopernicus wow! Difficult to learn.
>
>
>
> At 08:05 AM 7/27/2005, you wrote:
>
> >Hi, Lorenzo:
> >
> >Others have responded with reference to what an X server does, and
> >doesn't do. I want to respond to two other particular points from your
> >message.
> >
> >Lorenzo Taylor writes:
> >> -----BEGIN PGP SIGNED MESSAGE-----
> >> Hash: SHA1
> >>
> >> ... Gnopernicus, for example, is using libraries that
> >> rely on certain information ent by the underlying application libraries.
> >> Unfortunately, this implementation causes only some apps to speak
> >while others
> >> which use the same widgets but whose libraries don't send messages to the
> >> accessibility system will not speak.
> >
> >This is only partially correct. Any applications using those "same
> >widgets," as you put it, will speak. There are no exceptions.
> >
> >What causes them to not speak is that the properties required to make
> >them speak have not been supplied. So, Gnopernicus is getting an empty
> >string to renderd, which I suppose it dutifully renders as silence.
> >
> >Fortunately, these are open source applications and we don't need an
> >advocacy campaign to resolve these kinds of problems. A solid example of
> >this at work is the Gnome Volume Control. It was written with gtk2, but
> >the developers did not supply all the relevant property data. So, a
> >blind programmer came along one weekend, fixed it, and submitted the
> >patch which has shipped with the rest of Gnome Volume Control ever
> >since.
> >
> >Now the next point ...
> >
> >> But it occurs to me that X is simply a
> >> protocol by which client applications send messages to a server
> >which renders
> >> the proper text, windows, buttons and other widgets on the
> >screen. I believe
> >> that a screen reader that is an extension to the X server itself,
> >(like Speakup
> >> is a set of patches to the kernel) would be a far better
> >solution, as it could
> >> capture everything sent to the server and correctly translate it
> >into humanly
> >> understandable speech output without relying on "accessibility
> >messages" being
> >> sent from the client apps.
> >
> >
> >As other have pointed out, there's nothing to be gained by speaking RGB
> >values at some particular X-Y mouse coordinate location. But, I'm sure
> >that's not what you really intend. If I interpret you correctly you're
> >suggesting some kind of mechanism whereby a widget of some kind can be
> >reliably identified and assigned values that the screen reader can
> >henceforth utter. This is the approach with Windows OSM that has been
> >used over the past decade, and it's what allows screen readers, like
> >JFW, to develop interfaces based on scripts. For instance, Take widget
> >number 38,492 and call it "volume slider," and speak it before anything
> >else on screen when it shows up on screen, and facilitate the method
> >that will allow user to use up and down arrow to change it's value,
> >etc., etc.
> >
> >It is arguable, and has been cogently argued over the past 18 months,
> >that the failure of the original Desktop Accessibility Architecture
> >promoted by Sun and Gnome was to not provide such mechanisms. A great
> >part of the intent of the Orca screen reader proof of concept was to
> >provide exactly this kind of functionality. I believe this is now being
> >addressed, though I'm not aware any code for newer Gnopernicus (or post
> >Gnopernicus) readers is yet released. However, I do fully expect that
> >Gnopernicus is not the last word in desktop screen readers.
> >
> > Janina
> >
> >_______________________________________________
> >Speakup mailing list
> >Speakup@braille.uwo.ca
> >http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
--
Janina Sajka Phone: +1.202.494.7040
Partner, Capital Accessibility LLC http://www.CapitalAccessibility.Com
Bringing the Owasys 22C screenless cell phone to the U.S. and Canada. Go to http://www.ScreenlessPhone.Com to learn more.
Chair, Accessibility Workgroup Free Standards Group (FSG)
janina@freestandards.org http://a11y.org
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` Scott Berry
` Luke Yelavich
` Janina Sajka
@ ` Sean McMahon
2 siblings, 0 replies; 31+ messages in thread
From: Sean McMahon @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
What's difficult about keystrokes in gnopernicus? Are they something weird like
shift+alt+numpadinsert+f3 to open a file?
----- Original Message -----
From: "Scott Berry" <scott@citlink.net>
To: "Speakup is a screen review system for Linux." <speakup@braille.uwo.ca>
Sent: Wednesday, July 27, 2005 7:25 AM
Subject: Re: An idea,
> This really doesn't pertain to how Gnopernicus works. But one
> question I really would like to through around out here is why are
> the keystrokes so difficult. I know you can change key bindings in
> Gnopernicus but it looked like a guy with 20 years of experience
> would have to do it. Maybe this has changed in recent versions but I
> remember when I started using Gnopernicus wow! Difficult to learn.
>
>
>
> At 08:05 AM 7/27/2005, you wrote:
>
> >Hi, Lorenzo:
> >
> >Others have responded with reference to what an X server does, and
> >doesn't do. I want to respond to two other particular points from your
> >message.
> >
> >Lorenzo Taylor writes:
> > > -----BEGIN PGP SIGNED MESSAGE-----
> > > Hash: SHA1
> > >
> > > ... Gnopernicus, for example, is using libraries that
> > > rely on certain information ent by the underlying application libraries.
> > > Unfortunately, this implementation causes only some apps to speak
> > while others
> > > which use the same widgets but whose libraries don't send messages to the
> > > accessibility system will not speak.
> >
> >This is only partially correct. Any applications using those "same
> >widgets," as you put it, will speak. There are no exceptions.
> >
> >What causes them to not speak is that the properties required to make
> >them speak have not been supplied. So, Gnopernicus is getting an empty
> >string to renderd, which I suppose it dutifully renders as silence.
> >
> >Fortunately, these are open source applications and we don't need an
> >advocacy campaign to resolve these kinds of problems. A solid example of
> >this at work is the Gnome Volume Control. It was written with gtk2, but
> >the developers did not supply all the relevant property data. So, a
> >blind programmer came along one weekend, fixed it, and submitted the
> >patch which has shipped with the rest of Gnome Volume Control ever
> >since.
> >
> >Now the next point ...
> >
> > > But it occurs to me that X is simply a
> > > protocol by which client applications send messages to a server
> > which renders
> > > the proper text, windows, buttons and other widgets on the
> > screen. I believe
> > > that a screen reader that is an extension to the X server itself,
> > (like Speakup
> > > is a set of patches to the kernel) would be a far better
> > solution, as it could
> > > capture everything sent to the server and correctly translate it
> > into humanly
> > > understandable speech output without relying on "accessibility
> > messages" being
> > > sent from the client apps.
> >
> >
> >As other have pointed out, there's nothing to be gained by speaking RGB
> >values at some particular X-Y mouse coordinate location. But, I'm sure
> >that's not what you really intend. If I interpret you correctly you're
> >suggesting some kind of mechanism whereby a widget of some kind can be
> >reliably identified and assigned values that the screen reader can
> >henceforth utter. This is the approach with Windows OSM that has been
> >used over the past decade, and it's what allows screen readers, like
> >JFW, to develop interfaces based on scripts. For instance, Take widget
> >number 38,492 and call it "volume slider," and speak it before anything
> >else on screen when it shows up on screen, and facilitate the method
> >that will allow user to use up and down arrow to change it's value,
> >etc., etc.
> >
> >It is arguable, and has been cogently argued over the past 18 months,
> >that the failure of the original Desktop Accessibility Architecture
> >promoted by Sun and Gnome was to not provide such mechanisms. A great
> >part of the intent of the Orca screen reader proof of concept was to
> >provide exactly this kind of functionality. I believe this is now being
> >addressed, though I'm not aware any code for newer Gnopernicus (or post
> >Gnopernicus) readers is yet released. However, I do fully expect that
> >Gnopernicus is not the last word in desktop screen readers.
> >
> > Janina
> >
> >_______________________________________________
> >Speakup mailing list
> >Speakup@braille.uwo.ca
> >http://speech.braille.uwo.ca/mailman/listinfo/speakup
>
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread
* Re: An idea,
` Janina Sajka
` Scott Berry
@ ` Sean McMahon
` Laura Eaves
1 sibling, 1 reply; 31+ messages in thread
From: Sean McMahon @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
They really should have a device that can be trained to understand certain
shapes and just say what they are. Some which you could point at any visual
serface.
----- Original Message -----
From: "Janina Sajka" <janina@rednote.net>
To: "Speakup is a screen review system for Linux." <speakup@braille.uwo.ca>
Sent: Wednesday, July 27, 2005 6:05 AM
Subject: Re: An idea,
> Hi, Lorenzo:
>
> Others have responded with reference to what an X server does, and
> doesn't do. I want to respond to two other particular points from your
> message.
>
> Lorenzo Taylor writes:
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > ... Gnopernicus, for example, is using libraries that
> > rely on certain information ent by the underlying application libraries.
> > Unfortunately, this implementation causes only some apps to speak while
others
> > which use the same widgets but whose libraries don't send messages to the
> > accessibility system will not speak.
>
> This is only partially correct. Any applications using those "same
> widgets," as you put it, will speak. There are no exceptions.
>
> What causes them to not speak is that the properties required to make
> them speak have not been supplied. So, Gnopernicus is getting an empty
> string to renderd, which I suppose it dutifully renders as silence.
>
> Fortunately, these are open source applications and we don't need an
> advocacy campaign to resolve these kinds of problems. A solid example of
> this at work is the Gnome Volume Control. It was written with gtk2, but
> the developers did not supply all the relevant property data. So, a
> blind programmer came along one weekend, fixed it, and submitted the
> patch which has shipped with the rest of Gnome Volume Control ever
> since.
>
> Now the next point ...
>
> > But it occurs to me that X is simply a
> > protocol by which client applications send messages to a server which
renders
> > the proper text, windows, buttons and other widgets on the screen. I
believe
> > that a screen reader that is an extension to the X server itself, (like
Speakup
> > is a set of patches to the kernel) would be a far better solution, as it
could
> > capture everything sent to the server and correctly translate it into
humanly
> > understandable speech output without relying on "accessibility messages"
being
> > sent from the client apps.
>
>
> As other have pointed out, there's nothing to be gained by speaking RGB
> values at some particular X-Y mouse coordinate location. But, I'm sure
> that's not what you really intend. If I interpret you correctly you're
> suggesting some kind of mechanism whereby a widget of some kind can be
> reliably identified and assigned values that the screen reader can
> henceforth utter. This is the approach with Windows OSM that has been
> used over the past decade, and it's what allows screen readers, like
> JFW, to develop interfaces based on scripts. For instance, Take widget
> number 38,492 and call it "volume slider," and speak it before anything
> else on screen when it shows up on screen, and facilitate the method
> that will allow user to use up and down arrow to change it's value,
> etc., etc.
>
> It is arguable, and has been cogently argued over the past 18 months,
> that the failure of the original Desktop Accessibility Architecture
> promoted by Sun and Gnome was to not provide such mechanisms. A great
> part of the intent of the Orca screen reader proof of concept was to
> provide exactly this kind of functionality. I believe this is now being
> addressed, though I'm not aware any code for newer Gnopernicus (or post
> Gnopernicus) readers is yet released. However, I do fully expect that
> Gnopernicus is not the last word in desktop screen readers.
>
> Janina
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread* Re: An idea,
` Sean McMahon
@ ` Laura Eaves
0 siblings, 0 replies; 31+ messages in thread
From: Laura Eaves @ UTC (permalink / raw)
To: Sean McMahon, Speakup is a screen review system for Linux.
Hi Sean -- Were you at the nfb convention when they demoed the pocket
kurzweil reader? It was quite interesting -- kind of slow, but they have
projections of just such a technology, that could be even pointed at objects
like street signs for a blind person to read... But obviously it still has a
long way to go for that to happen.
Anyway, it is supposed to be available for several thousand dollars sometime
next year, so it will be interesting where it goes.
As for operating a GUI, however, I think such a thing would be essentially
useless as you would really need to still get inside the software to track
focus and other such things.
At best it would be usable marginally but far from practical.
Just some comments.
--le
----- Original Message -----
From: "Sean McMahon" <smcmahon@usgs.gov>
To: "Speakup is a screen review system for Linux." <speakup@braille.uwo.ca>
Sent: Wednesday, July 27, 2005 2:12 PM
Subject: Re: An idea,
They really should have a device that can be trained to understand certain
shapes and just say what they are. Some which you could point at any visual
serface.
----- Original Message -----
From: "Janina Sajka" <janina@rednote.net>
To: "Speakup is a screen review system for Linux." <speakup@braille.uwo.ca>
Sent: Wednesday, July 27, 2005 6:05 AM
Subject: Re: An idea,
> Hi, Lorenzo:
>
> Others have responded with reference to what an X server does, and
> doesn't do. I want to respond to two other particular points from your
> message.
>
> Lorenzo Taylor writes:
> > -----BEGIN PGP SIGNED MESSAGE-----
> > Hash: SHA1
> >
> > ... Gnopernicus, for example, is using libraries that
> > rely on certain information ent by the underlying application libraries.
> > Unfortunately, this implementation causes only some apps to speak while
others
> > which use the same widgets but whose libraries don't send messages to
> > the
> > accessibility system will not speak.
>
> This is only partially correct. Any applications using those "same
> widgets," as you put it, will speak. There are no exceptions.
>
> What causes them to not speak is that the properties required to make
> them speak have not been supplied. So, Gnopernicus is getting an empty
> string to renderd, which I suppose it dutifully renders as silence.
>
> Fortunately, these are open source applications and we don't need an
> advocacy campaign to resolve these kinds of problems. A solid example of
> this at work is the Gnome Volume Control. It was written with gtk2, but
> the developers did not supply all the relevant property data. So, a
> blind programmer came along one weekend, fixed it, and submitted the
> patch which has shipped with the rest of Gnome Volume Control ever
> since.
>
> Now the next point ...
>
> > But it occurs to me that X is simply a
> > protocol by which client applications send messages to a server which
renders
> > the proper text, windows, buttons and other widgets on the screen. I
believe
> > that a screen reader that is an extension to the X server itself, (like
Speakup
> > is a set of patches to the kernel) would be a far better solution, as it
could
> > capture everything sent to the server and correctly translate it into
humanly
> > understandable speech output without relying on "accessibility messages"
being
> > sent from the client apps.
>
>
> As other have pointed out, there's nothing to be gained by speaking RGB
> values at some particular X-Y mouse coordinate location. But, I'm sure
> that's not what you really intend. If I interpret you correctly you're
> suggesting some kind of mechanism whereby a widget of some kind can be
> reliably identified and assigned values that the screen reader can
> henceforth utter. This is the approach with Windows OSM that has been
> used over the past decade, and it's what allows screen readers, like
> JFW, to develop interfaces based on scripts. For instance, Take widget
> number 38,492 and call it "volume slider," and speak it before anything
> else on screen when it shows up on screen, and facilitate the method
> that will allow user to use up and down arrow to change it's value,
> etc., etc.
>
> It is arguable, and has been cogently argued over the past 18 months,
> that the failure of the original Desktop Accessibility Architecture
> promoted by Sun and Gnome was to not provide such mechanisms. A great
> part of the intent of the Orca screen reader proof of concept was to
> provide exactly this kind of functionality. I believe this is now being
> addressed, though I'm not aware any code for newer Gnopernicus (or post
> Gnopernicus) readers is yet released. However, I do fully expect that
> Gnopernicus is not the last word in desktop screen readers.
>
> Janina
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
_______________________________________________
Speakup mailing list
Speakup@braille.uwo.ca
http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread
* Re: An idea,
` An idea, Lorenzo Taylor
` (2 preceding siblings ...)
` Janina Sajka
@ ` Sean McMahon
3 siblings, 0 replies; 31+ messages in thread
From: Sean McMahon @ UTC (permalink / raw)
To: Speakup is a screen review system for Linux.
Don't know how you would make that work because I don't really understand X, but
your point about libs is well noted. It's the same problem in other gui
environments. If you don't use certain libs and certain classes, you don't have
accessibility.
----- Original Message -----
From: "Lorenzo Taylor" <lorenzo@taylor.homelinux.net>
To: <speakup@braille.uwo.ca>
Sent: Tuesday, July 26, 2005 8:41 PM
Subject: Re: An idea,
> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA1
>
> Here's another idea, maybe no one has thought of it yet, or maybe it is
> impossible to implement, but here it goes.
>
> It seems that the existing approaches for X screen readers should be taking a
> look at Speakup as a model. Gnopernicus, for example, is using libraries that
> rely on certain information ent by the underlying application libraries.
> Unfortunately, this implementation causes only some apps to speak while others
> which use the same widgets but whose libraries don't send messages to the
> accessibility system will not speak. But it occurs to me that X is simply a
> protocol by which client applications send messages to a server which renders
> the proper text, windows, buttons and other widgets on the screen. I believe
> that a screen reader that is an extension to the X server itself, (like
Speakup
> is a set of patches to the kernel) would be a far better solution, as it could
> capture everything sent to the server and correctly translate it into humanly
> understandable speech output without relying on "accessibility messages" being
> sent from the client apps.
>
> Any thoughts on this would be welcome.
>
> Lorenzo
> - --
> - -----BEGIN GEEK CODE BLOCK-----
> Version: 3.12
> GCS d- s:+ a- C+++ UL++++ P+ L+++ E- W++ N o K- w---
> O M V- PS+++ PE Y+ PGP++ t++ 5+ X+ R tv-- b++ DI-- D+
> G e* h---- r+++ y+++
> - ------END GEEK CODE BLOCK------
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.1 (GNU/Linux)
>
> iD8DBQFC5wJhG9IpekrhBfIRAuhgAKDNMp7ThoUKPYqiWC+u8WB3RS0oKQCgulck
> 2KEeJCAheJfd5oqbbUgiM5k=
> =lUXl
> -----END PGP SIGNATURE-----
>
> _______________________________________________
> Speakup mailing list
> Speakup@braille.uwo.ca
> http://speech.braille.uwo.ca/mailman/listinfo/speakup
^ permalink raw reply [flat|nested] 31+ messages in thread