From mboxrd@z Thu Jan 1 00:00:00 1970 Received: (qmail 6934 invoked from network); 7 Dec 1998 09:28:45 -0000 Received: from mail.redhat.com (199.183.24.239) by lists.redhat.com with SMTP; 7 Dec 1998 09:28:45 -0000 Received: from ariel.ucs.unimelb.EDU.AU (jasonw@ariel.ucs.unimelb.EDU.AU [128.250.20.3]) by mail.redhat.com (8.8.7/8.8.7) with ESMTP id EAA14860 for ; Mon, 7 Dec 1998 04:23:30 -0500 Received: from localhost (jasonw@localhost) by ariel.ucs.unimelb.EDU.AU (8.8.5/8.8.5) with SMTP id UAA22078 for ; Mon, 7 Dec 1998 20:23:27 +1100 (AEDT) Date: Mon, 7 Dec 1998 20:23:27 +1100 (AEDT) From: Jason White X-Sender: jasonw@ariel.ucs.unimelb.EDU.AU To: blinux-list@redhat.com Subject: Re: Concerning BLinux project (fwd) In-Reply-To: Message-ID: MIME-Version: 1.0 Content-Type: TEXT/PLAIN; charset=US-ASCII List-Id: One worthwhile project would be to design a user interface that provides generic components which capture the semantic distinctions required for spoken interaction and can also be presented graphically. This could be based on an analysis of the types of input/output options required by various types of interactions with application software. The resulting system would need to be easy and natural for programmers to include in their applications and also provide controls over the presentation in different media, so that the visual, auditory and tactile renderings could be independently customised. Important and relevant analysis is provided in T. V. Raman's excellent book, Auditory User Interfaces, Toward the Speaking Computer (1997), details of which can be found at his web site. The essential idea would be to extend the distinction between content and presentation, which has been developed in structured markup languages, to the entire user interface by creating a general means of specifying interaction semantics and then specific presentational controls. I am sure that experts in the u i field have developed proposals of this kind.