[Accessibility-handlers] about Minutes20080421
surkov.alexander at gmail.com
Wed Apr 23 22:44:04 PDT 2008
I have some questions when reading
*"PB:* big picture: two EH plug-ins, one for the browser, one for the AT.
The browser EH provides UI to navigate through the special content; The AT
EH knows how to transform specialized markup into speech or braille;
browser EH provides keyboard interface for objects/elements - the
granularity can be controlled by the user - the typical navigation requests
would be previous, current, and next item at the currently specified level
of granularity - when objects in the browser EH get focus they fire focus
events; theAT asks the object for its acc name which would expressed in a
standardized markup language for a particular discipline (math, music, chem)
and then the AT would call the AT EH for a transformation of a specific kind
(Braille, TTS) and then the AT would output the TTS/Braille to the TTS or
There is two cases: when the browser supports the XML dialect natively or it
doesn't. When the browser does then browser EH plug-in is the browser itself
(another words: browser EH plug-in is hardcoded into browser). For example,
like we have in Firefox for HTML. There we expose HTML content to AT by AT
API (like ATK or IAccessible2) usage. If the browser doesn't support some
XML dialect then browser EH plug-in may be a JS script that allows the
browser to expose that XML content to AT and here @implements attribute is
used to define when that script should be executed. In this chain I don't
get what role does AT EH plug-in play. Is it used when the specialized
markup should be transformed to information to be used by something else
than screen readers? Where are ontologies here?
Thank you much.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Accessibility-handlers