Does linux have a 'focus' concept? #503
-
In another discussion, it was mentioned that:
I have been trying to get a better picture of 'focus' generally. I am familiar with VoiceOver on apple platforms. Over there, there is a concept of a voiceover 'cursor' that can be moved 'forward' and 'back' (and 'up') to 'focus' on a particular widget, and the current focused widget can be queried via an API like I have not been able to find an analogous concept on Linux through orca or ATSPI. For example over here, the property
I'm developing the impression that Linux has no "accessibility focus" concept and relies on keyboard focus exclusively. Is that correct? If so, that presents real challenges taking code I wrote and just expecting it to work on Linux:
Does accesskit (or any other middleware) do things to 'smooth over' these platform differences? Where can I go to find out about how other people are porting accessible software? |
Beta Was this translation helpful? Give feedback.
Replies: 6 comments 11 replies
-
On the Linux desktop (AT-SPI) focus is a state. It would technically be possible for multiple widgets to be focused, although AccessKit will make sure this does not happen. Focus is not necessarily tied to keyboard events. AccessKit basically considers that any node which supports the action Bounding box is not linked to focus, although it is frequently used in conjunction with it: think of screen magnifiers which move the magnified area to track the focus, or various other assistive technologies that might display a rectangle around the focused widget. There are a few platform-specific behaviors around focus though. One that come to my mind right now is whether the focus follows the selection. This is something AccessKit doesn't address. My only knowledge on this is some personal experience and reading through Chromium's source code. |
Beta Was this translation helpful? Give feedback.
-
Even on macOS, Tab is implemented by the UI toolkit, not by VoiceOver. The reason why the behavior of Tab changes when VoiceOver is enabled is that VoiceOver enables a setting called Full Keyboard Access. So, to match native behavior, you'd need to check that setting using the There are also UI elements, like labels, that a screen reader user can move their screen reader cursor to, but which don't receive keyboard focus. Android has the concept of "accessibility focus" as distinct from keyboard focus, so Android applications can know and control the position of the screen reader cursor. And I believe iOS does too. I'm not sure if macOS does; my work on AccessKit so far assumed that it doesn't. Windows definitely doesn't, and I'm not sure about Linux either. But in all cases, you definitely shouldn't put labels (a.k.a. static text) in the tab order so they can receive keyboard focus. Screen reader users will access these elements exclusively with screen reader keyboard commands. So, if you want to test with Orca, you'll need to figure out why those commands aren't working for you. Are you using Wayland or X11? |
Beta Was this translation helpful? Give feedback.
-
I wouldn't be opposed to exposing a focus action through the AT-SPI action interface. Maybe we've over-optimized for what Orca needs. |
Beta Was this translation helpful? Give feedback.
-
Yes, you're correct about the problem of Orca not being able to receive keyboard input on Wayland.
Yes, there's an old way for toolkits to pass keyboard input to ATs. GTK supported this through version 3.x, and Qt still does. We chose to follow GTK 4's lead and not support this old solution in AccessKit, because it would have required all toolkit developers to make major changes to their event loop that weren't necessary on any other platform, and frankly, when we made that decision at the end of 2022, we thought the problem would be fixed quickly, because of GTK 4. I suppose we could at least provide the option to do what GTK 3 and current Qt are doing. Are you using winit? |
Beta Was this translation helpful? Give feedback.
-
Blind Linux users, at least those in the know, still use X11. |
Beta Was this translation helpful? Give feedback.
-
I took a look at trying to implement this myself – I think the key events are sent via via DeviceEventController's notify_listener_sync. After trying that, orca seems to partly understand my keystrokes although not entirely well. I think there are issues with how I represent the events. Is there a way to snoop traffic from an app that works so i can see the keycodes it is sending etc? |
Beta Was this translation helpful? Give feedback.
Blind Linux users, at least those in the know, still use X11.