Recherche personnalisée

mercredi 25 juillet 2012

Jelly Bean: Accessibility Gestures Explained

Jelly Bean: Accessibility Gestures Explained

1 Jelly Bean: Accessibility Gestures Explained

This article details accessibility gestures in Jelly Bean and is a follow-up to Jelly Bean Accessibility Explained. It gives a conceptual overview of the 16 possible gestures and describes how they are used. The interaction behavior described here holds for all aspects of the Android user interface, including interactive Web pages within Chrome and the Android Web Browser.

1.1 Conceptual Overview Of The Gestures

Placing a finger on the screen speaks the item under the finger by first placing Accessibility Focus on that item. Moving the finger triggers touch exploration which moves Accessibility Focus.

To generate any of the Accessibility Gestures discussed below, one moves the finger much faster — how much faster is something we will tune over time, and if needed, make user customizable.

To remember the gestures, think of the four directions, Up, Down, Left and Right. In addition to these four basic navigational gestures, we defined an additional 12 gestures by picking pairwise combinations of these directional flicks, e.g., Left then Down — this gives a total of 16 possible gestures. In what follows, left then down means swipe left, and continue with a down flick. Note that in performing these additional gestures, speed matters the most. As an example, it is not essential that you make a perfect Capital L when performing the Down then Right gesture for instance; speed throughout the gesture, as well as ensuring that the finger moves some distance in each direction is key to avoid these being misinterpreted as basic navigational gestures.

1.2 Accessibility Focus And Accessibility Gestures

Accessibility Focus is moved using the four basic directional gestures. For now we have aliased Left with Up, and Downwith Right; i.e., both Left and Up move to the previous item, whereas Down and Right move to the next item. Note that this is not the same as moving with a physical D-Pad or keyboard on Android; the Android platform moves System Focus in response to the D-Pad. Thus, moving with a D-Pad or trackball moves you through the various interactive controls on the screen; moving Accessibility Focus via the Accessibility Gestures moves you through everything on the screen.

1.3 Accessibility Gestures For Common Actions

In addition to the basic navigation describe above, we define the following gestures for common actions:

Navigation Granularity
You can increase or decrease navigation granularity by rapidly stroking Up then Down or Down then Up.
Scrolling Lists
You can scroll a list forward by rapidly stroking Right then Left; the reverse, i.e., Left then Right scrolls the list backward by a screenful.

1.4 User Configurable Gestures

Gestures Down then Left, Up then Left, Down then Right and Up then Rightare user configurable; their default assignments are shown below.

Back
Gesture Down then Left is the same as pressing the Back button.
Home
Up then Left has the same effect as pressing the Home button.
Status Bar
Gesture Up then Right opens Status Notifications.
Recent
Down then Right has the same effect as pressing the Recent Applications button.

1.5 Summary

Gestures for manipulating and working with Accessibility Focus are an evolving part of the Android Accessibility; we will continue to refine these based on user experience. At this point, you are probably saying:


But wait, you said 16 gestures, but only told us the meanings of 12 of them

You are correct — we have left ourselves some gestures to use for future features.

mardi 24 juillet 2012

Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

1 Jelly Bean Accessibility Explained:Touch Exploration Augmented By Gestures

We announced a number of accessibility enhancements in Android Jelly Bean — see our Google IO 2012 announcements and our Android Accessibility talk from I/O 2012. This article gives a user-centric overview of the Jelly Bean interaction model as enabled by touch exploration and navigational gestures. Note that as with every release, Android Accesssibility continues to evolve, and so as before, what we have today is by no means the final word.

1.1 High-Level Concepts

First, here's some shared vocabulary to ensure that we're all talking of the same thing when explaining Jelly Bean access:

Random Access
Enable user to reach any part of the on-screen UI with equal ease. We enabled this as of ICS with touch exploration.
Deterministic Access
Enable user to reliably land on a desired item on the screen. We enable this in Jelly Bean with linear navigation.
Accessibility Focus
The item that the user most recently interacted with — either via touch exploaration or linear navigation receives accessibility focus.
Activation
User can activate item having accessibility focus by double-tapping anywhere on the screen.

1.2 Sample Interaction Scenarios We Enable

The combination of random access via touch exploration, backed up by linear navigation starting from the point the user just explored enables users to:

  • Touch explore an application to understand its screen layout,
  • Use muscle memory to quickly touch parts of the display to access familiar application screens,
  • Use linear navigation to reach the desired item when muscle memory is wrong by a small amount.

As an example, when using the Google Play Store, I can use muscle memory with touch exploration to find the Search button in the top action bar. Having found an application to install, I can once again use muscle memory to roughly touch in the vicinity of the Install button; If what I touch is not the Install button, I can typically find it with one or two linear navigation steps. Having found the Install button, I can double-tap anywhere on the screen.

The same use case in ICS where we lacked Accessibility Focus and linear navigation would have forced me to use touch exploration exclusively. In instances where muscle memory worked perfectly, this form of interaction was highly effective; but in our experience, it also tended to lead to breakdowns and consequent user frustration in instances where users almost found the control they were looking for but missed by a small amount.

Having introduced accessibility focus and linear navigation in Jelly Bean, we decided to eliminate the ICS requirement that the user tap on or near a control to activate it — we now enable users to activate the item with accessibility focus by tapping anywhere on the screen. To eliminate spurious taps --- especially on tablets, we made this a double-tap rather than a single tap. Note: based on user experience, we may optionally bring back single tap at some point in the future as an end-user customization.

1.3 Summary

Android Accessibility continues to move forward with Jelly Bean, and will continue to evolve rapidly along with the platform. Please use the Eyes-Free Google Group to provide constructive feedback on what works or doesn't work for you --- what is most effective is to objectively describe a given use case and your particular experience.