Abstract:
In an example method, an electronic device receives data regarding a graphical user interface to be presented on a display of the electronic device. The electronic device identifies one or more key regions of the graphical user interface based on the received data and one or more rules. The one or more rules pertain to at least one of a geometric shape, a geometric size, a location, or a hierarchical property. The graphical user interface is presented on the display of the electronic device, and the at least one of the key regions of the graphical user interface is indicated using the electronic device.
Abstract:
Methods, systems, and user interfaces enable users identify a user-selected location of a user interface with reduced movement and motor effort. A first selection indicator is overlaid on the user interface and moved in a first direction. Responsive to receiving a first user input to stop movement of the first selection indicator, movement of the first selection indicator is ceased over a first location of the user interface. A second selection indicator is overlaid on the user interface and the second selection indicator is moved in a second direction. Responsive to receiving a second user input to stop movement of the second selection indicator, movement of the second selection indicator is ceased over a second location of the user interface. The user-selected location of the user interface is selected based at least in part on the first and the second locations of the user interface.
Abstract:
In accordance with some embodiments, a method is performed at a device with one or more processors, non-transitory memory, a display, and an input device. The method includes receiving, from an application, application output data and accessibility metadata associated with user interface elements of a user interface of the application, wherein the accessibility metadata can be used by accessibility modules to identify, describe, or enable interaction with the user interface elements. The method includes comparing the application output data to the accessibility metadata. The method include determining, based on comparing the application output data to the accessibility metadata, that a particular user interface element of the user interface of the application lacks accurate corresponding accessibility metadata. The method includes displaying, on the display, a report including an indication that the particular user interface element lacks accurate corresponding accessibility metadata.
Abstract:
In general, in one aspect, a method performed by one or more processes executing on a computer systems includes receiving an audio signal comprising a range of audio frequencies including high frequencies and low frequencies, converting a first portion of the range of audio frequencies into haptic data, shifting a second portion of the range of audio frequencies to a different range of audio frequencies, and presenting at least one of the converted first portion and the shifted second portion to a human user. Other implementations of this aspect include corresponding systems, apparatus, and computer program products.
Abstract:
An electronic device with a display showing a user interface (UI) automatically adjusts the zoom level of a magnification region. The electronic device receives a request to magnify at least a portion of the display showing the UI. The electronic device determines the context that the electronic device was operating in at the time of the magnification request. The context is comprised of display parameters, environmental parameters, or both. The electronic device displays the UI at a zoom level determined based on user preferences. Upon detecting a text input condition, the device resizes and optionally moves the magnification region so that the resized magnification region does not overlap with the newly displayed composition interface window. The device uses an auto-snap feature when the content within the magnification region displays a second boundary upon scrolling the content within the magnification region from a first boundary opposite the second one.
Abstract:
Disclosed herein are systems, methods, and non-transitory computer-readable storage media for gifting a personalized album. A system is described that includes generating a personalized album from a request and subsequently gifting the personalized album to another party. The personalized album can include playback attributes stored in a file associated with the album or alternatively in metadata of the media items to be gifted. The playback attributes can limit playback of the album to a particular playback sequence or configure playback to reveal the names of media items as they are presented to the user. The personalized album can be accepted, rejected, or re-gifted by the recipient.
Abstract:
While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays an application launcher screen including a plurality of application icons. A respective application icon corresponds to a respective application stored in the device. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters. A respective gesture that corresponds to a respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character. The device determines whether the detected sequence of one or more gestures corresponds to a respective application icon of the plurality of application icons, and, in response to determining that the detected sequence of one or more gestures corresponds to the respective application icon, performs a predefined operation associated with the respective application icon.
Abstract:
Systems and processes for scanning a user interface are disclosed. One process can include scanning multiple elements within a user interface by highlighting the elements. The process can further include receiving a selection while one of the elements is highlighted and performing an action on the element that was highlighted when the selection was received. The action can include scanning the contents of the selected element or performing an action associated with the selected element. The process can be used to navigate an array of application icons, a menu of options, a standard desktop or laptop operating system interface, or the like. The process can also be used to perform gestures on a touch-sensitive device or mouse and track pad gestures (e.g., flick, tap, or freehand gestures).
Abstract:
While an electronic device with a display and a touch-sensitive surface is in a screen reader accessibility mode, the device displays an application launcher screen including a plurality of application icons. A respective application icon corresponds to a respective application stored in the device. The device detects a sequence of one or more gestures on the touch-sensitive surface that correspond to one or more characters. A respective gesture that corresponds to a respective character is a single finger gesture that moves across the touch-sensitive surface along a respective path that corresponds to the respective character. The device determines whether the detected sequence of one or more gestures corresponds to a respective application icon of the plurality of application icons, and, in response to determining that the detected sequence of one or more gestures corresponds to the respective application icon, performs a predefined operation associated with the respective application icon.