Abstract:
Methods and devices for presenting representations of open windows in a device are provided herein. More particularly, the method includes a displaying a desktop application manager. The desktop application manager can include cards or thumbnails presented in a tray that can be navigated. The cards may be selected and placed onto the displays to make the application active. Thus, the desktop application manager can allow a user to navigate and manage open windows without scrolling the display of the windows.
Abstract:
A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad.
Abstract:
Methods and systems for presenting a user interface that includes a virtual keyboard are provided. More particularly, a virtual keyboard can be presented using one or more touch screens included in a multiple display device. The content of the virtual keyboard can be controlled in response to user input. Configurable portions of the virtual keyboard include selectable rows of virtual keys. In addition, whether selectable rows of virtual keys and/or a suggestion bar is displayed together with the standard character and control keys of the virtual keyboard can be determined in response to context or user input.
Abstract:
A dual display communication device includes a gesture capture region to receive a gesture, a first touch sensitive display to receive a gesture and display displayed images (such as a desktop or window of an application), and a second touch sensitive display to receive a gesture and display displayed images. Middleware receives a gesture, the gesture indicating that a displayed image is to be moved from the first touch sensitive display to the second touch sensitive display, such as to maximize a window to cover portions of both displays simultaneously; in response and prior to movement of the displayed image to the second touch sensitive display, moves a transition indicator from the first touch sensitive display to the second touch sensitive display to a selected position to be occupied by the displayed image; and thereafter moves the displayed image from the first touch sensitive display to the second touch sensitive display to the selected position.
Abstract:
Methods and devices for selectively presenting a user interface or “desktop” across two devices are provided. More particularly, a unified desktop is presented across a device and a computer system that comprise a unified system. The unified desktop acts as a single user interface that presents data and receives user interaction in a seamless environment that emulates a personal computing environment. To function within the personal computing environment, the unified desktop includes a process for docking and undocking the device with the computer system. The unified desktop presents a new user interface to allow access to functions of the unified desktop.
Abstract:
Methods and devices for selectively presenting a user interface in a dual screen device. More particularly, the method includes providing a gallery for the dual screen device. The gallery can present one or more images in a user interface. The gallery user interface can adapt to changes in the device configuration. Further, the gallery can display images or videos in the various configurations.
Abstract:
Methods and devices for selectively presenting a user interface or “desktop” across two devices are provided. More particularly, a unified desktop is presented across a device and a computer system that comprise a unified system. The unified desktop acts as a single user interface that presents data and receives user interaction in a seamless environment that emulates a personal computing environment. To function within the personal computing environment, the unified desktop includes a process for docking and undocking the device with the computer system. The unified desktop presents a new user interface to allow access to functions of the unified desktop.
Abstract:
Methods and devices for selectively presenting a user interface or “desktop” across two devices are provided. More particularly, a unified desktop is presented across a device and a computer system that comprise a unified system. The unified desktop acts as a single user interface that presents data and receives user interaction in a seamless environment that emulates a personal computing environment. To function within the personal computing environment, the unified desktop includes a process for docking and undocking the device with the computer system. The unified desktop presents a new user interface to allow access to functions of the unified desktop.
Abstract:
Systems and methods are provided for displaying a desktop for a multi-screen device in response to opening the device. The window stack can change based on a change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A composite display can expand over the area of the two or more displays comprising the device when opened. A desktop expands to fill the display area and be displayed on the second of the displays after the device is opened.
Abstract:
A dual-screen user device and methods for revealing a combination of selected desktops and applications on single and dual screens are disclosed. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. One embodiment provides an annunciator window extending across both screens in a dual screen configuration. The annunciator window provides alerts, notifications, and statuses of the device in an increased area thereby enhancing viewability of the information in the window. The annunciator window can be expanded over a selected screen to view full contents of the window without having to minimize or close running applications.