-
公开(公告)号:US10936185B2
公开(公告)日:2021-03-02
申请号:US16884943
申请日:2020-05-27
Applicant: Google LLC
Inventor: Leonardo Giusti , Ivan Poupyrev , Patrick M. Amihood
IPC: G06F3/0488 , G06K9/00 , G06T19/00 , H04M1/725
Abstract: This document describes techniques and systems that enable a smartphone-based radar system facilitating ease and accuracy of user interactions with a user interface. The techniques and systems can be implemented in an electronic device, such as a smartphone, and use a radar field to accurately determine three-dimensional (3D) gestures that can be used in combination with other inputs, such as touch or voice inputs, to interact with the user interface. These techniques allow the user to make 3D gestures from a distance—and enable seamless integration of touch and voice commands with 3D gestures to improve functionality and user enjoyment.
-
42.
公开(公告)号:US10930251B2
公开(公告)日:2021-02-23
申请号:US16802024
申请日:2020-02-26
Applicant: Google LLC
Inventor: Leonardo Giusti , Ivan Poupyrev , Brandon Barbello , Patrick M. Amihood
Abstract: This document describes techniques and systems that enable a smartphone-based radar system for facilitating awareness of user presence and orientation. The techniques and systems use a radar field to accurately determine a user's location and physical orientation with respect to an electronic device, such as a smartphone. The radar field also enables the device to receive 3D gestures from the user to interact with the device. The techniques allow the device to provide functionality based on the user's presence and orientation, and to appropriately adjust the timing, content, and format of the device's interactions with the user.
-
公开(公告)号:US20200264765A1
公开(公告)日:2020-08-20
申请号:US16080293
申请日:2016-12-07
Applicant: Google LLC
Inventor: Ivan Poupyrev , Carsten C. Schwesig , Jack Schulze , Timo Arnall
IPC: G06F3/0484
Abstract: Systems and methods of providing control-article-based control of a user interface are provided. For instance, a presence of a control article can be detected in a first proximity zone proximate a user device. Responsive to detecting the presence of the control article, presentation data corresponding to a presentation mode of a user interface associated with the user computing device can be provided for display. A presence of the control article can be detected in a second proximity zone proximate the user computing device. The second proximity zone can define a separate physical area than the first proximity zone. Responsive to detecting the presence of the control article in the second proximity zone, interactive data corresponding to an interactive mode of the user interface can be provided for display.
-
公开(公告)号:US20200258366A1
公开(公告)日:2020-08-13
申请号:US16827024
申请日:2020-03-23
Applicant: Google LLC
Inventor: Ivan Poupyrev , Antonio Xavier Cerruto , Mustafa Emre Karagozler , David Scott Allmon , Munehiko Sato , Susan Jane Wilhite , Shiho Fukuhara
Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.
-
公开(公告)号:US10660379B2
公开(公告)日:2020-05-26
申请号:US16356748
申请日:2019-03-18
Applicant: Google LLC
Inventor: Ivan Poupyrev , Carsten C. Schwesig , Mustafa Emre Karagozler , Hakim K. Raja , David Scott Allmon , Gerard George Pallipuram , Shiho Fukuhara , Nan-Wei Gong
IPC: A41D1/00 , G06F1/16 , D02G3/44 , D02G3/12 , D03D1/00 , G06F3/041 , G06F3/044 , G06F3/0488 , D02G3/36
Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.
-
公开(公告)号:US20200150771A1
公开(公告)日:2020-05-14
申请号:US16189346
申请日:2018-11-13
Applicant: Google LLC
Inventor: Leonardo Giusti , Ivan Poupyrev , Eiji Hayashi , Patrick M. Amihood , Bryan Allen
IPC: G06F3/01 , G06F3/0354 , G06F3/03 , G01S13/89 , G01S13/06
Abstract: This document describes techniques and systems that enable a radar-image shaper for radar-based applications. A radar field enables an electronic device to accurately determine a characteristic disposition (e.g., a location, orientation, velocity, or direction) of an object in the radar field. The characteristic disposition is determined by detecting a radar cross-section (or radar signature) of a radar-image shaper that is included in the object. The shape of the radar-image shaper produces a known signature when illuminated by the radar field. Using these techniques, the electronic device can determine a characteristic disposition of the object, which allows the object to be used to interact with the electronic device using gestures and other position-based techniques. Because the radar-image shaper enables a passive object to control applications on the electronic device, users have an interaction method with a rich library of gestures and controls that does not require additional components or a battery.
-
公开(公告)号:US20200064996A1
公开(公告)日:2020-02-27
申请号:US16112130
申请日:2018-08-24
Applicant: Google LLC
Inventor: Leonardo Giusti , Ivan Poupyrev , Patrick M. Amihood
IPC: G06F3/0488 , G06K9/00 , G06T19/00 , H04M1/725
Abstract: This document describes techniques and systems that enable a smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface. The techniques and systems use a radar field to accurately determine three-dimensional (3D) gestures that can be used to interact with augmented-reality (AR) objects that are presented on a display of an electronic device, such as a smartphone. These techniques allow the user to make 3D gestures from a distance—the user does not have to hold the electronic device steady while touching the screen and the gestures do not obstruct the user's view of the AR objects presented on the display.
-
公开(公告)号:US10409385B2
公开(公告)日:2019-09-10
申请号:US15703511
申请日:2017-09-13
Applicant: Google LLC
Inventor: Ivan Poupyrev
Abstract: This document describes techniques and devices for occluded gesture recognition. Through use of the techniques and devices described herein, users may control their devices even when a user's gesture is occluded by some material between the user's hands and the device itself. Thus, the techniques enable users to control their mobile devices in many situations in which control is desired but conventional techniques do permit effective control, such as when a user's mobile computing device is occluded by being in a purse, bag, pocket, or even in another room.
-
公开(公告)号:US20190208837A1
公开(公告)日:2019-07-11
申请号:US16356748
申请日:2019-03-18
Applicant: Google LLC
Inventor: Ivan Poupyrev , Carsten C. Schwesig , Mustafa Emre Karagozler , Hakim K. Raja , David Scott Allmon , Gerard George Pallipuram , Shiho Fukuhara , Nan-Wei Gong
Abstract: This document describes techniques using, and objects embodying, an interactive fabric which is configured to sense user interactions in the form of single or multi-touch-input (e.g., gestures). The interactive fabric may be integrated into a wearable interactive garment (e.g., a jacket, shirt, or pants) that is coupled (e.g., via a wired or wireless connection) to a gesture manager. The gesture manager may be implemented at the interactive garment, or remote from the interactive garment, such as at a computing device that is wirelessly paired with the interactive garment and/or at a remote cloud based service. Generally, the gesture manager recognizes user interactions to the interactive fabric, and in response, triggers various different types of functionality, such as answering a phone call, sending a text message, creating a journal entry, and so forth.
-
公开(公告)号:US20190138109A1
公开(公告)日:2019-05-09
申请号:US16238464
申请日:2019-01-02
Applicant: Google LLC
Inventor: Ivan Poupyrev , Carsten Schwesig , Jack Schulze , Timo Arnall , Durrell Grant Bevington Bishop
IPC: G06F3/01 , G06K9/00 , G06F3/041 , G06F3/0488
Abstract: Gesture detection and interaction techniques are described. Object detection used to support the gestures may be accomplished in a variety of ways, such as by using radio waves as part of a radar technique. In a first example, the techniques are implemented such that one hand of a user sets a context for a gesture that is defined by another hand of the user. In another example, a gesture recognition mode is utilized. In yet another example, detection of distance is used such that the same motions may be used to different between operations performed. In a further example, split gestures are supported. In another instance, entry into a gesture recognition mode may be implemented through touch and then recognized through three-dimensional orientation and motion of that hand or another.
-
-
-
-
-
-
-
-
-