-
公开(公告)号:US20190257939A1
公开(公告)日:2019-08-22
申请号:US16401611
申请日:2019-05-02
Applicant: Google LLC
Inventor: Carsten C. Schwesig , Ivan Poupyrev
IPC: G01S13/90 , H04Q9/00 , G06F21/62 , G06K9/00 , G06K9/62 , G01S7/41 , G06F3/01 , G01S13/56 , G01S13/86 , G06F16/245
Abstract: A gesture component with a gesture library is described. The gesture component is configured to expose operations for execution by application of a computing device based on detected gestures. In one example, an input is detected using a three dimensional object detection system of a gesture component of the computing device. A gesture is recognized by the gesture component based on the detected input through comparison with a library of gestures maintained by the gesture component. An operation is then recognized that corresponds to the gesture by the gesture component using the library of gestures. The operation is exposed by the gesture component via an application programming interface to at least one application executed by the computing device to control performance of the operation by the at least one application.
-
公开(公告)号:US20190232156A1
公开(公告)日:2019-08-01
申请号:US16380245
申请日:2019-04-10
Applicant: Google LLC
Inventor: Patrick M. Amihood , Ivan Poupyrev
CPC classification number: A63F13/21 , A63F13/24 , A63F2300/8082 , G01S7/41 , G01S7/415 , G01S13/56 , G01S13/66 , G01S13/86 , G01S13/865 , G01S13/867 , G01S13/90 , G01S13/931 , G01S19/42 , G06F1/163 , G06F3/011 , G06F3/017 , G06F3/0346 , G06F3/0484 , G06F3/165 , G06F16/245 , G06F21/6245 , G06F2203/0384 , G06K9/00201 , G06K9/6288 , G06K9/629 , G06T7/75 , G08C17/02 , G08C2201/93 , H04Q9/00 , H04Q2209/883
Abstract: Techniques are described herein that enable advanced gaming and virtual reality control using radar. These techniques enable small motions and displacements to be tracked, even in the millimeter or submillimeter scale, for user control actions even when those actions are optically occluded or obscured.
-
公开(公告)号:US10366593B2
公开(公告)日:2019-07-30
申请号:US15570461
申请日:2017-02-08
Applicant: Google LLC
Inventor: Ivan Poupyrev , Antonio Xavier Cerruto , Mustafa Emre Karagozler , David Scott Allmon , Munehiko Sato , Susan Jane Wilhite , Shiho Fukuhara
Abstract: Systems and methods of determining an ergonomic assessment for a user are provided. For instance, sensor data can be received from one or more sensors implemented with an ergonomic assessment garment worn by a user. Corporeal data associated with at least one body segment of the user can be determined based at least in part on the sensor data. The corporeal data is associated with a bend angle associated with the at least one body segment. An ergonomic assessment associated with the user can be determined based at least in part on the corporeal data. The ergonomic assessment can include an indication of one or more ergonomic zones associated with the user, the one or more ergonomic zones being determined based at least in part on the bend angle associated with the at least one body segment.
-
公开(公告)号:US20190187265A1
公开(公告)日:2019-06-20
申请号:US15844229
申请日:2017-12-15
Applicant: Google LLC
Inventor: Brandon Barbello , Leonardo Giusti , Ivan Poupyrev , Eiji Hayashi
Abstract: Techniques and devices for seamless authentication using radar are described. In some implementations, a radar field is provided through a radar-based authentication system. The radar-based authentication system can sense reflections from an object in the radar field and analyze the reflections to determine whether the object is a person. In response to determining that the object is a person, the radar-based authentication system can sense an identifying characteristic associated with the person. Based on the identifying characteristic, the radar-based authentication system can determine that the person is an authorized user.
-
公开(公告)号:US20180310644A1
公开(公告)日:2018-11-01
申请号:US15963552
申请日:2018-04-26
Applicant: Google LLC
Inventor: Ivan Poupyrev , Leonardo Giusti , Mauricio Gutierrez Bravo
IPC: A41D1/00 , G06F3/044 , G08C17/02 , D02G3/44 , A41D1/04 , A43B3/00 , D04B1/24 , A41B1/08 , A63B43/00 , A63B69/00 , A63B71/06 , D04D9/00 , D04B21/20 , A45C3/06 , A61B5/0408 , A61B5/0478 , A61B5/00
CPC classification number: G06F3/1423 , G06F3/0416 , G06F3/1454 , G06F3/147 , G09G3/2092 , G09G5/006 , G09G2340/14 , G09G2354/00 , G09G2370/022 , G09G2370/16 , G09G2380/02
Abstract: This document describes an interactive object with at least one electronics module and a touch sensor. The interactive object may be a garment, garment accessory, or garment container. The interactive object may be configured to provide at least a haptic, audio, or visual output. The interactive object may also contain conductive threads and a touch sensor containing said conductive threads.
-
公开(公告)号:US20250156739A1
公开(公告)日:2025-05-15
申请号:US18838561
申请日:2022-02-14
Applicant: Google LLC
Inventor: Jingying Hu , Tong Wu , Sriram Krishna Parameswar , Gerard Pallipuram , Ivan Poupyrev
IPC: G06N5/04
Abstract: The present disclosure is directed to inferring user intent based on physical signals. The method includes receiving physical information associated with a first party user and a physical location associated with a merchant. The physical information is indicative of a location of the first party user relative to one or more of a plurality of first party items associated with the merchant. The method includes determining an item interest level for at least one first party item of the plurality of first party items based, at least in part, on the physical information. The method includes initiating a presentation of a content item to the first party user via a user device associated with the first party user based, at least in part, on the item interest level. The content item includes information for the at least one first party item.
-
公开(公告)号:US12153571B2
公开(公告)日:2024-11-26
申请号:US18495648
申请日:2023-10-26
Applicant: Google LLC
Inventor: Ivan Poupyrev , Gaetano Roberto Aiello
IPC: G06F16/242 , G06F3/01 , G06F16/2457 , G06F16/248 , G06F16/29 , G06F16/9537
Abstract: This document describes techniques and devices for a radar recognition-aided search. Through use of a radar-based recognition system, gestures made by, and physiological information about, persons can be determined. In the case of physiological information, the techniques can use this information to refine a search. For example, if a person requests a search for a coffee shop, the techniques may refine the search to coffee shops in the direction that the person is walking. In the case of a gesture, the techniques may refine or base a search solely on the gesture. Thus, a search for information about a store, car, or tree can be made responsive to a gesture pointing at the store, car, or tree with or without explicit entry of a search query.
-
公开(公告)号:US12111713B2
公开(公告)日:2024-10-08
申请号:US17650488
申请日:2022-02-09
Applicant: Google LLC
Inventor: Leonardo Giusti , Ivan Poupyrev , Eiji Hayashi , Patrick M. Amihood
IPC: G06F1/3231 , G01S7/35 , G01S13/04 , G06F1/3287 , G06F3/01
CPC classification number: G06F1/3231 , G01S7/35 , G01S13/04 , G06F1/3287 , G06F3/017
Abstract: This document describes techniques and systems that enable a smartphone-based radar system for determining user intention in a lower-power mode. The techniques and systems use a radar field to enable the smartphone to accurately determine the presence or absence of a user and further determine the intention of the user to interact with the smartphone. Using these techniques, the smartphone can account for the user's nonverbal communication cues to determine and maintain an awareness of users in its environment, and only respond to direct interactions once a user has demonstrated an intention to interact, which preserves battery power. The smartphone may determine the user's intention by recognizing various cues from the user, such as a change in position relative to the smartphone, a change in posture, or by an explicit action, such as a gesture.
-
公开(公告)号:US20240094827A1
公开(公告)日:2024-03-21
申请号:US18521539
申请日:2023-11-28
Applicant: Google LLC
Inventor: Jung Ook Hong , Patrick M. Amihood , John David Jacobs , Abel Seleshi Mengistu , Leonardo Giusti , Vignesh Sachidanandam , Devon James O'Reilley Stern , Ivan Poupyrev , Brandon Barbello , Tyler Reed Kugler , Johan Prag , Artur Tsurkan , Alok Chandel , Lucas Dupin Moreira Costa , Selim Flavio Cinek
Abstract: Systems and techniques are described for robust radar-based gesture-recognition. A radar system detects radar-based gestures on behalf of application subscribers. A state machine transitions between multiple states based on inertial sensor data. A no-gating state enables the radar system to output radar-based gestures to application subscribers. The state machine also includes a soft-gating state that prevents the radar system from outputting the radar-based gestures to the application subscribers. A hard-gating state prevents the radar system from detecting radar-based gestures altogether. The techniques and systems enable the radar system to determine when not to perform gesture-recognition, enabling user equipment to automatically reconfigure the radar system to meet user demand. By so doing, the techniques conserve power, improve accuracy, or reduce latency relative to many common techniques and systems for radar-based gesture-recognition.
-
公开(公告)号:US20230350049A1
公开(公告)日:2023-11-02
申请号:US17661401
申请日:2022-04-29
Applicant: Google LLC
Inventor: Anandghan Waghmare , Dongeek Shin , Ivan Poupyrev , Shwetak N. Patel , Shahram Izadi , Adarsh Prakash Murthy Kowdle
IPC: G01S13/72 , G01S7/35 , G06F3/0338 , G06F3/01
CPC classification number: G01S13/723 , G01S7/35 , G06F3/0338 , G06F3/014 , G06F2203/0331
Abstract: A method including transmitting, by a peripheral device communicatively coupled to a wearable device, a frequency-modulated continuous wave (FMCW), receiving, by the peripheral device, a reflected signal based on the FMCW, tracking, by the peripheral device, a movement associated with the peripheral device based on the reflected signal, and communicating, from the peripheral device to the wearable device, an information corresponding to the movement associated with the peripheral device.
-
-
-
-
-
-
-
-
-