-
公开(公告)号:US11860267B2
公开(公告)日:2024-01-02
申请号:US17556221
申请日:2021-12-20
申请人: Oculii Corp.
发明人: Lang Hong , Steven Hong
IPC分类号: G01S13/72 , G01S13/90 , G01S13/00 , G01S7/42 , G01S7/52 , G01S13/02 , G01S15/66 , G01S13/931 , G01S13/86 , G01S13/44 , G01S13/66
CPC分类号: G01S13/723 , G01S7/42 , G01S13/003 , G01S13/90 , G01S7/52019 , G01S7/52023 , G01S13/4463 , G01S13/66 , G01S13/86 , G01S13/931 , G01S15/66 , G01S2013/0263 , G01S2013/932 , G01S2013/9316 , G01S2013/9322 , G01S2013/9323 , G01S2013/9324
摘要: A method for interpolated virtual aperture array radar tracking includes: transmitting first and second probe signals; receiving a first reflected probe signal at a radar array; receiving a second reflected probe signal at the radar array; calculating a target range from at least one of the first and second reflected probe signals; corresponding signal instances of the first reflected probe signal to physical receiver elements of the radar array; corresponding signal instances of the second reflected probe signal to virtual elements of the radar array; interpolating signal instances; calculating a first target angle; and calculating a position of the tracking target relative to the radar array from the target range and first target angle.
-
公开(公告)号:US20230400565A1
公开(公告)日:2023-12-14
申请号:US17838139
申请日:2022-06-10
发明人: Ruben CABALLERO , Jouya JADIDIAN
摘要: Techniques disclosed herein may be utilized to detect, measure, and/or track the location of objects via radar sensor devices that are affixed to a wearable device. Each of the radar sensors (e.g., MIMIC radar sensor) generates, captures, and evaluates radar signals associated with the wearable device (e.g., HMD) and the surrounding environment. Objects located within the field of view with sufficient reflectivity will result in radar return signals each with a characteristic time of arrival (TOA), angle of arrival (AOA), and frequency shift (Doppler shift). The sensed return signals can be processed to determine distance and direction, as well as identification of the objects based on radar characteristics of the object (e.g., radar back-scatter or cross-section pattern). Object information, including position and identification, may be further resolved based on correlation with measurements from one or more of the digital cameras or inertial measurement units.
-
公开(公告)号:US20230400563A1
公开(公告)日:2023-12-14
申请号:US18231202
申请日:2023-08-07
申请人: Transrobotics, Inc.
发明人: Sayf Alalusi
IPC分类号: G01S13/32 , G01S13/10 , G01S17/36 , G01S7/03 , G01S7/35 , G01S13/88 , G01S17/32 , G01S13/86 , G01S13/87 , G01S13/08 , G01S13/89 , G01S17/87 , G01S17/89 , G01S7/41 , G01S13/36
CPC分类号: G01S13/325 , G01S13/103 , G01S17/36 , G01S7/03 , G01S7/352 , G01S13/88 , G01S17/32 , G01S13/867 , G01S13/878 , G01S13/08 , G01S13/89 , G01S17/87 , G01S17/89 , G01S7/41 , G01S13/36 , G01S7/497
摘要: A method (e.g., a method for measuring a separation distance to a target object) includes transmitting an electromagnetic first transmitted signal from a transmitting antenna toward a target object that is a separated from the transmitting antenna by a separation distance. The first transmitted signal includes a first transmit pattern representative of a first sequence of digital bits. The method also includes receiving a first echo of the first transmitted signal that is reflected off the target object, converting the first echo into a first digitized echo signal, and comparing a first receive pattern representative of a second sequence of digital bits to the first digitized echo signal to determine a time of flight of the first transmitted signal and the echo.
-
公开(公告)号:US11841420B2
公开(公告)日:2023-12-12
申请号:US17527596
申请日:2021-11-16
申请人: Oculii Corp.
发明人: Lang Hong , Steven Hong
CPC分类号: G01S13/89 , G01S7/003 , G01S7/412 , G01S13/42 , G01S13/584 , G01S13/86 , G01S13/865 , G01S13/867 , G01S13/931
摘要: A method for radar-based localization and/or mapping, preferably including receiving sensor data, determining egospeed, and/or determining egorotation. The method can optionally include performing simultaneous localization and mapping. A system for radar-based localization and/or mapping, preferably including one or more radar sensors, and optionally including one or more vehicles and/or auxiliary sensors (e.g., coupled to the radar sensors).
-
公开(公告)号:US20230394692A1
公开(公告)日:2023-12-07
申请号:US17955884
申请日:2022-09-29
发明人: YU-HSUAN CHIEN , CHIN-PIN KUO
CPC分类号: G06T7/521 , G01S13/89 , G01S13/08 , G01S13/867 , G06T2207/10028 , G06T2207/30261 , G06T2207/20084 , G06T2207/20081
摘要: A method for estimating depth implemented in an electronic device includes obtaining a first image; inputting the first image into a depth estimation model, and obtaining a first depth image; obtaining a depth ratio factor, the depth ratio factor indicating a relationship between a relative depth and a depth of each pixel in the first depth image; and obtaining depth information of the first depth image according to the first depth image and the depth ratio factor.
-
公开(公告)号:US20230394691A1
公开(公告)日:2023-12-07
申请号:US17834850
申请日:2022-06-07
CPC分类号: G06T7/521 , G06V10/803 , G01S13/867 , G01S13/89 , G06T2207/10028 , G06T2207/20068 , G06T2207/20081 , G06T2207/30248
摘要: Systems and methods are provided for depth estimation from monocular images using a depth model with sparse range sensor data and uncertainty in the range sensor as inputs thereto. According to some embodiments, the methods and systems comprise receiving an image captured by an image sensor, where the image represents a scene of an environment. The method and systems also comprise deriving a point cloud representative of the scene of the environment from range sensor data, and deriving range sensor uncertainty from the range sensor data. Then a depth map can be derived for the image based on the point cloud and the range sensor uncertainty as one or more inputs into a depth model.
-
公开(公告)号:US11828837B2
公开(公告)日:2023-11-28
申请号:US17416153
申请日:2018-12-21
发明人: Magnus Åström , Bengt Lindoff
IPC分类号: G01S13/524 , G01S13/10 , G01S13/86
CPC分类号: G01S13/5242 , G01S13/103 , G01S13/867
摘要: A device and method therein for improving measurement result made by a radar unit are disclosed. The device comprises the radar unit and at least one motion sensor unit. The radar unit transmits at least one radar pulse in a frequency range and receives at least one radar pulse response associated to reflections of the at least one transmitted radar pulse. The radar unit determines at least one measurement based on the transmitted and received radar pulses. The radar unit further receives information on movement of the device from the at least one motion sensor unit during radar pulse transmission and reception and adjust the at least one measurement based on received information on movement of the device from the at least one motion sensor unit.
-
公开(公告)号:US11822731B2
公开(公告)日:2023-11-21
申请号:US17969823
申请日:2022-10-20
申请人: Google LLC
CPC分类号: G06F3/017 , G01S7/415 , G01S13/867 , G06F3/011 , G06F3/0346 , G06F3/167 , G06T7/20 , G06V40/28 , G06T2207/30196
摘要: The technology provides for a system for determining a gesture provided by a user. In this regard, one or more processors of the system may receive image data from one or more visual sensors of the system capturing a motion of the user, and may receive motion data from one or more wearable computing devices worn by the user. The one or more processors may recognize, based on the image data, a portion of the user's body that corresponds to a gesture to perform a command. The one or more processors may also determine one or more correlations between the image data and the received motion data. Based on the recognized portion of the user's body and the one or more correlations between the image data and the received motion data, the one or more processors may detect the gesture.
-
公开(公告)号:US20230366981A1
公开(公告)日:2023-11-16
申请号:US18246473
申请日:2021-09-27
申请人: Robert Bosch GmbH
发明人: Stephan Lenor , Andreas Scharf , Zsolt Vizi , Johann Klaehn , Christian Wolf
IPC分类号: G01S7/40 , G01S13/86 , G01S13/931 , G01S7/497 , G01S15/86 , G01S15/931 , G01S17/931 , G01S7/52
CPC分类号: G01S7/4004 , G01S13/86 , G01S13/931 , G01S7/497 , G01S15/86 , G01S15/931 , G01S17/931 , G01S7/52004 , G01S2007/4975 , G01S2007/52009
摘要: A method for determining a sensor degradation status of a first sensor system includes: providing data of the first sensor system to represent the environment; providing data of a second sensor system to represent the environment; determining an individual blindness indicator for the first sensor system on the basis of sensor data exclusively of the first sensor system; determining at least one first environment-related determination variable based on the provided data of the first sensor system; determining at least one second environment-related determination variable based on the provided data of the second sensor system; determining a fusion blindness indicator based on a comparison of the at least one first environment-related determination variable with the at least one second environment-related determination variable; and determining the sensor degradation status of the first sensor system based on of the individual blindness indicator and the fusion blindness indicator.
-
90.
公开(公告)号:US11815623B2
公开(公告)日:2023-11-14
申请号:US17462936
申请日:2021-08-31
发明人: Huazeng Deng , Ajaya H S Rao , Ashwath Aithal , Xu Chen , Ruoyu Tan , Veera Ganesh Yalla
IPC分类号: G06T7/00 , G01S7/40 , G01S13/86 , G01S7/41 , G01S13/42 , G06T7/80 , G06T7/70 , G06T7/20 , G06T7/62 , G06V20/56 , G06F18/25 , G06F18/24 , G06V10/764 , G05D1/02 , G05D1/00
CPC分类号: G01S7/40 , G01S7/41 , G01S7/417 , G01S13/42 , G01S13/865 , G01S13/867 , G06F18/24 , G06F18/253 , G06T7/20 , G06T7/62 , G06T7/70 , G06T7/80 , G06V10/764 , G06V20/56 , G05D1/0088 , G05D1/0231 , G05D1/0251 , G05D1/0257 , G05D2201/0213 , G06T2207/10028 , G06T2207/10044 , G06T2207/20084 , G06T2207/30252
摘要: Embodiments of the present disclosure are directed to a method for object detection. The method includes receiving sensor data indicative of one or more objects for each of a camera subsystem, a LiDAR subsystem, and an imaging RADAR subsystem. The sensor data is received simultaneously and within one frame for each of the subsystems. The method also includes extracting one or more feature representations of the objects from camera image data, LiDAR point cloud data and imaging RADAR point cloud data and generating image feature maps, LiDAR feature maps and imaging RADAR feature maps. The method further includes combining the image feature maps, the LiDAR feature maps and the imaging RADAR feature maps to generate merged feature maps and generating object classification, object position, object dimensions, object heading and object velocity from the merged feature maps.
-
-
-
-
-
-
-
-
-