Online Authoring of Robot Autonomy Applications

    公开(公告)号:US20210318687A1

    公开(公告)日:2021-10-14

    申请号:US16884954

    申请日:2020-05-27

    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.

    Navigating a Mobile Robot
    2.
    发明申请

    公开(公告)号:US20210041878A1

    公开(公告)日:2021-02-11

    申请号:US16661062

    申请日:2019-10-23

    Abstract: A method for controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.

    ONLINE AUTHORING OF ROBOT AUTONOMY APPLICATIONS

    公开(公告)号:US20230418302A1

    公开(公告)日:2023-12-28

    申请号:US18466535

    申请日:2023-09-13

    CPC classification number: G05D1/0221

    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.

    Semantic Models for Robot Autonomy on Dynamic Sites

    公开(公告)号:US20220244741A1

    公开(公告)日:2022-08-04

    申请号:US17648942

    申请日:2022-01-26

    Abstract: A method includes receiving, while a robot traverses a building environment, sensor data captured by one or more sensors of the robot. The method includes receiving a building information model (BIM) for the environment that includes semantic information identifying one or more permanent objects within the environment. The method includes generating a plurality of localization candidates for a localization map of the environment. Each localization candidate corresponds to a feature of the environment identified by the sensor data and represents a potential localization reference point. The localization map is configured to localize the robot within the environment when the robot moves throughout the environment. For each localization candidate, the method includes determining whether the respective feature corresponding to the respective localization candidate is a permanent object in the environment and generating the respective localization candidate as a localization reference point in the localization map for the robot.

    Online authoring of robot autonomy applications

    公开(公告)号:US11797016B2

    公开(公告)日:2023-10-24

    申请号:US16884954

    申请日:2020-05-27

    CPC classification number: G05D1/0221

    Abstract: A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.

    Navigating a Mobile Robot
    6.
    发明申请

    公开(公告)号:US20220260998A1

    公开(公告)日:2022-08-18

    申请号:US17661685

    申请日:2022-05-02

    Abstract: A method tor controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.

    Navigating a mobile robot
    7.
    发明授权

    公开(公告)号:US11340620B2

    公开(公告)日:2022-05-24

    申请号:US16661062

    申请日:2019-10-23

    Abstract: A method for controlling a robot includes receiving image data from at least one image sensor. The image data corresponds to an environment about the robot. The method also includes executing a graphical user interface configured to display a scene of the environment based on the image data and receive an input indication indicating selection of a pixel location within the scene. The method also includes determining a pointing vector based on the selection of the pixel location. The pointing vector represents a direction of travel for navigating the robot in the environment. The method also includes transmitting a waypoint command to the robot. The waypoint command when received by the robot causes the robot to navigate to a target location. The target location is based on an intersection between the pointing vector and a terrain estimate of the robot.

Patent Agency Ranking