Abstract:
Surgical trocars, and image acquisition method using the same, include a body having a passage configured to receive at least one surgical instrument, and at least one camera movably coupled to an outer wall of the body.
Abstract:
Embodiments of the present disclosure relate to a robot cleaner and a control method of the robot cleaner, more particularly, to a robot cleaner configured to correct position information of the robot cleaner by acquiring a position of a docking station during the robot clear drives and to correct a map by using corrected position information, and a control method of the robot cleaner.
Abstract:
A robot that moves to a position indicated by a remote device, and a method for controlling the moving robot. The moving robot according to an embodiment includes a traveling unit that moves a main body, a light reception unit that receives light, and a control unit that determines a traveling direction of the moving robot by filtering the light received from the light reception unit in accordance with a probability-based filtering method, and controls the traveling unit so that the main body travels in the traveling direction.
Abstract:
A cleaning robot includes a data acquisition unit that acquires actual sensor data by measuring a distance from a current position to an object to be measured; a local map acquisition unit that acquires a local map by scanning the vicinity of the current position based on an environmental map stored in advance; and a processor that determines coordinates of the current position for the local map by performing matching between the local map and the actual sensor data, and determines a traveling direction based on the current position by calculating a main segment angle of a line segment existing in the local map.
Abstract:
An optical scanning probe and an apparatus to generate three-dimensional (3D) data using the same are provided. The apparatus to generate 3D data includes an optical scanning probe that scans light generated from a light emitter over an object to be measured, a distance calculation processor that calculates a distance between the optical scanning probe and the object to be measured, based on the light scanned over the object to be measured and light reflected from the object to be measured; and a depth image generation processor that generates 3D data based on a scanning direction of the optical scanning probe and the distance between the optical scanning probe and the object to be measured.
Abstract:
A robot that moves to a position indicated by a remote device, and a method for controlling the moving robot. The moving robot according to an embodiment includes a traveling unit that moves a main body, a light reception unit that receives light, and a control unit that determines a traveling direction of the moving robot by filtering the light received from the light reception unit in accordance with a probability-based filtering method, and controls the traveling unit so that the main body travels in the traveling direction.
Abstract:
In some example embodiments, a position recognition method of an autonomous mobile robot may include: dividing a grid map into a plurality of spaces; extracting, by a processor, learning data of each of the plurality of spaces; generating, by the processor, space models of each of the plurality of spaces using the extracted learning data; and/or recognizing, by the processor, a current position of the autonomous mobile robot based on the extracted learning data, the space models, and actual range scan data input through the autonomous mobile robot.
Abstract:
Data matching includes receiving a piece of first relational data and a piece of second relational data. The piece of first relational data is associated with a plurality of pieces of first data, and the piece of second relational data is associated with a plurality of pieces of second data. An approximate value of the piece of second relational data is calculated. A similarity is calculated based on the piece of first relational data and the approximate value of the piece of second relational data. A correspondence between a piece of the first data and a piece of the second data is determined based on the calculated similarity. An alignment parameter is calculated based on the determined correspondence, and a first data group including the piece of the first data is matched with a second data group including the piece of the second data based on the alignment parameter.