Abstract:
Disclosed is an image processing apparatus including: a storage configured to store model information regarding a boundary of a person area in an image; and a controller configured to determine a boundary of a target area in an area to be processed, and to control the target area to undergo image processing for the person area if it is determined, based on the model information stored in the storage, that the determined boundary of the target area matches the boundary of the person area.
Abstract:
An imaging apparatus and a control method thereof are provided. The method for controlling the imaging apparatus includes acquiring images having a same focal length by performing continuous imaging in a predetermined time when a user's imaging command is input, calculating motion vectors using the images and separating a foreground and a background of a first image among the images based on the calculated motion vectors and color information of the first image, and performing out focusing based on the separated foreground and the separated background.
Abstract:
A method and apparatus for generating a dewarped document using a document image captured using a camera are provided. The method includes obtaining the document image captures using the camera, extracting text lines from the document image captured using the camera, determining a projection formula to convert positions of respective points constituting the extracted text lines to coordinates projected on a plane of the dewarped document, determining a target function used to calculate a difference between text lines projected on the place of the dewarped document using the projection formula and real text lines, calculating parameters that minimize the target function, and converting the document image to the dewarped document by substituting the calculated parameters into the projection formula.
Abstract:
A method of recognizing and generating a structure of a table included in an image is provided. The method includes extracting lines forming the table from among connection components forming an image, determining line intersections by using crossing functions matched with the lines, determining one of a plurality of crossing models identified based on a plurality of crossing shapes, in correspondence with each of the line intersections, and generating data about the table, which includes at least one cell determined using the determined crossing model.