Abstract:
Disclosed herein are an apparatus and method for analyzing a building. The apparatus for analyzing a building includes a configuration unit for configuring analysis settings for analyzing a target building; a calculation unit for calculating a distance based on similarity by acquiring information about the target building and information about a standard building depending on the analysis settings and for generating a result of analysis of the target building using the distance; and an output unit for outputting the result of analysis of the target building.
Abstract:
A method and apparatus for controlling Zigbee wireless lighting are disclosed. The apparatus for controlling Zigbee wireless lighting includes an address management unit, a packet conversion unit, and a client unit. The address management unit detects a Zigbee wireless address, corresponding to a light ID included in a command received from a user terminal, in an address management table. The packet conversion unit converts the command into a Zigbee light link (ZLL)-based packet. The client unit transfers the ZLL-based packet to Zigbee wireless illumination light corresponding to the Zigbee wireless address in order to control the Zigbee wireless illumination light.
Abstract:
An apparatus and method for providing indoor location information are disclosed herein. The apparatus includes a signal reception unit, a global positioning system (GPS) data reception unit, a frame generation unit, a scheduling unit, and a signal transmission unit. The signal reception unit receives a lighting control frame from a lighting control device. The GPS data reception unit receives location data from GPS satellites. The frame generation unit generates a location data frame by extracting the location of an illumination light stored in a lighting arrangement storage unit, modifying the location data and adding a header to the modified location data. The scheduling unit schedules the lighting control frame or the location data frame. The signal transmission unit transfers the lighting control frame or the location data frame to the corresponding illumination light based on the results of the scheduling of the scheduling unit.
Abstract:
A lighting control device and method are disclosed herein. The lighting control device includes a control unit and a control request unit. The control unit sets up a virtual multiple lighting control unit corresponding to a lighting control command received from a lighting management device. The control request unit selects a wireless light fixture corresponding to a light fixture ID included in the lighting control command, and makes a lighting control request to the selected wireless light fixture. The control unit transfers a response to the lighting control request to the lighting management device, and thus allows a corresponding lighting switch to control the wireless light fixture.
Abstract:
Disclosed herein is an automotive augmented reality HUD apparatus and method, which can adaptively change a location and a direction based on the visual field of a driver, thus reducing errors in location matching between virtual object information and real-world information. The automotive augmented reality HUD apparatus includes a viewing angle calculation unit for estimating a line-of-sight direction of a driver using a face direction, detected based on face images of the driver, and center positions of pupils and calculating a viewing angle. A matching unit matches a location of real-world information located in front of a driver's seat with a location of corresponding virtual object information located in front of the driver's seat, based on the line-of-sight direction and the viewing angle. A display unit displays results of the matching by the matching unit, wherein the display unit enables rotation thereof and location change thereof.
Abstract:
Disclosed herein is an automotive augmented reality HUD apparatus and method, which can adaptively change a location and a direction based on the visual field of a driver, thus reducing errors in location matching between virtual object information and real-world information. The automotive augmented reality HUD apparatus includes a viewing angle calculation unit for estimating a line-of-sight direction of a driver using a face direction, detected based on face images of the driver, and center positions of pupils and calculating a viewing angle. A matching unit matches a location of real-world information located in front of a driver's seat with a location of corresponding virtual object information located in front of the driver's seat, based on the line-of-sight direction and the viewing angle. A display unit displays results of the matching by the matching unit, wherein the display unit enables rotation thereof and location change thereof.