Abstract:
A device can include one or more sensors configured to output sensor data, and a trigger detection module configured to receive the sensor data from the one or more sensors and to determine whether a trigger event has occurred. The trigger detection module can be configured to output a trigger detection signal when the trigger event is detected. The trigger detection signal can be configured to be used by an augmented reality or virtual reality system to cause an augmented reality or virtual reality event.
Abstract:
Systems, devices, and methods including a bullet; a retroreflector array adhered to a base of the bullet, the retroreflector array having prism facets with a periodicity between 0.2 mm-2.0 mm; and a cover disposed over the retroreflector array and hermetically sealed at the base of the bullet; where the cover is disposed over the retroreflector array in a first position prior to firing, and where the cover is released from the base of the bullet in a second position after firing.
Abstract:
The present invention relates to a system comprising threat evaluation and sensor/weapon assignment algorithm operating units which are adapted such that they will operate any threat evaluation and sensor/weapon assignment algorithm, a simulation and analysis unit which is adapted such that it will form the area, in which threat evaluation and sensor/weapon assignment algorithms will be operated, as a virtual scenario by forming an air picture in accordance with the data it receives, an external communication unit which is in communication with the simulation and analysis unit; which can communicate correspondingly with a threat evaluation and sensor/weapon assignment algorithm operating unit; which is adapted such that it will transfer the current scenario information to the threat evaluation and sensor/weapon assignment algorithm when it is necessary and transfer the engagement results to the simulation and analysis unit by taking them back, and a communication unit which is adapted such that it will transfer the scenario, which is formed by communicating with a client, to the client and will receive data through the client; and a method comprising the steps of sending and arranging the data to the simulation and analysis unit through at least one client, transferring the virtual scenario to the TESWA algorithm operating units and receiving the engagement data, combining the engagement data with each other and the data received from the clients and updating the scenario status, approving and disapproving the engagement, analyzing the engagement data and transferring the results to a client partially or completely.
Abstract:
Disclosed is a repair unit (10) for a pod (11), especially a laser designation pod, including: a transportable casing (14); a bubble (18) attached to the casing (14), having a retracted state and a deployed state, the bubble (18) including an opening (46) for inserting part of a pod (11) and at least two projections forming tubes (50) for the hands of an operator, each tube (50) opening up in the bubble (18); and a gas supply device (22) for switching the bubble (18) front the retracted state to the deployed state. The casing (14) and the bubble (18) define an inner space sealed from outside pollutants in both the retracted state and in the deployed state.
Abstract:
The present invention relates to a sighting device, and specifically relates to an electronic sighting device with real-time information interaction. The sighting device comprises a field-of-view obtaining unit, a display unit; a sighting circuit unit, a sensor module, a positioning unit, and an interaction unit, wherein shooting vibration of a gun is judged by a vibration sensor in the sensor module, and then, the interaction unit is connected with internet or a remote display terminal to send the real-time information acquired by the sensor module and/or image information acquired by the field-of-view obtaining unit during shooting process to the internet or the remote display terminal, to achieve information collection and recording of each-time shooting by the electronic sighting device and achieve real-time interaction of the collected information with the internet or the remote display terminal.
Abstract:
The present invention relates to a sighting device, and specifically relates to an electronic sighting device with real-time information interaction. The sighting device comprises a field-of-view obtaining unit, a display unit; a sighting circuit unit, a sensor module, a positioning unit, and an interaction unit, wherein shooting vibration of a gun is judged by a vibration sensor in the sensor module, and then, the interaction unit is connected with internet or a remote display terminal to send the real-time information acquired by the sensor module and/or image information acquired by the field-of-view obtaining unit during shooting process to the internet or the remote display terminal, to achieve information collection and recording of each-time shooting by the electronic sighting device and achieve real-time interaction of the collected information with the internet or the remote display terminal.
Abstract:
The invention relates to an actuator element for setting the target mark of a sighting telescope having a rear housing, a spindle mounted in the rear housing so as to be rotatable about an axis of rotation, a setting knob for moving the spindle in the rear housing and with a retainer for fixing the setting knob relative to the rear housing. The retainer comprises detent toothing in the setting knob extending around a circumference, at least one retaining element engaging with the detent toothing and at least one lock bar for positively fixing the retaining element in a recess of the detent toothing. The retaining element is mounted in an intermediate housing connected to the rear housing in a non-rotating arrangement.
Abstract:
An apparatus includes a first end and a second end opposed to the first end. A body connects the first end to the second end. The second end includes a mark in a center of the second end, a first plurality of marks positioned on a first axis, a second plurality of marks positioned on the first axis on an opposite side of the center mark as the first plurality of marks, a third plurality of marks positioned on a second axis, a fourth plurality of marks positioned on the second axis on an opposite side of the center mark as the third plurality of marks, a fifth plurality of marks positioned on a third axis, and a sixth plurality of marks positioned on the third axis on an opposite side of the center mark as the fifth plurality of marks.
Abstract:
A calibration method comprises providing a mounting fixture including a tray coupled to a frame, and an alignment measurement sensor removably coupled to the tray. An angular orientation of the tray is determined using the alignment measurement sensor removably coupled to the tray in a first position. The alignment measurement sensor is then moved to a second position on the tray that is rotated from the first position, and the angular orientation of the tray is determined using the alignment measurement sensor at the second position. An axis misalignment for at least two of a pitch axis, a roll axis, or a yaw axis of the alignment measurement sensor is then calculated to determine one or more misalignment factors. The one or more misalignment factors are then applied to correct for misalignment of the alignment measurement sensor.