Abstract:
An access control method is performed on an equipment item linked to elements for rendering at least one haptic signal on a haptic rendering surface, wherein the haptic signal is a touch-discernible texture. The method includes, after an access request has been obtained: ordering rendering of a haptic signal; obtaining information on detection of a gesture of response to the signal; of validating the gesture of response when the gesture detected corresponds to a gesture previously associated with the haptic signal rendered; and authorizing access following the validation of the gesture.
Abstract:
The invention relates to a method for recognizing handwriting on a physical surface on the basis of three-dimensional signals originating from sensors of a terminal, the method being characterized in that the signals are obtained on the basis of at least 3 different types of sensors, and in that it comprises steps of sampling, according to 3 axes and over a sliding time window, of inertial signals originating from the sensors, fusing the sampled signals into a 9-dimensional vector for each sampling period, converting the fused signals into a sequence of characteristic 9-dimensional vectors, and, when a signal characteristic of an input start has been detected, storing the sequence of characteristic vectors in a list of sequences of characteristic vectors, the preceding steps being repeated until the detection of a signal characteristic of an input end, the method furthermore comprising, on detection of said signal characteristic of an input end, a step of recognizing a word on the basis of the list of sequences of characteristic vectors created over the time window.
Abstract:
Temporally segmenting an instrumented gesture executed by a user with a terminal having an inertial navigation module, which measures a vector of inertial characteristics representative of movement of the terminal. Segmenting includes, at each current instant: calculating an instantaneous power value of the vector; estimating a gesture indicator based on variation between the instantaneous power value and a mean power value estimated over a preceding time window; determining a start of gesture at a first instant, when the estimated gesture indicator is greater than or equal to a first threshold during a time interval greater than or equal to a first interval; and determining an end of gesture at a second instant when, at the current instant, the estimated gesture indicator is less than or equal to a second threshold during a time interval greater than or equal to a second time interval.
Abstract:
An access control method is performed on an equipment item linked to elements for rendering at least one haptic signal on a haptic rendering surface, wherein the haptic signal is a touch-discernible texture. The method includes, after an access request has been obtained: ordering rendering of a haptic signal; obtaining information on detection of a gesture of response to the signal; of validating the gesture of response when the gesture detected corresponds to a gesture previously associated with the haptic signal rendered; and authorizing access following the validation of the gesture.
Abstract:
The invention relates to a method for recognizing handwriting on a physical surface on the basis of three-dimensional signals originating from sensors of a terminal, the method being characterized in that the signals are obtained on the basis of at least 3 different types of sensors, and in that it comprises steps of sampling, according to 3 axes and over a sliding time window, of inertial signals originating from the sensors, fusing the sampled signals into a 9-dimensional vector for each sampling period, converting the fused signals into a sequence of characteristic 9-dimensional vectors, and, when a signal characteristic of an input start has been detected, storing the sequence of characteristic vectors in a list of sequences of characteristic vectors, the preceding steps being repeated until the detection of a signal characteristic of an input end, the method furthermore comprising, on detection of said signal characteristic of an input end, a step of recognizing a word on the basis of the list of sequences of characteristic vectors created over the time window.
Abstract:
Temporally segmenting an instrumented gesture executed by a user with a terminal having an inertial navigation module, which measures a vector of inertial characteristics representative of movement of the terminal. Segmenting includes, at each current instant: calculating an instantaneous power value of the vector; estimating a gesture indicator based on variation between the instantaneous power value and a mean power value estimated over a preceding time window; determining a start of gesture at a first instant, when the estimated gesture indicator is greater than or equal to a first threshold during a time interval greater than or equal to a first interval; and determining an end of gesture at a second instant when, at the current instant, the estimated gesture indicator is less than or equal to a second threshold during a time interval greater than or equal to a second time interval.
Abstract:
A method is provided for determining a mobility context for a user carrying a device fitted with inertial sensors. The method includes measuring inertial data from a set of inertial sensors of the device, taken at the time of execution of a reference gesture with the device, comparing the measured inertial data with prerecorded inertial data for various mobility contexts and for this reference gesture, and determining the mobility context according to the result of the comparison. A device and a server are also provided, which are capable of implementing the method described.
Abstract:
A method is provided for determining a mobility context for a user carrying a device fitted with inertial sensors. The method includes measuring inertial data from a set of inertial sensors of the device, taken at the time of execution of a reference gesture with the device, comparing the measured inertial data with prerecorded inertial data for various mobility contexts and for this reference gesture, and determining the mobility context according to the result of the comparison. A device and a server are also provided, which are capable of implementing the method described.
Abstract:
A method of recognizing a gesture made on an instrument. The gesture being performed by a user with a mobile terminal having a plurality of physical measurement sensors. The method includes searching for at least one stored entry corresponding to a pair formed by one of at least one recognized useful gesture and one of at least one recognized usage context; obtaining a corresponding appearance probability per pair, calculated as a function of a number of occurrences on which the pair has been recognized; deciding to confirm a pair as a function of the appearance probability; in the event of not recognizing at least one current usage context, requesting the user to confirm the recognized gesture; if confirmed, storing analyzed physical measurement values as a new usage context and storing a new pair formed by the recognized gesture and the new usage context, with an associated appearance probability.