Abstract:
Systems (100) and methods (600) for operating an exoskeleton disposed at least partially on a joint of a wearer's limb (118). The methods involve respectively aligning first apertures (310 or 312) of a first planar flexible element (304 or 306) of the exoskeleton with second apertures (310 or 312) of a second planar flexible element (304 or 306) of the exoskeleton. The first and second planar flexible elements abut each other. A toothed flexible element (302) is then caused to ratchetedly engage the first and second planar flexible elements by bending the joint.
Abstract:
System and method for operating a robotic exoskeleton involves using a control system (107) to monitor an output one or more electrical activity sensors (202) disposed on a human operator. The control system determines if an output of the electrical activity sensors corresponds to a predetermined neural or neuromuscular condition of the user. Based on the determining step, the control system automatically chooses an operating mode from among a plurality of different operating modes. The operating mode selected determines the response the control system will have to control inputs from the human operator.
Abstract:
A system for preventing discomfort to a user of a robotic exoskeleton (200) determines the existence of an exoskeleton operating condition which has the potential to cause at least one of a discomfort or an injury to a user (204) when the exoskeleton is being worn by the user. Responsive to the determining, an exoskeleton control system (224) selectively controls at least one viscous coupling (208, 210) disposed at an interface location (201, 203) of the exoskeleton where a physical interaction occurs between a portion of the user and a portion of the exoskeleton when the exoskeleton is in use. The control system selectively varies a viscosity of a fluid (216) comprising the viscous coupling to control the stiffness of the interface.
Abstract:
Method for controlling an exoskeleton (100) involves detecting an occurrence of an uncontrolled acceleration of at least a portion of the exoskeleton, as might occur during a fall. In response, the exoskeleton is caused to automatically transition at least one motion actuator (104a, 104b) from a first operational state to a second operational state. In the first operational state, the one or more motion actuators are configured to provide a motive force for controlled movement of the exoskeleton. In the second operational state, the one or more motion actuators are configured to function as energy dampers which dissipate a shock load exerted upon the exoskeleton.
Abstract:
Systems (100) and methods (700) for increasing a predictability of Telematic Operations (“TOs”) of a Teleoperation System (“TS”). The methods involve: measuring an inherent latency of a Communications Link (“CL”) of TS which varies unpredictably over at least a first window of time; analyzing the inherent latency, which was previously measured, to determine a first reference value useful for increasing the predictability of the TOs; using the first reference value to select an amount of controlled latency to be added to CL (120) at each of a plurality of time points (502-518); and adding the amount of controlled latency to CL at each of the plurality of time points so as to increase the predictability of the TOs. In some scenarios, the amount of controlled latency added at a first time point is different than the amount of controlled latency added at a second time point.
Abstract:
Method and system for telematic control of a slave device. Displacement of a user interface control is sensed with respect to a control direction. A first directional translation is performed to convert data specifying the control direction to data specifying a slave direction. The slave direction will generally be different from the control direction and defines a direction that the slave device should move in response to the physical displacement of the user interface. A second directional translation is performed to convert data specifying haptic sensor data to a haptic feedback direction. The haptic feedback direction will generally be different from the sensed direction and can define a direction of force to be generated by at least one component of the user interface. The first and second directional translation are determined based on a point-of-view of an imaging sensor.
Abstract:
Control units (10) for use with unmanned vehicles (12) include an input device (50) that moves in response to a user input, sensors (70) coupled to the input device (50), and a controller (16). The sensors (70) generate outputs related to the movement of the input device (50). The controller (16) determines a target displacement of the unmanned vehicle (12) based on the outputs of the sensors (70), and generates a control input related to the target displacement. The control input, when received by the unmanned vehicle (12), causes the unmanned vehicle (12) to substantially attain the target displacement. The position of the vehicle (12) is thus controlled by directly controlling the displacement of the vehicle (12).
Abstract:
System (100) and methods (500) for remotely controlling a slave device (102). The methods involve: using a Hybrid Hand Controller (“HHC”) as a full haptic controller to control the slave device when the HHC (406) is coupled to a docking station (460); detecting when the HHC is or is being physically de-coupled from the docking station; automatically and seamlessly transitioning an operational mode of at least the HHC from a full haptic control mode to a gestural control mode, in response to a detection that the HHC is or is being de-coupled from the docking station; and using at least the HHC as a portable gestural controller to control the slave device when the HHC is de-coupled from the docking station.
Abstract:
System (100) and methods (500) for remotely controlling a slave device (102). The methods involve: using a Hybrid Hand Controller (“HHC”) as a full haptic controller to control the slave device when the HHC (406) is coupled to a docking station (460); detecting when the HHC is or is being physically de-coupled from the docking station; automatically and seamlessly transitioning an operational mode of at least the HHC from a full haptic control mode to a gestural control mode, in response to a detection that the HHC is or is being de-coupled from the docking station; and using at least the HHC as a portable gestural controller to control the slave device when the HHC is de-coupled from the docking station.
Abstract:
Robotic manipulator arm has an end portion to which one or more end effector appliances can be operably mounted for performing one or more manipulator arm operations. A control system has access to a plurality of different end effector appliance parameter sets which are respectively associated with the plurality of different end effector appliances. A user interface facilitates identification to the control system of one or more of the different end effector appliances which are installed on the manipulator arm. The control system is responsive to the identification to modify a control algorithm.