Abstract:
Aspects of the disclosure relate to determining whether a vehicle should continue through an intersection. For example, the one or more of the vehicle's computers may identify a time when the traffic signal light will turn from yellow to red. The one or more computers may also estimate a location of a vehicle at the time when the traffic signal light will turn from yellow to red. A starting point of the intersection may be identified. Based on whether the estimated location of the vehicle is at least a threshold distance past the starting point at the time when the traffic signal light will turn from yellow to red, the computers can determine whether the vehicle should continue through the intersection.
Abstract:
A light detection and ranging device associated with an autonomous vehicle scans through a scanning zone while emitting light pulses and receives reflected signals corresponding to the light pulses. The reflected signals indicate a three-dimensional point map of the distribution of reflective points in the scanning zone. A hyperspectral sensor images a region of the scanning zone corresponding to a reflective feature indicated by the three-dimensional point map. The output from the hyperspectral sensor includes spectral information characterizing a spectral distribution of radiation received from the reflective feature. The spectral characteristics of the reflective feature allow for distinguishing solid objects from non-solid reflective features, and a map of solid objects is provided to inform real time navigation decisions.
Abstract:
A autonomous driving computer system determines whether a driving environment has changed. One or more objects and/or object types in the driving environment may be identified as primary objects. The autonomous driving computer system may be configured to detect the primary objects and/or object types, and compare the detected objects and/or object types with the previous known location of the detected object and/or object types. The autonomous driving computer system may obtain several different metrics to facilitate the comparison. A confidence probability obtained from the comparison may indicate the degree of confidence that the autonomous driving computer system has in determining that the driving environment has actually changed.
Abstract:
Aspects of the present disclosure relate generally to modeling a vehicle's view of its environment. This view need not include what objects or features the vehicle is actually seeing, but rather those areas that the vehicle is able to observe using its sensors if the sensors were completely un-occluded. For example, for each of a plurality of sensors of the object detection component, a computer may an individual 3D model of that sensor's field of view. Weather information is received and used to adjust one or more of the models. After this adjusting, the models may be aggregated into a comprehensive 3D model. The comprehensive model may be combined with detailed map information indicating the probability of detecting objects at different locations. A model of the vehicle's environment may be computed based on the combined comprehensive 3D model and detailed map information and may be used to maneuver the vehicle.
Abstract:
A vehicle configured to operate in an autonomous mode could determine a current state of the vehicle and the current state of the environment of the vehicle. The environment of the vehicle includes at least one other vehicle. A predicted behavior of the at least one other vehicle could be determined based on the current state of the vehicle and the current state of the environment of the vehicle. A confidence level could also be determined based on the predicted behavior, the current state of the vehicle, and the current state of the environment of the vehicle. In some embodiments, the confidence level may be related to the likelihood of the at least one other vehicle to perform the predicted behavior. The vehicle in the autonomous mode could be controlled based on the predicted behavior, the confidence level, and the current state of the vehicle and its environment.
Abstract:
An autonomous vehicle detects a tailgating vehicle and uses various response mechanisms. A vehicle is identified as a tailgater based on whether its characteristics meet a variable threshold. When the autonomous vehicle is traveling at slower speeds, the threshold is defined in distance. When the autonomous vehicle is traveling at faster speeds, the threshold is defined in time. The autonomous vehicle responds to the tailgater by modifying its driving behavior. In one example, the autonomous vehicle adjusts a headway buffer (defined in time) from another vehicle in front of the autonomous vehicle. In this regard, if the tailgater is T seconds too close to the autonomous vehicle, the autonomous vehicle increases the headway buffer to the vehicle in front of it by some amount relative to T.