Cars on the Road!
Cooperative Vehicle Safety Applications Using an Integrated Positioning Solution
In order to achieve more reliable and practical autonomous vehicle navigation, user equipment must overcome the difficulty of urban canyons, where blocked and reflected GNSS signals are familiar challenge to ensuring continuous, precise positioning. The authors present a multi-sensor fusion position solution that support performance goals of cooperative vehicle safety applications in such places. This article examines two architectures that use configurations of integrated GPS/ inertial navigation, clocks, odometers, and lidar, evaluating their performance using data collected while navigating through downtown Detroit.
Cooperative vehicle safety applications should preferably have two-meter horizontal accuracy and six-meter vertical accuracy, all with a 95-percent availability. The solution must be developed to incorporate lower-cost sensor options, specifically, lower-cost inertial measurement units that can be generally characterized by the gyro drift of 100 degrees per hour and an accelerometer bias force of twice its mass times gravity (two milligals).
Our implementation of a cooperative vehicle safety system uses a low-latency 5.9 GHz communication link among vehicles and roadside infrastructure. This enables each vehicle to continually assess the chance of a collision. If the collision probability is high, the system may generate an in-vehicle warning for the driver, or even automatically initiate actions to help prevent the collision. A vehicle equipped with this system knows its own location and path, while it wirelessly monitors the locations and paths of surrounding vehicles.
These applications rely on two main technologies: (1) information exchange using dedicated short-range communications (DSRC), and (2) location using GNSS, although various other technologies are involved.
Although GNSS satisfies the desired accuracy level in open areas where unobstructed signals are available, it fails to support the desired performance in dense urban environments. In order to achieve the set performance goals, GNSS must be augmented with other sensors.
In this article we describe a multi-sensor architecture developed to enable precise positioning capabilities in difficult GNSS environments (such as urban canyons) for cooperative vehicle safety applications. Our overall goal is to enable meter-accurate absolute positioning in dense urban environments at a low cost.
System Design: The Technologies
The integrated solution is based on generic a multi-sensor fusion architecture described in the article by A. Soloviev and M. Miller (please refer to Additional Resources). The architecture utilizes a self-contained inertial navigation system (INS) as a core sensor. Other navigation aids generally extract their navigation-related measurements from external information, which may or may not be available. When available, these externally dependent measurements are applied to estimate inertial error states and mitigate the INS output drift.
We initially derive the navigation solution from the inertial data, estimate inertial error states from the aiding measurements, and then adjust the INS outputs.
Data fusion is performed at a tightly coupled level, where observable measurements are fed to a complementary Kalman filter, and differences between aiding measurements and predictions are computed using the inertial data. This solution enables a generic formulation where adding/dropping of an aiding source is simply achieved by adding/dropping its corresponding measurement observables to/from the Kalman filter without the need to redesign the entire system architecture.
The aiding measurements evaluated as part of this effort, include a GPS receiver, an enhanced GPS receiver that is frequency- aided by an oven-controlled crystal oscillator (OCXO) for clock stability, an odometer (ODO), a video camera, and a scanning lidar.
To support a low-cost implementation, integration algorithms apply GPS carrier phase measurements for periodic estimation of inertial error states. These algorithms utilize temporal carrier phase differences, which enable efficient inertial measurement unit (IMU) drift estimation without needing to resolve integer ambiguities.
As a result, relatively low-grade micro-electro-mechanical system (MEMS) inertial sensors can be used. This reduces the cost of the inertial sensor component by at least an order of magnitude, when compared with tactical and navigation grade sensors employed in existing commercial off-the-shelf solutions.
The multi-sensor fusion solution was developed and evaluated over the following configurations: GPS/INS, GPS/INS/ clock, GPS/INS/clock/ODO, GPS/INS/clock/ODO/Video, and GPS/INS/clock/ODO/Lidar. All five configurations were implemented and initially evaluated in a high-fidelity simulation environment.
Based on the simulation analysis, the top two solution options were identified as GPS/INS/clock/ODO and GPS/INS/ clock/ODO/Lidar. We then tested these two further by collecting experimental data in downtown Detroit, Michigan.
. . .
Inertial Navigation. As shown in Figure 1, inertial navigation implements a standard strapdown INS mechanization routine that includes attitude determination, coordinate transformation, gravity compensation, and attitude determination steps.
. . .
GPS. The GPS position solution is provided directly by the GPS receiver. Additional processing of GPS measurements includes velocity estimation to initiate the INS’s alignment, and an internal integrity check. GPS velocity is derived from carrier phase changes over time wherein the carrier phase from current and previous measurement epochs are used.
. . .
Odometer. An internal wheel speed sensor provides the odometer measurements. The wheel-speed counter value is periodically read and used to compute ODO-based navigation solution. Because the counter is read directly from the sensor, the odometer measurements are precisely time-stamped, which is critical for their efficient use in the sensor fusion algorithm.
. . .
Laser Scanner. . . . Ranging pulses are sent in the scanner measurement angular range. Hence, each scan image is represented by the angular array and a corresponding ranging measurement for each angle. The current system implementation uses a laser scanner with an angular range from 0 to 180 degrees and angular resolution of 1 degree. The laser is mounted on the front edge of the roof of the test vehicle with a forward orientation and scans an approximately horizontal plane.
. . .
If the internal GPS integrity check detects a failure or an insufficient number of satellites (fewer than five), then an INS-based integrity check is also implemented. The latter integrity check predicts measurements using inertial data, compares predicted measurements with actual sensor measurements, and removes those measurements for which large discrepancies exist between predicted and measured values.
. . .
Next, the estimation step is carried out. If at least one reference measurement is available after the integrity check, reference measurements are fused with predicted system states (that is, predicted INS error states) to compute updated estimates of the system states.
If no reference measurements are available, state predictions simply become state estimations. Generally speaking, the estimation step uses new information that is provided by the reference measurements to update predicted system states.
. . .
Much of the testing was performed in two-day blocks with attempts to maximize the cover of the changing satellite constellation by 10-hour test sessions shifted by 4 hours between neighboring days. The data was processed off-line by the position estimation algorithm running on commercial computational software on a PC.
The vehicle/equipment setup includes:
. . .
For the complete story, including figures, graphs, and images, please download the PDF of the article, above.
ManufacturersThe GPS receiver with WAAS corrections was an OEMV Propak-V3-L1 from NovAtel Inc., Calgary, Alberta, Canada. The integrated GNSS/inertial unit was NovAtel’s SPAN-SE with an HG1700 AG58 ring laser gyro from Honeywell Aerospace, Phoenix, Arizona. The high-sensitivity receiver was an LEA- 5T from u-blox AG, Thalwil, Switzerland. The lower-cost IMU was an MMQ-50, from Systron Donner Inertial, Concord, California, USA. The laser scanner was the LMS-200 Lidar, from SICK AG, Waldkirch, Germany. Matlab software from The Mathworks, Inc., Natick, Massachusetts, USA, was used to process the results.
Copyright © 2017 Gibbons Media & Research LLC, all rights reserved.