lkd-aerospace
Inside GNSS: Engineering Solutions from the Global Navigation Satellite System Community
GPS Galileo Glonass BeiDou Regional/Augmentation
NovAtel
essp
unicore
Inside Unmanned Systems
Inside Unmanned Systems
Thought Leadership Series
sepoct11-soloviev-500px.jpg

Cars on the Road!

Cooperative Vehicle Safety Applications Using an Integrated Positioning Solution

CarFig1.jpgFIGURE 1: System architecture (Click image to enlarge.)
In order to achieve more reliable and practical autonomous vehicle navigation, user equipment must overcome the difficulty of urban canyons, where blocked and reflected GNSS signals are familiar challenge to ensuring continuous, precise positioning. The authors present a multi-sensor fusion position solution that support performance goals of cooperative vehicle safety applications in such places. This article examines two architectures that use configurations of integrated GPS/ inertial navigation, clocks, odometers, and lidar, evaluating their performance using data collected while navigating through downtown Detroit.

Share via: SlashdotSlashdot   TechnoratiTechnorati   Cars on the Road! (Inside GNSS)TwitterTwitter   FacebookFacebook

Cooperative vehicle safety applications should preferably have two-meter horizontal accuracy and six-meter vertical accuracy, all with a 95-percent availability. The solution must be developed to incorporate lower-cost sensor options, specifically, lower-cost inertial measurement units that can be generally characterized by the gyro drift of 100 degrees per hour and an accelerometer bias force of twice its mass times gravity (two milligals).

Our implementation of a cooperative vehicle safety system uses a low-latency 5.9 GHz communication link among vehicles and roadside infrastructure. This enables each vehicle to continually assess the chance of a collision. If the collision probability is high, the system may generate an in-vehicle warning for the driver, or even automatically initiate actions to help prevent the collision. A vehicle equipped with this system knows its own location and path, while it wirelessly monitors the locations and paths of surrounding vehicles.

These applications rely on two main technologies: (1) information exchange using dedicated short-range communications (DSRC), and (2) location using GNSS, although various other technologies are involved.

Although GNSS satisfies the desired accuracy level in open areas where unobstructed signals are available, it fails to support the desired performance in dense urban environments. In order to achieve the set performance goals, GNSS must be augmented with other sensors.

In this article we describe a multi-sensor architecture developed to enable precise positioning capabilities in difficult GNSS environments (such as urban canyons) for cooperative vehicle safety applications. Our overall goal is to enable meter-accurate absolute positioning in dense urban environments at a low cost.

System Design: The Technologies
Our solution develops a generic inertial-aided approach in which aiding sources can include GNSS, odometer, scanning lidar, and video cameras. We present the system architecture, describe specific system components, and evaluate its performance with experimental data collected in actual urban test environments.

The integrated solution is based on generic a multi-sensor fusion architecture described in the article by A. Soloviev and M. Miller (please refer to Additional Resources). The architecture utilizes a self-contained inertial navigation system (INS) as a core sensor. Other navigation aids generally extract their navigation-related measurements from external information, which may or may not be available. When available, these externally dependent measurements are applied to estimate inertial error states and mitigate the INS output drift.

We initially derive the navigation solution from the inertial data, estimate inertial error states from the aiding measurements, and then adjust the INS outputs.

Data fusion is performed at a tightly coupled level, where observable measurements are fed to a complementary Kalman filter, and differences between aiding measurements and predictions are computed using the inertial data. This solution enables a generic formulation where adding/dropping of an aiding source is simply achieved by adding/dropping its corresponding measurement observables to/from the Kalman filter without the need to redesign the entire system architecture.

The aiding measurements evaluated as part of this effort, include a GPS receiver, an enhanced GPS receiver that is frequency- aided by an oven-controlled crystal oscillator (OCXO) for clock stability, an odometer (ODO), a video camera, and a scanning lidar.

To support a low-cost implementation, integration algorithms apply GPS carrier phase measurements for periodic estimation of inertial error states. These algorithms utilize temporal carrier phase differences, which enable efficient inertial measurement unit (IMU) drift estimation without needing to resolve integer ambiguities.

As a result, relatively low-grade micro-electro-mechanical system (MEMS) inertial sensors can be used. This reduces the cost of the inertial sensor component by at least an order of magnitude, when compared with tactical and navigation grade sensors employed in existing commercial off-the-shelf solutions.

The multi-sensor fusion solution was developed and evaluated over the following configurations: GPS/INS, GPS/INS/ clock, GPS/INS/clock/ODO, GPS/INS/clock/ODO/Video, and GPS/INS/clock/ODO/Lidar. All five configurations were implemented and initially evaluated in a high-fidelity simulation environment.

Based on the simulation analysis, the top two solution options were identified as GPS/INS/clock/ODO and GPS/INS/ clock/ODO/Lidar. We then tested these two further by collecting experimental data in downtown Detroit, Michigan.

System Architecture
Figure 1
(see inset photo, above right) illustrates the overall architecture of the multi-sensor fusion algorithm. Computations are implemented as a recursive procedure that is executed every time a new measurement is provided by the inertial measurement unit (IMU). When a new IMU measurement becomes available, the system updates INS navigation outputs by propagating this new measurement through the INS navigation mechanization. This includes attitude determination, coordinate transformation, gravity compensation (implemented only if INS alignment has been completed) and, finally, integration into velocity and position navigation outputs. INS position outputs serve as the overall output of the system.

. . .

System Components
Let’s take a closer look at the key components used in our candidate systems.

Inertial Navigation. As shown in Figure 1, inertial navigation implements a standard strapdown INS mechanization routine that includes attitude determination, coordinate transformation, gravity compensation, and attitude determination steps.

. . .

GPS. The GPS position solution is provided directly by the GPS receiver. Additional processing of GPS measurements includes velocity estimation to initiate the INS’s alignment, and an internal integrity check. GPS velocity is derived from carrier phase changes over time wherein the carrier phase from current and previous measurement epochs are used.

. . .

Odometer. An internal wheel speed sensor provides the odometer measurements. The wheel-speed counter value is periodically read and used to compute ODO-based navigation solution. Because the counter is read directly from the sensor, the odometer measurements are precisely time-stamped, which is critical for their efficient use in the sensor fusion algorithm.

. . .

Laser Scanner. . . . Ranging pulses are sent in the scanner measurement angular range. Hence, each scan image is represented by the angular array and a corresponding ranging measurement for each angle. The current system implementation uses a laser scanner with an angular range from 0 to 180 degrees and angular resolution of 1 degree. The laser is mounted on the front edge of the roof of the test vehicle with a forward orientation and scans an approximately horizontal plane.

. . .

Integrity Check
As mentioned earlier, we apply an integrity check to remove measurement outliers. The GPS integrity check is applied to the GPS data, and an INS-based integrity checks odometer and laser scanner data.

If the internal GPS integrity check detects a failure or an insufficient number of satellites (fewer than five), then an INS-based integrity check is also implemented. The latter integrity check predicts measurements using inertial data, compares predicted measurements with actual sensor measurements, and removes those measurements for which large discrepancies exist between predicted and measured values.

. . .

Kalman Filter
The Kalman filter fuses reference data and inertial data to estimate inertial drift terms or, equivalently, inertial error states. At each IMU update (see Figure 1), the filter first implements a prediction step where the IMU error states are predicted based on their previous values and a time-propagation model, also referred to as the system dynamic-state model.

Next, the estimation step is carried out. If at least one reference measurement is available after the integrity check, reference measurements are fused with predicted system states (that is, predicted INS error states) to compute updated estimates of the system states.

If no reference measurements are available, state predictions simply become state estimations. Generally speaking, the estimation step uses new information that is provided by the reference measurements to update predicted system states.

. . .

Test Results
We evaluated the multi-sensor fusion solution developed with actual data collected in test drives through downtown Detroit, where streets were lined with multi-story buildings that obstructed GPS signals, often reducing visibility to less than four satellites.

Much of the testing was performed in two-day blocks with attempts to maximize the cover of the changing satellite constellation by 10-hour test sessions shifted by 4 hours between neighboring days. The data was processed off-line by the position estimation algorithm running on commercial computational software on a PC.

The vehicle/equipment setup includes:

  • A high-precision dual-frequency GPS receiver capable of receiving corrections from the Wide Area Augmentation System (WAAS)
  • High-sensitivity GPS receiver 
  • External oven controlled crystal oscillator (OCXO) clock connected to the high-precision receiver for clock aiding 
  • A lower-cost IMU characterized by the gyro drift of 100 degrees per hour and accelerometer bias of two milligals 
  • Internal wheel speed sensor, with resolution of five pulses per rotation 
  • Lidar 
  • An integrated GNSS/inertial system coupled with a ring laser gyro for position reference when conditions allowed.

. . .

Conclusion
We have presented a multi-sensor fusion solution for satisfying the accuracy goals of cooperative vehicle safety applications in dense urban environments, with the system designed to operate with a lower-cost IMU. Two configurations of the system architecture — an integrated GPS/INS/clock/ODO and a GPS/ INS/clock/ODO/lidar — were tested in urban environments in downtown Detroit. Results demonstrate that the fused multisensor solution drastically improves positioning accuracy compared to GNSS-only position estimation and met accuracy goals for the majority of test trials. Future work should focus on decreasing the cost of the system while maintaining and/or enhancing the performance goals via a trade-off calculation of the optimal sensor set based on performance-per-cost.

For the complete story, including figures, graphs, and images, please download the PDF of the article, above.

Acknowledgment
This article is based on a paper presented at the Institute of Navigation ION GNSS 2010 conference in Portland, Oregon.

Additional Resources
[1] Farrell, J. L., “GPS/INS-Streamlined,” NAVIGATION, Journal of the Institute of Navigation, Vol. 49, No. 4, Summer 2002
[2] Soloviev, A., and D. Bates and F. van Graas, “Tight Coupling of Laser Scanner and Inertial Measurements for a Fully Autonomous Relative Navigation Solution,” NAVIGATION, Journal of the Institute of Navigation, Vol. 53, No. 3, 2007
[3] Soloviev, A. and M. Miller “Navigation in Difficult Environments: Multi- Sensor Fusion Techniques,” NATO RTO Lecture series, Spring 2010
[4] van Graas, F, and A. Soloviev, “Precise Velocity Estimation Using a Stand- Alone GPS Receiver,” NAVIGATION, Journal of the Institute of Navigation, Vol. 51 No. 4, 2004

Manufacturers

The GPS receiver with WAAS corrections was an OEMV Propak-V3-L1 from NovAtel Inc., Calgary, Alberta, Canada. The integrated GNSS/inertial unit was NovAtel’s SPAN-SE with an HG1700 AG58 ring laser gyro from Honeywell Aerospace, Phoenix, Arizona. The high-sensitivity receiver was an LEA- 5T from u-blox AG, Thalwil, Switzerland. The lower-cost IMU was an MMQ-50, from Systron Donner Inertial, Concord, California, USA. The laser scanner was the LMS-200 Lidar, from SICK AG, Waldkirch, Germany. Matlab software from The Mathworks, Inc., Natick, Massachusetts, USA, was used to process the results.

Copyright © 2017 Gibbons Media & Research LLC, all rights reserved.

China Satellite Navigation Conference
Signals
Trimble
Trimble
Sensonor
NavtechGPS
Spectracom
Summit
CAST
NovAtel
globe Copyright © Inside GNSS Media & Research LLC. All rights reserved.
157 Broad Street, Suite 318 | Red Bank, New Jersey USA 07701
Telephone (732) 741-1964

Problems viewing this page? Contact our webmaster.