Timing on the Fly

Sensor fusion is a predominant feature of modern navigation systems. To integrate navigation systems with other sensors, the spatial and temporal relationship of the sensor systems must be defined and calibrated.

Sensor fusion is a predominant feature of modern navigation systems. To integrate navigation systems with other sensors, the spatial and temporal relationship of the sensor systems must be defined and calibrated.

Although system developers have employed a variety of schemes to great success in calibrating the spatial relationship between sensors, temporal integration is often less straightforward. Much of this difficulty arises from the problem of characterizing the timing of sensor measurements. Sensors sampled at varying, out-of-phase, or even irregular frequencies are typical situations encountered in temporal integration and further increase the difficulty. 

For the spatial relationship, a strapdown inertial platform often provides a simple approach to calibrating the various sensors using a single framework. However, an analogous approach of fixing the timing of sensor measurements to a single temporal framework is more difficult, if not impossible, to achieve in many scenarios. This is particularly true when attempting to integrate consumer off-the-shelf devices, which often offer limited synchronization capabilities.

For most fused navigation systems, the task of temporal integration is handled within the design of the devices themselves. For example, an integrated inertial navigation system (INS)/GPS might be designed to sample the inertial sensors at a synchronized frequency division of the GPS one-pulse-per-second (1PPS) signal.

However, the proliferation of ad hoc aided navigation systems suggests that such cohesive low-level device design may give way to more modular, higher-level synchronization schemes.

Such was the case when the University of Florida’s Unmanned Aerial Vehicle Research Group went about developing a small (one-kilogram) payload for the NOVA II small UAV platform, consisting of a consumer-grade digital single-lens reflex (DSLR) camera with an integrated INS/GPS system.

This UAV platform is the result of over a decade of progress by an interdisciplinary group at the University of Florida and was developed principally for state and federal agencies to conduct infrastructure and environmental monitoring. The principal purpose of the payload is to produce high resolution, directly georeferenced imagery, which requires knowing the position and attitude of the camera at the moment of exposure.

This article describes the development and analysis of a modular synchronization scheme that should elucidate the principle techniques required for such a system. In this case, very little access to low-level sensor circuitry or control logic was available. Instead, we were left to integrate largely black-box sensor systems using a modular approach that relied on simple external timing signals.

Time versus Timing
Historically, time has simply been defined in terms of periodic events observable in nature. The question, “When did this measurement occur?”, might be casually answered by checking the hands from the grandfather clock in the hall, or glancing at the readout on a digital wristwatch, or perhaps examining a recent GPS packet.

All of these reflect the cumulative phase angle of a periodic event, whether from pendular motion, a crystal oscillator, or the resonant frequency of cesium atoms. Such periodic signals establish a basis for measuring time.

From these we can establish the “absolute” timing of an event against a given temporal reference (e.g., it occurred at 12:00:00.00 UTC), the “relative” timing of two events (e.g., event A occurred three seconds after event B), or the duration of an event (e.g., it lasted for 3.00 seconds).

Outside of the GNSS community, the fact that the GPS constellation has provided unprecedented access to a globally available timing and synchronization framework is much less recognized than its navigational capabilities. The ability to reference events to GPS time, and in turn to Coordinated Universal Time (UTC) and other timing frameworks, is the best available means by which we can measure time against a standard reference.

To emphasize again, however, underlying all timing systems is the simple notion of periodicity and cumulative phase angle as the “measuring stick” of temporal relationships. The idea of synchronization is not limited to the concept of simultaneity, nor does the time need to be indicated with respect to some standard reference.

These two concepts are core to modular synchronization.

Synchronization and Georeferencing
Prior research has established the importance of synchronization in direct georeferencing, and given rise to a number of system integration techniques. The previous version of the NOVA II, the NOVA I, carried a payload that employed a synchronization scheme that relied on both on timing the arrival of navigation data packets at the host as well as the camera trigger commands.

This design was reported to have a synchronization error of 87 milliseconds. (See the paper by S. Bowman listed in the Additional Resources section near the end of this article.)

A similar implementation found that the synchronization error was as much as 333 milliseconds using the same architecture. (See the paper by H. Chao et alia, Additional Resources.)

This method and the resulting wide discrepancy in sensor synchronization error highlight the need for direct measures of the time of sensor sampling. Such timing errors translate into substantial accuracy problems for meeting NOVA II mission georeferencing requirements, as will be discussed in a later section.

Relying on data packets can be problematic due to variable processing speeds and transmission time of the packet on the sensor end, as well as indeterminate timing behaviors from software-based synchronization on a non-real-time operating system.

Particularly for commercial-off-the-shelf (COTS) cameras, features such as white balancing and autofocus can produce stochastic and unquantified delays between trigger and exposure. Without a feedback mechanism, timing such a system is an intractable problem.

A more rigorous approach commonly implemented is the use of a digital acquisition (DAQ) card in the host computer, which directly records timing signals. However, a DAQ card is impractical on a small UAV platform due to its size and lack of an available interface with the host computer. 

Nonetheless, the use of DAQ cards  has been shown to obtain synchronization accuracies ranging from 0.4 milliseconds [Li et al., 2006] to 5 microseconds. (For details, see the articles by B. Li et alia and C. Toth et alia, respectively, listed in Additional Resources section.)

The task of synchronizing the camera to the navigation system does not require that we know the UTC time at which the camera exposure occurs. Rather, the parameters of interest are in fact the navigational state of the system at that time, e.g., the position and orientation of the camera at the instant of exposure — although this does not preclude knowing the reference time.

Furthermore, we do not need to sample the navigation sensors at the precise moment of the camera exposure, but rather we can  estimate the state at the time of exposure from adjacent navigational states  given an appropriate model or assumptions.

With these two considerations, we proceed to develop the idea of a purely relative timing system with which we can synchronize sensors.

Relative Timing and Synchronization
To begin, some periodic signal must serve as a basis for measurement. Given our rather stringent weight limitations on the UAV, both the grandfather and atomic clocks seemed impractical; so, we settled on a crystal oscillator.

Because the UAV system makes all timing measurements with respect to this oscillator, its frequency will set the maximum obtainable resolution for our synchronization system, denoted fclock. The output of the oscillator is used to drive a digital counter, which increments one integer value for each oscillation of the crystal. A measurement is made by reading off the value of the digital counter at the moment of interest.

. . .

Synchronization in Action
In practice, this simple synchronization scheme is adaptable to a number of scenarios, and depends largely on appropriately selecting the master and slave sensors. A typical application would be to reference the high-rate inertial sensors to the GPS epoch, allowing each inertial measurement to be given as a fraction of the GPS second. In this synchronization scenario, the INS is the slave and the GPS is the master.

However, we do not need to reference all of the sensors to the GPS epoch. For example, we can simplify implementation of the interpolation scheme if we reference the camera exposure to the INS samples, because these have the highest temporal resolution of the navigation parameters of interest. In this case, the camera is the slave sensor and it is synchronized to the INS.

. . .

The Flying Burredo
We implemented the synchronization scheme outlined here on a custom circuit board, dubbed the “Burredo”.  The Burredo is based off of an eight-bit microcontroller with an external crystal oscillator to improve the stability of the clock frequency.

We used the microcontroller’s built-in timing facilities to implement the relative timing scheme described previously. Another advantage of using a microcontroller is the built-in serial communication facilities, which allows the synchronization data to be passed to the host computer.

The synchronization signals utilized were the external “Sync Out” signal specified in the INS/GPS device’s user manual and the X-sync flash circuitry common to most DSLR cameras. Synchronization signals from most devices can be accommodated by using simple signal conditioning that is included onboard the Burredo.

Total materials cost for the Burredo was less than $100. It weighs less than 20 grams, and measures just 5 × 3 × 1.5 centimeters.

. . .

In conclusion, this article has described the basic operation of a simple, flexible synchronization architecture that facilitates the temporal integration of sensors. It does not rely on the availability of the GNSS satellite constellation, although the GPS 1PPS signal does allow ready access to a standard time reference.

The Burredo device is highly modular, interfacing with the sensors using simple logic-level synchronization signals. It then independently calculates and transmits the synchronization information on an independent data bus to the host computer.

An accuracy of less than a microsecond was demonstrated in a benchmarking experiment. Using the NOVA II platform and payload as an example, our investigation has shown that the errors due to synchronization were orders of magnitude less than the errors of the relevant sensors, thus confirming its utility in directly georeferencing remotely sensed imagery.

Additional Resources
[1] Bowman, S., 2008 Design and validation of an autonomous rapid mapping system using a small UAV, Master’s Thesis, University of Florida, Gainesville, Florida, USA
[2] Chao, H., and M. Baumann, A. Jensen, Y.Q. Chen, Y. Cao, W. Ren, and M. McKee, “Band-reconfigurable Multi-UAV-based Cooperative Remote Sensing for Real-time Water Management and Distributed Irrigation Control,” Proceedings of the 17th Word Congress of the International Federation of Automatic Control, July 6-11, 2008, Seoul, Korea
[3] Lewandowski, W., and J. Azoubib, W.J.   Klepczynski, “GPS: Primary Tool for Time Transfer,” Proceedings of the IEEE, 87(1):163-172, 1999
[4] Li, B., and C. Rizos, H.K. Lee, and H.K. Lee, “A GPS-slaved Time Synchronization System for Hybrid Navigation,” GPS Solutions, 10:207-217, 2006
[5] Skaloud, J., “Problems in Direct-Georeferencing by INS/DPGS in the Airborne Environment,” ISPRS Workshop on ‘Direct versus indirect methods of sensor orientation’, Commission III, Working Group III/1, Nov. 25-26, 1999, Barcelona, Spain
[6] Toth, C., and S.W. Shin, D.A. Grejner-Brzezinska, and J.H. Kwon, “On Accurate Time Synchronization of Multi-Sensor Mapping Systems,” Journal of Applied Geodesy, 2(3):159–166, 2008
[7] Xsens MTi-G, 2009. MTi-G User Manual, Xsens, Inc., Enschede, The Netherlands

For the complete story, including figures, graphs, and images, please download the PDF of the article, above.

IGM_e-news_subscribe