Survey and Mapping

May 1, 2007

Ruth Neilan: The Global Grid Master

Ruth Neilan outside the history library at the GeoForschungsZentrum in Potsdam, Germany.

When Ruth Neilan was named director of what is now known as the Central Bureau of the International GNSS Service (IGS), she had an immense undertaking before her.

A voluntary civilian federation, the IGS compiles and analyzes GPS (and more recently, GLONASS) satellite data. From these, the IGS creates highly accurate products —such as precise satellite orbit and clock files — and makes them freely available to engineers, scientists, and researchers all over the world.

When Ruth Neilan was named director of what is now known as the Central Bureau of the International GNSS Service (IGS), she had an immense undertaking before her.

A voluntary civilian federation, the IGS compiles and analyzes GPS (and more recently, GLONASS) satellite data. From these, the IGS creates highly accurate products —such as precise satellite orbit and clock files — and makes them freely available to engineers, scientists, and researchers all over the world.

The latter folks use the IGS data to improve the accuracy of their own GNSS positioning and timing results based on observations from the same set of satellites, using IGS products in place of the broadcast data.

Originally known as the International GPS Service for Geodyanmics, the standardized global tracking network was initiated by NASA and NOAA in the late 1980s. Today, the IGS Central Bureau is managed by NASA’s Jet Propulsion Laboratory (JPL) at Caltech in Pasadena, California, where Neilan has worked for nearly 25 years. IGS has 200 participating organizations —mostly public, government, and research agencies — with upwards of 400 permanent ground stations and data and analysis centers in more than 80 countries.

But in the early 1990s, all of that was far in the future. Neilan and the IGS had to create the building blocks themselves: setting standards, agreeing upon specific formats for data collection and processing, deciding how much to log to guarantee the precise results they needed. They succeeded in great part because of Neilan’s passion and optimism that GNSS technologies could — and do — bridge geopolitical boundaries.

“Through IGS, developing countries can join an international effort. People are very enthusiastic about contributing,” she said. “Off-the-shelf products have developed to such a point that they can leapfrog into the highest technology that’s available. The difficulty we have is getting enough resources to put their efforts on solid ground and ensure sustainability.”

Never Say “Never”

Neilan’s internationalist bent was established early in life.

She recalls crawling beneath the drafting tables in her father’s engineering firm in Somerset, a small Appalachian town in southwest Pennsylvania, and losing herself in books. She especially like the one with fascinating photographs of Asia and, in fact, went on to study Mandarin Chinese for five years.

As a child, she was convinced that she would “never” be an engineer. But blessed with a surefire sense of direction that she calls “Zen navigation,” Neilan loved reading maps and making precise measurements.

At college, she gravitated to The Pennsylvania State University’s engineering technology program, earned an associate’s degree, and became a surveyor. But she still wasn’t convinced that engineering should be her life’s work, so she took a detour.

A two-year globe-spanning tour started her on the path that combined her passion — Asia and the world — with what turned out to be her calling: engineering and the development of GNSS.

On her sojourn Neilan crossed Turkey and Afghanistan, worked as English editor for a Taipei magazine, climbed to the Mount Everest base camp, and attained an altitude of 19,200 feet— without oxygen — while crossing into the Rowalling Valley along the Tibetan border. Along the way, Neilan realized she was good at engineering.

She returned to the United States to earn a bachelor’s degree in civil and environmental engineering plus a minor in Asian studies, graduating with distinction from the University of Wisconsin at Madison in 1983.

Although the buzz about GPS began perking in the early 1980s, nothing was taught at the university level. After graduating, Neilan visited a friend at the Jet Propulsion Laboratory at California Institute of Technology in Pasadena, California. Hoping for nothing more than ideas for a thesis topic, Neilan arrived wearing flip-flops and shorts. She met with several people working on GPS and, by the end of the day, she had a job.

She put herself on what she laughingly refers to as the first in a series of “five-year plans,” working for JPL while pursuing her master’s degree. She finished her thesis, “An Experimental Investigation of the Effect of GPS Satellite Multipath,” in 1986.

Growing the Global Grid

Neilan’s first major project out of graduate school put her at the hub of the emerging GPS infrastructure. JPL assigned her to get the Deep Space Network’s first GPS receivers and meteorological instrumentation up and running. As that project unfolded, she also managed seminal projects measuring crustal deformation, tectonic motion, and earthquake fault monitoring using GPS techniques.

Starting in 1990, a planning group of five leaders — including Neilan and her mentor at JPL, Bill Melbourne — began meeting to plan the way forward for a global network. Neilan led implementation and operation of ground data systems for the GPS Ground Tracking Network. At about the same time, she also took on the separate task of coordinating the sub-network of six GPS tracking stations required for mission support of the GPS precise orbit determination experiment flown on the satellite TOPEX/Poseidon.

In 1992, she became GPS Operations Manager for the NASA/JPL global network and for scientific support of regional experiments, overseeing project management and technical direction of field engineering for NASA scientists and geodetic tasks at the University NAVSTAR Consortium (UNAVCO) in Boulder, Colorado.

One year later, she was named director of IGS.

The Global Grid and Beyond

Neilan’s serves on the advisory board of the U.S. Positioning, Navigation and Timing Executive Committee, which addresses such issues as policy, planning, management, services, capabilities, and funding.

“This board provides an additional assurance that there is a voice for users,” Neilan says. “The board includes international people, which emphasizes the global nature of GPS.” It also underscores the value that the executive committee places on the international community as part of the process, she adds

Since 2005 she also has served as vice chair of the Global Geodetic Observing System, which provides continuous, precise observations of the three fundamental geodetic observables and their variations: the Earth’s shape, gravity field, and rotational motion. She says a stable, sustainable global reference frame is crucial for all Earth observation and for practical applications ranging from agriculture to the dynamics of atmosphere and the oceans.

The system is beginning to help scientists get their arms around the complexities of seismic activity and climate variation. “Natural hazard detection and mitigation is a really important use of GPS and continuous networking,” Neilan says. “Less than a day after the earthquake that triggered the tsunamis in the Indian Ocean in 2004, we could see that our IGS station in Singapore had moved almost an inch. Our IGS stations in India also had moved.”

Neilan’s zest for travel makes her an ideal fit for her job, which requires frequent trips to developing countries. She lights up when talking about bringing Africa’s 50-plus nations into the grid. This effort, known as AFREF (which stands for unification of African Reference Frames) includes vertical data as well as gravity observations. However, she emphasizes that planning on a continental scale does not have to be “top down or all at once.” Instead, she focuses on assisting newcomers put in GPS systems that meet standards and follow conventions used by the rest of the world.

If Neilan had just one magic GNSS wish, it would that everyone understood the importance of tying into the global grid, known officially as the International Terrestrial Reference Frame. “It’s easy to remotely sense an area, but often contractors or consultants set up their own little local reference system,” she explains. “Then, when they try to link up or extend their project, it has no relationship to the country’s grid, much less the international grid. It’s been very hard to get this across to the mapping and GIS people.”

NASA’s return to space exploration opens up the possibility of extending GPS-like constellations to the Moon and to Mars. “We need to have a way of commonly and seamlessly referencing all the vehicles,” Neilan says. “An extended coordinate timing system would reduce errors.”

Neilan’s daily contacts with colleagues around the globe reinforce her optimism that GNSS technologies can bridge geopolitical boundaries.

“Open availability [of data] is seeding so much innovation and fostering better understanding of our world,” she says. “IGS is the global sandbox. Everybody can have fun and play.”

Neilan’s coordinates:
34° 12′ 5.7" N
118° 10′ 27.47" W
h = 372 meters

Ruth Neilan’s Many GNSS Hats

  • Vice Chair, Global Geodetic Observing System (GGOS) since 2005
  • Director, Central Bureau of the International GNSS Service (IGS), since 1993
  • Advisory board, US Positioning, Navigation and Timing (PNT) Executive Committee
  • Ad Hoc Strategic Committee on Information and Data (SCID), International Council for Science (ICSU)
  • International Committee on GNSS (ICG), representing the International Association of Geodesy (IAG)
  • Executive Committee, International Association of Geodesy
  • IGS website http://igscb.jpl.nasa.gov

COMPASS POINTS

Engineering Specialties
Surveyor, geodetic surveying, and civil and environmental engineering.

GNSS Mentor
Bill Melbourne at the Jet Propulsion Laboratory, a visionary who’s many accomplishments included leading GPS technology developments at JPL. “His support in shaping the GPS global network and working with our international partners laid the foundation for the IGS.”

Favorite Equation
Geoid Separation H=h-N

Heights that are given from GPS (h) are relative to the GPS ellipsoid WGS84. “To get the vertical position of a point, the separation between the geoid and the ellipsoid (N) must be known — with care. You can think of the geoid as the surface of the earth approximated by the mean sea level.”

Her Compass Points
Neilan credits her family first – her “brilliant” jazz pianist husband, “terrific” kids, and parents who “are always there, even for GPS observations in American Samoa, Easter Island, and remote areas of Mexico.” Her other compass points are “the wonderful IGS/GNSS community of colleagues and friends,” and “The great game — soccer!”

Fell in love with GPS when . . .
. . . she realized that this revolutionary technology could provide precise position and navigation aids for anyone, anywhere, anytime. “(This) levels the playing field to an extent — especially in the developing countries.”

Knew GNSS had arrived when. . .
. . .
CASA UNO ’88 deployed the first civilian global tracking network for orbit improvements to monitor crustal deformation at many stations in Central and South America. “This was the first large-scale study of crustal deformation. Now it is done continuously for hundreds of stations around the globe.”

Influences of Engineering on her Private Life
“Time motion studies in the kitchen!”

Popular Notions about GNSS that Most Annoy
“That GPS will operate according to specifications when actually it is far, far better than that. We need to develop the notion of performance-based capabilities and delivery of services.”

What’s next?
For Neilan and IGS, this includes integrating the upcoming signals from GALILEO, COMPASS and other new GNSSes into the mix. They are aiming for seamless incorporation to take advantage of the signals “so that we can use this technology over several decades in order to better understand our changing world.” And, as always, IGS’s on-going effort to promote dialog and a forum for the international use of GNSS.

Human Engineering is a regular feature that highlights some of the personalities behind the technologies, products, and programs of the GNSS community. We welcome readers’ recommendations for future profiles. Contact Glen Gibbons, glen@insidegnss.com.

By
January 31, 2007

Trimble Acquires @Road, Spacient

Trimble of Sunnyvale, California, has entered into a definitive agreement to acquire publicly held @Road, Inc. of Fremont, California, and has purchased privately held Spacient Technologies, Inc. of Long Beach, California.

Read More >

By Inside GNSS

RTK Precise Positioning

Calgary, Alberta, Canada’s NovAtel Inc. offers a new real-time kinematic (RTK) positioning solution, known as AdVance RTK, designed to enhance the precision and performance of the company’s OEMV family of GNSS receiver boards.

Read More >

By Inside GNSS
[uam_ad id="183541"]
September 1, 2006

BOC or MBOC? Questions and Answers

Global navigation satellite systems are all about timing. In a narrow sense, GNSS is technically a matter of how long the satellite signals take to reach a receiver. In a larger sense, it’s about designing global infrastructure systems that may not produce practical benefits for 5, 10, even 15 years or more.

During that time, a lot can happen. Technology changes. Electronics get more powerful and cheaper.

Global navigation satellite systems are all about timing. In a narrow sense, GNSS is technically a matter of how long the satellite signals take to reach a receiver. In a larger sense, it’s about designing global infrastructure systems that may not produce practical benefits for 5, 10, even 15 years or more.

During that time, a lot can happen. Technology changes. Electronics get more powerful and cheaper.

But GNSS equipment manufacturers and receiver designers live in the here and now. They face today’s challenges with today’s technology: how to receive signal indoors, under tree canopy, in urban canyons. How to get the most robust tracking capability out of a receiver — the most accurate, the most available capabilities.

And to accomplish these things at a price that prospective customers in the marketplace will see as offering true value.

Will the common civil signal be the binary offset carrier, or BOC(1,1) waveform as stated in a 2004 agreement between the United States and the European Union? Or, will it be the multiplexed BOC (MBOC) signal recommended by a technical working group set up under that agreement to examine further refinements to the design?

The signal decision involves benefit trade-offs for different types of GNSS receiver designs and will have widespread consequences for the products developed over the next 10, 20, or even 30 years.

Although the math and science underlying the discussion may seem esoteric, there’s nothing abstract or theoretical about the consequences of the decision. The selection of a common GPS/Galileo civil signal will profoundly shape the user experience, the engineering challenges, the business prospects and strategies of GNSS manufacturers and service providers, and even the political relations among nations for decades to come.

Our series started in the May/June issue with a “Working Papers” column that introduced the MBOC spreading modulation. Earlier this year, the GPS-Galileo WorkingGroup on Interoperability and Compatibility recommended MBOC’s adoption by Europe’s Galileo program for its L1 Open Service (OS) signal and also by the United States for its modernized GPS L1 Civil (L1C) signal. The Working Papers column discussed the history, motivation, and construction of MBOC signals. It then showed various performance characteristics that the authors believe demonstrate MBOC’s superior performance and summarized their status in Galileo and GPS.

The May/June column also noted, “The United States is willing to adopt for GPS L1C either the baseline BOC(1,1) or the recommended MBOC modulation, consistent with what is selected for Galileo L1 OS.” Given this impartial U.S. government position, Inside GNSS believed it would be appropriate and useful to ask a panel of GNSS industry representatives their thoughts on the subject of a common civil GPS/Galileo signal waveform.

In the July/August issue of the magazine, therefore, in an article introduced by Tom Stansell, nine technology specialists from leading GNSS manufacturers began the discussion of technical alternatives, implications for receiver design, and significance for the products that reach the marketplace.

This month four more GNSS receiver designers join the manufacturers dialog, bringing the total to 13 panelists representing the perspectives of 8 manufacturers — CMC Electronics, Japan Radio Company, NavCom Technology, Nemerix, NovAtel, Qualcomm, Rockwell Collins, and SiRF Technology — and 3 independent consulting engineers. Their biographies follow, along with their verbatim answers to questions posed by Inside GNSS.

(In the sidebar, “Old Questions, New Voices,” at the end of this article, we present the responses of our four latest panelists to the five questions answered in Part 1 of the series. The complete article, as well as the May/June “Working Papers” column, can be found on the Inside GNSS website at https://www.insidegnss.com/mboc.php.)

We also invited the authors of the original MBOC design recommendation to respond to the entire manufacturers dialog, an invitation that we made to the GNSS community in general — and one that still remains open. Their response immediately follows this article (see below). Javad Ashjaee, president and CEO of Javad Navigation Systems who has been designing GNSS receivers for 30 years, also submitted some comments on the panelists’ discussion, which appear in this section as well.

In Part 2 of the Manufacturers Dialog on BOC and MBOC presented here, the panelists discuss performance of narrowband and wideband receivers under weak signal and multipath conditions and offer their opinions on the best signal option.

The Questions and Answers

Q: Would you expect any performance difference for your products if MBOC code is transmitted instead of BOC(1,1)?

Fenton – Yes, depending on the exact MBOC option used, we would expect between 21 percent to 33 percent reduction in code tracking error due to the increased effective chipping rate and a significant improvement in the detection and correction of close-in multipath interference.

Garin – Compared to a theoretically achievable performance with BOC(1,1) only, we would lose performance. Compared to the competition who will have to deal with the same signals in space, we won’t be at a disadvantage.

Hatch/Knight – We expect a modest improvement in multipath mitigation under moderately weak signal conditions, such as under foliage.

Kawazoe – We do not expect any advantage from MBOC.

Kohli/Turetzky – The biggest difference we would see would be in the availability due to the lower signal strength. However, it’s the same for everyone and if the benefit of higher accuracy for some applications is deemed to be of higher importance, we can still build a very high performance receiver on the MBOC signal.

Sheynblat/Rowitch – It is difficult to quantify the impact on indoor and urban canyon positioning accuracy due to a loss of 1 dB of sensitivity. However, it is straightforward to conclude that for successive sensitivity losses in 1 dB steps, measurement yield will also decrease in corresponding steps, eventually falling below the threshold for a successful GPS fix. This has a noticeably negative impact on the user experience for consumer and business applications. As an example, in some indoor signal scenarios we have seen 1 dB of improved sensitivity deliver an additional 20 percent improvement in successful fix rate.

Stratton – As stated earlier, we expect that we would obtain lower levels of multipath under ideal conditions, but the broader impact in off-nominal conditions requires further study. We do not anticipate a difference in user operational benefit for either choice.

Studenny – We prefer high performance signals and simple receiver architectures. Please note that developing an aviation receiver that uses BOC or MBOC will require the same development funds. As far as MBOC goes, we would take advantages of it.

Weill – Let’s consider Galileo signals as an example. When multipath is present, an MMT-equipped wideband receiver using a Galileo BOC(1,1) pilot with a total signal (data + pilot) E/No of 45 dB-Hz-sec and a secondary path 6 dB below the direct path can theoretically produce a worst-case RMS range error of about 63 centimeters at a secondary path delay of about 1.5 meters (the RMS error is over random secondary path phases). This peak error is reduced to about 50 centimeters using a TMBOC-50 pilot, which is a 21 percent reduction. For both signal types the error falls off rapidly at increased secondary path delays. At a path delay of 10 meters these RMS errors decrease to 25 centimeters and 18 centimeters, respectively. At path delays above 20 meters the errors approach those of a multipath-free signal, about 14 centimeters and 9 centimeters, respectively (essentially reaching the Cramer-Rao bounds for error due to thermal noise). In this region the TMBOC-50 signal gives about 33 percent less RMS error than BOC(1,1).

Q: A narrower bandwidth receiver designed for BOC(1,1) will be able to use only about 87.9 percent of the total power in the GPS MBOC pilot carrier or 81.8 percent of the total power in the Galileo MBOC pilot carrier (TMBOC-50 version). Do you see this as a disadvantage in any applications, especially in products/services provided by your company? If so, which ones?

Fenton – In the case of the GPS or Galileo MBOC, the effect of a 12 percent (or 18 percent in the case of Galileo) loss of signal strength would result in a 7 percent and 11 percent increase in RMS tracking error respectively. For example, if the RMS code tracking error of a channel locked to a narrow-band BOC(1,1) signal was 30 centimeters, then the expected tracking errors of the same hardware locked to the respective MBOC signals would increase to 32.1 and 33.3 centimeters assuming all other variables remained the same. We do not see this as being a significant disadvantage. The lower signal level will also slightly extend satellite acquisition times and time to first fix.

Garin – The disadvantage will be minor, at this level, as the fading effects are much more important than the absolute signal power. On the other side, the advantage will be immaterial for our current market. Nevertheless, we support the introduction of MBOC, as the theoretical penalty is minor, and the practical one will be insignificant.

Hatch/Knight – It is not likely that our company will build a narrowband receiver.

Kawazoe – We expect 12.1 percent and 18.2 percent power loss will not cause any serious problems. However, we would like BOC(1,1) to be adopted rather than MBOC for simple and compact design of GPS receivers.

Keegan – Signal level is sensitivity, and sensitivity is a significant part of consumer GPS. So, I believe that this 0.6 dB (or 0.9 dB) is more of an issue with consumer sets than high precision sets. However, in current consumer applications there are many places where architectural improvements would increase the signal-to-noise ratio (SNR) by more that these amounts, such as better antenna technology, more optimum signal sampling (sample rate and quantization), closed loop processing, etc. However, every dB is important.

Kohli/Turetzky – In general, we fight for every tenth of a dB in every aspect of our system design. Giving up 1 dB in transmitted signal power is a concession, but will be mitigated by other processing gains. One dB will translate into additional penetration in a building. This can make a measurable difference in availability at the consumer level.

Stratton – It is not directly a disadvantage. We will produce receivers that utilize every waveform that adds value to our markets. The key factor for us is whether our users would achieve operational benefits by using modernized signals, and we do not perceive a difference in user benefit between these two alternatives.

Studenny – We develop wideband receivers and maximize performance as required. We would use all available signals in the most effective manner possible.

Weill – With today’s technology, a narrowband design is required in applications where the receiver must have low cost and low power consumption. If it must also be capable of operating in poor signal environments, the provider of such a receiver is likely to believe that every decibel counts and therefore be in favor of a BOC(1,1) signal with its lower RMS bandwidth making all of the signal power useable. On the other hand, I would argue that it will probably take a decade to make MBOC signals available, and in that time improved technology is likely to make low-cost, high bandwidth receivers a reality. One must also take into consideration that if satellites without MBOC signals are launched, it will be a long time until the next opportunity to improve signal characteristics.

Q: If your receivers predominantly are narrowband now, do you believe your customers would benefit from wider bandwidth receivers with better multipath mitigation capabilities? Why or why not?

Fenton – The customers of our narrowband receivers would benefit from multipath mitigation capabilities. However the priority of these customers is cost rather than accuracy. It is more important for them to have a lower unit cost than advanced multipath mitigation technologies. However due to Moore’s law, by the time these signals are available, the cost of adding the increased signal processing to achieve better multipath mitigation may be tolerable.

Garin – Our today’s typical user will marginally benefit from the widening of bandwidth, when it will be technically and commercially feasible, mainly in line of sight conditions, that still represents a non negligible percentage of the conditions.

Kawazoe – Our customers wouldn’t benefit from wider bandwidth because multipath error is reduced with dead reckoning sensors, and the largest position errors occur when only non-direct signals are received, such as in areas with tall buildings.

Keegan – The main drivers for Consumer (or narrowband) receivers are cost and power and not accuracy in all but the most demanding environments such as indoors or in urban canyons, in which case improved performance is a desire as long as it does not grossly impact cost or power. However, a multipath environment that could be mitigated by a wideband receiver using conventional multipath mitigation techniques is not the environment experienced indoors or in urban canyons since the signal being tracked is typically a non-line-of-sight multipath signal and not a direct path signal contaminated with multipath. I believe it is unlikely these consumer products will significantly benefit from conventional multipath mitigation techniques employing a wider bandwidth design.

Kohli/Turetzky – Most of our receiver are narrowband today and we have far more requests for narrower bandwidth than wider. The multipath benefit is outweighed by the susceptibility to interference in most consumer markets.

Sheynblat/Rowitch – Given that the current performance capabilities of GPS technology meet the needs of consumers and business users worldwide, cost reduction is the remaining critical element needed to achieve wider utilization of GPS and Galileo in the future. This view is shared by most mass-market product manufacturers in the location industry.

Weill – I believe that customers will undoubtedly benefit from wider bandwidth receivers and that receiver manufacturers will provide more of these products in the not-so-distant future. For example, a major application of narrowband receivers is consumer-level high-sensitivity assisted GNSS handheld receivers, often embedded in a cell phone. Using current technology, these receivers are narrowband in order to reduce cost and power consumption, but this exacerbates multipath errors, which cannot be reduced by differential corrections available in many assisted systems. Compounding the problem is the severe multipath often encountered in indoor and urban environments. Going to a wider bandwidth can significantly reduce these errors, especially in conjunction with newer multipath mitigation technology.

Q: If your receivers predominantly are narrowband now, do you believe your designs will migrate toward wideband receivers in the next 10 to 15 years? Why or why not?

Fenton – What’s limiting the choice of processing bandwidth is unit cost and power consumption. Generally, wideband receivers have more complicated ASIC designs with higher gate counts as compared with narrowband designs. The use of these large and more expensive ASIC components along with larger CPUs required for the multipath processing results in higher unit receiver costs to our customers. Moore’s law may reduce the cost of signal processing to an insignificant amount before these signals are available or during the lifetime of these signals. Larger bandwidths require higher sampling rates and clock rates to the digital sections. These higher rates result in higher power consumption of the receivers. If the customer’s top priority is low power consumption then this will limit the widening of the bandwidth. Traditionally, each generation of electronic components have become more power efficient, so processing wider bands in the future may not increase the power demands beyond tolerable limits.

Garin – Our designs will increase the IF effective bandwidth, first for more accurate measurements, and possibly to accommodate Carrier Phase for the mass market in the next 3-5 years.

Hatch/Knight – Future high performance GNSS receivers will trend toward wider bandwidths. Performance of advanced code and phase multipath mitigation techniques is limited by the composite bandwidth of the satellite and receiver filtering. Receiver bandwidth in most existing receivers truncates a portion of the satellite signal spectrum and thereby reduces the effectiveness of advanced code and carrier multipath mitigation techniques.

Kawazoe – There is a possibility to migrate toward a wideband receiver, but the cost reduction and the jamming robustness are the main requirements from our customer, so we suppose that low cost narrowband receivers will continue to be dominant.

Keegan – One must believe that in 10-15 years the vast majority of consumer GPS receivers will be embedded in mobile handsets. In this environment I don’t believe wideband receivers (as defined here as capable of tracking the BOC(6,1) component) will improve the performance sufficiently to warrant its migration to this market. Other technical drivers would have to change first; such as much better antenna technology that does not impact cost and/or force the user to orient the device and much better low cost interference rejection (filtering) technology. Unless these change, wideband receivers that only offer 1dB of improved sensitivity will not compete with the lower power and cost of narrowband receivers. Unless these also improve, wideband receivers that only offer less than 1dB of SNR improvement will not compete with the lower power and cost of narrowband receivers. I don’t see a benefit that will cause them to migrate to something that is inherently more costly and consumes more power.

Kohli/Turetzky – If it makes economic sense to develop a wideband receiver in the future, we would do so. However, in our current markets today, we do not see that migration.

Weill – I have little doubt that competitive forces for better positioning accuracy combined with enabling technology will result in a trend toward low-cost high bandwidth receivers for most applications, even those which currently use narrowband receivers.

Q: If your receivers now or in the future are wideband, do you now or would you in the future likely use a form of “double delta” multipath mitigation?

Fenton – Possibly. The advanced multipath processing technique used to take full advantage of the MBOC waveform requires increased software processing demands and is more burdensome to the host CPU. It is envisioned that we would offer a modified Double-Delta style tracking technique for those customers who do not wish to burden their CPU with increased processing requirements. However, due to Moore’s law, by the time these signals become available, the cost of processing the algorithms may not be an issue.

Garin – If the bandwidth was suitable and the patents had expired, we would use some form of double-delta correlator as an add-on, but not as the main mitigation technique. We believe that double-delta will be superseded by methods pertaining to estimation theory rather than reference or received signal shaping. There is a misperception that carrier tracking performance won’t be different between C/A code, BOC and MBOC. It is probably true for traditional carrier phase tracking techniques. I would like to emphasize that several Carrier Phase “offset tracking” techniques can capture part of the code multipath performance into carrier phase performance, and will benefit as well from better code multipath performance.

Hatch/Knight – Some future multipath mitigation techniques will combine edge differencing techniques like “double delta” with advanced mitigation techniques.

Kawazoe – We would like to use a new method for multipath mitigation, if we are able to invent it.

Keegan – Double Delta type correlators can help any receiver mitigate multipath contamination and would be a good improvement even for narrowband receivers that actually (closed loop) “track” the signal. Many of the current consumer receivers do not track very low level signals but make open loop measurements of range in these environments, in which case double delta type correlators really have minimal benefit since there is limited control of the actual “sampling point” of the received signal. Other than intellectual property (IP) issues, there is nothing right now to stop narrowband tracking receivers from benefiting from Double-Delta type correlators … though the benefit is not as great for a narrowband as compared to a wideband receiver.

Obviously, high precision survey type receivers will employ any and all available multipath mitigation techniques, with IP issues being the limit.

Kohli/Turetzky – SiRF has a number of patented multipath techniques that we would leverage to take advantage of any new signal structure.

Stratton – Our receivers utilize a variety of tracking architectures depending on the specific requirements. Current civil aviation regulations limit the manufacturer’s flexibility to implement multipath mitigation techniques, though “double delta” discriminators are permitted. These limitations are intended to ensure that augmentation systems meet integrity performance under off-nominal conditions (e.g., spacecraft or atmospheric anomalies). The regulations will need to be revisited prior to the certification of receivers using modernized signal waveforms.

Studenny – No, Double-Delta technologies have their own limitations and problems. Other technologies exist that are superior to Double-Delta. Vision is one example. We are working on in-house signal processing, but we are not ready for disclosure.

Weill – Double delta may be a reasonable choice for low-cost, narrow bandwidth applications using current technology.

Q: If your receivers now or in the future are wideband, do you now or would you in the future likely use a more modern form of multipath mitigation (e.g., Multipath Mitigation Technology (MMT) by Larry Weill, as used by NovAtel in their Vision Correlator)?

Fenton – Yes, NovAtel intends to use a modified MMT algorithm specifically designed to take full advantage of the MBOC signal structure and to provide our customers both code and carrier tracking performance at near theoretically maximum performance achievable. NovAtel has exclusive use and sublicensing rights to MMT for commercial GNSS applications and intends to look at sub-licensing opportunities for its Vision technology.

Garin – MMT and Vision have their respective merits in their own market segments, but definitely not in ours, and not in an hypothetical high accuracy mass market. Other generations of MP mitigation techniques are under study and will probably obsolete the current MP methods. I feel it would be short-sighted to try to evaluate today what will be the impact of MBOC on Multipath, looking only at the impact it will have on the methods published as of now. A narrower correlation peak is also of interest in carrier phase multipath mitigation.

Hatch/Knight – We will deploy a more modern form of both code and phase multipath mitigation and, of course, will attempt to patent our own techniques.

Kawazoe – We would like to use new multipath mitigation, if we will be able to invent one which does not conflict with all multipath mitigation methods patented before.

Keegan – Obviously, the highest precision survey receivers will employ any and all available multipath mitigation techniques, again with IP issues being the limit. However, these types of techniques require substantially more system resources than do correlator type mitigators, so only those receivers looking for the highest accuracy will employ them. Again, this is a customer requirement issue. Users that demand the highest accuracy will use receivers that employ the best multipath mitigation techniques. Others that don’t require the highest accuracy will use receivers that are lower cost and lower power. This is not a technology issue, it is a customer requirements issue. Millimeter accuracy for someone looking for a power pole is not worth any additional cost over sub-meter accuracy.

Kohli/Turetzky – We would look at all of our options of both internally developed and externally available techniques that would be appropriate for our market. Our multipath mitigation needs however are focused on urban canyon type multipath rather than improving centimeter levels of accuracy in open sky.

Stratton – Rockwell Collins is actively developing and fielding multipath mitigation technology, and we hold a number of patents in this area. As mentioned earlier, regulations tend to limit the use of proprietary techniques for safety critical (civil) operations.

Studenny – We either develop or use whatever technology that is appropriate for our business.

Weill – If I were a receiver manufacturer in an environment where there is competition for positioning accuracy, I would at least want to investigate some of the new multipath mitigation technologies currently being developed and to consider whether licensing arrangements would make sense if patents are in force.

Q: If your receivers now or in the future are wideband, what are the “real world” benefits you expect from having the MBOC waveform? Will accuracy be better? By how much and under what circumstances? Will performance be better under poor signal conditions? By how much?

Fenton – Although not fully analyzed, the expected benefit of the MBOC signal will come from the increased effective RF phase transition rate (the number of phase transitions per unit time). As pointed out above, the expected increase of effective signal-to-noise ratio of a tracking loop that takes full advantage of the MBOC signal structure is between 2 and 3.5 dB with respect to a BOC(1,1) signal. For example, if the RMS code error of a channel tracking the BOC(1,1) signal was 30 centimeters, then switching to an MBOC would result in reducing the RMS error to between 23 centimeters and 21 centimeters depending on the exact MBOC code chosen (a factor of between 21 percent and 33 percent improvement). Multipath mitigation technologies also benefit from an effective increase in code tracking signal to noise ratio. These algorithms will be able to detect the presence of multipath sooner with this increased signal gain and be able to provide more precise range and phase measurements in the presence of closer-in multipath interference as compared with BOC(1,1).

Garin – The wider bandwidth will benefit this incoming accurate mass market.

Hatch/Knight – The MBOC codes will improve the “noise” of the multipath corrections estimated by advanced mitigation techniques. They may not significantly improve the mean accuracy, particularly for stronger signals. The weak signal code tracking threshold for the advanced techniques will be improved by the ratio of MBOC edges divided by BOC edges, as discussed in the Part 1 article.

Kawazoe – The “real world” benefit would be a reduced multipath effect, and we would expect better accuracy in urban canyons. Under poor signal conditions we would not expect high sensitivity or high cross-correlation through MBOC.

Keegan – Since the most modern multipath mitigation techniques (not double-delta or equivalent) work better with more observations of the multipath and multipath is observable only at code transitions, I believe these modern multipath mitigation techniques will improve with more code transitions. So, the MBOC signal structure should improve the performance of all wideband receivers tracking the MBOC signal that employ these modern multipath mitigation techniques. The more difficult the multipath is to observe (e.g. with very short delays) the more the additional code transitions will help.

Kohli/Turetzky – For our customers, we would expect some very limited benefit in accuracy under a very narrow set of conditions. When we talk about poor signal conditions, we are talking about -160 dBm and lower.

Stratton – Accuracy will be better under ideal conditions, but we have not seen validation of the theoretical benefit under realistic conditions. The impact of off-nominal conditions on accuracy, particularly differential GNSS (augmentation systems), requires further study, including:

  • Impact of atmospheric propagation effects that distort split-spectrum signals, which may impact MBOC differently than BOC(1,1) or C/A;
  • Impact of spacecraft anomalies that potentially impact MBOC differently than BOC(1,1) or C/A;
  • Impact of RF and antenna characteristics that vary across the bandwidth (e.g., VSWR, group delay differential) and thus may impact MBOC differently than BOC(1,1) or C/A.

It is worth noting that GPS already provides a higher accuracy signal than MBOC – the L1 carrier phase. At this point we favor the adoption of the simpler alternative of BOC(1,1), at least until a broader consensus regarding the above issues is achieved. While it would be interesting to know the benefit of MBOC on airport surface operations, we have not identified any other potential operational benefit to choosing this waveform over BOC(1,1).

Studenny – We desire an L1 capability that matches the L5 capability and which supports the deployment of CAT-III precision approach. It’s not just the power, it’s cross-correlation, false self-correlation, and ability to resist multipath and RFI. A well-selected coding scheme minimizes all these things, and when we compare it with the L1 C/A and L5 signals, it’s these things that really stand out. Recall we desire to minimize hazardously misleading information (HMI) by selecting an appropriate code/signal, because HMI is the key to precision approach. One more thing – a great many commercial applications will depend on minimizing HMI – they just don’t know it yet. Why? Because the position fix will need integrity. I can envision lawsuits, court battles, and so on, when GPS position fixes are questioned. This is coming, the commercial low cost GPS manufacturers may not want to deal with this but may have to, especially if there will be large sums of money involved.

Weill – In the absence of multipath, a wideband receiver using a TMBOC-50, TMBOC-75, or CBOC-50 pilot instead of a BOC(1,1) pilot should have RMS range errors due to thermal noise that are respectively 33 percent, 26 percent, and 21 percent smaller than with a BOC(1,1) pilot, assuming equal received signal power. This relative performance advantage is essentially insensitive to C/N0.

Q: The newest multipath mitigation technology is effective when receiving signals directly from satellites, and MBOC helps most in low S/N conditions. For your applications, how frequently will a low S/N with directly received signals occur? What practical and measurable benefit will MBOC give your users?

Fenton – As mentioned, the MBOC helps most in poor signal conditions such as low elevation tracking or high multipath conditions. The presence of these conditions is highly dependent on the location of the receiving equipment. A well situated antenna with multipath resistant electronics will not see a high proportion of poor signal. However, a surveyor operating in an urban construction site, or a forest engineer walking through the bush will experience a very high proportion of poor and corrupted signal. The large majority of our GPS users are operating in challenging RF signal conditions and would benefit by various amounts from the MBOC signal structure.

Hatch/Knight – In our applications low signal conditions occur at the start and end of satellite passes or when our receivers are near foliage or buildings. Many European farms are small and are surrounded by hedgerows that cause loss of satellite tracking or multipath mitigation when the satellites are masked by the foliage. MBOC improves the use of very weak satellites, but the effectiveness of advanced multipath mitigation algorithms for signals masked by foliage is not yet known. The several extra dB of code edge power provided by MBOC may be useful in such environments, but the benefits can not be quantified without live tests of the signals and processing algorithms on foliage-attenuated signals. The extra multipath mitigation power provided by the MBOC signals will lower the noise and residual multipath for both code and carrier measurements, but the amount of improvement is small for typical satellites.

It is our opinion that the extra number of visible satellites provided by a GPS plus Galileo satellite constellation is far more beneficial than implementing MBOC. Extra satellites greatly reduce the importance of weak signals and increase the precision of navigation. Implementation of the MBOC signal structure will be very costly to our customer base. Our existing receivers can combine a BOC waveform with a PN code. MBOC requires time multiplexing two different PN code in a very specific manner, which requires redesign of the signal processing ASIC and increases the complexity of the code generator by perhaps one-third to one-half. One could also use a 12*1.023MHz memory code to represent the 6*1.023MHz BOC code + PN code. That requires 12 times the storage of the 1.023 MHz memory code. The proposed codes are up to four milliseconds in length (~50,000 bits per channel). This is a sizeable fraction of the ASIC logic required to implement a channel and is more memory than is available.

The extra edge power provided by the MBOC signal structure is meaningful for a very small fraction of the time and can not be attained without a redesign of the code generators in our receivers. This will necessitate replacement of all the receivers in our customer base. We do not think the perceived benefits of MBOC are worth the cost.

Kawazoe – We think it is rare that a low S/N with directly received signals would occur when GPS receivers are used for car navigation. It is seldom that MBOC will give some benefit.

Keegan – I don’t completely agree with the assertion that “MBOC helps most in low S/N conditions”. More code transitions helps in the observation of multipath, that is, the ability to distinguish the multipath from the direct path signal. As the multipath delay becomes smaller, the ability to distinguish and hence measure the multipath becomes problematic. More code transitions assist in this case even in high SNR conditions.

Kohli/Turetzky – The definition of “low S/N” is critical here. We live in the domain of –160 dBm signals, which are almost never direct.

Stratton – Civil aviation receivers must pass specific test criteria under standard interference conditions to provide a margin for the users against interference. The receiver’s ability to maintain carrier track is far more important to accuracy than raw code phase quality in these scenarios. The receiver’s ability to demodulate data in these scenarios is also more critical, since navigation data senescence is a requirement to use the augmentation system. The military user may benefit indirectly from a more jam resistant acquisition signal in cold-start cases; however, the power level devoted to the data channel is all that matters in these cases.

Studenny – In Commercial Aviation, the concern is the integrity in applications supporting all phases of flight including CAT-I/II/III precision approach. As we approach CAT-III precision approach, the bounding probability for a very small position-fix error in the vertical direction and horizontal plane has to be very large (in excess of 99.9999999 percent). Any benefit that the signal-in-space can provide to meet these kinds of requirements is welcome. To answer the question directly, please note that there are various task forces at RTCA, EUROCAE, ICAO, and elsewhere, that are attempting to precisely quantify the various error allocations due to the signal in space, the aviation receiver, the proposed augmentation system, and the aircraft and crew, for all phases of flight, and for precision approach in particular. Please refer to these task forces for more details.

Weill – The wider bandwidth of an MBOC signal will generally improve MMT multipath performance by the same amount relative to BOC(1,1) under all conditions. Even with a relatively weak direct path signal component, MMT can be effective if the application permits extending the observation time of the signal. This is because its performance in reducing multipath error improves proportionately with increases in the ratio of signal energy to noise power spectral density, or E/N0. (This is not the case for double-delta mitigation.) For example, if the direct path C/N0 is 15 dB-Hz (a very weak signal), 10 seconds of signal observation gives an E/N0 of 25 dB-Hz-sec, which is useable by MMT. In some applications 100 seconds of signal observation can bring E/N0 to 35 dBHz-sec to give even better performance. Consequently, MMT multipath mitigation can be effective in many cases when the direct path signal is attenuated by foliage or passes through walls. (Note that extended signal observation times with MMT are appropriate only for static applications.) Urban canyons present a more difficult problem if there is total blockage of the direct path component, but then it is unlikely that any method of receiver-based multipath mitigation will work. On the other hand, the future availability of many more satellites could provide enough unblocked direct path signals to obtain positioning enhanced by good multipath mitigation.

Q: As you know, the statistics of real-world multipath are difficult to assess. Based on your real-world experience, how important is effective multipath mitigation to the GNSS community, and specifically in what applications? How important is it to your company?

Fenton – Having good multipath mitigation technology benefits almost all applications. Very few applications have “ideal” antenna locations providing multipath free signals. Most real-world applications suffer from some amounts of multipath. The amount of benefit that the user sees from this technology is inversely proportional to quality of the RF signal received.

Garin – Multipath is in my opinion the “last frontier” in the pursuit of better navigation and positioning performance for the GNSS community at large. Building monitoring and surveying will be the principal beneficiaries. For the cell phone and personal navigation device we deeply do care about multipath, but the ultimate answer won’t come from a binary choice between MBOC or BOC, nor from any reference signal shaping technique. A new class of methods is about to emerge, some of them adapted from the wireless communications discipline.

Hatch/Knight – Multipath is one of the largest errors in short to medium baseline real-time kinematic (RTK) applications, which are a major portion of our business. Mitigation of multipath is very important to our business.

Kawazoe – We think an effective multipath mitigation is very important for all applications in urban canyons, such as for car navigation or walker’s navigation. It also is important for our company, because we produce many GPS receivers for car navigation.

Keegan – If multipath mitigation is defined as the mitigation of a multipath-contaminated direct path signal, then it is extremely important in High Precision Survey applications. The most difficult multipath is the multipath that is from a nearby reflector that changes very slowly, is difficult to observe, and appears as a measurement bias during a typical observation interval. The ability to observe this type of multipath is enhanced by increasing the number of code transitions that occur during the observation interval. While this type of multipath is also present in consumer hHandset) applications, its impact is less of a problem when the desired accuracy is measured in meters. However, when the dominant received signal is a multipath signal, as is the case in urban canyons and indoors, then the consumer receiver produces solutions with large errors. Mitigation of this type of multipath is more important for consumer chipsets than the mitigation of multipath-contaminated direct path signals, but I don’t expect MBOC to help with this problem.

Kohli/Turetzky – Multipath mitigation can be a clear differentiator in accuracy and our focus is getting the best possible accuracy in obstructed environments, given the constraints of cost, size, and TTFF for consumer applications. Our customers care about “consumer affordable” meter level accuracy to determine streets and house numbers not centimeter level accuracy.

Stratton – Having greater multipath-resistance is secondary in importance to having a robust and available signal with navigation data at sufficient power. During the development of the civil augmentation systems, multipath was seen as a significant issue, but methods were developed to mitigate multipath that were within the reach of current technology. For example, we use carrier smoothing (i.e., complementary filtering that takes advantage of the high accuracy of the L1 carrier phase) to mitigate multipath sufficiently to conduct CAT III landings if the augmentation system is located at or near the airport. In looking at precision approaches flown with this technology, we see no degradation in accuracy as the airplane approaches the runway environment. This is expected because of the frequency separation of the multipath resulting from the airplane’s motion.

Studenny – Multipath is an issue, especially for GBAS ground stations. It has to be minimized by whatever techniques are available. A signal with desirable code properties is a great starting point to minimizing multipath effects. The counter example is the L1 C/A code – it has poor multipath rejection properties and requires specialized signal processing to mitigate some of the multipath effects.

Weill – Effective multipath mitigation has always been regarded as important in high-precision applications, where in some cases careful measurements have shown that enough multipath exists to cause serious problems unless it is mitigated. It has also been demonstrated that receivers used indoors and in urban canyons often produce large errors due to multipath. Although in any given application it is difficult to reliably determine how often multipath is really a problem, a conservative approach uses effective multipath mitigation methods to instill confidence that the required level of positioning accuracy has been achieved.

Q: It is now known that signals with wider bandwidths improve theoretically achievable multipath performance. However, current popular mitigation methods (such as the double delta correlator) cannot take advantage of the higher frequency components of an MBOC signal. On the other hand, advanced techniques (such as NovAtel’s Vision Correlator) are emerging which approach theoretical bounds for multipath error using any GNSS signal regardless of bandwidth, and they are especially effective at reducing errors due to near multipath. In particular, multipath errors using the BOC(1,1) signal can be significantly reduced and MBOC does even better. In what applications, if any, would such improvements be useful to your company?

Fenton – Given that multipath is the biggest single source of error, improved multipath performance is critical for improved positioning in most high precision applications such as surveying and mapping, machine control, and precision guidance. In RTK applications, having precise pseudoranges reduces the convergence time to centimeter position estimates by providing smaller initial search volumes for the fixed integer ambiguity estimators. Not only does Multipath Mitigation Technology (MMT) provide cleaner measurements, it also provides signal quality estimations so that the position computation software can de-weight the poor quality measurements.

Garin – I have already stated earlier that the major improvement MBOC will bring is for surveying applications. It will be more a minor hindrance for the cell phone mass market and a minor limitation on weak signal capabilities. I don’t think that any incremental power improvement in the signal in pace will noticeably change the landscape of the indoor navigation market. It has been implied for a while that high customer demand for “always present” location availability will call for some kind of data fusion. In contrast, MBOC will be a boon for the high accuracy market, and it will engender new ideas as I have always witnessed every time a new concept was introduced in GNSS.

Hatch/Knight – Advanced multipath techniques that are equal or superior to the Vision Correlator will be a required feature of future high performance GNSS receivers.

Kawazoe – We think this would be a high-level and expensive GPS receiver.

Keegan – Since these new techniques require more processing and work better with higher sampling rates, they are only applicable to the highest precision sets. As processing becomes cheaper and higher sampling rates become the norm, this type of multipath mitigation will migrate to lower cost high precision GNSS sets, but I doubt that they will ever be part of consumer chipsets since they only provide mitigation of multipath that accounts for a few meters of code error and centimeters of phase error in relatively static situations.

Kohli/Turetzky – For our markets, near multipath is not the biggest source of error at the signal levels our customers are most interested in. Therefore, the multipath mitigation techniques we would use would potentially be different.

Stratton – Perhaps additional multipath resistance could become more significant in the future if GNSS is used in airport surface applications (i.e., when the airplane is moving slowly), but this requires further study and validation. On the other hand, a more complex signal structure may be more difficult to certify for safety-critical uses. It is not yet clear whether the certification risks associated with migrating to modernized signals will outweigh their potential benefits. This is analogous to the situation that exists today, with low-tech (but proven) instrument landing systems still being installed despite the availability of GNSS landing systems, which are dramatically more accurate from the pilot’s perspective.

Studenny – The preference is NOT to use unusual or complicated receiver technologies. It is also true that a well designed signed will not require such unusual technologies to reach the required performance levels. A well-designed, wide-band signal allows for simple receiver architectures and designs that achieve very high levels of performance. We believe that having an inadequate signal as a starting point and then attempting to extract performance through complicated receiver designs is the wrong approach.

Weill – It is now generally accepted that the real problem in most applications is close-in multipath, characterized by strong secondary signals from nearby reflectors (notably the ground) delayed by less than 10-20 meters. In this region the popular double delta correlator is not effective in suppressing multipath, so new mitigation techniques that solve this problem are certainly of interest.

Q: Would the additional capabilities provided by the MBOC code be useful in your products?

Fenton – Yes, the MBOC will provide additional accuracy and reduction in multipath interference.

Garin – In the medium to long term, 5-10 years, the mass market will migrate toward use of carrier phase. Then we will benefit from MBOC, as the surveying equipment manufacturers would today, because there will be market segment overlap.

Hatch/Knight – We expect a modest improvement in multipath mitigation under moderately weak signal conditions, such as under foliage.

Kawazoe – No. MBOC code is not useful.

Kohli/Turetzky – The capabilities of improved accuracy would have very limited benefit in our application.

Stratton – Having a more multipath-resistant civil signal is secondary in importance to having a robust and available signal with navigation data at sufficient power.

Studenny – Yes.

Weill – Yes. MMT can take advantage of the higher RMS bandwidth of an MBOC signal.

Q: If you could influence the governing bodies regarding the selection either of BOC(1,1) or of MBOC code, what would you recommend?

Fenton – Two fundamental limitations of accuracy are radio transmission bandwidth and the BPSK chipping rate. Since there is very little option of increasing the bandwidth, then increasing the effective BPSK chipping rate is the only option to increase the signal gain and therefore accuracy. I would recommend increasing the effective chipping rate as much as possible.

Garin – BOC(6,1) is in the domain of surveying applications. Because a very large majority of them need to have dual frequency processing capabilities and more available power to accommodate large bandwidths, we would recommend dedicating one non-L1C frequency channel to the exclusive use and benefit of the surveying community, with a larger bandwidth and, why not, exclusively transmitting BOC(6,1) codes. Short of this technically sound solution, we support MBOC for the benefit of the surveying community.

Hatch/Knight – We believe that MBOC may be useful for our applications, but the amount of benefit is unclear and is difficult to estimate theoretically. Support of MBOC will definitely increase receiver complexity. We do not think there is a strong and clear case for implementing MBOC

Kawazoe – We would like to recommend BOC(1,1).

Kohli/Turetzky – We would recommend BOC(1,1), but it’s really more of a preference. We are perfectly comfortable with MBOC, but we do see more benefit for mass market consumers from the higher power of BOC(1,1).

Sheynblat/Rowitch – Given that high cost, high precision GPS devices can afford to monitor multiple GNSS frequencies, employ higher complexity RF components, employ higher complexity processing algorithms, it would make sense to optimize the modernized signals for the low cost, mass market and let high cost receivers pursue the many other options available for improving precision. In summary, Qualcomm is in favor of the original BOC(1,1) proposal with no imposition of BOC(6,1) modulation.

Stratton – Greater public involvement will be needed to finalize the L1C definition. Perform further validation of L1C signal structure before adopting a finalized signal structure. The validation should include impacts to augmentation systems, integrity performance under off-nominal conditions and probable failures, and migration issues (user benefits).

Studenny – We would take advantage of the MBOC signal.

Weill – I would recommend that MBOC be selected. The reduction in power for narrowband applications is small. When MBOC signals finally become available, advances in receiver technology are likely to make low-cost wideband receivers a reality.

Summary and Conclusion

We received remarkable interest and cooperation from eight companies and two prominent consultants who are experts in multipath mitigation techniques. Undoubtedly, their willingness to commit such thoughtful and extensive replies to our questions underscores the importance of the issue.

Although the discussion reflects tendencies within the manufacturing community, our BOC/MBOC series was not intended to serve as a comprehensive poll of sentiments in the GNSS community at large. Instead, we wanted to link the efforts of GNSS signal experts with those of receiver manufacturers – to bring these two worlds closer together and explore how the movements of one affect the other.

Clear tendencies emerged from the panelists’ comments, reflecting separate perspectives of companies and engineers working with single-frequency/narrowband receiver designs and those building wideband, multi-frequency GNSS receivers.

Most of the panel members acknowledged the theoretical potential of the MBOC waveform to enable receiver designs that further reduce the effects of multipath beyond that available with BOC(1,1). Where they parted ways was over the question of the amount of practical benefit that would derive from this advantage. As one might expect, representatives of companies that serve the consumer electronics market generally preferred BOC(1,1) rather than MBOC — the opposite view of their wideband counterparts.

The discussion also highlighted differences of opinion over the likely trajectory of technology development, particularly on the question of whether that trajectory might — or might not — allow consumer-oriented GNSS products in the future to be able to affordably incorporate the benefits of MBOC.

MBOC supporters tended to believe that today’s narrowband receivers would migrate to wideband designs so that they could take advantage of the BOC(6,1) component. Most BOC(1,1) supporters were skeptical of that assessment and asserted that consumer receivers would probably remain narrowband.

There were two surprises, however. One of the consumer electronics companies acknowledged the disadvantage of MBOC for its current market but considered that to be minor compared with the potential benefit to the high-precision applications market and perhaps eventually to the consumer market itself.

The counter-surprise was that a company involved in very high precision applications recognized the potential benefit of MBOC to its applications and will use MBOC if provided. However, they judged the practical benefit to be minor and less important than the disadvantage of having a more complex receiver.

Useful conclusions can be drawn from this limited but focused survey.

1. An industry consensus does not exist regarding the relative merits or demerits of BOC and MBOC. The majority of consumer products companies, which expect to serve a billion users, want to avoid even a small loss of signal power and doubt that they ever will be able to use the high frequency component of MBOC. Most receiver designers targeting high-precision and safety-of-life applications are equally convinced that every increment of robustness and accuracy brings a critical benefit to their customers and, consequently, endorsed MBOC.

2. Quantifying the relative advantage of MBOC and BOC in practical user terms has been difficult, especially without signals in space to test user equipment under actual operating conditions. Consequently, the assessments of benefit have derived from lab tests and simulations.

Under a fairly severe multipath scenario, one panelist calculated that MBOC could reduce the worst-case RMS range error from about 63 centimeters with BOC(1,1) to about 50 centimeters with MBOC. On the other hand, another panelist argued that every decibel makes a difference, especially in E-911 type applications where availability can make a critical difference. Absent extensive field experience, the significance of both positions remains arguable.

3. Whichever choice is made, no killer reasons have appeared that will condemn either choice. The differences are subtle and both choices could be justified.

4. We sympathize with those making the decision in Europe. Either choice will be both praised and criticized.

Civil GNSS Signals at a Crossroads: An Afterword

In an effort to close the loop between receiver designers and signal experts, we invited additional comments on the discussion presented in the two-part article, “BOC or MBOC?”

We received responses from several U.S. members of the US/EU technical work group that recommended the multiplexed binary offset carrier waveform for the new GPS and Galileo civil signals. (They also were coauthors on the original Working Papers column that introduced the signal proposal in Inside GNSS’s May/June issue.) Javad Ashjaee, president and CEO of Javad Navigation Systems and a long-time designer of GNSS receivers, also provided a commentary of the discussion, which we present following the remarks of the U.S. signal team members.

As discussed in the introduction to Part 1 in the July/August issue of Inside GNSS, if MBOC is implemented, the United States and Europe may implement slightly different versions of MBOC, with different allocations of power on the pilot carrier. The comments from the U.S. working group members address the relative merits of MBOC and BOC(1,1) in general as well as the specific U.S. version of MBOC — time-multiplexed BOC.

Additional Comments on MBOC and BOC(1,1)

John W. Betz, Christopher J. Hegarty, Joseph J. Rushanan
The MITRE Corporation

As members of the United States team who worked with our European colleagues to design the MBOC spreading modulation, we respectfully offer the following comments on the article entitled “BOC or MBOC? Part 1,” published in the July/August issue of Inside GNSS.

This response is meant to provide additional information that complements the views presented in the introduction to the article and to explain the background of the GPS-Galileo Working Group A (WG A) Recommendations on L1 OS/L1C Optimization, which can be viewed at the GPS and Galileo signal specification websites, respectively, GPS: http://gps.losangeles.af.mil/engineering/icwg/ and Galileo: http://www.galileoju.com/page3.cfm. Our focus here is on the GPS L1C signal.

The MBOC modulation contains an additional high frequency component that produces a sharper correlation function peak — fundamentally improving its suitability for tracking. In particular, MBOC enables a receiver to better process against multipath errors, often the dominant source of error in navigation receivers.

Most modernized signals in GPS, Galileo, GLONASS, QZSS, and mobile telephony reflect this trend toward wider bandwidths and sharper correlation function peaks, because of the many benefits that accrue. Moreover, MBOC has the added advantage that it retains excellent interoperability with narrowband receivers.

Indeed, many of the favorable responses to MBOC in the July/August article were explicitly tied to statements that look ahead to when L1C will become operational late in the next decade and then be used for decades afterward in applications that we can scarcely fathom today. At least seven more cycles of Moore’s Law will have unfolded before initial operational capability of L1C, reflecting more than 100-fold improvement in digital processing capability.

As in the many other systems engineering tradeoffs involved in the design of L1C, pros and cons were carefully considered in making the recommendation on the spreading modulation. The full set of engineering data comparing TMBOC (the time-multiplexed BOC implementation for L1C) versus BOC(1,1) substantiates the net benefits in robustness and performance to all users whether or not BOC(1,1) or TMBOC is used.

For example, when narrowband GPS receivers track both C/A code and L1C transmitted from the same satellites, compared to using C/A code alone they obtain 2.7 dB more signal power with TMBOC or 2.9 dB more power with BOC(1,1). With either modulation, there is a significant benefit to narrowband receivers, and the difference between modulations yields an imperceptible difference in available power.

Figure 1 lists tradeoff factors considered in the L1C spreading modulation design; these supplement the subset of factors discussed in the introduction to the BOC or MBOC article. TMBOC’s relative advantages are shown as dB values to the right and BOC(1,1)’s relative advantages are shown as dB values to the left. (To view Figure 1, download the PDF version of this article using the link above.)

TMBOC’s benefits, such as reduced correlation sidelobe levels, apply to all receivers, with most value to those that must use weak signals. Observe that receivers need only employ bandwidths of roughly ±6 MHz to obtain the other benefits of TMBOC in signal tracking and multipath mitigation.

As indicated in our earlier article on MBOC in the May/June issue of Inside GNSS, the Galileo program has the lead in choosing a common signal modulation that will be used for decades by not only Galileo, but also GPS, QZSS, and possibly satellite-based augmentations systems, and other radio-navigation systems. We understand Galileo decision makers’ need to balance near-term programmatic issues against the longer-term investment in improved satellite-based navigation, and respect their decision process.

In conclusion, we sincerely welcome receiver manufacturers’ views on both BOC and MBOC. The challenge for all of us — signal designers, receiver designers and manufacturers, and decision makers — is to make this decision in the context of applications and receiver technologies that will be relevant later in the next decade and for decades to follow.

We believe the engineering tradeoffs reaffirm that TMBOC, like other aspects of L1C, will provide solid net benefits to future generations of satellite navigation users.

MBOC Is the Future of GNSS; Let’s Build It

Javad Ashjaee
Javad Navigation systems

All I can say is, I’m glad these guys complaining about MBOC weren’t the ones designing the GPS system — or the new common GPS/Galileo civil signal. What is their basic complaint about MBOC? That it adds complexity and power consumption. But 25 years ago, GPS user equipment weighed 150 pounds and a receiver cost $250,000. If they had based the system design on the state-of-the-art receivers at the time and tried to simplify the system design to accommodate them, they would have said, “We don’t need carrier phase or a second frequency.” They would have been thinking about receivers as if they were carrying an FM radio from those days around in their pocket.

But technology changes. Product design improves. How old is Moore’s Law [that says the complexity of integrated circuits, with respect to minimum component cost, doubles every 18 months], and yet it’s still going on. The same thing is repeating itself today.

In the early 1980s when we were building the first GPS receivers, we only had 8-bit microprocessors. Multiplying two floating point numbers together was a huge task. I had to write software to simplify the computation of the signals as much as possible, but I never complained about the GPS system design itself.

Now the technology has matured to the point that you see today — single-chip GPS receivers. And yet modern user equipment is based on this GPS system design of 30 years ago.

We should design the system and make it as good as we can. By the time it’s up and running, technology will have advanced a long way in the products that we are building.

Even with the current technology, however, what do the people who don’t want MBOC lose? One decibel. But the new satellites have 3 dB more than we have today.

On the other hand, what do we gain with MBOC? Maybe a little, maybe a lot, depending on who looks at it. MBOC gives us more things to work with. It may help us to get faster RTK by removing multipath in the automatic landing of an aircraft. The people worried about getting GPS signals further indoors are talking about users who may be sitting around drinking wine, not sitting in an airplane that’s landing in the fog. Even if there is an emergency indoor application, it most probably can wait a few more seconds to get a position fix or have a few more meters of error.

The chips that will be designed to fully use this new GNSS system will come 10 years from now. It’s a crime to say that we can’t build the best system for the future because today someone needs an extra bit of processing power.

One final note: my hat’s off to a dear, long-time friend, Tom Stansell, for a job well-done in having helped coordinate the BOC-MBOC discussion in Inside GNSS in such an unbiased even-handed way.

Javad Ashjaee, Ph.D., is the president and CEO of Javad Navigation Systems, San Jose, California, USA, and Moscow, Russia.

Old Questions, New Voices

Q: What segment of the GNSS market do your answers address? Describe your market, including typical products and the size of the market.

Kawazoe – Typical products are GPS receivers for car navigation. The total Japanese Car Navigation market was over 4 million units in 2005, and JRC sells about 1.8 million units per year.

Keegan – I have worked with companies in all Market areas from Consumer to High Precision Survey as well as Military.

Kohli/Turetzky – SiRF has a broad array of location and communication products at the silicon and software level that address mainstream consumer markets. Our main target markets are automotive, wireless/mobile phones, mobile compute, and consumer electronics. These markets have a potential size of more than a billion units per year. Although the consumer GPS market is growing very fast, the overall penetration of GPS in these markets is still quite low. Our technology is used in a range of market leading products including GPS-enabled mobile phones, portable and in car navigation systems, telematics systems, recreational GPS handhelds, PDA and ultra mobile computers, and a broad range of dedicated consumer devices. Our customers are global and we currently ship millions of units per quarter all over the world. We focus on providing best in class performance for consumer platforms (availability, accuracy, power, size) at a cost effective price.

Q: Which signal environments are important for your products: open sky, indoor, urban canyon, etc.?

Kawazoe – It is an urban canyon environment.

Kohli/Turetzky – There is not a single most important environment, our products are designed to operate across all environments. The biggest challenge for us and our “claim to fame” is our ability to make GPS work in obstructed environments. The consumer expectation is that location is always available and meeting this expectation is the focus of our innovations. Our technology is targeted to meet the difficult challenges of the urban canyon, dense foliage, and indoor environments.

Q: Which design parameters are most critical for your products: power, cost, sensitivity, accuracy, time to fix, etc.

Kawazoe – The most critical design parameter is cost. The next parameters are sensitivity and accuracy. Our main GPS receiver specifications are: power: 88 mA typical at 3.3 Volts, sensitivity: less than -135dBm, accuracy: 10 m 2DRMS typical, and time to fix: 8 sec. typical (hot start).

Kohli/Turetzky – We target different parameters for different target markets. In general, however, availability (a combination of sensitivity and time to first fix) with reasonable accuracy and power are more important than extreme accuracy.

Q: Do you really care whether GPS and Galileo implement plain BOC(1,1) or MBOC? Why?

Kawazoe – Yes. We prefer BOC(1,1) for easy implementation.

Kohli/Turetzky – We don’t have a strong opinion. We can see the benefits of both for different markets. Whatever is chosen, we will build the best receiver for our customers.

Q: Are the GNSS receivers of interest narrowband (under ±5 MHz) or wideband (over ±9 MHz)?

Kawazoe – Our receivers of main interest are narrowband because low cost and jamming robustness are most important for our major customers. Even so, some JRC receivers are wideband because accuracy is more important for these receivers.

Keegan – I have designed receivers that are narrowband (consumer) as well as wideband (Survey) receivers.

Kohli/Turetzky – Our customers have a definite preference for narrowband receivers because it makes their system design more robust to interference. As our receivers operate in harsh RF environments and can navigate at extremely low signal levels, keeping interference out lets them utilize our technology to its fullest. Interference in integrated products arises from LCDs, disc drives, and other RF links, and the interfering spectrum can be wideband.

Sheynblat/Rowitch – The receivers of interest are narrowband. Low cost GPS consumer devices do not employ wideband receivers today and will most likely not employ wideband receivers in the near future. Any technology advances afforded by Moore’s law will likely be used to further reduce cost, not enable wideband receivers. In addition, further cost reductions are expected to expand the use of positioning technology in applications and markets which today do not take advantage of the technology because it is considered by the manufacturers and marketers to be too costly.

By Alan Cameron

Development Update

Development of satellite-based positioning and navigation technology has greatly reformed conventional spatial determination practices and enabled advancement of the digital infrastructure in China. This kind of progress is continuing with the improvement of related techniques.

Development of satellite-based positioning and navigation technology has greatly reformed conventional spatial determination practices and enabled advancement of the digital infrastructure in China. This kind of progress is continuing with the improvement of related techniques.

This article will provide an update on China’s GNSS-related activities in recent years, including research on novel positioning approaches, collaborations between China and international sectors, and, finally, some brief comments on the prospect for China’s Beidou navigation and positioning system.

China’s CORS Network

Beginning in 1990, the mode of continuously operating reference station (CORS) using GPS was first applied by NASA’s Jet Propulsion Laboratory (JPL) and MIT to the research of plate tectonics in southern California, USA. This innovation successfully helped geologists to deepen their understanding of seismic faults because more continuous spatial information can be obtained than ever before.

From 1997 to 2000, as a key state scientific project, the Crust Motion Observation Network of China (CMONOC) was implemented, composed of 25 CORS stations and 1,000 regional network stations. Very long baseline interferometry (VLBI) and satellite laser ranging (SLR) equipment was coupled in some of the CORS stations. Based on CMONOC, researchers achieved significant seismic motion results about continental plates. CORS has subsequently been employed by numerous agencies and organizations in China and has become popular in many fields, including guidance of aircraft similar to the U.S. Wide Area Augmentation System (WAAS) approach procedures.

In contrast to preliminary stages, evolution of networks and communications have enabled CORS to become a leading support component for the national temporal and spatial information infrastructure. CORS is now implemented at many of China’s main cities, such as Shenzhen, Chengdu, Beijing, Shanghai, and Guangzhou.

Among these, the Shenzhen CORS system was started in 1999 as a paradigm of comprehensive service network and spatial data infrastructure in China. The system was designed and implemented in a flexible form of network and wireless communication to perform a variety of positioning and navigation services in both real-time and postprocessing.

The project was jointly accomplished by the GNSS Engineering Research Center, Wuhan University, and Shenzhen Municipal Bureau of Land Resources and Housing Management. It is aimed at applications for surveying and mapping, urban planning, resource management, transportation monitoring, disaster prevention, and scientific research including meteorology and ionosphere scintillation.

In this way, the Shenzhen CORS network is acting to energize the booming economy of this young city. With rapid development of CORS construction in China, these stations are expected to operate within a standard national specification and to play vital roles in realization of the “digital city” in terms of real-time and precise positioning and navigation.

Based on CORS stations properly distributed throughout China, some of these facilities are aligned with stations installed with other spatial observing technologies such as SLR, VLBI and DORIS (Doppler Orbitography and Radio-positioning Integrated by Satellite, a system maintained by France). These sites are serving for satellite orbit determination and, when combined with multiple spatial technologies, have created a dynamic and multi-dimensional terrestrial reference frame for China.

(For the rest of this story, please download the complete article using the PDF link above.)

By
July 1, 2006

Orbital Precession, Optimal Dual-Frequency Techniques, and Galileo Receivers

Q: Is it true that the GPS satellite geometry repeats every day shifted by 4 minutes?

A: It is true that the GPS satellite orbits were selected to have a period of approximately one half a sidereal day to give them repeatable visibility. (One sidereal day is 23 hours, 56 minutes, and 4 seconds long or 236 seconds shorter than a solar day.) However, because of forces that perturb the orbits, the repeat period actually turns out to be 244 to 245 seconds (not 236 seconds) shorter than 24 hours, on average, and changes for each satellite.

Q: Is it true that the GPS satellite geometry repeats every day shifted by 4 minutes?

A: It is true that the GPS satellite orbits were selected to have a period of approximately one half a sidereal day to give them repeatable visibility. (One sidereal day is 23 hours, 56 minutes, and 4 seconds long or 236 seconds shorter than a solar day.) However, because of forces that perturb the orbits, the repeat period actually turns out to be 244 to 245 seconds (not 236 seconds) shorter than 24 hours, on average, and changes for each satellite.

The selection of a half sidereal day orbit causes the satellite ground track and the satellite visibility from any point on earth to be essentially the same from day to day, with the satellites appearing in their positions approximately 4 minutes (236 seconds) earlier each day due to the difference between sidereal and solar days. This was a particularly useful property in the early days of GPS when session planning was important to ensure adequate satellite coverage. With this easily predictable coverage, GPS users could schedule repeatable campaign sessions well in advance just by shifting their experiments forward each day by 4 minutes.

(For the rest of Penina Axelrad and Kristine M. Larson’s answer to this question, please download the complete article using the PDF link above.)

Q: How can dual frequency code and carrier measurements be optimally combined to enhance position solution accuracy?

A: The smoothing of GPS code pseudorange measurements with carrier phase measurements to attenuate code noise and multipath is a well-established GPS signal processing technique. Unlike carrier phase real time kinematic (RTK) techniques, carrier-smoothed code (CSC) positioning solutions do not attempt to resolve carrier phase ambiguities. As a result, they offer a number of design and operational advantages for those applications that do not require RTK accuracies.

Ionospheric effects are a limiting factor in how much smoothing of pseudorange errors can be accomplished with single-frequency measurements. The use of dual-frequency code and carrier measurement combinations in CSC processing to attenuate pseudorange errors and as a precursor for carrier phase ambiguity resolution has gained increasing importance, particularly with the availability of all-in-view dual-frequency GPS receivers in the survey and military markets. Interest in these techniques will increase with the advent of additional GNSS signals as the result of GPS modernization and implementation of Galileo, along with the proliferation of differential services.

(For the rest of Dr. Gary McGraw’s answer to this question, please download the complete article using the PDF link above.)

Q: What is the availability of Galileo receivers?

A: With the launch of the GIOVE-A (Galileo In-Orbit Validation Element – A) Galileo test satellite in December last year, the European Galileo satellite navigation system is making progress. How will we be able to recognize the benefits of Galileo? We will require enough Galileo satellites to make a difference when used with GPS alone, and we will require dual-mode Galileo/GPS receivers.

First, let us recap what Galileo will provide to users. And second, let us summarize what benefits we can expect to see, not only from Galileo alone but from a combined GPS/Galileo constellation of approximately 60 satellites.

Galileo will offer several worldwide service levels, including open access and restricted access for various segments of users.

(For the rest of Tony Murfin’s answer to this question, please download the complete article using the PDF link above.)

By

Uh-Oh, It’s UXO!: A Global Differential GPS Solution for Clearing Unexploded Ordnance

At any given time along a large swath of rural, northern Texas you might witness a loud, dirty ritual. A handful of men standing still in the middle of a field, their expectant eyes fixed on the same point. Just about the time your gaze sets on the same point, it happens: a deep sound like the forceful downbeat of a drum cracks through the air and the dirt-caked ground explodes in a dusty plume of metal and sand. The cloud dissipates and the men, satisfied that the World War II–era munition has been successfully destroyed, move on to their next pin-flagged target.

At any given time along a large swath of rural, northern Texas you might witness a loud, dirty ritual. A handful of men standing still in the middle of a field, their expectant eyes fixed on the same point. Just about the time your gaze sets on the same point, it happens: a deep sound like the forceful downbeat of a drum cracks through the air and the dirt-caked ground explodes in a dusty plume of metal and sand. The cloud dissipates and the men, satisfied that the World War II–era munition has been successfully destroyed, move on to their next pin-flagged target.

It’s an almost daily exercise for the survey and “dig” teams of Parsons Corporation and USA Environmental who together for the past four years have been steadily clearing the once-active infantry and artillery training facility Camp Howze and returning the 59,000 acres of land to its former cattle-grazing condition.

Officially designated as the Former Camp Howze Removal Project (FCHRP), the Texas effort is part of a long-standing U.S. Army Corps of Engineers (USACE) program to clean up unexploded ordnance (UXO) remnants at former military training bases around the globe. And it’s one in which Parsons, an engineering and construction firm based in Pasadena, California, has been heavily involved for the past 15 years. During that time, Parsons survey and explosives teams have located, unearthed, and recycled or destroyed more than a billion pounds of munitions, fragments, and other range-related items.

Unlike the typical short-term UXO removal projects of the past, the FCHRP has already required four years of dedicated labor and doesn’t as yet have a fixed end point. The USACE funding approach enables the teams to stay on site until either the money runs out or all the ordnance is found and cleared, says Terry Willis, Parsons field data manager for FCHRP.

Such unusual circumstances make budgeting for operational costs and developing highly efficient and productive work methods that much more critical, says Willis, because any substantial funding cuts by Congress could mean the end of the project.

An additional motivation for taking an open-ended approach to the FCHRP could be that all of the former Camp Howze land is privately owned and is home to families who have lived and worked there for more than 50 years. Each family must grant consent to the teams to clear individual parcels.

Despite the fact that their house or barn could be sitting on a land mine or other ordnance, many people are rather complacent and don’t see the urgency in having the munitions removed, Willis says. Even though field teams have found artillery rounds eight feet from people’s doorsteps, acquiring the necessary consent to access the property has been a time-consuming process.

The Basis for Going Baseless
The rather atypical FCHRP presented Parsons with the opportunity to arm the project teams with their own atypical survey “weapon” — a global satellite-based augmentation system (GSBAS) that provides corrected GPS positioning without the use of base stations.

Although Parsons had never before employed the GSBAS technology in its numerous UXO removal projects, given the way the GSBAS has performed so far on the Texas project, Willis predicts that similar systems will become as common place in the field as the shovels and the explosives used to remove munitions.

Previously, Parsons’ UXO-removal teams employed real-time kinematic GPS (RTK-GPS) systems to create search grids, find and stake out anomalies for investigation, and record the position of munitions found. RTK techniques require the broadcast of differential corrections to the GPS signals’ carrier phase measurements. These corrections are transmitted via a high-speed data modem from a base station to roving GPS receivers.

“Although RTK-GPS is extremely accurate,” says Willis, “its complexity, bulk, and expense make it less than ideal for Parsons’ purposes.” Since putting the GSBAS system to use in the field, the Camp Howze team has virtually eliminated its need for RTK-GPS in the majority of the fieldwork, obtaining decimeter accuracy for one-third the cost of an RTK-GPS unit.

A departure from local real-time differential GPS systems, the GSBAS relies on a global network of base stations to calculate and compensate for  clock and orbit errors in the satellite transmissions. Broadcast of the DGPS corrections, available globally in real time, eliminates the need for local base stations, which in turn eliminates the struggle to maintain communication links to a source of local corrections. In short, users are no longer tethered to a base station for precise positioning.

Recycling a Metallic Past
For four years, from 1942 to 1946, Camp Howze was the temporary home for thousands of soldiers as they prepared for battle overseas. Located along the Texas-Oklahoma border about 55 miles North of Dallas, the camp offered an immense area for training forces, artillery ranges, libraries, chapels, theaters, banking facilities, and even a camp newspaper.

For the last four years, however, former Camp Howze has been the temporary home of the Parsons and USA Environmental teams as they continually search for the telltale metallic signs of the camp’s previous incarnation.

When their FCHRP activities began in 2002, the Parsons team started out with very scant historical and practical knowledge of the area, having only a few sheets of county property maps and background information on the camp itself provided by the U.S. Army. This information included engineering maps with the approximate locations of artillery ranges, aerial photos of the facility from 1943, and written records from units that trained there.

Parsons then obtained updated aerial photos of the site taken in 1995 and records of what project managers call “phase one properties,” occupied properties or buildings believed to be near or on former range or training areas. All of these data sets were incorporated as layers into a geographical information system (GIS) to begin to identify logical areas to investigate for UXOs. Of critical importance for prioritizing their efforts was the identification of current high-traffic areas where private citizens live, work, and play.

“Once we identify the areas to investigate, we determine which methods for removing ordnance will be the most effective based on many factors such as accessibility, terrain, vegetative cover, time of year and land-owner consent,” says Willis. In the case of Camp Howze, all of those factors led to the decision to couple standard search tools with new technology to improve efficiency.

The two most common investigative tools are what Willis calls “magnetometer (Mag) and Dig” — a thorough, yet costly and time-consuming process of manually clearing smaller areas with the aid of shovels and handheld metal detectors – and “digital geophysics,” a survey technique that uses large, highly sophisticated electromagnetic sensors to detect the presence of buried metal objects. As the latter method can rapidly cover a much larger territory, Parsons first applied the technique to perform a geophysical survey in combination with RTK-GPS to pinpoint suspected unexploded munitions.

To perform the geophysical survey, three computer-controlled electromagnetic sensors are connected together to create a three-meter wide sensing array. The sensors are then physically pulled by an all-terrain vehicle over the area of interest to detect the presence of metal items in the ground and record their positions.

The readings from the electromagnetic sensors coupled with the continuous GPS readings are postprocessed to generate coordinates of anomalies, that is, possible UXOs, which are then added to the GIS. All uploaded position readings are tied to Texas North Central State Plane coordinates.

Through considerable postprocessing of the geophysical survey data, the Parsons’ geophysicists plot positions of the anomalies on digital maps and “flag” them through color-coded points to signify the level of probability of being UXOs.

Following their usual practice, at this stage the field teams would have used RTK-GPS to reacquire and flag the real-world points of potential targets detected by the geophysical survey. At the former Camp Howze, however, Willis chose to depart from tradition.

Given the terrain extremes in this region of rural Texas, the large study area, high- accuracy requirements, limited labor resources and indefinite work schedule, Parsons needed a cost-effective and user-friendly survey solution that would enable UXO technicians to efficiently locate anomalies and precisely position them.

“We opted for the satellite-based system approach for a number of reasons, one of which was the considerable cost savings over an RTK-GPS rental,” he says. “Because it doesn’t require a base station, we don’t have line-of-sight issues nor need to spend considerable time troubleshooting communications and power supply issues. So, we can be much more productive in the field. And the simplicity of the system makes it much easier for the teams — who are not trained surveyors — to set up and use.”

Unearthing UXO
On any given day, Willis and his teams use the GIS as a logistical planning tool to map out clearing strategies based on the digitally flagged hot spots indicating the highest probability of buried munitions. The team imports those coordinates into the controller software of the receiver, and the Mag/Dig teams head to the site.

Once on site, the two-person survey team uses a “quick-start” feature built into the GSBAS receiver software that enables the system to reach full position accuracy immediately by using a previously surveyed position to initialize the navigation function. This set up process takes “less than five minutes,” says Willis, after which the survey team uses the satellite-based system to navigate to the predetermined points on the ground, where they stake the targets with pin flags.

Following relatively closely behind the survey team, the dig team of three to seven UXO technicians armed with shovels and handheld magnetometers investigate each flagged point. They use the metal detector to verify the presence of metal, and, if the indications are affirmative, they gingerly dig up the object.

Should they unearth any munitions, they carefully inspect the UXO to determine if it needs to be destroyed. To neutralize the ordnance they set explosives and destroy it on the spot. All discovered ordnance is properly and precisely recorded — type, position, and confirmed destruction — and the collected field data uploaded into the GIS.

“Although the GIS was not originally a requirement, it has become the information backbone of the project,” says Willis. “It’s a planning tool for UXO searches and the main repository for what we find in the field. It maintains all of the data layers that we have accumulated and created over the past four years, including aerial photos, topographic maps, scans of annotated response maps, parcel boundaries, and pipeline data to provide us with a comprehensive graphical resource. And because it’s tied into the field database, data from daily operations can be displayed geographically in various ways.” 

To date, with the combination of the GSBAS, the GIS, and their conventional Mag/Dig tools, the survey and UXO-removal teams have cleared more than 1,800 acres of Camp Howze’s most hazardous areas. Along the way, they have destroyed more than 860 live ordnance items, including mortars, artillery shells, anti-tank rockets, hand grenades, and land mines.

Stars in Their Eyes
Although Camp Howze stretches across nearly 59,000 acres, FCHRP mandate is not to sweep 100 percent of the land but rather to investigate and clear the most zones constituting the greatest hazard to the public. Willis says that adding the satellite-based system to the fieldwork is helping Parsons to fulfill that charter more efficiently than with their previous RTK-GPS solution, predominantly because they can achieve near RTK-GPS accuracy without a base station.

“We work four, 10- hour days per week,” says Willis. “If you spend an hour setting up a base station and another half hour to tear it down, you’ve lost at least an hour and a half of operational time, provided you don’t have any trouble with it during the day. In rural, rough terrain, radio line of sight is a problem, and it can be a long trip back to the base if we lose the radio signal.”

Also, powering the base for an entire day can be a challenge, he says. Parsons teams have used marine deep-cycle batteries to power the equipment and sometimes the power supply still wouldn’t last an entire day. Cellular RTK was considered for its convenience, but the existence of many cellular “dead zones” in the area precluded its use.

“Because our survey team directly supports our dig team, both teams will normally have to shut down operations if something happens to the base station,” Willis adds. “It is costly to keep a dig team in the field. If they’re forced to stop work because the survey equipment is down, it’s very expensive.”

FCHRP requirements dictate that the teams position any ordnance they discover to within one foot. Willis says the satellite-based unit performs well enough for them to meet this standard. “The decimeter accuracy provided by the system is actually more than we require for this project,” he says.

Because heavy thunderstorms and tornadoes are the only weather-related phenomena that will force the crews inside, the Mag/Dig teams need rugged equipment that’s also portable.

“The [GSBAS] system fits into a single carry case; so, much of the weight and bulk is reduced to a manageable size,” says Willis. “That simplicity and compactness makes it very reliable because it’s easy to transport into the field, set up and to use.”

Willis feels confident that the cost-effectiveness of the system will help Parsons to win similar projects in the future. “When bidding on these projects, it helps to be able to shave thousands of dollars off the cost by simply changing a piece of equipment,” he says.

In the meantime, people in this rural part of Texas can still count on witnessing a handful of men, staring at a fixed point in the distance, waiting for the inevitable explosion of dirt and metal.

For figures, graphs, and images, please download the PDF of the article, above.

By
[uam_ad id="183541"]

BOC or MBOC?

Europe and the United States are on the verge of a very important decision about their plans to implement a common civil signal waveform at the L1 frequency: Should that waveform be pure binary offset carrier — BOC(1,1) — or a mixture of 90.9 percent BOC(1,1) and 9.09 percent BOC(6,1), a combination called multiplexed BOC (MBOC). The desire for a common civil L1 signal is enshrined in a 2004 agreement on GNSS cooperation between the United States and the European Union (EU).

Europe and the United States are on the verge of a very important decision about their plans to implement a common civil signal waveform at the L1 frequency: Should that waveform be pure binary offset carrier — BOC(1,1) — or a mixture of 90.9 percent BOC(1,1) and 9.09 percent BOC(6,1), a combination called multiplexed BOC (MBOC). The desire for a common civil L1 signal is enshrined in a 2004 agreement on GNSS cooperation between the United States and the European Union (EU).

For the EU and the European Space Agency (ESA), that decision — and its consequences — will come sooner: with the Galileo L1 Open Service (OS) that will be transmitted from satellites to be launched beginning in the next few years. For the United States, the waveform decision will shape the design of the L1 civil signal (L1C) planned for the GPS III satellites scheduled to launch in 2013. For a background on the process that led to design of the GPS L1 civil signal and its relevance to the BOC/MBOC discussion, see the sidebar L1C, BOC, and MBOC.

The May/June issue of Inside GNSS contained a “Working Papers” column titled, “MBOC: The New Optimized Spreading Modulation Recommended for Galileo L1 OS and GPS L1C”. Authored by members of a technical working group set up under the U.S./EU agreement, the article discussed the anticipated MBOC benefits, primarily improved code tracking performance in multipath. The column also noted that, while lower-cost BOC(1,1) receivers would be able to use MBOC, it would come at the cost of a reduction in received signal power.

An article in the “360 Degrees” news section of the same issue of Inside GNSS noted that some GNSS receiver manufacturers believe MBOC is not best for their applications and perhaps should not have been recommended. (This point was noted on page 17 of the May/June issue under the subtitle “MBOC Doubters.”) See the sidebar “Other Observers” (below) for additional comments from companies with concerns about MBOC recommendation.

This article, therefore, continues the discussion of a common signal waveform by asking several companies with different product perspectives whether they consider the proposed MBOC waveform to be more or less desirable for their applications than the BOC(1,1). Currently, BOC (1,1) is the baseline defined in the June 26, 2004, document signed by the U.S. Secretary of State and the vice-president of the European Commission (the EU’s executive branch): “Agreement on the Promotion, Provision and Use of Galileo and GPS Satellite-Based Navigation Systems and Related Applications.”

Maximum benefit from MBOC will be obtained by receivers using recently invented technology that employs computationally intensive algorithms. Although such receivers clearly will provide benefits to their users because of the BOC(6,1) component of MBOC, the practical value of the benefits have not been quantified, which is one purpose of the questions raised in this article. For the moment, let’s call all these prospective MBOC users “Paul”.

Meanwhile, patents on the most widely used multipath mitigation technologies today, such as the “narrow correlator” and the more effective “double-delta” techniques, will expire about the time the new signals are fully available, making these techniques more widely available. Unfortunately, the double-delta technology cannot use the BOC(6,1) component of MBOC. In addition, narrowband receivers, which today dominate consumer products, also cannot use the BOC(6,1). Let’s call all these users “Peter”.

Therefore, the fundamental question raised by this article is whether we should rob Peter to pay Paul. If the amount taken is quite small and the benefits are large, then the answer should be “yes.” If the amount taken creates a burden to Peter, now and for decades to come, with little benefit to Paul, then the answer should be “no.” The in-between cases are more difficult. The purpose of this article is to explore the tradeoffs.

To address this issue, we invited engineers from companies building a range of GNSS receivers to take part in the discussion. We’ll introduce these participants a little later. But first, let’s take a look at the technical issues underlying the discussion.

BOC/MBOC Background

The RF spectrum of a GPS signal is primarily defined by the pseudorandom code that modulates its carrier and associated data. A pseudorandom code appears to be a completely random sequence of binary values, although the sequence actually repeats identically, over and over.

For the C/A code on the L1 frequency (1,575.42 MHz), the state of the code (either +1 or –1) may change at a clock rate of 1.023 MHz. We call this binary phase shift keying, or BPSK(1), meaning BPSK modulation with a pseudorandom code clocked at 1.023 MHz. Note that the bits of a pseudorandom code often are referred to as “chips,” and four BPSK chips are illustrated at the top of Figure 1. (To view any figures, tables or graphs for this story, please download the PDF version using the link at the top of this article.)

Among many other topics, the 2004 U.S./EU agreement settled on a common baseline modulation for the Galileo L1 OS and the GPS L1C signals: BOC(1,1). (The BOC(n,m) notation means a binary offset carrier with n being a 1.023 MHz square wave and m being a 1.023 MHz pseudorandom code.) Like BPSK(1), the BOC(1,1) waveform also is a BPSK modulation, meaning there are only two states, either a +1 or a –1. The timing relationships of the code and the square wave are illustrated by Figure 1.

Although the agreement defined BOC(1,1) as the baseline for both Galileo L1 OS and GPS L1C, it left the door open for a possible signal “optimization” within the overall framework of the agreement. As documented in the paper by G.W. Hein et al., “A candidate for the GALILEO L1 OS Optimized Signal” (cited in the “Additional Resources” section at the end of this article) and many other papers, the EC Signal Task Force (STF) after much study initially recommended a composite binary coded symbols (CBCS) waveform.

Because the agreement made it desirable for GPS L1C and Galileo L1 OS to have an identical signal spectrum and because GPS III implementation of CBCS would be difficult, a search was made by a joint EC/US working group to find an optimized signal that was acceptable for both GPS and Galileo. The result is MBOC (discussed in the May/June “Working Papers” column and the like-named IEEE/ION PLANS 2006 paper by G. W. Hein et al. cited in “Additional Resources.”).

Like all modernized GPS signals — including M-code, L2C, and L5 — L1C will have two components. One carries the message data and the other, with no message, serves as a pilot carrier. Whereas all prior modernized GPS signals have a 50/50 power split between the data component and the pilot carrier, L1C has 25 percent of its power in the data component and 75 percent in the pilot carrier.

The L1C MBOC implementation would modulate the entire data component and 29 of every 33 code chips of the pilot carrier with BOC(1,1). However, 4 of every 33 pilot carrier chips would be modulated with a BOC(6,1) waveform, as illustrated in Figure 2. The upper part of the figure shows 33 pilot carrier chips. Four of these are filled to show the ones with the BOC(6,1) modulation. Below the 33 chips is a magnified view of one BOC(1,1) chip and one BOC(6,1) chip.

The BOC(1,1) chip is exactly as illustrated in Figure 1 while the BOC(6,1) chip contains six cycles of a 6.138 MHz square wave. With this image in mind, we can easily calculate that the pilot carrier has 29/33 of its power in BOC(1,1) and 4/33 of its power in BOC(6,1). Because the pilot carrier contains 75 percent of the total L1C signal power, then the percent of total BOC(6,1) power is 75 × (4/33) or 9.0909+percent. Conversely, the data signal has 25 percent of the total L1C signal power; so, the calculation of BOC(1,1) power is 25 + 75 × (29/33) or 90.9090+ percent.

Because the Galileo OS signal has a 50/50 power split between data and pilot carrier, the implementation is somewhat different in order to achieve the same percentages of BOC(1,1) and BOC(6,1) power. For the most likely time division version of MBOC for Galileo, 2 of 11 chips in the pilot carrier would be BOC(6,1) with none in the data component. Thus, the percent of total BOC(6,1) power is 50 × (2/11) or 9.0909+ percent. Similarly, the percent of total BOC(1,1) power is 50 + 50 × (9/11) or 90.9090+ percent. This makes the spectrum of Galileo L1 OS the same as GPS L1C.

Code Transitions. The fundamental purpose of MBOC is to provide more code transitions than BOC(1,1) alone, as is evident in Figure 2. (A code loop tracks only the code transitions.) However, these extra transitions come on top of the increased number in BOC(1,1) compared to the L1 C/A signal.

Taking into account that the pilot carrier has either 75 percent of the signal power with GPS or 50 percent with Galileo, GPS with BOC(1,1) has 2.25 times more “power weighted code transitions” than C/A-code (a 3.5-dB increase). Galileo with BOC(1,1) has 1.5 times more (a 1.8-dB increase). MBOC on GPS would further increase the net transitions by another factor of 1.8 (2.6-dB increase), and the most aggressive version of MBOC on Galileo would increase the net transitions by a factor of 2.2 (3.4-dB increase).

Therefore, given the improvement of BOC(1,1) over C/A code, the question raised by this article is whether a further improvement in number of transitions is worth subtracting a small amount of signal power during all signal acquisitions, for all narrowband receivers, and for all receivers using the double-delta form of multipath mitigation.

A portion of Table 1 from the May/June “Working Papers” column is reproduced here, also as Table 1. Of the eight possible waveforms in the original table, only three are included here. These are representative of all the options, and they include the two versions of MBOC considered most likely for implementation in Galileo and the only version GPS would use.

Two new columns have been added in our abbreviated version of the table. The first is an index to identify the particular option, and the last identifies whether GPS or Galileo would use that option.

Receiver Implementations

Most GNSS receivers will acquire the signal and track the carrier and code using only the pilot carrier. For GPS L1C this decision is driven because 75 percent of the signal power is in the pilot carrier. Little added benefit comes from using the data component during acquisition and no benefit for code or carrier tracking, especially with weak signals.

For Galileo, the decision is driven by the data rate of 125 bits per second (bps) and the resulting symbol rate of 250 symbols per second (sps). This allows only 4 milliseconds of coherent integration on the Galileo data component (compared with 10 milliseconds on the GPS data component). Because coherent integration of the pilot carrier is not limited by data rate, it predominantly will be the signal used for acquisition as well as for carrier and code tracking.

Reflecting the reasons just stated, Figure 3 compares the spectral power density in the pilot carrier for each of the three signal options listed in Table 1. In each case the relevant BOC(1,1) spectrum is shown along with one of the three MBOC options. These plots show power spectral density on a linear scale rather than a logarithmic dB scale, which renders small differences more prominent.

The center panel shows the GPS case with either BOC(1,1) or TMBOC-75. (The BOC(1,1) peaks are arbitrarily scaled to reach 1.0 Watt per Hertz (W/Hz). The BOC(1,1) peaks of TMBOC-75 are lower by 12% (-0.6 dB) in order to put additional power into the BOC(6,1) component of TMBOC-75, primarily at ±6 MHz.

All three panels of Figure 3 have the same relative scaling. The reason the peaks of the BOC(1,1) components in panels 1 and 3 are at 0.67 W/Hz is that GPS L1C will transmit 75 percent of its total signal power in the pilot carrier whereas Galileo will transmit 50 percent. The difference is simply 0.5/0.75 = 0.67 (-1.8 dB).

The first panel of Figure 3 also shows the Galileo TMBOC-50 option in which the BOC(1,1) component peaks are lowered by 18 percent (-0.9 dB) in order to provide power for the BOC(6,1) component, primarily at ±6 MHz.

The third panel shows the same Galileo BOC(1,1) power density but with the CBOC-50 option. In this case the BOC(6,1) component exists in the data channel as well as the pilot carrier. That is why it is half the amplitude at ±6 MHz as in panels 1 and 2. That also is why less power is taken from the BOC(1,1) component for the BOC(6,1) component; in this case the reduction is 9 percent (-0.4 dB). This is not considered an advantage by those who want to track the BOC(6,1) component, and it also reduces the data channel power for narrowband receivers by the same 9 percent or 0.4 dB.

As stated before, the fundamental question raised by this article is whether we should rob Peter to pay Paul. As with all such top-level questions, the answers lie in the details and in the perceptions of those affected. Inside GNSS posed a series of questions to industry experts in order to explore their perspectives and preferences.

The Questions and Answers

Q: What segment of the GNSS market do your answers address? Describe your market, including typical products and the size of the market.

Fenton – High precision survey and mapping, agriculture/machine control, unmanned vehicles, scientific products, and SBAS ground infrastructure where centimeter accuracy is very important. NovAtel sells at the OEM level to software developers and system integrators and calculates its present total addressable market (TAM) at $300-$400 million USD, again at the OEM level.

Garin – We are focused on consumer electronics where very low cost and very low power are of critical importance, such as personal navigation devices (PNDs), cellular phones, and in general applications where the power consumption is at a premium. These objectives should be reached with little to no impact on the user experience. The loss of performance due to design tradeoffs is mitigated by assisted GPS (A-GPS).

Hatch /Knight – NavCom supplies high-precision, multi-frequency GNSS receivers that employ advanced multipath and signal processing techniques, augmented by differential corrections from our StarFire network. These receivers are widely used in the agriculture, forestry, construction, survey, and offshore oil exploration markets. Current market size is on the order of 100,000 units per year.

Sheynblat/Rowitch – Our answers address wireless products for the consumer, enterprise, and emergency services markets. There are over 150 million Qualcomm GPS enabled wireless handsets in the market today, and this large market penetration and heavy usage is primarily driven by low cost, low power, and high sensitivity. The vast majority of other GPS enabled consumer devices worldwide are also cost driven.

Stratton – Rockwell Collins is a leading provider of GPS receivers to the U.S. military and its allies, and we are also a major supplier of GNSS avionics to the civil aviation industry. The civil aviation applications demand high integrity and compatibility with augmentation systems, while the military requirements range from low-power, large-volume production to high-dynamic and highly jam-resistant architectures (as well as civil compatible receivers). Military receivers are impacted due to civil compatibility requirements. Our company has produced over a half million GPS receivers and has a majority market share in military and high-end civil aviation (air transport, business, and regional) markets.

Studenny – Our market is commercial aviation where continuity of operation and integrity are the most important performance parameters.

Weill – I and a colleague, Dr. Ben Fisher, of Comm Sciences Corporation, are the inventors of a new multipath mitigation approach which we call Multipath Mitigation Technology (MMT), so our primary product is technology for improved multipath mitigation. MMT is currently incorporated in several GPS receivers manufactured by NovAtel, Inc. Their implementation of MMT is called the Vision Correlator.

Q: Which signal environments are important for your products: open sky, indoor, urban canyon, etc.

Fenton – In general, most of our customers operate in open sky environments. However, a significant number are operating under or near tree canopy and in urban canyons.

Garin – Ninety percent of our applications are or will be indoors and in urban canyons.

Hatch /Knight – Our receivers are mostly used in open sky and under-foliage conditions.

Stratton – Our products use civil signals mainly in open sky conditions, although civil signals may be used to assist the acquisition of military signals in a broad variety of environments.

Studenny – Aircraft environments, with particular attention to safety-of-life. Also, ground-based augmentation system (GBAS) ground stations.

Weill – Any environment in which multipath is regarded as a problem, including precision survey, indoor (911) assisted GPS, and military and commercial aviation.

Q: Which design parameters are most critical for your products: power, cost, sensitivity, accuracy, time to fix, etc.

Fenton – In general, our products service the high end “commercial” markets. Our customers in general have priorities in the following order: a) accuracy, b) robust tracking, c) cost, d) power, e) time to first fix.

Garin – The most important criteria are, from the highest to the lowest: power, cost, sensitivity, time-to-first-fix, and finally, accuracy.

Hatch /Knight – Accuracy is most important.

Sheynblat/Rowitch – We have invested substantial engineering effort to achieve market-leading sensitivities (-160 dBm) while maintaining very low receiver cost. Engineering investment, focus on sensitivity, and close attention to cost models is probably also true for other vendors focused on mass market, AGPS enabled devices that have to work indoors. All of these GPS vendors go to great lengths to improve sensitivity for difficult indoor scenarios. Every dB counts and may make the difference between a successful or a failed fix, which is of particular concern for E-911 and other emergency situations.

Stratton – The tradeoff in relative importance of these parameters varies widely depending on the particular application, though life-cycle cost (including development and certification) arguably is most significant.

Studenny – Actually, all parameters are important. However, we focus on safety-of-life and the drivers are both continuity of operation and integrity (hazardously misleading information or HMI).

Specifically, we believe cross-correlation, false self-correlation, and the ability to resist RFI, as well as improving multipath performance, are signal properties of great interest to us. A well-selected coding scheme minimizes all of these and HMI in particular. Finally, HMI may become a legal issue for non-aviation commercial applications, especially if those applications involve chargeable services, implied safety-of-life, and other such services.

Weill – MMT is most effective in receivers that have high bandwidth and are receiving high-bandwidth signals. However, it can substantially improve multipath performance at lower bandwidths.

Q: Do you really care whether GPS and Galileo implement plain BOC(1,1) or MBOC? Why?

Fenton – Yes, we expect that the MBOC signals combined with the latest code tracking techniques will provide a majority of our customers a significant performance benefit for code and carrier tracking accuracy in applications where multipath interference is a problem.

Garin – I do not believe that MBOC will significantly benefit our short-term market. The MBOC expected multipath performance improvement will be meaningless in the urban context, where the dominant multipath is Non Line of Sight and where the majority of the mass market usage is concentrated. However we believe that a carrier phase higher accuracy mass market will emerge within a 5 year timeframe, with back-office processing capabilities, and wireless connected field GPS sensors. This will be the counterpart of the A-GPS architecture in cell phone business. MBOC would have an important role to play in this perspective. We envision this new market only in benign environments, and not geared towards the surveyors or GIS professionals.

Hatch /Knight – The MBOC signal will significantly improve the minimum code tracking signal to noise ratio where future multipath mitigation techniques are effective. The expected threshold improvements will be approximately equal to the best case improvements indicated by this article. MBOC will be less beneficial to very strong signals where the noise level is already less than the remaining correlated errors, like troposphere and unmitigated multipath.

Designing a receiver to use the MBOC code will be a significant effort. The resulting coder will likely have about double the complexity of the code generator that does not support MBOC. There will be a small recurring cost in silicon area, and power consumption will increase significantly. Overall, MBOC is desirable for our high performance applications. For many applications the costs are greater than the benefits.

Sheynblat/Rowitch – Yes, we do care about the decision of BOC versus MBOC. The proposed change to the GPS L1C and Galileo L1 OS signal to include BOC(6,1) modulation will perhaps improve the performance of a very tiny segment of the GPS market (high cost, high precision) and penalize all other users with lower effective received signal power due to their limited bandwidth. We prefer that GPS and Galileo implement the BOC(1,1) signal in support of OS location services.

Stratton – This decision does not appear to have much influence on our markets when viewed in isolation, but we would like to see GPS make the best use of scarce resources (such as spacecraft power) to provide benefits that are attainable under realistic conditions.

Studenny – Yes, we do care. GPS L5 needs to be complemented by a signal with similar properties at L1, the reason being that a momentary outage during precision approach on either L1 or L5 should not affect CAT-I/II/III precision approach continuity or integrity. We understand that there are constraints in selecting a new L1 signal; however the proposed MBOC waveform better supports this. This is keeping with supporting the FAA NAS plans and transitioning to GNSS for all phases of flight including precision approach.

Weill – Yes. Comm Sciences has established that the performance of current receiver-based multipath mitigation methods is still quite far from what is theoretically possible. It is also known that GNSS signals with a wider RMS bandwidth have a smaller theoretical bound on ranging error due to thermal noise and multipath. Since multipath remains as a major source of pseudorange error in GNSS receivers, I feel that the use of an MBOC signal for GPS and Galileo is an opportunity to provide the best possible multipath performance with evolving mitigation methods that take advantage of the larger RMS bandwidth of an MBOC signal as compared to plain BOC(1,1).

Q: Are the GNSS receivers of interest narrowband (under ±5 MHz) or wideband (over ±9 MHz)?

Fenton – Wideband. High precision GNSS receivers typically process all available bandwidth ~20 MHz (±10 MHz).

Garin – Our GNSS receivers are narrowband today, but we expect the widening of the IF bandwidth (or equivalently their effective bandwidth) to ±9 MHz, in the next 3-5 years, with the same or lower processing and power consumption.

Hatch /Knight – Our receivers are primarily wideband.

Sheynblat/Rowitch – The receivers of interest are narrowband. Low cost GPS consumer devices do not employ wideband receivers today and will most likely not employ wideband receivers in the near future. Any technology advances afforded by Moore’s law will likely be used to further reduce cost, not enable wideband receivers. In addition, further cost reductions are expected to expand the use of positioning technology in applications and markets which today do not take advantage of the technology because it is considered by the manufacturers and marketers to be too costly.

Stratton – All of our markets require wide-band receivers; however, the civil receiver/antenna RF characteristics are adapted to high-bandwidth C/A processing (where the bulk of RF energy is at band center). So the MBOC signal does raise some potential compatibility questions.

Studenny – Wideband.

Weill – I believe the trend will be toward wideband receivers for most applications. If one looks at the history of GPS receiver products, it is clear that there has always been competitive pressure to increase positioning accuracy, even at the consumer level. Not only is better accuracy a marketing advantage, but it has also opened up entirely new applications. The availability of wide bandwidth signals is a key factor in continuing to improve positioning accuracy. Although currently available receivers that can take advantage of wider bandwidth signals cost more and consume more power, the rapid rate of improving digital technology should make low-cost, low-power, wide bandwidth receivers available in the not-so-distant future. The availability of an MBOC signal would maximize the capability of such receivers.

Other Observers

Inside GNSS invited comments from a broad range of companies representative of most GNSS markets. In addition to those who fully responded to our questions, several offered abbreviated remarks:Garmin International, Inc. did not identify a spokesperson, but it submitted the following official statement: “It is Garmin’s policy not to disclose any information about future designs. However, we would like to indicate that we support the BOC(1,1) implementation over the MBOC.”Sanjai Kohli, Chief Technology Officer of SiRF Technology Inc., submitted the following official statement: “The existence of the BOC(6,1) chips in the MBOC signal won’t matter very much to SiRF. Still, to maximize the availability of weak signals, it would be preferable not to suffer any loss of signal power. Therefore, SiRF would prefer that all chips be BOC(1,1). Furthermore, it is doubtful that any advanced method of multipath reduction will be of much benefit for urban and indoor signal reception, since it is likely that the line-of-sight component of the weak signal is blocked.”

European Company – A large and well known European consumer products company could not obtain internal approval to answer the questions, but the following unofficial communication from a technical manager is of interest: “Our understanding about the pros and cons of MBOC as compared with BOC(1,1) is . . . that narrow-band receivers are not able to utilize the higher frequency components of the MBOC signal and they thus represent wasted power from their viewpoint. This is especially true for acquisition, because the acquisition bandwidth many times seems to be narrower than the tracking bandwidth, especially in those parallel acquisition receivers that are used in consumer products specified for weak signal operation. For such receivers the received signal power is critical in the acquisition phase, not so much in the tracking phase.”

L1C, BOC, and MBOC

Pertinent to the subject of this article is the remarkable way in which the L1C signal was designed. The original C/A- and P-code signals were designed by a small group of technologists under the direction of the GPS Joint Program Office (JPO). Although from the beginning GPS was understood to be a dual-use (civil and military) system, the signals were designed primarily from a military perspective.

Design of the L2C civil signal was led by a JPO deputy program manager representing the Department of Transportation (DoT) — but the process took place under extreme time pressure. The RTCA, Inc., with authorization from the Federal Aviation Administration (FAA), initially defined the L5 signal. The RTCA is a consensus-driven open forum, but its focus is almost exclusively on aviation.

In contrast, development of L1C was funded by the Interagency GPS Executive Board (IGEB), now superseded by the National Space-Based Positioning, Navigation, and Timing (PNT) Executive Committee. Representatives of the Department of Defense (DoD) and DoT co-chair the PNT Executive Committee: so, the central focus is on managing GPS as a dual-use utility. Reflecting this, the L1C project was co-chaired by a DoD representative and by a civil representative. (The civil co-chair was Dr. Ken Hudnut of the U.S. Geological Survey. A sequence of JPO officers represented the DoD: Captains Bryan Titus, Amanda Jones, and Sean Lenahan. Tom Stansell of Stansell Consulting served as project coordinator throughout.)

L1C development consisted of two key activities. The first was a study of the wide range of civil requirements and development of five signal structure options. A technical team conducted this part of the work, drawing on experts in all aspects of the signal, including spreading code, data modulation, forward error correction, and message format.

Several team members had deep experience developing civil user equipment, from consumer chipsets to high-precision survey receivers. Others were experts on aviation requirements. The second key activity is, to our knowledge, unique: a worldwide survey of GNSS experts to determine which of the five options to choose. The design process is complete, and a draft specification (IS-GPS-800) has been published.

The innovative MBOC proposal was developed quickly by a group of very competent U.S. and EU signal experts with both civil and military backgrounds. However, this team apparently had only one person with extensive experience in receiver manufacturing, and the timeline did not allow the opportunity for a broad survey to assess equipment manufacturers’ opinions about the design. Informal conversations with some industry representatives also revealed dissatisfaction with MBOC. Therefore, Inside GNSS decided to consult a number of experts from companies that build GNSS equipment to determine their thoughts about the MBOC concept.

Additional Resources

1. Agreement on the Promotion, Provision and Use of Galileo and GPS Satellite-Based Navigation Systems and Related Applications, June 26, 2004, http://pnt.gov/public/docs/2004-US-EC-agreement.pdf

2. Hein, G. W., and J-A. Avila-Rodriguez, L. Ries, L. Lestarquit, Issler, J. Godet, and T. Pratt, “A candidate for the GALILEO L1 OS Optimized Signal”, Proceedings of ION GNSS 2005 – 13-16 September 2005, Long Beach, California

3. Hein, G. W., and J-A. Avila-Rodriguez, S. Wallner, J. W. Betz, C. J. Hegarty, J. J. Rushanan, A. L. Kraay, A. R. Pratt, S. Lenahan, J. Owen, JL. Issler, T.A. Stansell, “MBOC: The New Optimized Spreading Modulation Recommended for Galileo L1 OS and GPS L1C”, Inside GNSS, Volume 1, Number 4, pp 57–65, May/June 2006

4. Hein, G. W., and J-A. Avila-Rodriguez, S. Wallner, A. R. Pratt, J. Owen, J-L. Issler, J. W. Betz, C. J. Hegarty, S. Lenahan, J. J. Rushanan, A. L. Kraay, T.A. Stansell, “MBOC: The New Optimized Spreading Modulation Recommended for GALILEO L1 OS and GPS L1C”, IEEE/ION PLANS 2006, April 24-27, 2006, San Diego, California

5. IS-GPS-200: NAVSTAR GPS Space Segment / Navigation User Interfaces; IS-GPS-705: NAVSTAR GPS Space Segment / User Segment L5 Interfaces; Draft IS-GPS-800 for new L1C signal; http://gps.losangeles.af.mil/engineering/icwg/

By
May 1, 2006

Mobile RTK: Using Low-Cost GPS and Internet-Enabled Wireless Phones

Government regulation such as E911 and the promise of location-based services (LBS) are the biggest drivers for integrating positioning capability into mobile phones. The increasing sophistication of applications and refinement of map databases are continually tightening the accuracy requirements for GNSS positioning. In particular, location-based games and features such as “friend finder” sometimes require better accuracy than what is achievable with state-of-the-art network-assisted GPS (A-GPS) platforms.

Government regulation such as E911 and the promise of location-based services (LBS) are the biggest drivers for integrating positioning capability into mobile phones. The increasing sophistication of applications and refinement of map databases are continually tightening the accuracy requirements for GNSS positioning. In particular, location-based games and features such as “friend finder” sometimes require better accuracy than what is achievable with state-of-the-art network-assisted GPS (A-GPS) platforms.

Cellular standards for GPS assistance data exist for both control plane and user plane protocols. These protocols carry information that help the integrated GPS receiver to improve its sensitivity, speed up signal acquisition, and especially reduce the time to first fix. However, these approved standards do not contain sufficient information for the receiver to do carrier phase positioning.

Until now, no compelling reason existed for adding carrier phase positioning related features into cellular standards so that they could employ real-time kinematic (RTK) techniques. Generally, RTK-enabled devices on the market are expensive and intended primarily for geodetic and survey applications. Also, there has been no real need in the cellular world for the accuracy RTK provides. With evolving LBS applications, however, this situation is changing.

This article describes a solution called mobile RTK (mRTK), a system specifically designed and implemented for the cellular terminal use. Its design incorporates low-cost single-frequency A-GPS receivers, Bluetooth (BT) communications, and inertial sensors.

Basically, the technique involves exchanging measurements in real-time between two units — one designated as the reference and the other as the user terminal — and producing the best possible estimate of the baseline between the terminals using RTK techniques. We are developing the solution so that in the future it will be possible to add any other Global Navigation Satellite System (GNSS) measurements in addition to GPS measurements — or even instead of GPS measurements.

Using a simulator, we shall provide data that show it is possible to enable high-precision, carrier phase-based positioning in handsets with minimal additional hardware costs. Further, we shall describe some of the protocol aspects and especially the aspects of adding support for mRTK messaging to already existing cellular standards — GSM and UMTS. We believe that the mRTK solution will bring high performance to the mass market.

Moreover, additional GPS signals, such as L2C and L5, and other GNSSes such as Galileo will become operational in the near future. Consequently, it would be very beneficial to begin incorporating mRTK into the pertinent wireless standards now so that the infrastructure and the service providers will be ready when business opportunities present themselves.

. . .

mRTK Solution Overview
A plethora of RTK surveying solutions is available on the market today. Generally, they are characterized by the use of both GPS frequencies, L1 and L2, enabling ambiguity resolution in seconds over baselines of up to 20 kilometers, or even 100 kilometers with more time and under good conditions.  We must emphasize that this article does not claim to demonstrate similar performance and reliability as high-performance dual-frequency receivers.

We are designing the mRTK solution to work with low-cost, off-the-shelf GPS receivers with certain requirements (for example, the ability to report carrier phase measurements and data polarity). Therefore, performance degradations are expected in terms of time to ambiguity resolution, accuracy, and achievable baseline length.

. . .

Testing the System
The mRTK performance testing was accomplished using two identical hardware platforms containing 12-channel off-the-shelf high-sensitivity OEM GPS receiver modules and a 3-axis accelerometer. We constructed this test system to determine the physical limitations and requirements for the protocol and messaging aspects.

. . .

Performance
We conducted several experiments using the testing system and a GPS simulator. The simulator was configured to output data from the same eight satellites for both receivers with using several different baseline lengths varying from 0 meters to approximately 5 kilometers , and using scenarios for different GPS weeks.

. . .

Testing Protocol
The testing protocol used in the mRTK solution was designed specifically for use in research and development and as a reference design for proposed changes to the pertinent cellular standards. The protocol was designed to be as efficient as possible and especially to take advantage of the properties of TCP/IP. As TCP/IP already guarantees that transmitted data are error-free and also preserves the order of the data, our protocol did not need to include extensive error corrections and packet order counts.

. . .

Cellular Protocol Aspects
During the testing protocol design and implementation, several issues emerged concerning the addition of the mRTK feature into cellular protocols . . . User-to-user relative positioning is not recommended for control plane systems because it would require a lot of protocol and implementation work to get the binding of two terminals and relaying measurements between two terminals to actually work.

. . .

Future Work
This article has introduced a new concept called mobile Real-Time Kinematics and shows that RTK-like features are possible using low-cost components and existing cellular communication carriers. Even though a lot of development work remains on the mRTK algorithm side, the biggest challenge still involves cellular carriers and their standardization. Of course, even after standardization, the development of the infrastructure would require a huge effort.

Future work with the existing testing protocol includes more testing, especially field testing, and testing with different signal conditions and satellite constellations. The testing protocol itself should be modified with new features such as the VRS service. Using VRS, the baseline can always be kept very short, and accurate absolute positioning is available everywhere using mRTK.

One of the ideas that also need to be further developed is peer-to-peer protocols. In those protocols the mRTK measurements would be transmitted directly from one terminal to another without the use of a server in between.

As an example, this kind of protocol could be embedded into voice-over-IP (VoIP), in which the data channel for the voice encoding is already open and could easily accommodate other data transmissions that do not have strict real-time requirements, such as mRTK. Other peer-to-peer protocol means would exist, for instance, in WLAN, where the terminals are connected to the same subnet and would be able to open direct connections to each other.

The solution we have presented holds a lot of potential. Especially with the forthcoming satellite systems (e.g., Galileo and modernized GPS), the solution will significantly improve the accuracy of positioning in the mobile terminal. Nonetheless, the standardization of the mRTK features will require a lot of joint effort among terminal and network manufacturers and cellular operators.

For the complete story, including figures, graphs, and images, please download the PDF of the article, above.

Acknowledgments
This article is based in part on two papers, “Bringing RTK to Cellular Terminals Using a Low-Cost Single-Frequency AGPS Receiver and Inertial Sensors,” by L. Wirola, K. Alanen, J. Käppi, and J. Syrjärinne, and “Inertial Sensor Enhanced Mobile RTK Solution Using Low-Cost Assisted GPS Receivers and Internet-Enabled Cellular Phones,” by K. Alanen, L. Wirola, J. Käppi, J. Syrjärinne, presented at the IEEE/ION PLANS 2006 conference, © 2006 IEEE.

By
April 1, 2006

Geodesy and Satellite Navigation

There has always been a love-hate relationship between geodesy and satellite navigation. Indeed, satellite positioning started life as an extension of terrestrial geodesy. When the first satellite, Sputnik 1, started orbiting the Earth in 1957, geodesists in several countries realised that satellites offered substantial potential as a geodetic positioning and navigation tool.

There has always been a love-hate relationship between geodesy and satellite navigation. Indeed, satellite positioning started life as an extension of terrestrial geodesy. When the first satellite, Sputnik 1, started orbiting the Earth in 1957, geodesists in several countries realised that satellites offered substantial potential as a geodetic positioning and navigation tool.

The basic technologies of terrestrial geodesy of the day, notably triangulation, traversing, and precise leveling, were slow and cumbersome, mainly because of the effect of the curvature of the surface of the Earth, which limited the range of measurements to theodolite observations between points situated on hilltops, observation towers, and triangulation masts.

The advent of EDM (electronic distance measurement) in the 1960s helped terrestrial geodesy, but it, too, was affected by the same limitation, namely the shortness of observable EDM ranges due to the Earth’s curvature.

Earth orbiting satellites did not suffer from this drawback. They could be viewed simultaneously from several points on Earth, and therefore direction and range measurements made, provided that the space vehicles were not obscured by high natural features or tall man-made structures. This led to several new satellite geodesy positioning methodologies.

The first of these was satellite triangulation, which was used initially to supplement and strengthen terrestrial triangulation networks. Satellite triangulation consisted of geodetic direction measurements derived from high power photographs of satellite orbits made against a stellar background of stars, with known right ascension and declination.

A few years later, this was followed by range measurements to satellites, made from Earth-bound EDM equipment to corner cube reflectors placed on the early satellites. The methodology used thus far was an extension of geodetic astronomy, with little reference to physical geodesy.

This situation changed significantly when geodesists realized that they could use the Doppler shift on the signal broadcast from a satellite to obtain differential range measurements that, together with the known Keplerian orbit of the satellite, could lead to a relatively fast positioning, or navigation, method. The Keplerian orbital motion of satellites is primarily based on the Earth’s gravity field, a subject of expertise by practitioners of physical geodesy.

This technical advance gave birth to Transit-Doppler, the first satellite navigation technology. Transit-Doppler was used in the late 1970s and early 1980s not only for the positioning of naval ships and of submarines surfacing in the polar regions, but also for the strengthening and scaling of national and continental terrestrial triangulation networks.

However, practitioners soon realized that positioning by Transit-Doppler to a reasonable degree of accuracy took several minutes, and, therefore, precluding its use as a full navigation methodology, which requires quasi-instantaneous positioning.

Enter GPS
These were the early days of a new global satellite positioning, navigation, and timing system, first called the NAVSTAR Global Positioning System, a name later shortened to just GPS. The rest is history. The early decision to base GPS on a constellation of 24 medium-Earth orbit satellites was taken on the advice, as you would expect, of geodesists at the U.S. Naval Surface Weapons Center in Dalgren, Virginia.

The close relationship between the early GPS and geodesy was further demonstrated by the adoption of WGS84, the World Geodetic System 1984, as the basis of the 3-D coordinate system of GPS. As GPS was born during the Cold War, it was declared a US military navigation system, with full access to NATO but only restricted access and down-graded positioning accuracies for civilian users.

This so-called Selective Availability (SA) gave the green light to the civilian geodetic community to come up with new methodologies that could counter the effects of SA. As always, human ingenuity did not disappoint, and two new differential techniques were developed. The first was the differential GPS (DGPS) technique, which improved relative positioning accuracies of GPS by at one order of magnitude, down to a few meters. As a result, DGPS soon became the standard methodology for the offshore positioning of oil platforms, pipelines, etc.

The next advance in improving the accuracy of satellite positioning was made on the advice of radio-astronomers, who proposed replacing the standard GPS pseudorange measurements, which are based on timing the modulated signal from satellite to receiver.

Instead, they suggested making measurements on the basic carrier frequencies of these signals, just as they did with extra-galactic signals arriving at, say, two widely spaced radio telescopes in so-called very long baseline interferometry (VLBI), leading as a by-product to the Cartesian coordinate differences between the two telescopes. This was the beginning of centimetric positioning by the carrier phase GPS method, which was later developed further by geodesists into kinematic GPS and centimetric navigation.

GPS had now become the universal high precision quasi-instantaneous positioning and navigation tool, creating the basis for hundreds of new applications. Again, geodesists led the way, concentrating on high precision scientific and engineering applications. These included surveying and mapping, positioning in offshore engineering, the monitoring of local crustal dynamics and plate tectonics, the relative vertical movements of tide gauges, and the continuous 3-D movements of critical engineering structures, such as tall buildings, dams, reservoirs, and long suspension bridges.

All of these applications required very high relative positioning accuracies, but not quasi-instantaneously as in the safety-critical navigation and landing of civilian aircraft. This came much later.

Geodesy and Navigation
Initially, GPS was considered as a standard navigation tool for military vehicles on land, sea, and air, but not for safety-critical civilian transportation. This was because, unlike military positioning and navigation, safety-critical civilian transportation not only requires quasi-instantaneous and accurate positioning, but also so-called “high integrity and good coverage.”

Geodesists will immediately realize that “integrity” stands for the geodetic concept of “reliability,” whereas “coverage” refers to the availability of a sufficient number of satellites that can be sighted by a receiver continuously and are not obscured by natural or man-made obstructions, such as high mountains, tall buildings, and the wings of an aircraft.

On its own, GPS cannot meet these requirements to the level required in safety-critical civilian transportation. Military transportation, on the other hand, has relatively modest requirements, which can be met by GPS. Indeed, you do not become a NATO Air Force pilot if you want a safe life. Flying as a passenger in a commercial airline is something else all together.

The penetration of satellite navigation, and primarily GPS, into civil aviation involved yet again, as you would expect, geodesists. They had to develop jointly with the civil aviation community the necessary theoretical and practical tools, which could be used to establish and quantify their requirements of accuracy, integrity, and coverage.

This involved the use of existing geodetic tools, such as the covariance matrix, the analysis of least squares residuals, and the well-established geodetic reliability measures. New tools were also introduced, such as the concept of RAIM or receiver autonomous integrity monitoring, based on the analysis of the least squares residuals.

Persuading Non-Geodesists
These geodetic tools, which were highly beneficial to the civil aviation community, initiated a fruitful, long-term collaboration between the two communities. However, this has not always been a straightforward and smooth relationship, and it involved — especially at the beginning — a deep suspicion of these “academic” geo-scientists. Here are a few notable examples of this love-hate relationship.

As a general rule, the existing civil aviation horizontal coordinates were based on latitudes and longitudes, with no particular reference to a reference datum. Heights in civil aviation were and still are based on barometric altimetry, on the assumption that all that matters is “the relative heighting between airplanes,” which is not affected significantly by a change in barometric pressure.

This assumption disregards, of course, the fact that the heights of natural features on the ground, such as mountains, do not change with changing barometric pressure. The first challenge was to convince the international civil aviation community that their horizontal coordinates, that is, latitudes and longitudes, required a proper geodetic datum and, as GPS was being contemplated as a future navigation tool, it made sense to adopt the same reference datum, namely WGS84. It took a while to convince the community to accept that.

The adoption of WGS84 led to the resurveying of most airports, runways, and various en route and landing navigation aids in order to bring them into WGS84, in preparation for the introduction of GPS. This led to the discovery of some large discrepancies, at airports and among navaids in many countries, between the existing horizontal coordinate values and their new WGS84 equivalents. Geodesists will be familiar with such occurrences, whenever they start dealing with a new community, whether they are civil or offshore engineers, oceanographers or meteorologists.

The first GPS receivers did not lend themselves to mass market adoption. Geodesists of a certain age will also remember some of the earliest commercial GPS receivers, such as the TI 4100 receivers, made by Texas Instruments. These early receivers operated by measuring sequentially four pseudoranges to four different satellites. Consequently, the receivers were programmed to first check the geometry of the satellites in view and decide on the best four in terms of geometrical configuration.

However, later on, with the emergence of new receivers that could measure all the available pseudoranges quasi-simultaneously, there was no need to carry on with measurements only to the “best four” satellites. One could track all available satellite signals and process these measurements by least squares, rejecting those with relatively large residuals, if any. This standard processing of observations is bread-and-butter stuff to surveyors and geodesists.

However, this was not the case with a number of navigation experts, who persisted on recommending the use of only the “best four” satellites for quite sometime, before they finally abandoned the practice.

A New Era of GNSS
Satellite navigation and positioning has changed substantially and significantly over the last 5 to 10 years. With Galileo in its development and in-orbit validation phase, the future developments in GPS IIF and GPS III, renewed interest in GLONASS, and satellite navigation initiatives in Japan, China, India, Australia, and several other countries, GNSS or the Global Navigation Satellite System is moving from being a concept, largely based on GPS alone, to a full global reality. A comprehensive program of GPS modernization currently under way aims to deliver significant improvements to both military and civil users.

The earliest mass-market applications of GPS involved road vehicles and mobile phones. In both cases, the twin aims are navigation (where am I, and how do I go to my destination?) and tracking (where is he, she, or it?). In the case of road vehicle tracking, successful applications include fleet monitoring (taxis or road transport companies), theft recovery of private cars, “black box” incident recorders, and the transport of hazardous or valuable cargoes.

Typically, most of these applications share three common features, namely prior knowledge of the proposed route, the continuous tracking of position and velocity by GPS, and the trigger of an alarm by a significant deviation.

Similarly, a number of GPS tracking applications use mobile phone technology (GSM or GPRS), but these are not as developed and widespread as vehicle tracking. Typically, these involve vulnerable people, such as young children, the elderly, key workers in some risky environments (for instance, railways), individuals with a chronic or contagious disease, and even VIPs.

Person tracking with GPS+telematics could also involve judicial cases (ordered by a court of law), of suspected criminals or anti-social elements. Other proposed applications include environmental information, location-based security, and location-sensitive marketing.

On its own, a GPS-enabled phone offers location and communication. This may answer the questions “Where is she or he?” and “Where am I?” but nothing more. However, when position and communication are combined with an appropriate geographic information system (GIS) database and a direction sensor, the combined system could answer two other very important questions, namely “What’s around me?” and “What’s that building over there?”

This could be achieved by a GPS+compass device, providing positional and directional data, which the mobile phone or the PDA transmits to a remote server. The server calculates the user’s position and identifies the building along the measured azimuth, gets the relevant information from the database, and sends it back to the client.

This is clearly valuable for the public utilities (water, gas, electricity, TV), shopping and leisure (restaurant menus, theater tickets), house hunting (details of the property advertised for sale), and of course, for visitors and tourists (museums, notable buildings, archaeological sites).

Leaving mobile phones aside, satellite navigation can also be used for location-based- security. For example, a briefcase or a portable PC can be programmed to unlock safely only in a specified location and nowhere else. This would minimize the risk of sensitive military or commercial material falling into the wrong hands.

Some working prototype systems already exist. Other location-and-context-based applications under consideration include the marketing and selling of goods, the reception of pay-TV, credit card security, spectator sports, road user charging and many others.

Indeed, the qualification of “critical application” is no longer restricted to safety-critical transportation, but it also applies now to financial-critical, legal-critical, security-critical, and business-critical applications as well. This creates a problem with standard off-the-shelf autonomous GPS receivers, which cannot operate indoors, because of signal attenuation and multipath.

Over the last few years, GPS chip and receiver manufacturers have tried, with some success, to develop high sensitivity GPS (or HS-GPS). The latest HS-GPS receivers, which incorporate up to 200,000 correlators operating in parallel, make it relatively easy to identify true pseudoranges from among the many signal and multipath reflections. Several manufacturers in the United States, Japan, Korea, and Europe, already advertise HS-GPS chips, and many other companies use such chipsets in their receivers.

GNSS Evolution
Like nearly all the technologies that preceded it, satellite navigation and positioning is going through the standard stages of development from birth to maturity. Older surveyors and geodesists may well remember the advent of EDM, using microwaves or lightwaves in the late 1960s and the 1970s. When the first EDM instruments were introduced, the distances measured were also measured with tapes, just in case.

Then came the second phase, when surveyors became fully confident about EDM and used it routinely for fast and precise range measurements. It took a few years and several critical mistakes in local mapping and national triangulation, to realize that EDM instruments could go wrong and that they had to be calibrated regularly in order to determine their accuracy and systematic biases.

The development of satellite navigation and positioning is following practically the same stages as EDM did 40 years ago. Only now we can formalize these successive stages of development of a technology and give them names by using Gartner’s famous “Hype Cycle Curve,” which was invented about 10 years ago in conjunction with new information technology products.

Using a simplified version, these successive stages of technology development are now formally called “Technology Trigger,” followed by “Peak of Inflated Expectation,” leading to “Trough of Disillusionment”, happily followed by the “Slope of Enlightenment,” and hopefully leading to the “Plateau of Productivity.”

As I write this, the first Galileo satellite, GIOVE-A, has been launched and tested successfully, opening a new era in satellite navigation. Hopefully, this will lead to the development of a large number of new critical applications — and involve close collaboration with geodesy and several other related disciplines — for the benefit of business, government and society.

Here is one last example about the strange relationship between geodesy and GPS. The U.S. delegation to the International Telecommunications Union (ITU) recently proposed to abolish leap seconds, and thus cut the link between Solar Time and Coordinated Universal Time (UTC) and ipso facto GPS Time.

At present, whenever the difference between UTC and Solar Time approaches 0.7 second, a leap second correction is made in order to keep the difference between them under 0.9 second. This is done every few years on the recommendation of the International Earth Rotation and Reference Systems Service, which monitors continuously the difference between Solar Time and UTC.

This leap second correction, which has to be applied every few years to GPS Time, apparently causes software problems because it has to programmed in manually. However, considering the difficulties that this change would cause to other scientific communities, such as astronomers, and even to users of GPS time itself for some critical applications, the U.S. proposal has now been postponed for the time being.

In conclusion, I must declare a conflict of interest. Although all the work I do at present involves GNSS, my academic background is clearly in geodesy. However, a change is in the air now, as safety-critical transportation is no longer the only critical application that has to be catered to. It has now been joined by several other emerging critical applications, notably financial-critical, legal-critical, security-critical and business-critical applications, which will also require nearly the same level of accuracy, integrity and coverage as safety-critical transportation.

This is where geodesy could step in again and create some new statistical tools, which will differentiate between the navigation and positioning systems on offer, and assess their suitability for the specific critical application.

For figures, graphs, and images, please download the PDF of the article, above.

By
March 1, 2006

Building Monitors

Severe loading conditions such as strong winds and earthquakes acting on modern tall buildings and structures can cause significant loads and vibrations. Recent trends toward slender, flexible, and light-weight buildings have left a large number of buildings susceptible to wind-induced motion. Furthermore, human perception of building motion has become a critical consideration in modern building design.

Severe loading conditions such as strong winds and earthquakes acting on modern tall buildings and structures can cause significant loads and vibrations. Recent trends toward slender, flexible, and light-weight buildings have left a large number of buildings susceptible to wind-induced motion. Furthermore, human perception of building motion has become a critical consideration in modern building design.

More complex building shapes and structural systems further accentuate eccentricities between the mass center, the elastic center, and the instantaneous point of application of aerodynamic loads, and consequently will generate significant torsional effects.

Verifying dynamic structural analysis requires the development of direct dynamic measurement tools and techniques in order to determine the natural frequencies, damping characteristics, and mode shapes. Among these tools accelerometers have played the most important part in analyzing structural response due to severe loading conditions. However, they provide only a relative acceleration measurement. The displacement from acceleration measurement cannot be obtained directly by double integration.

In contrast to accelerometers, GPS can directly measure position coordinates, thereby providing an opportunity to monitor, in real-time and full scale, the dynamic characteristics of a structure. GPS used in the real-time kinematic mode (GPSRTK) offers direct displacement measurements for dynamic monitoring. Earlier studies by the authors and other researchers, referenced in the Additional Resources section at the end of this article, have shown the efficiency and feasibility of structural deformation monitoring by combining accelerometer and GPS-RTK.

However, GPS-RTK has its own limitations. For example, the measurement accuracy can be affected by multipath and depends strongly on satellite geometry. Moreover, the typical GPS-RTK 20Hz sampling rate will limit its capability in detecting certain high mode signals of some structures. The new 100Hz GPS-RTK systems need to be further tested in order to ensure the independence of the measurements.

In order to exploit the advantages of both GPS-RTK and accelerometers, two data processing strategies have typically been used, namely to convert GPS measured displacement to acceleration through double differentiation and compare it with the accelerometer measurements (what we refer to as forward transformation), or to convert the accelerometer measurements into displacement through double integration and compare it with GPS measured displacement (the reverse transformation).

The latter approach is much more challenging because we have to determine two integration constants in order to recover all the components of displacement (static, quasi-static and dynamic). If the structure to be monitored is subject to a quasi-static force, as in the case of a typhoon, this further complicates the analysis.

Although earlier research has proposed a lab-based threshold setting for accelerometers to deal with the quasi-static issue, we believe that avoiding this procedure and developing new ways to recover the false and missing measurements from GPS by acceleration transformation would provide a preferred approach.

This article discusses recent efforts to design such a system based on a new integration approach that employs the correlation signals directly detected from a GPS-RTK system and an accelerometer to transform one form of measurement to the other. The methodology consists of a Fast Fourier Transform (FFT) for correlated signal identification, a filtering technique, delay compensation, and velocity linear trend estimation from both GPS and accelerometer measurements. We also present results derived from its installation on structures in Japan that subsequently experienced the effects of an earthquake and typhoon.

(For the rest of this story, please download the complete article using the PDF link above.)

By
January 1, 2006

Will Success Spoil GPS?

Like some behemoth rocket ship launched in the 1970s, the Global Positioning System sails on through an expanding universe of users and applications, seemingly imperturbable, successful beyond the expectations of its creators, an enormous momentum carrying it into the third millennium.

Like some behemoth rocket ship launched in the 1970s, the Global Positioning System sails on through an expanding universe of users and applications, seemingly imperturbable, successful beyond the expectations of its creators, an enormous momentum carrying it into the third millennium.

To all appearances, GPS is prospering more than ever: a second full signal (L2C) is becoming available to civil and commercial users, a denser ground monitoring system being built out, improved accuracies squeezed out of the algorithms and operational practices at the Master Control Station in Schriever Air Force Base, prices dropping on products with more features and functions than ever, hundreds of millions of receivers in use around the world. A follow-on generation (Block IIF) of satellites with a third civil signal (at the so-called L5 frequency) is being built by Boeing for launch beginning in 2007.

Since its first satellite launch 28 years ago, GPS has blazed a trail for satellite-based positioning, navigation, and timing. Thanks to GPS, global navigation satellite systems have gone from being a technological unknown to becoming a widely recognized utility. GPS, a model and inspiration to its imitators across the oceans.

Or is it?

In fact, for some years now GPS has been a victim of its own success. Performing better than advertised, the system has suffered from budgetary pilfering for other defense programs and risks getting lost in the shifting maze of diffuse dual-use management responsibilities.

“History has shown that the Air Force has had chronic difficulty in adequately funding GPS, even in the absence of the more expensive GPS III satellites,” observes a high-level Defense Science Board (DSB) task force report on GPS issued late last year. “If the Air Force continues to use its GPS investments as a funding source to offset other space/aircraft programs, then GPS service continuity will remain in jeopardy even without the more costly GPS III.” (See article “Bold Advice” in this issue.)

Meanwhile, an Air Force Space Command projection puts the worst-case probability of the GPS constellation falling below its fully operational capability (FOC) of 24 space vehicles sometime between 2007 and 2012 as 20–40 percent. Indeed, the task force argues for a 30-satellite constellation to ensure robust coverage in “challenged environments.”

The timelines for the last three GPS satellite development and launch programs — Block IIR, IIR-M, and III — all slid to the right, as they describe schedule delays these days.

Intermittently starved for fuel, with sporadic guidance from the helm, will new resources reach the system before its speed inevitably begins to slow, threatening its being overtaken by other GNSS vehicles?

Okay, that’s the bad news.

The good news is that no one connected to the program wants to let one of the world’s leading U.S.-branded utilities slip into the shadow of the other GNSSes under development. And steps are under way to ensure that doesn’t happen.

New Game Plan

A long-awaited next-generation program, GPS III, spent well more than hundred million dollars on conceptual studies and several years jogging in place before receiving a renewed go-ahead from the Department of Defense (DoD). The Fiscal Year 2006 (FY06) federal budget allocated $87 million for GPS III. The FY07 budget will be finalized soon in Washington, and current indications are that GPS Block III will receive at least $237 million, according to the GPS Joint Program Office (JPO). Of course, GPS III funds have been zeroed out before.

Current plans call for GPS JPO decision this summer that chooses among proposals submitted for separate space vehicle (SV) and operational control (OCX) segment contracts. Once acquisition strategies are formally approved in Washington, release of the GPS Block III SV request for proposals (RFP) are expected to be released by mid-February and later in the spring for the OCX RFP, according to JPO.

“Minor adjustments are being implemented in the program planning to reflect an incremental development and delivery approach for both acquisitions that will provide increased GPS capability sooner and more frequently over the life of the program,” the JPO told Inside GNSS. Nonetheless, an upgrade in the control segment to accommodate the new generations of satellites is behind schedule, which means that the capability to operationally control those signals will not be available until 2009 at the earliest, according to the DSB task force.

Modernizing Technology

In terms of its fundamental design, the Global Positioning System is nearly 35 years old. More recent spacecraft designs using modern electronics, new rubidium clocks, better satellite management techniques, and navigation message enhancements have improved performance. But the design of the key resource for manufacturers and users, the GPS signals-in-space, is essentially the same as when the first satellite was launched in 1978: a C/A-code on L1 (centered at 1575.42 MHz) and P/Y-code military signals at L1 and L2 (1227.60 MHz).

Over the next five years, however, this situation will change dramatically.

Beginning with SVN53/PRN17, the first modernized Block IIR (IIR-M) satellite built by Lockheed Martin and launched last September 25, GPS has gained a new open civil signal at L2 (centered at 1227.6 MHz). A third civil signal, L5 (centered at 1176.45 MHz) will arrive with the Block IIF satellites now scheduled to begin launching in 2007.

Both IIR-M and IIF satellites will offer new military M-code signals at L1 and L2 with “flex power” capability of transmitting stronger signals as needed. The L5 civil signal will be broadcast both in phase and in quadrature, with the quadrature signal being broadcast without a data message. Air Force Space Command expects to have a full complement of satellites transmitting L2C and M-code signals by 2013; for L5, fully operational capability is expected by 2014.

Generally, the new signals will be characterized by longer code sequences broadcast at a higher data rate and with slightly more power. Beginning with the IIR-M satellites, the Air Force will be able to increase and decrease power levels on P-code and M-code signals to defeat low-level enemy jamming — a capability known as “flex power.”

These new signal features will support improved ranging accuracy, faster acquisition, lower code-noise floor, better isolation between codes, reduced multipath, and better cross-correlation properties. In short, the new signals will be more robust and more available.

Looking farther ahead, another civil signal at L1 is planned to arrive with the GPS III program. Under a GNSS agreement signed with the European Union in June 2004, this will be a binary offset carrier (BOC 1,1) signal similar or identical to that of the Galileo open signal. This is expected to simplify the combined use of GPS and Galileo signals. Nominal first launch date for a GPS III spacecraft is currently 2013.

Modernization will also take place in the ground control segment. Six GPS monitoring stations operated by the National Geospatial-Intelligence Agency (formerly the National Imagery and Mapping Agency) have been folded into the existing five Air Force GPS monitoring stations (which includes the Master Control Station at Schriever AFB, Colorado.) This will eliminate blank spots in coverage and support Air Force plans to monitor the integrity (or health) of civil signals as well as military signals.

New Political Structure

Under a presidential national security policy directive (NSPD) released in December 2004, a National Space-Based Positioning, Navigation, and Timing (PNT) Executive Committee and Coordination Office have taken over from the Interagency GPS Executive Board (IGEB). Mike Shaw, a long-time GPS hand on both sides of the civil/military interface, stepped in toward the end of 2005 as the first director of the PNT coordination office.

Establishment of the PNT committee — now cochaired by deputy secretaries of defense and transportation, Gordon England and Maria Cino, respectively — kicked GPS leadership up a notch from that of the IGEB. Other members include representatives at the equivalent level from the departments of state, commerce, and homeland security, the Joint Chiefs of Staff and the National Aeronautics and Space Administration.

The committee had met once shortly after its formation under President Bush’s NSPD, but a January 26 gathering marks its first with the current leadership. In addition to getting acquainted with one another and the PNT topic in general, the agenda covered such issues as the DSB task force report, modernization and funding of GPS, and the new UN International Committee on GNSS (see article "What in the World is the UN Doing About GNSS?" in this issue).

Without a director and coordination board in place, the executive committee was unable to get on with many of the tasks assigned it by the presidential directive, including writing a five-year plan for U.S. space-based PNT and appointing an advisory board of outside experts. With Shaw on board, the coordination board now has seven staff members detailed from agencies represented on the executive committee.

A charter for the advisory board has been drafted and awaits approval by the committee, as does a draft of an international PNT strategy prepared by the State Department under the direction of Ralph Braibanti, who heads that agency’s space and advanced technology staff.

By
1 25 26 27
IGM_e-news_subscribe