Uncategorized Archives - Page 79 of 82 - Inside GNSS - Global Navigation Satellite Systems Engineering, Policy, and Design

Uncategorized

October 19, 2008

PTTI 2008

The 40th annual Precise Time and Time Interval (PTTI) Systems and Applications Meeting will take place at the Hyatt Regency Hotel, at Reston Town Center, just outside of Washington, DC. The conference focuses on preciste time and frequency technology and its challenges.

Tutorials take place on December 1. Topics include quality assurance, traceability and setting up a timing system, Loran and eLoran, missile tracking with GPS translators, GNSS, and measurement methods.

Read More >

By Inside GNSS
October 16, 2008

GNSS Forum: This is Not a Test – Future of the Emergency Alert System

School drill during a simulated alert in Texas

Readers of Inside GNSS may remember these words, uttered solemnly over their TV or radio in the 1950s and ’60s:  “This is a test of the Emergency Broadcast System. This is only a test. . . .”  This was followed by a piercing pilot tone, which, for me, is firmly etched in childhood memory along with tornado warnings (Midwestern upbringing) and school evacuation drills.

Readers of Inside GNSS may remember these words, uttered solemnly over their TV or radio in the 1950s and ’60s:  “This is a test of the Emergency Broadcast System. This is only a test. . . .”  This was followed by a piercing pilot tone, which, for me, is firmly etched in childhood memory along with tornado warnings (Midwestern upbringing) and school evacuation drills.

What was known then as the Emergency Broadcast System lives on today as the Emergency Alert System (EAS), and is designed to act as a national warning system in case of major public emergencies.

Today, as media consumption patterns change, as connected mobile devices such as cell phones and PDAs become near ubiquitous, and as the lessons of recent disasters — both man-made and natural — take root, the future shape of the EAS is now being defined. To wit: on June 26, 2006, President Bush signed an executive order directing the Department of Homeland Security (DHS) to create a comprehensive Public Alert and Warning System for the United States. DHS is to make its recommendations to the White House within 90 days. 

This article provides a brief summary of the history of the EAS as well as the context in which the President’s Executive Order was issued, such as other regulatory and agency efforts, and further examines potential features that could be included in an Enhanced Emergency Alert System.

(Basic information on the EAS can be found at <http://www.fcc.gov/cgb/consumerfacts/eas.html>.) It will go on to suggest how location technologies can create a rich technological complement that makes EAS even more robust.

“Had There Been an Actual Emergency . . .”
The Emergency Alert System has its beginnings in the Cold War, when President Truman created CONELRAD (Control of Electromagnetic Radiation) in 1951 as a means to rapidly communicate with the general public during times of emergency. 

In 1963, President Kennedy replaced CONELRAD with the Emergency Broadcast System and added state and local broadcast capability. In 1994, the Federal Communications Commission expanded the Emergency Broadcast System to include analog cable systems and, in doing so, renamed the system the Emergency Alert System.

The EAS is designed so that the president may issue a message within 10 minutes from any location. Participating systems must interrupt programming in process to transmit this presidential message. Messages are disseminated in a hierarchy, first to “Primary Entry Point” stations, and then down to EAS participating stations.

As one might guess by the Cold War context in which the EAS was first conceived, it was initially designed as a means to respond to national threats, such as nuclear attack. However, the EAS has seen practical application as a state and local alert mechanism, for example, to disseminate information regarding serious inclement weather.

In light of the immediate physical impact of era-defining disasters such as 9/11 or Hurricane Katrina, or annual cyclical events such as tornadoes or hurricanes, we can safely say that most emergency events are local or regional in footprint. (Certainly Hurricane Katrina and 9/11 were national events in terms of long-term impact, both financial and emotional. Further, Katrina led to a diaspora that was also national in impact.)

At the time of this writing, the EAS comprises analog AM and FM stations, analog broadcast television stations, and analog cable stations. Digital television, digital audio broadcast (DAB), digital cable and satellite radio systems begin EAS participation on December 31, 2006. Direct broadcast satellite (DBS) services will start participation on May 31, 2007.

But That’s Not All
The Executive Order comes on the heels of multiple parallel efforts. In August 2004, the Federal Communications Commission (FCC) issued a notice of proposed rule-making (NPRM) that began a review of the EAS <http://www.fcc.gov/eb/Orders/2004/FCC-04-189A1.html>.

At the time, the review was largely motivated by changes in media consumption patterns. The Commission noted growing use of new media, such as satellite radio and DBS services. For example, the FCC noted that as of June 2005, DBS services reached an estimated 25 percent of TV households, or roughly 28 million households, but the DBS broadcasters did not have any EAS obligations. The FCC moved to rectify this and will broaden the EAS as described above.

In November 2005, in the wake of the Gulf State hurricanes of 2005, the FCC issued an order and follow-on NPRM, which, in addition to incorporating the findings of the first NPRM, also acknowledged shortcomings in how the EAS was applied during Hurricane Katrina. With the NPRM, the FCC also solicited comment on how the EAS could be improved.

As of this writing, much of the discussion revolves around whether the EAS should be expanded to include cell phones and, if so, how to extend EAS coverage while preserving the robustness of TV and radio-based alerts. SMS, a popular cellular messaging technique, has its constraints.

For instance, the amount of text that can be disseminated is only 160 characters. Moreover, SMS is inherently a lousy medium in which messages, quite simply, are sometimes lost. Additionally, SMS carrier networks may be bogged down if mass alerts are sent out, and cell sites are susceptible to power outage, as shown after Katrina and during the power outage of 2003.

The FCC shares purview over the EAS with the Federal Emergency Management Agency (FEMA) and the National Weather Service (NWS). In October 2004, FEMA announced its Digital Emergency Alert System (DEAS) pilot, conducted in collaboration with the Association of Public Television Stations (APTS).

Phase I of the pilot involved a demonstration in the National Capitol Region of how datacasting over digital television broadcasts could deliver enhanced emergency alerts. For example, text crawls could be replaced with full audio and video alerts. A demonstration system was shown at the National Association of Broadcasters convention in April 2005.

Phase II expanded the pilot to stations outside of the Capitol Region, and began the work of integrating DTV-based capabilities with other warning and transport systems, such as satellite. For example, Emergency Alerts could first be uplinked to satellite, enabling broad national dissemination even if terrestrial infrastructure is impaired, and then received at digital TV stations, which could then retransmit over their spectrum to local TV sets.

In Congress, in September 2005, Senator Jim DeMint (R-SC) introduced the Warning, Alert, and Response Network (WARN; S.1753) Act into the Senate, also known as the National Alert System and Tsunami Preparedness Act, as a means of establishing a national hazard alert system.

In June 2006, Representative John Shimkus (R-IL) introduced a comparable bill into the House of Representatives. S.1753 notes that a multitude of media should be used so as to maximize dissemination and minimize the risk of having a single point of failure. This has been a goal of the FCC’s review as well.

Suffice it to say, the Emergency Alert System, and the overarching issue of comprehensive, integrated public warning, are hot topics these days.

Time for More
The move to broaden the EAS to new media distribution platforms seems timely and appropriate given the change in media consumption patterns. Households may get their information the old-fashioned way — through over-the-air TV to a fixed TV set — or through some combination of media, such as cable, satellite radio, and the Internet. 

In addition to “how” people consume information, another important attribute is “where” people consume information. Today, media viewing is not necessarily a static experience. Device users may be mobile, such as on foot or in a car, or at least “nomadic”, that is, in a fixed location that is neither home nor office. The laptop-toting businesswoman next to you in the airport may view the news in a hotspot café, or at a hotel, or through a cellphone.

Even so-called “legacy” media such as television are adapting to a more mobile use scenario. The Advanced Television Standards Committee (ATSC), an industry committee with purview over the digital TV standard used in the United States, is testing a format adapted for reception by mobile TV-capable devices. Tests have shown reception is possible at train-level speeds.

New mobile TV services, such as MobiTV (available on Sprint and Cingular), or Qualcomm’s MediaFLO (to be launched in 2006 on Verizon), or Modeo (DVB-H broadcasts to be launched in 2007) are targeting mobile handhelds. Informa estimates that some 120 million TV-capable devices will be in service by 2010.

Further, devices equipped with TV tuners are being embedded in automobiles. A car on an evacuation route could receive EAS information while (quite logically) heading away from a home in a storm’s path. 

In sum, television itself is becoming a mobile experience and, perhaps as importantly, a battery-powered experience. The ability to receive TV-like service without a wall plug is a clear benefit, given that electricity may go out during disasters.

During Katrina, for example, not only did power go out, but cellular networks and the public switched telephone network were also knocked out. Portions of the TV and FM infrastructure stayed on the air, leading to a commendation to the National Association of Broadcasters by President Bush.  In the end, the broadcast infrastructure was shown to be robust.

Katrina’s Not Alone
Immediately in the aftermath of Katrina, FCC Chairman Kevin Martin established an independent panel to review the Impact of Hurricane Katrina on Communications Networks. The FCC has issued an NPRM to review the panel’s recommendations.

Although the issue of how emergency alerts can be received is important, a second issue is how public agencies implement EAS. While the NWS did send severe weather warnings over the EAS, state and local officials did not activate the Emergency Alert System to send emergency evacuation information before Katrina’s landfall. Going forward, however, these officials will probably be more aggressive in using the EAS.

A major fundamental, technical issue is that of power. For example, TV and FM broadcasters typically have backup power available for business reasons — loss of service means loss of revenue.

In contrast, cell sites, which are usually in a much denser network than TV stations and repeater sites, often do not have backup power. This points to the quality of service that can be expected from each medium. However, even broadcasters faced power constraints in the wake of Katrina. Moreover, bringing fuel for the backup generators at TV stations proved difficult due to the prolonged flooding.

One recommendation, then, is to apply standards for the duration of backup power supply and to create supply routes in advance of disaster. Although the independent FCC panel analyzing Katrina’s impact noted that fuel supplies did run out in some cases, it only recommended “checklists” for emergency preparedness and did not go as far as to recommend standards for backup fuel. In the wake of 9/11, industry formed the Media Security and Reliability Council, which made similar recommendations. At this point, checklists would not seem to be enough.

Going forward, what disasters might be faced? This article was written in part in San Francisco, home to multiple potential terrorist targets, and which is also adjacent to Silicon Valley, one of America’s most influential clusters of innovation. (Parts of it were also written on an airplane, immediately in the wake of the averted London terrorist plot.) The San Francisco Bay Area is also adjacent to two major tectonic faults.

Looking back at the impact of the Northridge and Loma Prieta earthquakes, hurricanes are obviously not the only potential natural disaster that America faces. Further, while in the end the impact of SARS was minimal, it gave ample evidence of how a “Patient Zero” could easily communicate infectious disease across borders. While SARS emanated from Southeast China, it made its way to Taiwan, Hong Kong, and even Toronto.

A Complementary Solution
What other problems could the EAS solve? The EAS currently is a platform for one-way dissemination of information. It lacks knowledge of receipt or a feedback loop. Could connected devices acknowledge receipt back to a centralized database? Could devices pre-registered as “high priority”, such as those belonging to law enforcement, receive higher tiers of alerts?

From the perspective of position location and geographic information systems, the value of location-aware EAS receivers is apparent. Cellular devices can be coarsely located through cell site-based location technologies, and could thus receive “Reverse 911” alerts. Position location through GPS or other means could be polled with specific, geo-tagged instructions. Location-aware handsets could then acknowledge or ignore alerts as appropriate.

TV itself can be used for position location (as the author’s company does), and it can also be used in areas where conventional satellite-based positioning solutions are not effective. This would address the Presidential Decision Directive of December 2004 on Positioning, Navigation, and Timing, which noted the need for augmentations to the GPS to improve system integrity, availability, and accuracy. GPS’ shortcomings indoors and in urban settings — the most likely targets of terrorist attack — are well-known.

Conversely, the broadcast infrastructure is well-correlated with urban centers, making it an ideal complement to GNSS coverage in open and rural areas. Further, the low frequency and high power of TV signals make it effective for indoor use, which benefits both position location and communication.

Potential applications include positioning of and communications to first responders; geo-targeted alerts, such as weather warnings or Amber Alerts; tracking of hazardous material; tracking of vehicles or cargo; even patient triage. Rescue agencies have noted that one major hurdle they face in disaster settings is tracking their own assets, such as rental cars.

In addition, as evidenced by the APTS-FEMA trial, TV has the bandwidth to disseminate the audio and video information.

In sum, TV provides a pre-built, robust infrastructure capable of supporting location-rich Emergency Alerts, and complements the three major functions of GPS – positioning, navigation, and timing. Wedding the EAS with PNT seems an intuitive way to kill two birds with one stone.

By
October 5, 2008

Don’t Dump That Data!: Thorough Use of GNSS Measurements

Figure 1: Conceptual design for tracking sequential changes in carrier phase

A constantly recurring refrain within the satellite navigation community is the self-evident observation that, with more satellites, constellations, and frequencies appearing over the next few years, GNSS performance will improve.

Until those expectations become reality, though, another major opportunity is being almost universally overlooked: making far more efficient use of all the GNSS measurement data we have available right now.

A constantly recurring refrain within the satellite navigation community is the self-evident observation that, with more satellites, constellations, and frequencies appearing over the next few years, GNSS performance will improve.

Until those expectations become reality, though, another major opportunity is being almost universally overlooked: making far more efficient use of all the GNSS measurement data we have available right now.

Even after addition of more satellites and frequencies, a need for full extraction of information will continue in many operations. Even with more satellites and signals, users will still encounter various sources of signal degradation (e.g., intentional and/or unintentional interference, multipath, ionospheric disturbances), signal blockage (obstruction by foliage, terrain masking, or even elements of the structure carrying the antenna), and attenuation of already weak GNSS signals in sensitive (e.g., indoor) operations.

To those eagerly awaiting the new signal resources, claims of major improvement now — from existing data without further breakthroughs — may provoke immediate skepticism.  Those willing to pursue the issue, however, can realize those benefits even before the new signal resources come on line.

In this short article, I describe how satellite signal data unexploited by extant user equipment, can actually be put to work. I offer the usage of sequential changes in carrier phase data, including those from low-elevation GNSS satellites whose signals are omitted from receiver processing because of concerns over atmospheric propagation effects.

Velocity from Carrier Phase Changes — Be Thorough!
Before plunging into a full discussion of the topic, here’s a preview. Sequential changes in GNSS carrier phase, over one-second intervals, are

  • typically accurate to within a centimeter
  • immune to cycle count uncertainties — the integers cancel in the subtraction forming the one-second difference
  • amenable to treatment as if they were independent — and therefore — (next item)
  • not required to be continuous (We routinely cope with gaps in a data stream.)
  • so insensitive to effects of the ionosphere/troposphere that no mask angle is needed for those — (again, benefiting from  cancellation of the unknowns) — and
  • therefore, useful as a precise and dependable source of velocity history.

Velocity histories thus derived feed directly into dead-reckoning, and pseudoranges (protected by mask as usual and edited via receiver autonomous integrity monitoring (RAIM) and other signal processing techniques) prevent unattended growth in the position error. An inertial measurement unit (IMU) is not required; this can be accomplished with or without one.

Assertions made in this article are based on flight-validated experience. Rather than repeatedly referring to test results, they are mentioned only in passing where needed for clarification.

All derivations, details, steps, answers to accompanying questions/correlation effects/issues/paradoxes and so forth — as well as supporting test results — are documented in my recently published book, GNSS Aided Navigation and Tracking: Inertially Augmented or Autonomous, cited in the Additional Resources section at the end of this article.

This article is intended to provide only a simple description of this approach with no mathematical development.

Figure 1 (above, right) shows a simple system driven by only GNSS data (no IMU nor any other sensor). Outputs are geographic velocity and position as usual. Velocity is determined from sequential changes in carrier phase over one-second intervals. A continuous time history of velocity is simply integrated to produce position.

To counteract the growth of position error due to this integration, only pseudoranges, (i.e., no carrier phase information) are used for position determination, and, conversely, no pseudorange information is used for calculating velocity. The dynamic model merely consists of a constant acceleration vector.

To blunt the effect of modeling discrepancies from acceleration changes, the data-averaging durations are kept short (i.e., a few seconds). In analogy with spline fitting, then, estimates continuously evolve — formed as combinations of current and recent observations. That latter concept has solid support from a long history of success in myriad tracking applications.

Unlike virtually all existing operational methods, then, this approach makes full use of the carrier phase, even if permanently ambiguous and intermittent. That latter trait — intermittency — is made acceptable by reason of another unusual feature reflected in the design of a GNSS receiver developed at The Ohio University Avionics Center and described in the article by F. van Graas et alia cited in Additional Resources.

This OU receiver design replaces correlators and tracking loops with fast Fourier transform (FFT) processing, which offers

  • unconditional stability (in contrast to a conditionally stable third-order loop)
  • unconditional access to all  inverse FFT cells,  allowing  a   broader  effective correlation search  than  the narrow subset that a tracker would cover
  • linear phase shift (as opposed to nonlinear phase of a track loop, causing group delay variations).

In addition, the user can easily be empowered to select the durations of both the GNSS carrier difference and code data blocks. That option provides full flexibility and capability for versatile operation.

Obstacles to Adoption
The approach just presented is so effective and has such simple dynamics that it prompts a question: Why has this capability remained unused for all these years? The answer to this question has multiple parts.

Selective Availability. A derivation relating sequential changes in carrier phase to precise velocity, accounting for the large excursions of satellites over the same interval, never appeared until shown in my previously cited book. For many years, the presence of the selective availability (SA) time dither limited the motivation to investigate this approach. Despite an obvious advantage (i.e., cancellation of unknown cycle counts), which was noted in my 2001 ION/CIGTF presentation (see Additional Resources) — again using data collected from before SA’s removal  — stand-alone capability wasn’t possible.

Land trials in a van—conducted under SA conditions but employing these principles — were successful,  but they depended on receiving corrections from a ground station. However, differential GPS performance was already brilliant without the programming-intensive processing required for converting sequential carrier phase changes into velocities; so, the scheme did not find wide acceptance.

Legacy of Past. Technical obstacles that shaped receiver designers’ thinking in the past may blind them to the practicality of such methods in today’s world. The approach discussed in this article requires added processing easily possible with today’s electronics.

Position Fixation. The industry has long prioritized instantaneous position accuracy above other considerations. As a figure of merit, instantaneous position-fixing seems to offer — at least conceptually — an easily understood basis for decisions. This is valued especially in applications involving certification, wherein anything not immediately apparent can be open to question.

For certification that stance may or may not ever change but, in any case, the following brief discussion of alternative decision criteria is in order.

Among the most obvious reasons for broader figure-of-merit criteria: many operations are not rigidly governed by an inescapable need to substantiate the minimization of theoretical risk. Furthermore, whether operations are required to quantify risk or not, “accuracy” is widely understood to have multiple meanings because the concept of “error” has multiple interpretations (e.g., RMS, containment, CEP, SEP, and so on).

A corresponding variety of error metrics is needed to accommodate different conditions and/or performance levels — as well as differences in context (absolute, repeatable, relative). The latter item in particular can, in some applications, impose additional demands. For instance, a need may arise for adherence to a uniform datum reference because, if coordinates are being subtracted to yield relative locations, misunderstanding arising from nonuniformity will undercut the validity of the results.

Finally, the value of reducing instantaneous position error to ever-smaller amounts — centimeter? millimeter? less than one millimeter? — should be judged in light of diminishing returns. For example, even if we could pinpoint the instantaneous location of the antenna phase center on a supersonic aircraft to within a micron, our efforts would be better spent on determining future position.

Freedom from commonly imposed constraints, ironically, can often produce better prospects of success — just by using information that tightly controlled systems would discard. And that immediately raises another issue, the safety of using such information, which we will address next.

Making Previously Rejected Data Safe
In addition to obvious features in Figure 1 (segmentation and sequential change in carrier phase over the course of one second), another procedural step is fundamental to the approach: single-measurement RAIM, with optional subsequent verification by more familiar multi-satellite RAIM methods. Thresholds for each residual can be set to control probabilities of alarm and missed detection, with the same rigor used for deriving RAIM tests — but with one significant improvement. Each separate observation can be weighted according to its individual credibility (e.g., based on signal strength, elevation angle, and so on).

Furthermore, rather then selecting only the space vehicles (SVs) offering the best RAIM geometry, every satellite in view can be acceptance-tested as a candidate for insertion into estimation processing. Optionally a designer might choose to employ additional well-known procedures such as across-SV differencing (followed by whitening to undo resulting correlations) to remove user clock errors.

The measurement separation — between pseudoranges and carrier phases — just described is total: a pseudorange rejected from a specific satellite does not affect decisions about whether to accept that same satellite’s carrier phase sequential-change data.

Of course, mask angle is prescribed to reject pseudoranges from low-elevation satellites in order to edit  data containing long ionospheric and tropospheric delays. When tracking sequential change in carrier phase, however, these same SVs should not be automatically rejected.

Over a one-second interval, those sequential phase changes are generally accurate to within a centimeter RMS. If multipath or ionospheric disturbances degrade some of them, data editing (via both single-measurement and subsequent multi-satellite RAIM) can exclude them. Some of the data at an instant can be rejected without rejecting all of the data at that instant.  Especially when a shortage of data exists, use of partial information may be vital for more than one reason. Not only does it add measurements that would otherwise be ignored, but it can improve coverage geometry, which may be pivotal for a successful application.

Successful flight test results described in GNSS Aided Navigation and Tracking: Inertially Augmented or Autonomous were obtained after SA removal; that was stand-alone operation. What was previously a limited range of application — operation with a ground station, already brilliantly successful without the capabilities described herein — now expands to unrestricted usage.

An opportunity for major performance improvement is therefore available for general operation. Hardly any in the industry are taking advantage of this at present but that is expected to change; the cost is too low while the benefit is too high to remain untapped.

Conclusion
My discussion here was intentionally limited to fundamental considerations and included only a restricted subset of applications. A wider scope of issues (e.g., inertial updating, free-inertial coast, tracking, etc.) can be found addressed at length in my book.

With the welcome addition of more satellites and more frequencies, these considerations will continue to be applicable, for multiple reasons. Here are some of those reasons:

  • Hindrances to signal detection, previously discussed, will remain.
  • Interoperability is most challenging for items with highest accuracy expectations. With carrier phases from different constellations, sequential changes are far easier to mix than the phases themselves.
  • A microelectromechanical system (MEMS) IMU has much less free-inertial coast capability than a navigation-quality INS, but demands to support GNSS-era performance (centimeters per second rather than yesteryear’s nautical miles per hour) will endure.
  • Continued growth, often gradual but inexorable, will occur in demands for additional capabilities.
  • Those demands will be met by increased sophistication in knowledge from continued research.

Additional Resources
[1] Farrell, J. L., “Carrier Phase Processing Without Integers,” pp. 423–428, Proceedings of Institute of Navigation 57th Annual Meeting/CIGTF 20th Biennial Guidance Test Symposium, Albuquerque, New Mexico, June 11-13, 2001
[2] Farrell, J. L., GNSS Aided Navigation and Tracking: Inertially Augmented or Autonomous, American Literary Press, NavtechGPS (distr.), 2007
[3] van Graas, F., Soloviev, A., Uijt de Haag, M., Gunawardena, S., and Braasch, M., “Comparison of two approaches for GNSS receiver algorithms: batch processing and sequential processing considerations,” ION GNSS-2005.

By Alan Cameron
[uam_ad id="183541"]
August 17, 2008

Guidance, Navigation and Control Challenges for Miniature Autonomous Systems

This workshop on GNC Challenges for Miniature Autonomous Systems is sponsored by the Air Force Research Lab (AFRL) Munitions Directorate and facilitated by the Institute of Navigation. It will take place October 20-22, 2008 at the Holiday Inn Sunspree Resort in Fort Walton Beach, Florida.

The workshop targets the DoD technical and user community, academia, and industry. Submit paper abstracts online by September 2.

Read More >

By Inside GNSS
August 16, 2008

Location Summit 2.0

Sponsored by GIS Development Pvt. Ltd, Location Summit 2.0 is a platform for stakeholders in the location-based services community in India and its neighborhood to meet, network, learn, share, and collaborate and help facilitate the growth of the industry. 

The organizers expect about 300 delegates representing different segments of LBS community and end users.

It will take place at Hyderabad Convention Centre in conjunction with Map World Forum (www.mapworldforum.org).

Read More >

By Inside GNSS

AGI 2008 Users’ Conference

"Total Access- Your Technology Pass" is the theme of this year’s Analytical Graphics, Inc. user conference. It features current and future software solutions for space, defense, and intelligence analysis; integration; and collaboration in the form of desktop applications, application engines, and components. It is open to all aerospace, defense, and intelligence professionals.

The event takes place at the Renaissance Schaumburg Hotel and Conference Center outside of Chicago. It is a 20 minute drive from O’Hare Airport.

Read More >

By Inside GNSS
August 14, 2008

GNSS Hotspots | August 2008

One of 12 magnetograms recorded at Greenwich Observatory during the Great Geomagnetic Storm of 1859
1996 soccer game in the Midwest, (Rick Dikeman image)
Nouméa ground station after the flood
A pencil and a coffee cup show the size of NASA’s teeny tiny PhoneSat
Bonus Hotspot: Naro Tartaruga AUV
Pacific lamprey spawning (photo by Jeremy Monroe, Fresh Waters Illustrated)
“Return of the Bucentaurn to the Molo on Ascension Day”, by (Giovanni Antonio Canal) Canaletto
The U.S. Naval Observatory Alternate Master Clock at 2nd Space Operations Squadron, Schriever AFB in Colorado. This photo was taken in January, 2006 during the addition of a leap second. The USNO master clocks control GPS timing. They are accurate to within one second every 20 million years (Satellites are so picky! Humans, on the other hand, just want to know if we’re too late for lunch) USAF photo by A1C Jason Ridder.
Detail of Compass/ BeiDou2 system diagram
Hotspot 6: Beluga A300 600ST

1. COMPASS EXPLAINED
Montreal, Quebec. Canada
√ An International Committee on GNSS Experts Meeting took place July 15 in Montreal, Canada, with updates on GNSS system developments. A Chinese representative detailed the Compass (Beidou II) signals: 10 services — five free “open” services, and five restricted “authorized” services — centered at eight different carrier frequencies. (See related news article in this issue.)

Read More >

By Alan Cameron
[uam_ad id="183541"]
July 1, 2008

ISSSTA 2008 International Symposium on Spread Spectrum Techniques and Applications

The 10th ISSSTA symposium will be held at the Savoia Hotel Regency in Bologna, Italy this August. The theme is "Creating New Dimensions in the Wireless World."

Of particular interest are a session on GNSS, Tuesday, August 26 from 11 a.m. to 1 p.m. and a session on postioning and navigation on Thursday afternoon, August 28.

The main social event will be held at Villa Griffone, the home of radio communications pioneer Guglielmo Marconi, as part of the celebrations for the centennial anniversary of the Marconi Nobel Prize (1909).

Read More >

By Inside GNSS
June 19, 2008

AGNSS Standardization

Location technology is entering ever more deeply into our day-to-day lives. Growing market demand for location-based services (LBS) revolves the single premise: “Location everywhere, any time.” The requirement for seamless, ubiquitous positioning includes, of course, urban and indoor environments.

Location technology is entering ever more deeply into our day-to-day lives. Growing market demand for location-based services (LBS) revolves the single premise: “Location everywhere, any time.” The requirement for seamless, ubiquitous positioning includes, of course, urban and indoor environments.

Meeting this requirement will provide the cornerstone for a real “boom” in the location market. To realize the capability, location technology seems to evolve naturally towards the aggregation of several systems used in combination to provide accurate information to users.

Already today one can see the pre-eminent place of assisted GPS or AGPS, which is no more than a way of combining telecommunication signals with GPS signals in order to improve real-time positioning capabilities. Many mobile handsets equipped with AGPS are now available and standardization played a great role in this beginning success story.

With the arrival of Galileo and the near-term prospect for implementation of assisted GNSS (AGNSS) receivers, the market is about to pass through a new technological gap in the location solutions offered by mobile communication systems. . .

. . .Conclusion
The central place that standardization could play in the development of the location market was recognized very early in the Galileo program. This was strongly relayed by the industrial participants that merged their competencies and energy to build a new, very efficient location standard around AGNSS.

This industrial cooperation — tangibly achieved within 3GPP and OMA — brought Galileo and the GNSS concept into the heart of the location market. Even more, the achievement by all the private companies involved in the process gave birth to what we could call the first real and tangible AGNSS concept, gathering and standardizing the use of several constellations together to improve location performance, a concept in which EGNOS is already demonstrating great advantage.

The work continues to add attractive features to the AGNSS standard that ensure even higher performance in mass-market LBS applications. On the one hand, the companies and organizations involved in this process will have to follow the development of communication technologies and, on the other hand, to support the growth of the GNSS concept with the interoperable integration of the future satellites navigation systems. Additionally, an even deeper hybridization of Galileo and GNSS with other technologies will have to be standardized in order to tackle the next challenge: providing accurate locations deep indoors.

In parallel, standardization bodies will also have to tackle the challenge of developing test procedures of new AGNSS (or A-GANSS) location technologies, e.g. minimum performance standards, in order to make the feature really usable.

Nonetheless, the success already achieved in 3GPP and OMA has established an ideal basis and framework for the development of the mass market for location services, starting from the mobile communications domain but also certainly providing an ideal technological basis for other domains such as intelligent transport systems.

To read the rest of Michel Monerrat‘s article, including figures, charts, and graphs, please download the PDF of the entire article above.

By Alan Cameron
June 9, 2008

Integrating Inertial Sensors with GNSS Receivers

GNSS Aided Navigation & Tracking:
Inertially Augmented or Autonomous

By James L. Farrell
Navtech GPS. 2007. Hardcover. 280 pages
ISBN-13: 978-1-56167-979-9

This text offers concise guidance on integrating inertial sensors with global navigation satellite system (GNSS) receivers and other aiding sources. The focus is on low-cost inertial measurement units (IMUs), which require frequent updates.

GNSS Aided Navigation & Tracking:
Inertially Augmented or Autonomous

By James L. Farrell
Navtech GPS. 2007. Hardcover. 280 pages
ISBN-13: 978-1-56167-979-9

This text offers concise guidance on integrating inertial sensors with global navigation satellite system (GNSS) receivers and other aiding sources. The focus is on low-cost inertial measurement units (IMUs), which require frequent updates.

Dr. Farrell has many decades of experience in this subject area and the book is teeming with insights that are hard to find or unavailable elsewhere.

An engineer and university teacher, Farrell has made a number of contributions to inertial navigation technology and integrated navigation systems, including extensive error propagation analyses, synthetic aperture radar motion compensation and transfer alignment algorithms. He is the author of Integrated Aircraft Navigation (1976.) and many articles.

The book is perhaps best suited for engineers with some familiarity with both IMUs and GNSS, although there is an excellent “Review of Fundamentals” chapter that provides a brief summary of prerequisite subject matter.

The introductory chapters also point to the literature for additional background reading.

The main sections detail the steps necessary to yield robust three-dimensional position, velocity, and attitude estimates from low-cost IMU sensors aided by frequent GNSS updates.

The assumption of frequent aiding-source updates, combined with an emphasis on applications that require precise velocity rather than extreme precision in position, results in numerous simplifications in the integration. All aspects of a typical integration are covered, including raw measurement pre-processing, position/velocity/attitude estimation, coordinate systems, and the provision of integrity.

Experimental results are described to illustrate the attainable accuracies (better than 3 cm/s velocities in three-dimensions). The closing chapters of the text include a discussion of other applications of the integration formulation, for example, tracking.

Find out more

By
June 6, 2008

GNSS Hotspots | June 2008

One of 12 magnetograms recorded at Greenwich Observatory during the Great Geomagnetic Storm of 1859
1996 soccer game in the Midwest, (Rick Dikeman image)
Nouméa ground station after the flood
A pencil and a coffee cup show the size of NASA’s teeny tiny PhoneSat
Bonus Hotspot: Naro Tartaruga AUV
Pacific lamprey spawning (photo by Jeremy Monroe, Fresh Waters Illustrated)
“Return of the Bucentaurn to the Molo on Ascension Day”, by (Giovanni Antonio Canal) Canaletto
The U.S. Naval Observatory Alternate Master Clock at 2nd Space Operations Squadron, Schriever AFB in Colorado. This photo was taken in January, 2006 during the addition of a leap second. The USNO master clocks control GPS timing. They are accurate to within one second every 20 million years (Satellites are so picky! Humans, on the other hand, just want to know if we’re too late for lunch) USAF photo by A1C Jason Ridder.
Detail of Compass/ BeiDou2 system diagram
Hotspot 6: Beluga A300 600ST

1. CANADA AND U.S. FIGHT OVER OREGON – AND GPS IS THERE!
Kingston, Ontario, Canada.
√ The Canadian navy built the Murney Tower when Canada and the U.S. fought over Oregon in 1846. Cruises of this Kingston, Ontario region feature the world’s first wireless GPS-triggered audio tours — in six languages, no less. The UNESCO World Heritage Site features old fortifications guarding the Rideau Canal.

Read More >

By Alan Cameron
IGM_e-news_subscribe