Survey and Mapping

April 1, 2006

Geodesy and Satellite Navigation

There has always been a love-hate relationship between geodesy and satellite navigation. Indeed, satellite positioning started life as an extension of terrestrial geodesy. When the first satellite, Sputnik 1, started orbiting the Earth in 1957, geodesists in several countries realised that satellites offered substantial potential as a geodetic positioning and navigation tool.

There has always been a love-hate relationship between geodesy and satellite navigation. Indeed, satellite positioning started life as an extension of terrestrial geodesy. When the first satellite, Sputnik 1, started orbiting the Earth in 1957, geodesists in several countries realised that satellites offered substantial potential as a geodetic positioning and navigation tool.

The basic technologies of terrestrial geodesy of the day, notably triangulation, traversing, and precise leveling, were slow and cumbersome, mainly because of the effect of the curvature of the surface of the Earth, which limited the range of measurements to theodolite observations between points situated on hilltops, observation towers, and triangulation masts.

The advent of EDM (electronic distance measurement) in the 1960s helped terrestrial geodesy, but it, too, was affected by the same limitation, namely the shortness of observable EDM ranges due to the Earth’s curvature.

Earth orbiting satellites did not suffer from this drawback. They could be viewed simultaneously from several points on Earth, and therefore direction and range measurements made, provided that the space vehicles were not obscured by high natural features or tall man-made structures. This led to several new satellite geodesy positioning methodologies.

The first of these was satellite triangulation, which was used initially to supplement and strengthen terrestrial triangulation networks. Satellite triangulation consisted of geodetic direction measurements derived from high power photographs of satellite orbits made against a stellar background of stars, with known right ascension and declination.

A few years later, this was followed by range measurements to satellites, made from Earth-bound EDM equipment to corner cube reflectors placed on the early satellites. The methodology used thus far was an extension of geodetic astronomy, with little reference to physical geodesy.

This situation changed significantly when geodesists realized that they could use the Doppler shift on the signal broadcast from a satellite to obtain differential range measurements that, together with the known Keplerian orbit of the satellite, could lead to a relatively fast positioning, or navigation, method. The Keplerian orbital motion of satellites is primarily based on the Earth’s gravity field, a subject of expertise by practitioners of physical geodesy.

This technical advance gave birth to Transit-Doppler, the first satellite navigation technology. Transit-Doppler was used in the late 1970s and early 1980s not only for the positioning of naval ships and of submarines surfacing in the polar regions, but also for the strengthening and scaling of national and continental terrestrial triangulation networks.

However, practitioners soon realized that positioning by Transit-Doppler to a reasonable degree of accuracy took several minutes, and, therefore, precluding its use as a full navigation methodology, which requires quasi-instantaneous positioning.

Enter GPS
These were the early days of a new global satellite positioning, navigation, and timing system, first called the NAVSTAR Global Positioning System, a name later shortened to just GPS. The rest is history. The early decision to base GPS on a constellation of 24 medium-Earth orbit satellites was taken on the advice, as you would expect, of geodesists at the U.S. Naval Surface Weapons Center in Dalgren, Virginia.

The close relationship between the early GPS and geodesy was further demonstrated by the adoption of WGS84, the World Geodetic System 1984, as the basis of the 3-D coordinate system of GPS. As GPS was born during the Cold War, it was declared a US military navigation system, with full access to NATO but only restricted access and down-graded positioning accuracies for civilian users.

This so-called Selective Availability (SA) gave the green light to the civilian geodetic community to come up with new methodologies that could counter the effects of SA. As always, human ingenuity did not disappoint, and two new differential techniques were developed. The first was the differential GPS (DGPS) technique, which improved relative positioning accuracies of GPS by at one order of magnitude, down to a few meters. As a result, DGPS soon became the standard methodology for the offshore positioning of oil platforms, pipelines, etc.

The next advance in improving the accuracy of satellite positioning was made on the advice of radio-astronomers, who proposed replacing the standard GPS pseudorange measurements, which are based on timing the modulated signal from satellite to receiver.

Instead, they suggested making measurements on the basic carrier frequencies of these signals, just as they did with extra-galactic signals arriving at, say, two widely spaced radio telescopes in so-called very long baseline interferometry (VLBI), leading as a by-product to the Cartesian coordinate differences between the two telescopes. This was the beginning of centimetric positioning by the carrier phase GPS method, which was later developed further by geodesists into kinematic GPS and centimetric navigation.

GPS had now become the universal high precision quasi-instantaneous positioning and navigation tool, creating the basis for hundreds of new applications. Again, geodesists led the way, concentrating on high precision scientific and engineering applications. These included surveying and mapping, positioning in offshore engineering, the monitoring of local crustal dynamics and plate tectonics, the relative vertical movements of tide gauges, and the continuous 3-D movements of critical engineering structures, such as tall buildings, dams, reservoirs, and long suspension bridges.

All of these applications required very high relative positioning accuracies, but not quasi-instantaneously as in the safety-critical navigation and landing of civilian aircraft. This came much later.

Geodesy and Navigation
Initially, GPS was considered as a standard navigation tool for military vehicles on land, sea, and air, but not for safety-critical civilian transportation. This was because, unlike military positioning and navigation, safety-critical civilian transportation not only requires quasi-instantaneous and accurate positioning, but also so-called “high integrity and good coverage.”

Geodesists will immediately realize that “integrity” stands for the geodetic concept of “reliability,” whereas “coverage” refers to the availability of a sufficient number of satellites that can be sighted by a receiver continuously and are not obscured by natural or man-made obstructions, such as high mountains, tall buildings, and the wings of an aircraft.

On its own, GPS cannot meet these requirements to the level required in safety-critical civilian transportation. Military transportation, on the other hand, has relatively modest requirements, which can be met by GPS. Indeed, you do not become a NATO Air Force pilot if you want a safe life. Flying as a passenger in a commercial airline is something else all together.

The penetration of satellite navigation, and primarily GPS, into civil aviation involved yet again, as you would expect, geodesists. They had to develop jointly with the civil aviation community the necessary theoretical and practical tools, which could be used to establish and quantify their requirements of accuracy, integrity, and coverage.

This involved the use of existing geodetic tools, such as the covariance matrix, the analysis of least squares residuals, and the well-established geodetic reliability measures. New tools were also introduced, such as the concept of RAIM or receiver autonomous integrity monitoring, based on the analysis of the least squares residuals.

Persuading Non-Geodesists
These geodetic tools, which were highly beneficial to the civil aviation community, initiated a fruitful, long-term collaboration between the two communities. However, this has not always been a straightforward and smooth relationship, and it involved — especially at the beginning — a deep suspicion of these “academic” geo-scientists. Here are a few notable examples of this love-hate relationship.

As a general rule, the existing civil aviation horizontal coordinates were based on latitudes and longitudes, with no particular reference to a reference datum. Heights in civil aviation were and still are based on barometric altimetry, on the assumption that all that matters is “the relative heighting between airplanes,” which is not affected significantly by a change in barometric pressure.

This assumption disregards, of course, the fact that the heights of natural features on the ground, such as mountains, do not change with changing barometric pressure. The first challenge was to convince the international civil aviation community that their horizontal coordinates, that is, latitudes and longitudes, required a proper geodetic datum and, as GPS was being contemplated as a future navigation tool, it made sense to adopt the same reference datum, namely WGS84. It took a while to convince the community to accept that.

The adoption of WGS84 led to the resurveying of most airports, runways, and various en route and landing navigation aids in order to bring them into WGS84, in preparation for the introduction of GPS. This led to the discovery of some large discrepancies, at airports and among navaids in many countries, between the existing horizontal coordinate values and their new WGS84 equivalents. Geodesists will be familiar with such occurrences, whenever they start dealing with a new community, whether they are civil or offshore engineers, oceanographers or meteorologists.

The first GPS receivers did not lend themselves to mass market adoption. Geodesists of a certain age will also remember some of the earliest commercial GPS receivers, such as the TI 4100 receivers, made by Texas Instruments. These early receivers operated by measuring sequentially four pseudoranges to four different satellites. Consequently, the receivers were programmed to first check the geometry of the satellites in view and decide on the best four in terms of geometrical configuration.

However, later on, with the emergence of new receivers that could measure all the available pseudoranges quasi-simultaneously, there was no need to carry on with measurements only to the “best four” satellites. One could track all available satellite signals and process these measurements by least squares, rejecting those with relatively large residuals, if any. This standard processing of observations is bread-and-butter stuff to surveyors and geodesists.

However, this was not the case with a number of navigation experts, who persisted on recommending the use of only the “best four” satellites for quite sometime, before they finally abandoned the practice.

A New Era of GNSS
Satellite navigation and positioning has changed substantially and significantly over the last 5 to 10 years. With Galileo in its development and in-orbit validation phase, the future developments in GPS IIF and GPS III, renewed interest in GLONASS, and satellite navigation initiatives in Japan, China, India, Australia, and several other countries, GNSS or the Global Navigation Satellite System is moving from being a concept, largely based on GPS alone, to a full global reality. A comprehensive program of GPS modernization currently under way aims to deliver significant improvements to both military and civil users.

The earliest mass-market applications of GPS involved road vehicles and mobile phones. In both cases, the twin aims are navigation (where am I, and how do I go to my destination?) and tracking (where is he, she, or it?). In the case of road vehicle tracking, successful applications include fleet monitoring (taxis or road transport companies), theft recovery of private cars, “black box” incident recorders, and the transport of hazardous or valuable cargoes.

Typically, most of these applications share three common features, namely prior knowledge of the proposed route, the continuous tracking of position and velocity by GPS, and the trigger of an alarm by a significant deviation.

Similarly, a number of GPS tracking applications use mobile phone technology (GSM or GPRS), but these are not as developed and widespread as vehicle tracking. Typically, these involve vulnerable people, such as young children, the elderly, key workers in some risky environments (for instance, railways), individuals with a chronic or contagious disease, and even VIPs.

Person tracking with GPS+telematics could also involve judicial cases (ordered by a court of law), of suspected criminals or anti-social elements. Other proposed applications include environmental information, location-based security, and location-sensitive marketing.

On its own, a GPS-enabled phone offers location and communication. This may answer the questions “Where is she or he?” and “Where am I?” but nothing more. However, when position and communication are combined with an appropriate geographic information system (GIS) database and a direction sensor, the combined system could answer two other very important questions, namely “What’s around me?” and “What’s that building over there?”

This could be achieved by a GPS+compass device, providing positional and directional data, which the mobile phone or the PDA transmits to a remote server. The server calculates the user’s position and identifies the building along the measured azimuth, gets the relevant information from the database, and sends it back to the client.

This is clearly valuable for the public utilities (water, gas, electricity, TV), shopping and leisure (restaurant menus, theater tickets), house hunting (details of the property advertised for sale), and of course, for visitors and tourists (museums, notable buildings, archaeological sites).

Leaving mobile phones aside, satellite navigation can also be used for location-based- security. For example, a briefcase or a portable PC can be programmed to unlock safely only in a specified location and nowhere else. This would minimize the risk of sensitive military or commercial material falling into the wrong hands.

Some working prototype systems already exist. Other location-and-context-based applications under consideration include the marketing and selling of goods, the reception of pay-TV, credit card security, spectator sports, road user charging and many others.

Indeed, the qualification of “critical application” is no longer restricted to safety-critical transportation, but it also applies now to financial-critical, legal-critical, security-critical, and business-critical applications as well. This creates a problem with standard off-the-shelf autonomous GPS receivers, which cannot operate indoors, because of signal attenuation and multipath.

Over the last few years, GPS chip and receiver manufacturers have tried, with some success, to develop high sensitivity GPS (or HS-GPS). The latest HS-GPS receivers, which incorporate up to 200,000 correlators operating in parallel, make it relatively easy to identify true pseudoranges from among the many signal and multipath reflections. Several manufacturers in the United States, Japan, Korea, and Europe, already advertise HS-GPS chips, and many other companies use such chipsets in their receivers.

GNSS Evolution
Like nearly all the technologies that preceded it, satellite navigation and positioning is going through the standard stages of development from birth to maturity. Older surveyors and geodesists may well remember the advent of EDM, using microwaves or lightwaves in the late 1960s and the 1970s. When the first EDM instruments were introduced, the distances measured were also measured with tapes, just in case.

Then came the second phase, when surveyors became fully confident about EDM and used it routinely for fast and precise range measurements. It took a few years and several critical mistakes in local mapping and national triangulation, to realize that EDM instruments could go wrong and that they had to be calibrated regularly in order to determine their accuracy and systematic biases.

The development of satellite navigation and positioning is following practically the same stages as EDM did 40 years ago. Only now we can formalize these successive stages of development of a technology and give them names by using Gartner’s famous “Hype Cycle Curve,” which was invented about 10 years ago in conjunction with new information technology products.

Using a simplified version, these successive stages of technology development are now formally called “Technology Trigger,” followed by “Peak of Inflated Expectation,” leading to “Trough of Disillusionment”, happily followed by the “Slope of Enlightenment,” and hopefully leading to the “Plateau of Productivity.”

As I write this, the first Galileo satellite, GIOVE-A, has been launched and tested successfully, opening a new era in satellite navigation. Hopefully, this will lead to the development of a large number of new critical applications — and involve close collaboration with geodesy and several other related disciplines — for the benefit of business, government and society.

Here is one last example about the strange relationship between geodesy and GPS. The U.S. delegation to the International Telecommunications Union (ITU) recently proposed to abolish leap seconds, and thus cut the link between Solar Time and Coordinated Universal Time (UTC) and ipso facto GPS Time.

At present, whenever the difference between UTC and Solar Time approaches 0.7 second, a leap second correction is made in order to keep the difference between them under 0.9 second. This is done every few years on the recommendation of the International Earth Rotation and Reference Systems Service, which monitors continuously the difference between Solar Time and UTC.

This leap second correction, which has to be applied every few years to GPS Time, apparently causes software problems because it has to programmed in manually. However, considering the difficulties that this change would cause to other scientific communities, such as astronomers, and even to users of GPS time itself for some critical applications, the U.S. proposal has now been postponed for the time being.

In conclusion, I must declare a conflict of interest. Although all the work I do at present involves GNSS, my academic background is clearly in geodesy. However, a change is in the air now, as safety-critical transportation is no longer the only critical application that has to be catered to. It has now been joined by several other emerging critical applications, notably financial-critical, legal-critical, security-critical and business-critical applications, which will also require nearly the same level of accuracy, integrity and coverage as safety-critical transportation.

This is where geodesy could step in again and create some new statistical tools, which will differentiate between the navigation and positioning systems on offer, and assess their suitability for the specific critical application.

For figures, graphs, and images, please download the PDF of the article, above.

By
March 1, 2006

Building Monitors

Severe loading conditions such as strong winds and earthquakes acting on modern tall buildings and structures can cause significant loads and vibrations. Recent trends toward slender, flexible, and light-weight buildings have left a large number of buildings susceptible to wind-induced motion. Furthermore, human perception of building motion has become a critical consideration in modern building design.

Severe loading conditions such as strong winds and earthquakes acting on modern tall buildings and structures can cause significant loads and vibrations. Recent trends toward slender, flexible, and light-weight buildings have left a large number of buildings susceptible to wind-induced motion. Furthermore, human perception of building motion has become a critical consideration in modern building design.

More complex building shapes and structural systems further accentuate eccentricities between the mass center, the elastic center, and the instantaneous point of application of aerodynamic loads, and consequently will generate significant torsional effects.

Verifying dynamic structural analysis requires the development of direct dynamic measurement tools and techniques in order to determine the natural frequencies, damping characteristics, and mode shapes. Among these tools accelerometers have played the most important part in analyzing structural response due to severe loading conditions. However, they provide only a relative acceleration measurement. The displacement from acceleration measurement cannot be obtained directly by double integration.

In contrast to accelerometers, GPS can directly measure position coordinates, thereby providing an opportunity to monitor, in real-time and full scale, the dynamic characteristics of a structure. GPS used in the real-time kinematic mode (GPSRTK) offers direct displacement measurements for dynamic monitoring. Earlier studies by the authors and other researchers, referenced in the Additional Resources section at the end of this article, have shown the efficiency and feasibility of structural deformation monitoring by combining accelerometer and GPS-RTK.

However, GPS-RTK has its own limitations. For example, the measurement accuracy can be affected by multipath and depends strongly on satellite geometry. Moreover, the typical GPS-RTK 20Hz sampling rate will limit its capability in detecting certain high mode signals of some structures. The new 100Hz GPS-RTK systems need to be further tested in order to ensure the independence of the measurements.

In order to exploit the advantages of both GPS-RTK and accelerometers, two data processing strategies have typically been used, namely to convert GPS measured displacement to acceleration through double differentiation and compare it with the accelerometer measurements (what we refer to as forward transformation), or to convert the accelerometer measurements into displacement through double integration and compare it with GPS measured displacement (the reverse transformation).

The latter approach is much more challenging because we have to determine two integration constants in order to recover all the components of displacement (static, quasi-static and dynamic). If the structure to be monitored is subject to a quasi-static force, as in the case of a typhoon, this further complicates the analysis.

Although earlier research has proposed a lab-based threshold setting for accelerometers to deal with the quasi-static issue, we believe that avoiding this procedure and developing new ways to recover the false and missing measurements from GPS by acceleration transformation would provide a preferred approach.

This article discusses recent efforts to design such a system based on a new integration approach that employs the correlation signals directly detected from a GPS-RTK system and an accelerometer to transform one form of measurement to the other. The methodology consists of a Fast Fourier Transform (FFT) for correlated signal identification, a filtering technique, delay compensation, and velocity linear trend estimation from both GPS and accelerometer measurements. We also present results derived from its installation on structures in Japan that subsequently experienced the effects of an earthquake and typhoon.

(For the rest of this story, please download the complete article using the PDF link above.)

By
January 1, 2006

Will Success Spoil GPS?

Like some behemoth rocket ship launched in the 1970s, the Global Positioning System sails on through an expanding universe of users and applications, seemingly imperturbable, successful beyond the expectations of its creators, an enormous momentum carrying it into the third millennium.

Like some behemoth rocket ship launched in the 1970s, the Global Positioning System sails on through an expanding universe of users and applications, seemingly imperturbable, successful beyond the expectations of its creators, an enormous momentum carrying it into the third millennium.

To all appearances, GPS is prospering more than ever: a second full signal (L2C) is becoming available to civil and commercial users, a denser ground monitoring system being built out, improved accuracies squeezed out of the algorithms and operational practices at the Master Control Station in Schriever Air Force Base, prices dropping on products with more features and functions than ever, hundreds of millions of receivers in use around the world. A follow-on generation (Block IIF) of satellites with a third civil signal (at the so-called L5 frequency) is being built by Boeing for launch beginning in 2007.

Since its first satellite launch 28 years ago, GPS has blazed a trail for satellite-based positioning, navigation, and timing. Thanks to GPS, global navigation satellite systems have gone from being a technological unknown to becoming a widely recognized utility. GPS, a model and inspiration to its imitators across the oceans.

Or is it?

In fact, for some years now GPS has been a victim of its own success. Performing better than advertised, the system has suffered from budgetary pilfering for other defense programs and risks getting lost in the shifting maze of diffuse dual-use management responsibilities.

“History has shown that the Air Force has had chronic difficulty in adequately funding GPS, even in the absence of the more expensive GPS III satellites,” observes a high-level Defense Science Board (DSB) task force report on GPS issued late last year. “If the Air Force continues to use its GPS investments as a funding source to offset other space/aircraft programs, then GPS service continuity will remain in jeopardy even without the more costly GPS III.” (See article “Bold Advice” in this issue.)

Meanwhile, an Air Force Space Command projection puts the worst-case probability of the GPS constellation falling below its fully operational capability (FOC) of 24 space vehicles sometime between 2007 and 2012 as 20–40 percent. Indeed, the task force argues for a 30-satellite constellation to ensure robust coverage in “challenged environments.”

The timelines for the last three GPS satellite development and launch programs — Block IIR, IIR-M, and III — all slid to the right, as they describe schedule delays these days.

Intermittently starved for fuel, with sporadic guidance from the helm, will new resources reach the system before its speed inevitably begins to slow, threatening its being overtaken by other GNSS vehicles?

Okay, that’s the bad news.

The good news is that no one connected to the program wants to let one of the world’s leading U.S.-branded utilities slip into the shadow of the other GNSSes under development. And steps are under way to ensure that doesn’t happen.

New Game Plan

A long-awaited next-generation program, GPS III, spent well more than hundred million dollars on conceptual studies and several years jogging in place before receiving a renewed go-ahead from the Department of Defense (DoD). The Fiscal Year 2006 (FY06) federal budget allocated $87 million for GPS III. The FY07 budget will be finalized soon in Washington, and current indications are that GPS Block III will receive at least $237 million, according to the GPS Joint Program Office (JPO). Of course, GPS III funds have been zeroed out before.

Current plans call for GPS JPO decision this summer that chooses among proposals submitted for separate space vehicle (SV) and operational control (OCX) segment contracts. Once acquisition strategies are formally approved in Washington, release of the GPS Block III SV request for proposals (RFP) are expected to be released by mid-February and later in the spring for the OCX RFP, according to JPO.

“Minor adjustments are being implemented in the program planning to reflect an incremental development and delivery approach for both acquisitions that will provide increased GPS capability sooner and more frequently over the life of the program,” the JPO told Inside GNSS. Nonetheless, an upgrade in the control segment to accommodate the new generations of satellites is behind schedule, which means that the capability to operationally control those signals will not be available until 2009 at the earliest, according to the DSB task force.

Modernizing Technology

In terms of its fundamental design, the Global Positioning System is nearly 35 years old. More recent spacecraft designs using modern electronics, new rubidium clocks, better satellite management techniques, and navigation message enhancements have improved performance. But the design of the key resource for manufacturers and users, the GPS signals-in-space, is essentially the same as when the first satellite was launched in 1978: a C/A-code on L1 (centered at 1575.42 MHz) and P/Y-code military signals at L1 and L2 (1227.60 MHz).

Over the next five years, however, this situation will change dramatically.

Beginning with SVN53/PRN17, the first modernized Block IIR (IIR-M) satellite built by Lockheed Martin and launched last September 25, GPS has gained a new open civil signal at L2 (centered at 1227.6 MHz). A third civil signal, L5 (centered at 1176.45 MHz) will arrive with the Block IIF satellites now scheduled to begin launching in 2007.

Both IIR-M and IIF satellites will offer new military M-code signals at L1 and L2 with “flex power” capability of transmitting stronger signals as needed. The L5 civil signal will be broadcast both in phase and in quadrature, with the quadrature signal being broadcast without a data message. Air Force Space Command expects to have a full complement of satellites transmitting L2C and M-code signals by 2013; for L5, fully operational capability is expected by 2014.

Generally, the new signals will be characterized by longer code sequences broadcast at a higher data rate and with slightly more power. Beginning with the IIR-M satellites, the Air Force will be able to increase and decrease power levels on P-code and M-code signals to defeat low-level enemy jamming — a capability known as “flex power.”

These new signal features will support improved ranging accuracy, faster acquisition, lower code-noise floor, better isolation between codes, reduced multipath, and better cross-correlation properties. In short, the new signals will be more robust and more available.

Looking farther ahead, another civil signal at L1 is planned to arrive with the GPS III program. Under a GNSS agreement signed with the European Union in June 2004, this will be a binary offset carrier (BOC 1,1) signal similar or identical to that of the Galileo open signal. This is expected to simplify the combined use of GPS and Galileo signals. Nominal first launch date for a GPS III spacecraft is currently 2013.

Modernization will also take place in the ground control segment. Six GPS monitoring stations operated by the National Geospatial-Intelligence Agency (formerly the National Imagery and Mapping Agency) have been folded into the existing five Air Force GPS monitoring stations (which includes the Master Control Station at Schriever AFB, Colorado.) This will eliminate blank spots in coverage and support Air Force plans to monitor the integrity (or health) of civil signals as well as military signals.

New Political Structure

Under a presidential national security policy directive (NSPD) released in December 2004, a National Space-Based Positioning, Navigation, and Timing (PNT) Executive Committee and Coordination Office have taken over from the Interagency GPS Executive Board (IGEB). Mike Shaw, a long-time GPS hand on both sides of the civil/military interface, stepped in toward the end of 2005 as the first director of the PNT coordination office.

Establishment of the PNT committee — now cochaired by deputy secretaries of defense and transportation, Gordon England and Maria Cino, respectively — kicked GPS leadership up a notch from that of the IGEB. Other members include representatives at the equivalent level from the departments of state, commerce, and homeland security, the Joint Chiefs of Staff and the National Aeronautics and Space Administration.

The committee had met once shortly after its formation under President Bush’s NSPD, but a January 26 gathering marks its first with the current leadership. In addition to getting acquainted with one another and the PNT topic in general, the agenda covered such issues as the DSB task force report, modernization and funding of GPS, and the new UN International Committee on GNSS (see article "What in the World is the UN Doing About GNSS?" in this issue).

Without a director and coordination board in place, the executive committee was unable to get on with many of the tasks assigned it by the presidential directive, including writing a five-year plan for U.S. space-based PNT and appointing an advisory board of outside experts. With Shaw on board, the coordination board now has seven staff members detailed from agencies represented on the executive committee.

A charter for the advisory board has been drafted and awaits approval by the committee, as does a draft of an international PNT strategy prepared by the State Department under the direction of Ralph Braibanti, who heads that agency’s space and advanced technology staff.

By
1 23 24 25