New competitive demands and regulatory requirements are driving financial firms around the world to use highly precise timestamps on incoming information and transactions, a trend likely to push firms to adopt a common time source like GPS.
New competitive demands and regulatory requirements are driving financial firms around the world to use highly precise timestamps on incoming information and transactions, a trend likely to push firms to adopt a common time source like GPS.
“Where we are today and where we will be in about three to four years is going to be markedly different because the demand for very precise timing within the industry is changing rapidly,” said Andrew F. Bach, the chief architect for the financial services team at Juniper Networks, a firm that provides high performance networks.
Part of the impetus for the shift is the need to handle billions of dollars of transactions daily at lightning speed.
For example, said Bach, the Depository Trust and Clearing Corporation, which settles stock transactions in the United States, clears about $24 quadrillion dollars per a year, and there are 20 to 25 similarly-sized clearing corporations in other countries. One of the larger U.S. banks settles about $35 trillion on a slow day and the New York Stock Exchange handles nearly $2 billion in trades in the first two minutes after opening.
“The bottom line is we handle and transact a tremendous amount of dollars very rapidly,” Bach told the December meeting of the National Space-Based Positioning, Navigation, and Timing (PNT) Advisory Board.
Those transactions are based on real-time news, and as the speed of news distribution has increased so has the speed of trading, said Bach, who came to Juniper from the New York Stock Exchange (NYSE Euronext), where he served as senior vice president and global head of network services.
“There’re some really good financial research papers out there,” said Bach, “that indicate that stock prices will readjust and react to news, and completely settle out to their new value in response to that news, in under five minutes from when the time the news breaks.”
Many of the transactions are now rooted in heuristic trading based on “hyper-contextual” information, which assesses a story’s believability, judges its potential accuracy, and determines how to weight the information.
“You combine that all together and then you make a trading decision based on it,” he said. “This sort of process is being developed and is evolving within industry as we speak, and the goal is to keep that well under a few seconds from the time the news breaks. This goes back to the comment I made earlier, if you wait five minutes you miss the market and you’re the last one showing up at the party.”
The plant doing this analytical process handles, on average, around one billion messages a second, Bach explained. “So when you are running in your shop something that’s digesting one billion messages a second, that drives some very strange technology requirements — and that’s where we get to the need for precision timing.”
Enter GPS
Currently, he said, most companies are using a Building Integrated Timing System or BITS. These systems link to GPS to get the day, date, and time but have their own atomic clocks and are able to generate timing information independently. Up to now there hasn’t been a need to coordinate timing information between financial organizations, and GPS has not been essential for smooth operations.
Precise timing, however, is becoming vital for several reasons — first of all to run the increasingly complicated facilities.
“When you’re processing a billion messages per second,” said Bach, “those timestamps get awfully small. So, there is a need for a very precise, very accurate set of clocks.”
The trading systems themselves are also requiring vast sums of data and that data increasingly needs to have a real-time reference.
“This is where we start shifting, over the next several years, to the criticality of GPS,” explained Bach. In the long term, he said, it will matter when a snippet of information is created.
“It makes a difference to me,” he said, “whether that person tweeted it now or 100 microseconds ago . . . because when the next [message] comes in to replace it, I can start to build a trend out of the tweets, out of the newsfeed, all separated by microseconds. And that’s how I start to build whether I believe that particular social feed. And that’s where I’m taking the business overall. So, that’s going to drive a need for much more precise timestamps, but more importantly, timestamps that are now correlated with other outside third-party entities.”
At least one news outlet is jumping on the precision timing bandwagon, said Bach. They will announce plans in the next three months to offer a high-quality newsfeed that incorporates a precision time stamp.
Regulating Time
Regulatory factors are also driving the desire to timestamp transactions and tidbits of news. Companies need to be able to prove that they are providing fair and equal access to all market participants.
“When it took you five or six seconds to execute an order,” said Bach, “it was very easy to demonstrate that and follow it through the system. When your order is one of say 100 million flowing through the system in one second, I have to be able to prove to you, and to regulators, that I didn’t do anything unfortunate to your order; I didn’t arbitrarily delay it — or worse yet deliberately delay your order for financial advantage. . . . The only way I can prove that is with remarkably precise timestamping.”
Regulators are starting to focus on requiring extremely precise timestamps and on timestamping every single transaction, Bach told the advisory board. Europe has adopted MiFID II (an update of the Markets in Financial Instruments Directive), which will require the timestamping of every transaction in the financial industry across all of the European Union.
MiFID II was published in the Official Journal of the European Union on December 6 and will be implemented by 2017. The directive requires that the clocks of trading venues and their customers be synchronized, standardizing the recorded time for post-trade data, transaction reporting and order event auditing, according to a summary posted by Fidessa, a British firm that provides investment infrastructure, technology tools and support services for traders.
MiFID II will be the beginning of a mandate to form a common time epoch, Bach said.
“All the brokerage firms, the banks, the trading institutions will, in fact, have to be synced,” he explained, “and now at that point I really do care that my clock matches my competitor’s clock and the exchange where I am doing business and their clock — because it’s the only way I’m going to be able to convince the regulators that I haven’t traded ahead of the market or done something else that’s of questionable ethics.”
Though MiFID II applies only to Europe, Bach expects U.S. regulators to adopt something similar.
“I haven’t quite seen a formal process from them yet,” he said, “but clearly, if Europe’s going to do it I’ve got to believe that U.S. regulators are either thinking about it or not far behind.”
According to Fidessa’s Christian Voigt, U.S. regulators are already moving to tighten timing requirements. The Financial Industry Regulatory Authority, Inc. (FINRA), the largest independent securities regulator in the United States, recently recommended new timestamping rules, requiring U.S. firms to express time in milliseconds. (A proposal in Europe wants records timestamped to the nanosecond.)
FINRA is also consulting, Voigt said, on business clock synchronization, he wrote in a blog post on the website Regulation Matters. The current clock synchronization requirements allow for a tolerance of one second from the National Institute of Standards and Technology (NIST) atomic clock. FINRA has proposed that the tolerance for computer clocks be reduced to 50 milliseconds. A comment period on the proposed rule has been extended to February 20, 2015. For comparison, the U.S. Naval Observatory operates an ensemble of
stratum 1 Network Time Protocol servers using an Internet standard
protocol, which are synchronized to the USNO Master Clocks or to GPS as
their stratum 0 reference clocks, with a typical accuracy in the range 1
– 30 milliseconds continuous.
Meeting the requirements, however, will not be easy. The industry needs to settle on a uniform time epoch and will have to develop better timing technology. Systems capable of nanosecond precision, as proposed in Europe, could be prohibitively expensive, said Fidessa’s Anne Plested, in a Regulation Matters blog post.
Developing a microsecond timing solution will be a challenge, acknowledged Bach.
“We have to figure out how to do that. That’s not an easy thing to do from a technology perspective.”