Tech At Bloomberg

How Bloomberg handles a massive wave of real-time market data in microseconds

March 04, 2022

It’s 9:29 in the morning, and the financial world is waiting. The New York Stock Exchange is about to ring its storied opening bell. At this moment, a flood of market data will hit Bloomberg’s Ticker Plant, the software and hardware that ingests and processes massive streams of financial data and publishes them in real-time to Bloomberg’s hundreds of thousands of Terminal subscribers and Enterprise Data customers, enabling them to make important decisions about their investments. This dramatic spike in the volume of incoming data happens in microseconds, presenting an unparalleled challenge for the software engineers who design and maintain the system. But due to world-class engineering innovations by the company’s technology teams, the Ticker Plant never skips a beat.

(1:11) Bloomberg’s Ticker Plant is built to process millions of pieces of financial data a second.

It’s like a flood.

See how Bloomberg’s real-time systems process the flood of market data that is unleashed at the start of every trading day.

Featured

Shawn Edwards poses for a headshot. He is a white male with extremely short hair and a bear, wearing a navy suit with a blue shirt and tie.Shawn Edwards poses for a headshot. He is a white male with extremely short hair and a bear, wearing a navy suit with a blue shirt and tie.
A headshot of Paul Callaway, the Head of Real-Time Systems at Bloomberg LP. Paul is a white mane with short black hair and a thick black beard. He is wearing glasses and a pink button-up shirt underneath a charcoal jacket.A headshot of Paul Callaway, the Head of Real-Time Systems at Bloomberg LP. Paul is a white mane with short black hair and a thick black beard. He is wearing glasses and a pink button-up shirt underneath a charcoal jacket.
A headshot of Christine Flounders, an Engineering Manager of Real-Time Market Data at Bloomberg LP in London. Christine is a white woman with shoulder-length brown hair and is wearing a burgundy blouse.A headshot of Christine Flounders, an Engineering Manager of Real-Time Market Data at Bloomberg LP in London. Christine is a white woman with shoulder-length brown hair and is wearing a burgundy blouse.
Craig Gordineer, the Engineering Manager for Ticker Plant at Bloomberg LP in New York's headshot. Craig is a white male with short white hair and a 5-o'clock shadow. In this picture he smiles through closed lips and is wearing a blue collared shirt and glasses.Craig Gordineer, the Engineering Manager for Ticker Plant at Bloomberg LP in New York's headshot. Craig is a white male with short white hair and a 5-o'clock shadow. In this picture he smiles through closed lips and is wearing a blue collared shirt and glasses.
Tony Tang is the Head of Engineering for Market Data at Bloomberg LP. He is clean-shaven man with rimless glasses and dark hair. He's wearing a gray and white-striped shirt.Tony Tang is the Head of Engineering for Market Data at Bloomberg LP. He is clean-shaven man with rimless glasses and dark hair. He's wearing a gray and white-striped shirt.
Chief Technology Officer
Head of Real-Time Systems
Engineering Manager – Real-Time Market Data
Global Head of Ticker Plant SRE
Head of Engineering, Market Data

Shawn Edwards poses for a headshot. He is a white male with extremely short hair and a bear, wearing a navy suit with a blue shirt and tie.Shawn Edwards poses for a headshot. He is a white male with extremely short hair and a bear, wearing a navy suit with a blue shirt and tie.
Chief Technology Officer

A headshot of Paul Callaway, the Head of Real-Time Systems at Bloomberg LP. Paul is a white mane with short black hair and a thick black beard. He is wearing glasses and a pink button-up shirt underneath a charcoal jacket.A headshot of Paul Callaway, the Head of Real-Time Systems at Bloomberg LP. Paul is a white mane with short black hair and a thick black beard. He is wearing glasses and a pink button-up shirt underneath a charcoal jacket.
Head of Real-Time Systems

A headshot of Christine Flounders, an Engineering Manager of Real-Time Market Data at Bloomberg LP in London. Christine is a white woman with shoulder-length brown hair and is wearing a burgundy blouse.A headshot of Christine Flounders, an Engineering Manager of Real-Time Market Data at Bloomberg LP in London. Christine is a white woman with shoulder-length brown hair and is wearing a burgundy blouse.
Engineering Manager – Real-Time Market Data

Craig Gordineer, the Engineering Manager for Ticker Plant at Bloomberg LP in New York's headshot. Craig is a white male with short white hair and a 5-o'clock shadow. In this picture he smiles through closed lips and is wearing a blue collared shirt and glasses.Craig Gordineer, the Engineering Manager for Ticker Plant at Bloomberg LP in New York's headshot. Craig is a white male with short white hair and a 5-o'clock shadow. In this picture he smiles through closed lips and is wearing a blue collared shirt and glasses.
Global Head of Ticker Plant SRE

Tony Tang is the Head of Engineering for Market Data at Bloomberg LP. He is clean-shaven man with rimless glasses and dark hair. He's wearing a gray and white-striped shirt.Tony Tang is the Head of Engineering for Market Data at Bloomberg LP. He is clean-shaven man with rimless glasses and dark hair. He's wearing a gray and white-striped shirt.
Head of Engineering, Market Data

“There are very few companies in the world that have to deal with so much data in such a short period of time, while maintaining strict ordering and extreme accuracy,” says Bloomberg CTO Shawn Edwards. “It’s an incredible engineering challenge to get this right, every single day, when people are depending on microsecond responses to the market.”

“It’s like a flood,” explains Paul Callaway, Head of Real-Time Systems Architecture in the CTO Office. “It’s as if someone’s opening a dam and releasing all of this stored financial information that has built up since the market closed the day before.”

Bloomberg’s challenge is to turn this global flood of data into meaningful and consumable real-time streams of information for its clients around the globe, so they can make better financial decisions, faster.

Charting the course

Bloomberg’s CTO Office is the research arm of the company, which designs and develops prototypes for the next generation of Bloomberg’s infrastructure, hardware, and applications. This group collaborates with other departments across the company, especially Bloomberg’s Engineering team, the organization’s main technology department that includes more than 6,500 software engineers working across more than a dozen countries around the world. The tight research and implementation collaboration are instrumental in the ongoing development of Bloomberg’s infrastructure and products. The CTO Office also leads external academic and open source communities.

“We focus on strategic areas for the company,” says Edwards. “And when necessary, we switch hats and become product managers for a lot of core infrastructure and new product ideas.”

Back in the 80s, Ticker Plant was originally written in Fortran, and was used initially to store bond prices. It utilized a time-series database that was optimized for non-homogenous data — essential when dealing with many different types of financial data sets. In the early 2000s, its code was rewritten in C++, a language well-suited for high-performance, high-throughput applications, like those in finance.

In 2010, the company’s decades-old datastore was replaced with a home-grown, data-agnostic, high-performance, persistent datastore, built to support massive time-series datasets. This flexible, ultra-fast data storage and retrieval solution enables Ticker Plant to add business value much faster to intraday market data. The storage engine hosts several petabytes of data, across hundreds of millions of financial instruments. The query engine handles around 80 billion queries every day at very high throughput.

Turning these disparate feeds of raw market data into meaningful insights is Bloomberg’s central mission. Building solutions to accomplish that mission requires bold leadership from the Ticker Plant Engineering team, the CTO Office, and complex coordination across System Reliability Engineering (SRE) teams, Software Infrastructure teams, Product teams, and more. The result is a suite of solutions that allow data to be gathered, evaluated, enriched, and made available to clients, virtually instantaneously, the moment it hits Bloomberg’s feed handlers. Bloomberg had to build and evolve solutions in-house to perform this function at such a unique scale.

“There are no good open source or commercial Market Data software solutions that our engineers can pull off the shelf,” says Edwards. “Bloomberg has had to handcraft these solutions and customize them over the 40 years that we’ve been dealing with market data.”

Millions per second

At any time of day — long before and after the opening bell has rung — financial professionals rely on Bloomberg’s Ticker Plant. At this very moment, many thousands of financial analysts, investment bankers, asset managers, hedge funds, business magnates, and amateur day traders around the world have at least one eye glued on a phone or monitor displaying a real-time feed of financial data provided by the Ticker Plant. It’s the beating heart of Bloomberg’s data estate, processing over 200 billion pieces of financial information and publishing them in real time, each and every day. This massive and ever-growing pipeline is a leading distributor of real-time market data to the financial world.

Resiliency, efficiency, availability, and visibility are mission-critical values that Bloomberg’s engineers strive to improve all the time.

“Market open is not the only time that bursts in volume happen,” says Jeffrey Olkin, Global Head of Ticker Plant SRE. “When the Fed announces an interest rate change at two o’clock in the afternoon, you see a similar surge in traffic. Politics and current events can also generate drastic moves in the market at various times during the day — and those peaks can sometimes exceed market open. It all comes down to us managing not for average loads, but for peak loads.”

“Every piece of market data is about the size of a tweet, but there are millions of them per second at the peaks of the day,” says Callaway. “We have to take these and flow them through various processing steps. We need to turn them into a standard form internally so we can distribute them. We need to store them. We may need to make some calculations on them. Once we have that, we need to route them out around the world to customers. And this entire process needs to happen in microseconds.”

Parsing data at the peaks

Before it enters the Ticker Plant, feed-handling software decodes the data coming in from the exchanges, normalizes it, and sends it downstream to the Ticker Plant. Information about securities, asset classes, market indices, commodities, derivatives, futures, or any number of other financial instruments or data points is aggregated, normalized, and enriched with various forms of analytics. That data is then presented to customers, now in a more consumable form, as a cross-section of those feeds, in real time. 

“We have to figure out exactly how much information we should expect from the exchanges,” explains Christine Flounders, Engineering Manager for the EMEA Real-time Market Data & News teams. “We have to build in enough network and hardware capacity to be able to handle that amount of data, providing enough headroom to ensure that we’re not going to have any issues related to the processing of that data when it first comes into the Bloomberg network.”

As noted previously, data is transformed in a variety of ways as it moves through this pipeline. Each stage of processing represents a potential chokepoint, where the flow of data could be slowed or the accuracy of data could be compromised. The Real-time Market Data team employs a number of monitoring-in-place and notification-in-place tools that sound the alarm when the volume of data approaches designated thresholds.

Dams and Deltas

The “lots of data coming at once” problem is more complex than it sounds, requiring a multi-faceted tech stack and many distinct domains of expertise. 

“We have two strategies to deal with this,” says Tony Tang, Head of Market Data Engineering. “The first is division of labor. We don’t make a set of engineers focus on all the problems at once.” Instead, Tang’s team of nearly 400 is divided into narrow domains, with each team dedicated to maintaining and innovating a slice of the tech stack. “We break up the problem into many small pieces and link them together.”

The second strategy is a division of systems — to “go wide,” as Tang puts it. His team makes an educated bet that if surges in market data occur, they will most likely be distributed evenly across the market. “When we have a jump in data volume, it’s usually not just a single security,” explains Tang. Instead, spikes typically result when hundreds of thousands of securities experience an increase in trading volume — and its related data — at once. As a result, different parts of Bloomberg’s system handle different subsets of this market data. “By chopping up the incoming data this way, the flow is more evenly spread out across the system.”

You can deal with a flood in several ways. You can build one mega-dam, like the Hoover Dam, but that’s expensive and risky, as it has the potential for being a single point of failure. Or you can terraform the geography of the riverbed so that flooding can be handled by several smaller dams. “Think of the Nile delta, where you have the water spreading out over a much wider area,” says Tang.

Big spikes in market data volume are usually distributed across the market, rather than coming from one particular source, but not always. Sometimes, there’s a flurry of interest around a particular security, and Bloomberg has to have a strategy to handle localized spikes like this as well. This is something Tang’s team is working on.

Ticker Plant’s systems can allocate resources on the fly in the event of anomalous market behavior. Take the frenzy of meme stock trading that took place around GameStop earlier this year, an example of a spike that wasn’t widely distributed across the market. Right now, Tang’s team is focused on building a set of solutions that would allow Bloomberg to dedicate a portion of the system (i.e., a channel) specifically to handle such flare-ups of concentrated market activity.

Another ongoing challenge is serving data not only to humans, but also to machines, which have a greater capacity for ingesting data very quickly at scale. When Bloomberg’s core systems were initially devised, they were designed to feed data to human end-users. But increasingly, Bloomberg feeds data to advanced software which helps automate many of the functions of human financial professionals.

“We’ve had to upgrade our capabilities in latency and stability across the gamut to support this new use case,” says Tang. Bloomberg’s feed handlers and Ticker Plant are constantly evolving to meet the challenges of the ever-changing global capital markets.

“This is a really exciting problem, because you are never done,” says Edwards. “What we have now is a fantastically performant system of which we are really proud. But, the market will be different next year, and we know we have to constantly anticipate and evolve with — and even get ahead of — it.”