Partager via


Recreating the Ear of the Markets - Part I

Ever since Mr. Reuter replaced carrier pigeons with telegraphy, the history of markets has been a race to capture market sensitive information. But the challenge lies not just in gathering the data, but distilling it to what really counts.

In the days of open outcry there was something called ‘the ear of the market’. Traders became aware of potential market movements listening to the noise from other pits. One could almost feel a market surge or decline coming on as voices and energy levels ebbed and flowed across the floor. In fact it was impossible to miss. With the shift to electronic markets we have lost that ‘feel’ and today glean our intelligence from rows of numbers across multiple screens and thousands of spreadsheets.

As a result, individual traders became laser focused on a few securities – a myopic view of markets. After all there is a limit to what the human mind can absorb. While markets moved faster, financial institutions became buried in a deluge of data. But 2D data is of little value in managing 3D markets and new data visualization tools are defining the next generation of business intelligence.

The question remains do we have the right data in the first place?

Once market prices hit the screen or populate a spreadsheet they are already history. In fact markets move on sentiment fed by stories that can bubble up from almost anywhere. Today’s equivalent of the telegraph is the internet and its socialization. What if we could track the evolution of stories – the noise from other pits – before they are baked into market prices?

Can Social Networks Move the Dow?

At least one hedge fund believes there is gold in ‘gossip’. It has invested its trading strategy in following Twitter based on research that showed a close relationship between social sentiment and movements in the Dow. But the noise that moves markets goes way beyond Twitter. If markets are driven by fear and greed, what is the data that contributes to those feelings and how can we access it before everyone else has formed a view?

Demand for the data that feeds traditional database models continues to grow at a strong clip. Banks could all do better at managing and integrating traditional data. But social data – the blogs, the wikis, the tweets and the whole chatter of the internet – is growing exponentially. Conversations and opinions are rushing online and becoming more pervasive. A whole new body of sentiment, that had little or no outlet before, has now appeared.

Stock prices are a function of fact and sentiment. Sentiment is a function of noise over time: a collective consensus of positive or negative perceptions that may percolate slowly or abruptly, taking the market in different directions at any moment.

Data leads to these perceptions. It may be structured or unstructured. It may linger on websites or swarm across multiple networks. It may make the front page of the New York Times or the back page of USA Today. It may be a lengthy article or a sound bite or the digital breadcrumbs we leave behind when we surf the internet. Some of that data may provide clues to compelling events, while other may be of little or no value.

The trick is to distil the data before the sentiment is formed, and provide essential intelligence about the next movement in markets.

How Can We Measure Feelings?

Even if we could track the data that anticipates sentiment how can we measure it? There have been efforts to construct news sentiment scores for multiple asset classes such as equities, foreign exchange, fixed income and commodities. Such scores can be constructed at various horizons to meet the different needs and objectives of high and low frequency trading strategies. Based on that quantified sentiment, it may be possible to generate a set of strategies for investing, hedging, and order execution.

Time series analysis enables us to correlate the relationship between sentiment, earnings and stock prices over time to determine the relationship between them. We can even explore the relationships between these variables across different asset classes to see if any patterns emerge that may provide clues to future market movements.

Initiatives in this space remain a work in progress. Until recently the technology to capture and analyze such huge amounts of data simply did not exist, but today it does albeit in a fragmented form. Different elements are coming together, to the point where we can recreate the ear of the market and anticipate trends before they emerge.

So what is this new technology and how will it work together? For the answer see part II of Recreating the Ear of the Markets by Damien Islam Frenoy in his blog tomorrow.