Asset Control’s Phil Lynch issues data management warning
Phil Lynch, CEO of Asset Control, a data management solutions company, says market volatility is forcing fund managers to take note of data management.
Over the past year, the global financial market has suffered several shocks. The US came dangerously close to technical default, the dollar was downgraded and Europe struggled with issues of country bailouts causing a decline in confidence within central banks and national treasuries.
For those on the front line, increasing levels of volatility mean more rapid changes in price points leading to more data to access, analyze and act upon. Day traders, high-frequency market-makers and those whose strategy is simply to stay out of trouble need the ability to respond rapidly to fast-changing data.
A steady increase in interrelationships between global events, the rise of multi-asset trading and technology enabling capital to flow more quickly across asset classes, borders and venues have all contributed to expanding data volumes. Other factors have added to the pressure as well: liquidity has become fragmented across venues; the Internet has made information more readily available; and trade sizes have shrunk. Trading has become a 24/7 business, leaving firms with shortened windows for processing, pricing and reconciliation.
Anyone who thinks this is a sell-side only problem is missing the point. Buy-sides have been getting closer to the trade for years. Regulators and investors are pushing for additional accountability and transparency across the board, meaning that those on the buy-side business need to take greater responsibility for investment decisions.
With the huge amount of data now available, fund managers need to make sense of, and effectively use, the increased information available to them. Understanding the impact of volatility on the shape of portfolios and making practical decisions about information brokers are delivering has become a vital concern.
Compounding the problem is the fact that technology investment has typically been focused on automating the front office rather than updating middle and back office infrastructure, meaning systems that source, process and manage the risks and operational procedures have struggled to keep up. These systems were designed around clearly demarcated markets and the siloed business structures of the past.
The truth is that automation of the middle and back office is the only way to deal with the increased amount of information and processing within the tighter timeframes produced by volatile markets. Economic instability means that there will be no returning to that homogenous world where a narrow focus on asset class and geography could deliver alpha and manually handling data could effectively manage any issues that arose.
The investment focus needs to center on obtaining accurate, available and actionable data, and managing it as a discrete and fundamental activity across the firm, separate from – but interoperable with – the applications which rely on it to perform their functions. The alternative of having data sourced individually by each system only leads to chaos, duplication, inefficiency, obscurity and cost.
Most firms have multiple business units, product lines and investment strategies, all of which require different data sets that will be used in different ways. Compliance and risk management will need different data sets than fund managers, while operations want data on actual holdings, and analysts use information for modeling, stress-testing and ‘what if’ scenarios.
Clearly there’s no single solution for these distinct requirements.
Therefore, it’s time for buy-side firms to examine how data works for them and not just from a risk and compliance perspective but in terms of gaining a competitive edge too. If they don’t, investors and regulators will certainly be queuing up to find out why.