Legacy systems: soon to be history? SimCorp’s Mayr explains
John Mayr, Marketing and Partner Development at Danish software and financial expertise provider SimCorp analyses the impact of technology in the financial industry.
The fifth anniversary of the collapse of Lehman Brothers in September has led to much dissection of how the investment landscape has subsequently changed. The industry has also turned its gaze inward to identify what lessons have been learned and what the impact would be if another major market player collapsed.
One of the shortcomings apparent following the Lehman collapse was some investment firms’ inability to identify their exposure to the failed bank. An overlapping tangle of ageing technology meant it took some managers weeks rather than hours to calculate their losses – wholly unacceptable for them and to their clients.
It is particularly surprising then, that despite all the improvements, adaptations and wholesale change induced by the financial crisis, many firms would still find themselves in the same position should history repeat itself. A study published by independent research institute SimCorp StrategyLab in September 2013 found that at least one in four asset managers worldwide is running their core systems on outdated technology, so-called “legacy systems”. As the top 2,000 firms collectively manage upwards of $80trn in assets, potentially trillions of dollars are at the mercy of outdated technology.
The term legacy systems, as defined in the report, refers to any one of a number of scenarios where old technology, in some cases installed decades ago, is used to oversee risk and manage client assets. Given the development of technology in the intervening years, this is akin to an individual today still using a Walkman to listen to music.
This situation pertains in many cases because technology solutions at a given firm have been updated piecemeal, business function by business function as the needs of the business have changed over time. Some components of such a firm’s resulting fragmented system landscape may have been constructed in-house, or supplied by different vendors. Many will rely on manual processes, overnight batch jobs for updates, and spreadsheets or personal databases.
These systems were usually adequate at performing what they were designed for – the problem is that environment no longer exists. Instead they have become a liability, unable to keep up with the pace of change.
The most urgent problem this raises is insufficiently robust risk management. Proper understanding and mitigation of risk, as demanded by regulators and investors, compounded by the speed with which market conditions and requirements change, is extremely difficult to achieve using manual processes and outdated technology. The use of spreadsheets in particular is prone to error and carries concomitant risks. This should not be underestimated; spreadsheet errors were said to be behind JP Morgan’s loss of $6.2bnin last year’s “London Whale” trading incident.
As the investment environment and market infrastructures have continued to change, risk monitoring has become significantly more complicated than the original designers of legacy systems ever envisaged. To cope, such systems have been modified reactively, in patchwork fashion. This is unsustainable, costly and itself risky. The aforementioned study references a global survey of buy-side recipients conducted earlier this year, which showed that well over half of legacy system respondents (56%) has had to increase IT operation budgets overall. By contrast, 60% of respondents with state-of-the-art technology systems plan to maintain or decrease IT operations spend.
This issue will only worsen for investment managers as the investment environment becomes yet more complex. Over the past decade investable asset classes, for example, have hugely expanded, with emerging markets securities, private equity and derivatives, to name but a few, having become largely mainstream. Many systems pre-date this. In addition, the pressure for faster reporting has increased, not just from clients, but also regulators. Emir, Dodd-Frank, Ucits and Mifid all carry significant reporting requirements which investment managers will find increasingly difficult to meet without modern technology in place.
At the root of the problem of legacy systems is that unlike with state-of-the-art systems, there is no central source of data. While it may sound prosaic, today’s investment firms are inundated with huge volumes of data every day, which feeds into different processes and organisational silos. Legacy systems are not equipped to analyse or manage this data in a timely or useful way.
Investment managers need up to the minute, accurate, complete data to make informed investment decisions. Without this in place, increasingly they will find it impossible to maintain a competitive edge in the marketplace. The investment manager who, in just a mouse click, can get an overview of their full exposure, will be significantly better placed than those who wait hours for their team to sift through spreadsheets and overnight batch results to provide them with the same data. In the changed risk environment, having the right IT infrastructure in place will become not just a business advantage, but a business necessity.