Over the past decade, what started out as industry buzzwords – data, data warehouse, data strategy and data governance – have become part of our daily work and are a key area of focus for many asset managers who look to data to transform how they do business.
If you have ever watched your children play sports, chances are you have seen parents, or perhaps even yourself, move beyond simple parental encouragement and yell instructions to do things that even the more experienced athlete might struggle with. Often the reality of where our kids are and the expectation of where we want them to be are greatly misaligned.
To continue with this analogy, it’s a phenomenon similar to that which many organisations experienced when they began their journey of data governance programs. Many of these program stakeholders were enamored with the potential of what data governance could offer, only to be blindsided by the reality of how ingrained data and its use were within applications and throughout the firm. Data governance was perceived as a panacea for fixing operational issues caused by poor data quality and yet the various stakeholders involved had different visions and expectations for what the “panacea” would deliver.
The desire to manage data efficiently, for information to flow and for the business to benefit from better data was understandable. However, in most cases – like the parent who wants their child to play like a “pro” – this vision of data governance was somewhat unrealistic and more complex than first hoped for.
Due to the nature of ingrained processes and system interconnectivity in many organisations, decoupling applications from one another and building a separate data warehouse were early goals of data strategy programs that were driven by data management teams and led by a chief data officer (CDO). These programs, and the increased visibility of anything to do with data, have created an environment in which the CDO has emerged as one of the most important leadership positions in the organisation.
These early projects were probably first-generation data strategy initiatives; today a new data trend is emerging, and it is driven by regulatory pressure as well as a need to extend the investment operations’ capabilities of the past. Historically, most of these data projects achieved no more than “warehousing” a collection of data in support of management decision processes.
However, under the current regulatory frameworks and pressure, and as today’s CDOs are well aware, we need more than “warehousing” and the industry is evolving in that direction. We need to not only aggregate, govern and steward the data, but to better use data as a business enabler.
The need for more sophisticated data aggregation, transformation and validation tools, where data is turned into information, is more prevalent than ever and with this evolution, one can imagine a future where reports are not “pushed” to the regulators but where the agreed upon data or information is ”pulled” by the regulators instantaneously from firms as required.
A strong data foundation is key and to support new data strategies, technology is equally as important, and without a doubt will be cloud based. Both foundation and strategy are natural next steps in the data lifecycle where data becomes a business enabler and the application of emerging technologies can generate actionable and analytical insight. There is still a huge amount of work to be done in this space and many asset managers and servicers will have ‘DataTech’ initiatives across their operations as key drivers over the next few years.
Katie Kiss is product manager at Confluence. The firm has published a white paper outlining developments around datatech. The paper can be downloaded here: https://www.confluence.com/uploads/confluence-datatech-whitepaper-digital.pdf