Financial industry players must step up collaboration towards a more efficient system of data exchange and management to exploit and connect massive amounts of information to produce new insights across firms and markets, a new report says.
The Depository Trust & Clearing Corporation ( DTCC ), a New York-based post-trade services provider, calls for open-source data standards, which will allow more users to get access to broader data more easily and put them to use much faster.
In a white paper, Data Strategy & Management in Financial Markets, DTCC notes that current data exchange standards typically assume point-to-point communication with asset class-specific and inflexible formats, as well as bespoke data models.
This has limited the ability of firms to explore the interlinkages of data. Meanwhile, upgrades, including expansion or harmonization of data fields, require lengthy consultation processes, industry consensus, and costly implementation.
Thus, there is a need for more flexible data sharing, enabling data producers to send data to many users or users to retrieve data at their convenience in a standard format. This flexibility will let users create simpler workflows and lower technology spend that will foster advanced analytics and innovation, the report says.
Data silos
“As new technological advancements, including broad adoption of cloud technology, spark an evolution of global markets, the financial services industry has an opportunity to reimagine how data is exchanged and managed across financial markets and firms,” says Kapil Bansal, managing director, head of business architecture, data strategy and analytics at DTCC.
“For many years, companies have collected massive stores of data, but the siloed nature of data and the need for better data quality limited the ability of market participants to extract strategic insights to support more effective decision-making. We’re now at a unique moment where we can make this vision a reality, but long-term success hinges on market participants taking action to ensure their data strategy can meet the demands of a highly digitalized and interconnected marketplace.”
Currently, information in the form of metadata ( i.e., descriptive data that captures attributes of the underlying data ) is often missing or embedded in specific data stores of applications, which significantly limit how broadly the data can be used and re-used in new ways.
The emerging best practice is to store data, including metadata, separately, in dedicated locations, akin to a virtual library, so it can be accessed by many applications for many purposes, the white paper says.
In addition, “data tagging or cataloguing” can be applied to provide additional context to data items ( e.g., privacy attributes ). This would let users within an organization innovate without having to search to find data the organization already possesses. Additionally, these enhanced data tags can also be used to allow external parties to discover the properties of proprietary data sets without the need to “see” the actual data.
Data quality
At present, many data sets that financial institutions rely upon are not of desired data quality to support decision-making, let alone automated decision-making, DTCC says.
Data quality is often difficult to ascertain – more so in the case of commercial data, which, to date, have not undergone as much scrutiny as risk and regulatory data.
At worst, incomplete or unvalidated data sets at times are used to support business decision-making. At best, an organization can expect lengthy and costly analytics development lifecycles, which often result in standard aggregated reports – not dynamic and predictive analytics tools.
For financial institutions to become faster moving and more agile, they will need the ability to perform larger volume and more flexible ad-hoc analyses, at lower cost.
Additionally, new products and resilience can be developed by sharing information across firms – e.g., to develop more robust risk analyses and models. This need has in the past often been addressed by data brokers and vendors, who for a large premium turn consumers’ own information into usable analytics.
Increased flexibility
DTCC’s white paper details four scenarios that will drive how data are used in financial markets in the future:
To enable these changes, the white paper suggests institutions that produce and consume significant amounts of data embed key principles into their data operating models, including:
Applying these principles will help market participants gain access to data that are trapped or underutilized today and allow for new and faster insights, DTCC says.
“Building the future of data exchange and management will require close consultation and coordination among industry participants and service providers, including standardization in how data is managed and shared,” Bansal stresses.