Beyond the realms of customer experience and regulatory requirements, the use of new data sources, techniques and approaches is now reaching almost all parts of financial institutions, from the human resources department to the creation of entirely new businesses.
This has not happened overnight: financial services companies have gradually become data factories through a journey that started with the dematerialisation of securities and cash and more real-time processing of transactions in market places. A high level of security has always been a major priority, particularly for custodians/trust banks responsible for safeguarding clients’ assets.
Everyone in the industry needs to be prepared for the challenges linked to this explosion of data. The path to effective data management has a number of crucial steps, but the most important, sitting above the rest, is data governance.
Getting the most from existing data
The financial services industry already handles a vast amount of data, but because it is sometimes segregated in silos, it is challenging to extract the full value of data without knowledge of the relationship between datasets. Introducing agility in data management will help maximise the value of existing data resources and provide asset managers and asset owners with aggregated data regardless of the multiple providers they interact with.
At BNP Paribas Securities Services, we are working on improving the quality of the data by establishing governance and rules around data management and facilitating the integration of data providers. As well as making data management more cost-efficient by minimising duplication, tearing down the walls between silos will open up possibilities for data interactions that allow more added value from data mining.
Traditionally, incoming data has been automatically stored to be used at some point in the future, specified or not. In the most extreme cases, some flows of data deliver most of their value when processed instantly and may not even be worth storing for other future uses. In an environment with overwhelmingly huge flows of data, it is likely that the cost of data storage should be weighed against possible future value.
In practice, a mixed approach is often adopted. For example, real-time applications can process the live flows while the data is also stored for less time-sensitive applications. Or, the data is acquired and processed through so-called ‘micro-batches’, which enable frequent refreshes but do not require fully dynamic processing.
Next generation data storage: accommodating new unstructured data
In any case, successful companies will extend beyond existing data and traditional data formats. New agile systems must be ready to deal with a much broader set of inputs.
The universal SQL-databases of the past were designed only for structured data coming in a predictable format, destined for a predefined usage. The next generation of data management systems will have to accommodate various data with diverse formats and feeds, such as videos or voice recordings. This will challenge the design of data storage and access systems as consumer expectations and mindsets will change too.
All of this opens up a huge range of new ways to use data. Banks may be able to capitalise on the data they possess by sharing it in one form or another. This may be about giving customers access to (and even control over) their own raw data, or by delivering new added-value services such as analytical tools.
All of this will be enabled by the development of APIs (application programming interfaces) which allow different servers to communicate with each other.
Talking to each other through APIs
Flexibility and multiplicity will become standard, with ultra-personalised services becoming the norm, delivered in a self-service mode: when and where the customer needs them.
API platforms allow data to be shared without cumbersome file transfer processes and will be used both internally and to enable speedier and more responsive interactions with clients. For their part, clients will access and harness the data and create their own reporting.
These API platforms will become universal data marketplaces ready to fulfill most data requirements, giving users the convenience of more or less standard access mechanisms.
There is already a potential solution to be implemented slightly further down the road: when a new data source is identified, it is put at the disposal of consumers under a “data-as-a-service” mode that allows data to be integrated seamlessly and efficiently in applications.
I am your virtual assistant: how can I help you?
Other technologies, such as artificial intelligence (AI), Natural Language Processing (NLP), Natural Language Generation (NLG) and process automation, can be used to improve efficiency and experience for clients and employees.
BNP Paribas Securities Services is testing a virtual assistant to allow clients to access help functions, get smart help in response to their queries and access a to-do module.
Automation of repetitive tasks will further free up staff to focus on work requiring expertise and creativity. For example, a sales team supported by automatically generated client reports could focus on value-added tasks such as relationship-building.
With automated processes, audit trails are more transparent and more readily available. All of this will also help meet new regulatory requirements, which require the processing of huge quantities of data points and high level qualitative analysis.
Regulation, although it is demanding, is not the only reason financial services companies need to consider carefully how they will store and manage data in years to come. They will need to establish and enforce strong ethical policies in their data management practices, if they wish to retain the trust of their customers, who are likely to be increasingly sensitive to the dangers of data misuse.
Custodians: the digital safe for data
Cybersecurity, data traceability and auditability, as well as how data quality is managed, are now gaining critical weight in policies. In many cases, these constraints impose new architectures and new methods.
In response to the need for these new structures and the vital importance of data quality, BNP Paribas Securities Services has implemented a governance framework dedicated to data quality management and data protection with related roles and responsibilities.
The chief data officer (a position implemented in each of BNP Paribas’ entities) steers the framework and ensures the implementation of data protection and of quality and integrity principles regarding governance, processes and tools.
BNP Paribas, which is strongly engaged with data management best practices, has implemented a training path for its employees called “Know Your Data Culture”, with the objective of making sure each employee understands the importance of data quality and integrity.
Data governance comes first
In the Big Data Executive Survey for 2017 by New Vantage Partners LLC, the 50 senior business and technology executives who participated in the survey (three out of four of whom work in financial services) outlined the difficulty of organisational and cultural change around big data.
“More than 85% of respondents report that their firms have started programs to create data-driven cultures, but only 37% report success thus far. Big data technology is not the problem; management understanding, organisational alignment, and general organisational resistance are the culprits.”
Although the digital revolution has the potential to transform financial services, it is vital that companies put in place a governance structure suitable for using and controlling the powerful new tools at their disposal.
There is a huge opportunity for financial services to pre-empt the data management revolution. The challenge will be to understand the best path to combine agility and security.