FRTB: BANKS MUST LEARN ITS ABOUT THE DATA & IT’S GOING TO COST
The fact that FRTB should have happened 20 years ago, and would have ameliorated some of the effects of 21st century financial disasters highlights its necessity, however ‘Better late than never’.
On 22 March 2018 the Basle Committee finally acted and wisely delayed the introduction of the Fundamental Review of the Trade Book (FRTB) by postponing to 01 January 2022.
The original live date 01 January 2020 was unrealistic given the MiFID2 sized black hole created for Financial Institutions in terms of resources and expertise.
Please visit www.marketdata.guru/2018/08/15/frtb20180815 to access pdf with graphics.
The data elements of FRTB are critical to its success as a regulation, yet this appears not to be fully understood in terms of its enterprise wide impact and application. With FRTB and other regulations, the regulators have shifted the data benchmarks.
Before: Regulators would set a benchmark of what must standards must be met.
After: Regulators are now defining the criteria of how these benchmarks must be met.
This dramatically raises the bar on data sources’ integrity and quality assurance which means resources, effort, and costs must be devoted to understanding what is required and how to meet these requirements.
This means we need to ask questions concerning exactly what a Bank’s data requirements are going to be. Here we can only scratch the surface.
Question. What does FRTB require in real world data terms?
There are 5 market data aspects that require consideration.
- Market Data in conjunction with trading data
- Reference Data including data classification, a significant challenge
- Data Quality, wrong inputs will materially affect capital costs
- Analytics & Models, will require regulatory approval
- Technology & Infrastructure. FRTB will increase data generation requiring access and storage
In addition, there are 8 critical features driving the data requirements of FRTB.
- Daylight, clear separation between the trading book and banking book
- Shortfall, shifting from evaluating risk to valuing prospective loss
- Horizons, creating ‘liquidity horizons’ for illiquid and complex products
- Sensitivity, placing increased emphasis on data quality
- Model Based, more rigorous approval processes with consistent identification of material risk factors
- Empirical evidence that practices are effective during periods of stress
- Transparency, the creation of level playing fields
- Mandatory approach requiring standardised approaches which are publicly disclosed
Question. What are FRTB’s direct data impacts?
- There will be a substantial increase in market data consumption, as well as market data generation needed to comply with FRTB.
- Banks will need to significantly invest in sourcing data, quality assurance and technical infrastructure to manage the data.
This places an emphasis on appropriate due diligence when assessing the market data requirements to be complaint.
Question. What are the consequences?
- Capital Requirements. Bad data will lead to higher capital requirements as sourcing the wronga data will inhibit the ability to run efficient risk-weighted assets (RWAs) calculations
- Data Sourcing. Greater premiums placed on (a) Quality, (b) Sufficiency/Quantity, (c) Real & Derived from observable transactions, (d) Consistency, and (e) Frequency
- Data Accessibility. Increase in number of calculations meaning need for more efficient data processes, all requiring technology infrastructure
- Audit trails. Must be in place, and need to eliminate data bottlenecks
Question. What are the fundamental data requirements?
Based on the questions asked, there are 4 sets of parameters for data. These are:
- Models. Does the bank have FRTB compliant models? If no, what is needed to be done?
- Data Inventory. Identify the datasets currently available internally and externally, assess the data QA and identify the gaps
- Data Sourcing. Define the data requirements and provide a clear assessment of what needs to be sourced from where to be FRTB compliant
- Technology. What are the technology gaps? This goes further than infrastructure it highlights the necessity of best practice symbology management
Question. What does this mean?
Banks need to address their data requirements from the strategic level to the desk level. This also requires Banks to understand that data is not only a cost, but it is also a very important resource. This requires new approaches to data sourcing and management, in other words the existing frameworks for data management across the business are becoming obsolete.
This is not only driven by more intrusive regulatory data requirements, such as FRTB, but driven by the advent of ‘The Cloud’, ‘Big Data’, and access to new data sources in the forms of traditional ‘structured data’ and unconventional ‘unstructured data’.
Question. What needs to be done?
There has been insufficient communication globally on what FRTB means and what is required to implement. IOSCO and global regulators need to promote FRTB better, especially outside the US, UK and EU, to markets where there is less knowledge about, and even less understanding of FRTB.
There is still a lack of preparedness within the financial community, especially when evaluating the data requirements, so there needs to be greater education regarding what is expected to become FRTB compliant.
Greater awareness of the data impacts of FRTB especially when benchmarking the data that is required to be FRTB compliant.
The danger is Banks, especially those most at risk, will subscribe to cheaper data to populate their FRTB models as a cost cutting exercise, ignoring the principle of ‘rubbish in, rubbish out’.
Banks must implement a comprehensive, coherent data strategy avoiding over-complication. Emphasis needs to be placed on Data Sourcing and effective data quality assurance with continuous validation.