BCBS 239 is a set of Principles to strengthen the risk data aggregation and facilitate the resolution of banking crises.
The global financial crisis of 2007 revealed that business data management models are fragile and can’t adequately support the identification and mitigation of risks within banking realities. The aggregation of risk exposure and the ability to identify its concentration at various levels (Group, legal entity, and operating unit) were essential requirements. However, many banks were unable to meet them. To fill this gap, in January 2013, the Basel Committee on Banking Supervision (BCBS) issued a regulation aimed at the systemically important banks worldwide, the G-SIB. Created by an international task force, BCBS 239 lists a set of principles (to adopt by January 2016) aimed at strengthening risk data aggregation capability and internal/external reporting processing procedures.
In April 2020, a new document, Progress in adopting the Principles for effective risk data aggregation and risk reporting, was released. It provides further points of reference as well as a snapshot of the state of adopting the regulation as of date.
But it’s not only about the G-SIB institutions. The Basel Committee strongly recommended the national supervisory body to apply the same principles to systemically important domestic banks within three years of their designation as D-SIBs.
Regulation objectives
The main aim is to improve the banks’ ability to aggregate risk data and thus facilitate the resolution of banking crises.
As appears from the regulation, in the recovery intervention framework a reliable risk data management system would help the banks and the supervising bodies to foresee issues. It would also make it easier to find alternative solutions to restore the financial soundness and sustainability of banks in serious difficulties. For example, they can improve the prospects of finding a compatible merger partner.
All in all, adjusting to BCBS 239 gives the banks advantages in terms of efficiency, lower probability of losses, improved strategic decision-making, and, finally, higher profitability.
Principles for effective aggregation and reporting of risk data
To sum it up, these principles aim at enhancing the banking procedures related to financial risk management and corresponding decision-making processes. The fundamental principles are:
- Completeness, integrity, and granularity: both internal and external reporting system must cover all the main risks the bank is exposed to. Moreover, they must do it efficiently and have an appropriate system of controls.
- Governance: Top management and risk committees should be informed at least once a year about the status of alignment to the BCBS 239 principles (completeness, Data Quality, reporting timeliness); significant gaps should be the subject of an action plan for remediation.
- Adaptability and responsiveness in case of financial crises: the reporting system must guarantee flexibility to respond also to immediate specific requests. These could be made in crisis situations, or as a consequence of internal developments, or by supervisory bodies.
On what focus points should we base a methodology to achieve regulatory compliance?
We identify three key points:
GOVERNANCE AND IT INFRASTRUCTURE
Strengthening the current Data Quality Framework and its system of controls by:
- extending Data Governance to the activity of risk data processing and the preparation of the related reporting
- developing the IT architecture towards supply chain integration and efficiency in preparing the reporting
RISK DATA AGGREGATION CAPABILITIES
Banks should be able to monitor financial risks in a reliable way via:
- overseeing data accuracy and integrity while minimizing manual intervention
- constant andcomplete updating of the data
- adaptible and flexible data to meet the specific requirements
RISK REPORTING PRACTICES ENHANCEMENT
Ensure the data is available for the right people at the right time via:
- data accountability, guaranteed by Data Governance
- reporting accuracy
- completenes, clarity, and timeliness of the reports
- creating cooperation tools for the various automated business actors
But what to do in practice?
The first recommended activity is to identify the scope of analysts of the data related to the greatest banking risks. For the above mentioned principles to take root, it is essential to define and apply the criteria that determine the data application perimeter of the regulation. The aim is to prioritize the most relevant actions for different types of risk:
- credit risks
- financial risks (liquidity and rate) and operational risks
Another useful criterion in the intervention perimeter is the reporting present within the Bank:
- Management reporting
- Regulatory reporting (e.g., FINREP, COREP, etc.)
The second recommended activity concerns risk architecture recognition, or mapping the main application systems relevant in the data life cycle (e.g., Datamart ALM or General Ledger, DWH, Loans, etc.). This step is fundamental as it prepares the activities of critical data analysis, in particular, for the data mapping and in the data life cycle. Involving the Risk Management would allow to identify the various local systems, intermediate preparation and data consolidation systems and point out the interchanges of data streams.
Once all the data in the analysis perimeter is identified, we can proceed to establish the list of the most critical data and the detailed logic of identifying and prioritizing the data. This skimming of critical data happens by analyzing of data relevance. It is usually conducted according to criteria of significance as well as the findings of the Business and IT teams involved in the analysis.
Significant data of highest priority could be, for example, “the economic activity code, Tax ID, Client ID, the probability of default, the approved granted amount, etc.”
Finally, with this data one can proceed to:
- define common nomenclature and semantics, transversal to the bank structures by a single data item
- collect all the data necessary to define the data lineage for critical data (applications, controls, streams, etc.)
- sum up the collected information from the analysis of data lineage and make it grafically available to allow a more accurate and useful knowledge of the data life cycle
- summarize all the quality controls traced by an individual data item from different structures. It is useful for creating a unified control dictionary
- verify the consistency of the data with the BCBS 239 principles. It would allow to prioritize areas for improvement and identify the most critical data
- identify the corrective actions for filling the functional gaps identified during the analysis and make the data conform to BCBS 239.
Why choose Irion?
For years, Irion has been supporting banking clients in achieving regulatory compliance. We do so by speeding up the activities that the data management process requires and supporting effective collaboration between the worlds of IT and of business. Irion EDM facilitates the sharing, collaboration, and optimal data management as well as the automation of operational processes through the clear traceability of the entire data lifecycle and the production of BCBS239 reporting for audit purposes. If your needs relate to regulatory reporting, and not just risk data, then Irion is for you! Our experience is at your disposal. Request a demo.
We can help you with:
- powerful control engines that perform 2.5 million controls per minute, verifying over 60 million records;
- a flexible and collaborative Data Quality Governance system so that different data specialists can interact;
- an effective system to manage poor data quality issues and remediation;
- a module that allows to adopt the already tested metrics or to define, calculate and analyze any type of indicator on any type of business process;
- automations that generate technical rules based on metadata in a smart way and in a few seconds;
- the cutting-edge technologies, such as artificial intelligence and machine learning, included into the platform. For example, they can suggest the most fitting Business controls
- the automatic verbalization of technical rules to facilitate interaction between business and IT users; constantly updated documentation;
- dashboards for continuous monitoring;
- an ideal Data Lineage and impact analysis tool. It provides a graphical interactive representation of the relationships between data;
- it can quickly manage millions of data
- and much more…
Want to know more?
We will provide you with illustrative examples of how other organizations have already started their transformation.