Solvency II & Data Quality Management

Samuel Hendry 06.10.2015

At Morgan McKinley we encourage partnership through knowledge. Our Senior Appointments Business Partner Susan Kelly has reached out to a specialist in the area of Solvency II to help in imparting high level expertise in the area of Solvency II. Through a series of blogs Alan Lush will take us through the challenges surrounding Solvency II and its ongoing impact on your business. This is Alan's second blog on the topic.

“Data, data everywhere...” (with apologies to Samuel Coleridge Taylor !)

One of the most challenging aspects of implementing Solvency II is undoubtedly the issue of data. The original directive said very little on the subject, but by the time the Delegated Acts (Commission Delegated Regulation (EU) 2015/35) appeared in 2014, things had moved on considerably.

The requirement

EIOPA requires insurers to have very tight controls over the accuracy, timeliness and appropriateness of all data that goes into the calculation of Technical Provisions, and hence into the SCR calculation. While this is to some degree open to interpretation by the insurers themselves, evidence is emerging that regulators will expect to see at least all of the following in relation to this data:-

  1. A complete trace of the data from the point where it is captured by the undertaking, to the point where it ends up in the SCR output – in other words, in the Quantitative Reporting Templates (QRTs), ORSA, RSR and SFCR.
     
  2. Evidence of regularly tested controls, showing how accuracy is maintained at each point where the data is moved from one system to another, or from one file to another, or is processed by calculations.
     
  3. A “directory” of all the data used in the calculation of Technical Provisions – although of late this requirements seems to have evolved into a Data Dictionary which is a much more complex and detailed artefact.
     
  4. Evidence of strict management of data contained in End User Computing packages such as Microsoft Excel or Access.

 

These are extremely difficult for insurers to achieve for several reasons:-

  • The requirement is essentially a new one
  • Much of the data involved was laid down in a past era – in some cases as long ago as 20 to 30 years in a time when no such stipulations applied
  • All insurers rely heavily on spread sheets to process TP data, and frankly these are very prone to errors; unaudited changes to formula; version confusion and other forms of innocent corruption.

 

www.datasciencecentral.com

Where do we start?

By now, you should have made significant inroads into the issue of Data Quality, and may already have some or all of the following in place:

  • A Data Quality policy – approved by the Board of Directors and reviewed regularly
  • A Data Quality framework – this usually involves assigned data owners; data stewards and a change management system for amendments. Regular attestation of compliance with the Data Quality policy is also required from the data owners.
  • A data dictionary – every data item that goes towards the calculation of Technical Provisions should be described in this with detailed information such as data type; data description; field length and number of decimal places for numeric data; tolerance; data name; in which systems the data is used; the owner of the data item and an audit of any changes that have been made to it.
  • Data Quality reports that go to the Board, possibly via a sub-committee such as the Board Risk Committee, with actions for remediation where data is found to be inaccurate or inconsistent.
  • An End User Computing policy and control framework that ensures that access to critical spread sheets is controlled and audited, as well as some method of verifying the accuracy of the spread sheets in use as well. This can be accomplished by any one of a number of proprietary software tools designed especially to find formula errors that may cause material deviations from the true TP and consequently SCR position

www.nemsis.org

Conclusion

There is very little doubt that this is one of the most challenging aspects of the new regulatory regime for insurers. Regulators will be vigilant in their assessment of compliance with the requirements, with those found wanting being subjected to capital add-ons until the issues are resolved to their satisfaction.

Next time we will be looking at an often overlooked aspect of the Solvency II Directive – but one which has the ability to cause considerable difficulty for those who are unaware of it.

About Alan

alanAlan Lush is a Senior Consultant in the Financial Services industry who has been focusing on Solvency II for the past 5 years. He has assisted in the preparation of a total of six Irish insurers for the advent of the new regulation, including those on both Standard Formula and Internal Model. Prior to involvement with Solvency II, Alan worked on implementing the Capital Requirement Directive (Basel II) for a number of Irish financial institutions. Alan has a long career in consulting both locally and internationally having worked for many years with IBM's Business Consulting Services and International Financial Services Solution Centre.

Connect with Alan Lush on Linkedin

Samuel Hendry's picture
Director | Financial Services Recruitment
shendry@morganmckinley.ie

LATEST JOB VACANCIES

Morgan McKinley is currently recruiting for the role of Customer Service Advisor in Dublin city centre!
Dublin17.10.2019
Morgan McKinley is currently recruiting for the role of an APA Customer Service Advisor in Dublin.
Dublin17.10.2019
Morgan McKinley is currently recruiting for the role of a QFA Customer Service Advisor in Dublin.
Dublin17.10.2019