- Eric Leman, Director and solutions specialist, Moody’s Analytics
- Tobias Spanka, General manager, Germany and Austria, Bureau van Dijk
- Aurel Schubert, Director, General statistics, European Central Bank
- Jerry Goddard, Director, Wholesale risk, Santander
- Moderator: John Anderson, Journalist
Regulators are waking up to the benefits of big data. Traditional reporting of summary positions and exposures by banks is increasingly being replaced by much larger, more granular datasets. The aim is to give supervisors a clearer picture of firm and sector trends and risks, which could allow them to make better micro and macroprudential decisions.
For reporting firms, there are short-term implications for the way they manage their data – a new round of cleaning, reconciling and standardising – and longer-term implications for the way they are supervised.
For regulators, the question is whether they can unlock the promise of these rich datasets. Early forays into the space have not gone well, as demonstrated by continuous struggles to obtain usable data from the post-crisis derivatives reporting regime.
This webinar looks at the challenges facing banks as data requirements expand, considering initiatives such as the European Central Bank’s Analytical Credit Datasets project – also known as AnaCredit – as well as similar big data initiatives in the UK and elsewhere.
Key discussion points include:
- Whether the regulatory rationale stands up to scrutiny
- The changes for reporting firms
- The data management challenges of AnaCredit and similar projects
- The future of bank supervision
- Where else regulatory big data can be applied