Industry experts gathered for a roundtable discussion in New York at the end of May to debate the ways firms acquire and use information about risks. Moderated by Ellen Davis
What data challenges is your firm currently faced with and, specifically, what are your challenges related to technology architecture, data integration and data quality and management?
Marta Johnson (senior vice-president, global markets – business control, Bank of America):
Data gathering is a real problem. As with many organisations who have been subject to any number of mergers and acquisitions, we have a very disjointed platform and architecture. We have an effort to do Key Risk Indicators (KRI) across the platform and have been able to come up with a handful that we can actually get data on. We've tended to keep it at a very high level because of the reorganisations that have happened. It has taken a tremendous amount of work to try and keep that database up to date. A lot of the risks we see are where we are dependent upon other people to perform functions for us and those connections are frequently the source of some large losses because the assumption was that somebody was watching or doing and they're not – like even backing up data. You would think that wouldn't be so hard. Everyone defines backup differently. That's where your risk exposure is huge.
Sandeep Vishnu (managing director and head of the enterprise risk management practice for North America, BearingPoint):
I would like to follow up on what Marta said with a real experience that my team had five years ago. We were talking to a technology organisation who had just rolled out a new application and we wanted to know how it was working for them. Our contact said "we have a database, we can tell you – we've had these trouble tickets." We said "OK show me the database." The gentleman proceeded to open his file drawer. That was the database. Hopefully this sort of issue is happening less and less, but it probably still occurs at small and mid-size institutions – the large multi-national banks may have this to a lesser degree – what constitutes a database might not be what you think of as a database.
Joseph Sabatini (managing director and head of corporate operational risk, JP Morgan):
I will say that I think JP Morgan Chase, and I say this with some pride, has been one of the pioneers in trying to move forward on the data and the technology in question. Have we been 100% successful? Absolutely not. If I could turn back the clock would I do some things differently? Very definitely. But today we have in place, and have had in place for a few years, a single tool and architecture and protocols for collecting loss data. We have had a single tool in place, an architecture and protocols for doing self-assessments. There is, dare I say, a data warehouse with information in there and a recording platform on top of it. We are able to produce countless reports looking at the data in different ways.
I would say that the biggest challenge we have is how accurate is the data really? We have a great partnership with the businesses and we have a great partnership with audit. In the culture of JP Morgan Chase, people believe in the culture of good data, a single source of data, transparency of the data, escalation of the data and accountability before any action plans happen. If you were to listen to the Jamie Dimon speech in terms of what he expects from his business managers versus the aspirational discussions that an ORM or op risk function would have had a few years ago, in terms of what we need to build on, there would be enormous amounts of commonality.
I think that while we have some challenges to work through, we have the big investment behind us and the kind of agreement by the businesses that it is appropriate to have an initiative where everybody is, more or less, and I use that cautiously, on the same system. From a technology standpoint, we have made great progress, albeit there's always a to do list in front of us.
Question from audience: The best data for self-assessments comes from frontline managers, and without that data self-assessments can get stale very quickly. How do you get the partnership of the business units to get the data needed from individuals who are in the business units?
I get frustrated when people tell me they do their self-assessments once a year, or monthly, and they get kind of stale. Every bank has a credit analyst and every bank has ratings for General Motors, Ford and every industry and every exposure they have. Those credit analysts feel an ownership for their particular view on those credit exposures or, for those people who have equity analysts or fixed-income analysts in their shop, they have a view on whether the stock for General Motors or the stock for Bank of America or whatever is a good buy. Do they do those ratings every day? Did somebody say "I'm going to go into the system today and confirm that a BB rating for General Motors is still accurate?" No. But do they feel ownership and that they need to be on top of their industry and credit exposures so that anybody in the firm who goes into the system and says "I'm thinking of taking credit risk with this industry or this particular counterparty. I can rely on the fact that there's a professional here who's well paid, well motivated and feels accountability for the fact that that is an accurate rating."
The same thing needs to be true about self-assessments. This "drop your day job, let's do the self-assessments because it's December 15th" absolutely does not work and would provide no value to anybody throughout the course of the year. There's a lot we can learn from credit risk and market risk. We can borrow those kinds of mentalities, cultures, expectations about data quality, accountability and ownership of the risks etc. We can really hold the same standard in op risk that I think would be beneficial.
Question from audience: Do you see any sort of direction or trend in terms of which key indicators people are starting to collect? You all have a lot of data but, in terms of key risk indicators, what is coming out from either the regulator or, just in general, industry best practices?
Key risk indicators have probably taken on a life of their own. We know some institutions who say they are managing 1,500 KRIs, at which point the word "key" has no meaning anymore. There is very little prescriptive guidance. Regulators are not saying "manage these elements" because that brings forward liability. There are probably a handful of indicators that companies have gravitated towards, which they are using to manage the business.
The average key risk indicator is a key performance indicator that starts to fall outside tolerance levels. Each business already knows what they are. They have a handful – 10, maybe 20 at the most. We have seen that people have not taken enough time to think about two types of risk indicators – internal and external. Take data security as an example of an internal indicator – all the policies or procedures around managing customer data are a potential key risk indicator. They cut across different departments, which may not have paid the same attention to what that means or the related impacts.
External indicators are those that come from outside and change the business and make it a true risk for a company. For example, in 2002 there was bill pending in Congress authored by senators Sarbanes and Oxley. This was a risk indicator for companies because it would fundamentally affect how a business operates. We have seen very little prescription and not enough structure across organisation-wide indicators, but have seen that managers have a good handle at the business level on what are the key metrics that people need to follow.
Rei Tanaka (senior manager, financial services, BearingPoint):
We know that there are two types of key risk indicator – preventive and detective, or directive. In my experience, companies usually focus on the detective ones. That's obviously more challenging than tracking historical accumulated data. Getting down to what's causing the risk to happen and detecting pieces of data or information that may lead to the risk becoming a loss in the future – that's the issue that business leaders, subject-matter experts and risk managers need to address. We need to get down to "what is the root cause, and what do we really need to manage to avoid future risk resulting in future losses?"
Question from audience: I just want to expand on the KRI question. To me, a big challenge is communicating with the end users of risk – management and business lines. We have a lot of information and we'd like it to be used as effectively as possible. There's been an evolution in how we report this data, what type of data we report and what type of metrics and measures of risk we report. Is there a set of best practices in terms of what type of data reporting should be done to stakeholders in firms?
Let me use one example. When we look at reporting we generally see that there are four primary stakeholders – two are external and two are internal. The two external stakeholders are regulators and investors and the two internal stakeholders are senior management and line management. For senior management , as well as the board of directors, we have found that risk reports need to address four things:
• How is the risk programme tracking – what have we done and where are we in developing an op risk programme?
• What is our loss history? Show me the last quarter, the last year – what are the trends in terms of our losses?
• What are the industry trends, and do we have any type of competitive insight into what others are doing?
• Has the regulatory environment weighed in on specific areas we need to address?
Boards and senior management are generally more worried about the large events that they haven't accounted for, or that the business might not have fully captured. They tend to be more comfortable that the 'expected' loss component is being addressed. What they want is insight into what the unexpected loss component may be, and how ready is the organisation to deal with it. This is just one example.
I would just say that reporting is perhaps one of our biggest challenges. If you think about it, at the user level the person who is actually doing the self-assessments – collecting and reporting the loss data – has one set of information requirements and their direct managers are, cell-by-cell within the organisation, dealing with very granular data. As you begin to go up the chain of command, and certainly when you get to senior management or the board, there's no way you can bundle all that data into a two-inch thick report and say "here's all the information cell-by-cell." Yet we lack the talent and the information and data standards to be able to aggregate the data because everybody's giving it to you a little differently. What summary information that is an insight into the risk of the organisation can we produce at the senior levels? I'd say that is our biggest challenge.
Question from audience: There's been a lot of talk of culture networks. I'm from a university. We've got an op risk management department that just started a year ago. We are spending a lot of our energy just putting in the formalised structure we face because we need them for compliance. You all seem to have a lot of experience. Do you have any advice on how one could develop these networks? Does it come from top-down or is it bottom-up? It would really be helpful if you could gear your answer to an institution where we have 14 different colleges and it's not all a one-business-based institution. We have dental schools, medical schools, law colleges, everything. Is there any advice on how to create that culture? Formalised networks – we want to make efforts useful. Joe – you talked about how you would put things in place if you could start afresh. We have that opportunity. If you have any advice that would really be helpful.
Let me just start because as chairman of the audit committee at Case Western Reserve University in Cleveland, I have a vested interest in understanding the risk culture that the academic world brings to the basic blocking and tackling of running a university, which is in itself a big business. The first thing I would observe is that op risk initiatives, whether you work at a university or a bank, from that platform you are not going to change the culture of the organisation. So you have "we're JP Morgan Chase, we're a cowboy, let's take any risk we want to take" organisation. A corporate initiative or some regulation is not going to change that.
You have to start with the realisation "what can I start with and what can I achieve within the culture that exists?", because some cultures are better than others. Every culture has some weaknesses. Again, I think the logic that we've seen within our industry has an immediate application certainly to academia, and I would assume elsewhere. That is, just understand where the big risks are and start with those. Don't try to boil the ocean, don't try to tackle all 14 colleges, don't try to tackle every one of the 100 risks that somebody says you might have.
Start with the bigger areas like the medical and the dental. Engineering is probably one where there's a lot of expense and revenue – also a lot more risk than maybe some of the other colleges. Be risk-based and start with the key areas. Have some early wins and take those early wins to the next generation, to the next school, and say "here's how we're doing it here and here's how we recommend you have it there." There's a great wisdom in terms of having early successes from a pilot initiative that is valued by the end user who then says "this was helpful to us" because then you can learn from that and take it to the next area.
I would agree. One of the things I've done in other organisations is ask the managers what they are most worried about or key traders or whoever is generating the revenue side. They know off the top of their head. For example, one of the guys who ran a money market said "I'm really worried about Joe and Steve going on vacation at the same time because they're the only two that know how to do a certain job for us."
What should have been the indicator of that? We ended up looking at the difference between proprietary application issues and customer-facing. I don't know how we kept in business with the proprietary side, although it generated a phenomenal amount of money, but all the investment into technology went to the customer-facing side. That stuff was just crumbling. Just by having someone say something that started the thinking of "OK, how do I back into that?", we ended up with a very concrete picture indicating where we had a lot of risk exposure. Find out where, and why that is a high area of risk. Is it the revenue, is it the expense or the litigation? Find out what it is and then try and back into that.
Just a couple of quick observations. One – I think the point has been made that informal networks cannot substitute for formalised structures, so one should not ignore that there is a need for some formal structures. The second point is all about viral marketing. If you want an informal network to develop, you have to show quick wins so that you generate a buzz.
You want people to say "this is valuable – there's something I can get out of it" and that will lead to that informal interconnection and the organic growth of risk information being exchanged. Until that buzz happens, it's very hard to get a group of people together just because they are all risk managers. They might gather, but they will probably spend their time with their Blackberry rather than actually interact with each other, unless they believe, as Marta said, there's something of value in the process.
More on Risk systems
Tracking network behavior patterns is the latest priority as IP becomes a target.
Community data sharing could change cyber risk protocol
After five years of work, a group of 19 big banks still get a failing grade from supervisors on their ability to pull together and report counterparty exposures. Is it all a question of cost? Fiona ...
Corporate statement: SunGard
Sign up for Risk.net email alerts
Sponsored video: Elseware
Oxford professor David Vines argues that the carrot is as important as the stick
Sponsored webinar: IBM
Watch highlights of this year's London conference
There are no comments submitted yet. Do you have an interesting opinion? Then be the first to post a comment.