GDPR uncertainty could spell trouble for machine learning
Advisers warn of discrepancy in advice to regulators over automated decision-making
Industry participants have voiced concern over an apparent discrepancy between the text of the European Union’s new data privacy laws and subsequent interpretative guidance to regulators on how the new regime should treat a key use case of machine learning. Advisers say the first legal precedents set under the rules could have major implications for banks’ use of machine learning-based algorithms to automate everyday decisions such as credit card approvals.
Article 22 of the General Data Protection Regulation (GDPR), which came into effect on May 25, says a consumer has the right not to be subject to a solely automated decision that “produces legal effects concerning him or her, or similarly significantly affects him or her”.
Exceptions exist, however, where such automation is deemed a necessary part of a contract between two parties, or where explicitly authorised by local law. In this context, industry experts believe the right to a human decision-maker would be an active one that consumers will need to invoke.
However, guidelines on GDPR issued earlier this year by the Article 29 Working Party – a group of representatives from local data-protection regulators of EU member states, now superseded by a permanent body – interpreted Article 22 to be a passive right. This implies solely automated decision-making would be generally prohibited, rather than a right that consumers have to request.
Such a strict reading of the regulation could have a severe impact on banks’ rapidly growing use of machine-learning techniques – a subset of artificial intelligence (AI) that relies on automation to create accurate predictions from large, dense data sets – to speed up decisions, such as whether to pre-approve a potential customer for a loan, experts say. This discrepancy has left market participants uncertain how to comply with the regulation.
“[The guidance] is not clear. Many organisations are battling with it because the regulation says one thing, and then the guidance comes out and it actually doesn’t clarify, but almost provides more confusion over some of these things,” says one senior London-based cyber risk consultant.
Is a staff check of one in a million enough human intervention? Probably not
Cyber risk consultant
The Article 29 Working Party was replaced formally by the European Data Protection Board at the end of May, when the GDPR entered force. The EDPB’s goal is to ensure supervisory authorities interpret and enforce the regulation in a consistent way across the different member states.
Asked for comment, a spokesperson for the EDPB says that, given the sheer breadth of firms affected by the GDPR, offering regulatory guidelines customisable for individual people or firms would have been unfeasible. Instead, where banks and other financial services firms acting as data controllers – defined as any party that determines the purpose and means of processing personal data – feel the regulation should be interpreted and applied differently, they must log this with the regulator and justify why.
“As data protection is a wide field with various different procedures, depending on how the data is processed and used, we cannot provide tailor-made codes of conduct for organisations,” the spokesperson says. “The ultimate responsibility for GDPR compliance lies with the controllers. If they believe the GDPR needs to be implemented in a certain way to suit their specific needs, which may differ from the proposed method by the EDPB, they will need to log and justify this for possible future audits [or] controls.”
Bigger issue
Andrew Burt, chief privacy officer and legal engineer of data-management platform Immuta, agrees that the difference between the base text of the GDPR and February’s guidance appears substantial. However, he says the bigger issue is the uncertainty around how regulators plan to enforce the GDPR on machine learning in general.
“The Article 29 Working Party interpretation would add a significant compliance burden to machine learning within the EU, pure and simple,” Burt says. “What I’m concerned about is the current, pretty significant ambiguity in how data-protection authorities are actually going to implement these provisions. When it comes to automated decision-making, the line between GDPR compliance and non-compliance is simply not yet clear.”
That indecision has left market participants questioning how they should oversee automated and machine-learning techniques within their own firms.
The senior consultant suggests banks could look to implement an “airgap” of human intervention in their models. However, the level of involvement required from humans before a model can be judged as being no longer solely automated is unknown, he adds. In the absence of further guidance, firms will have to rely on legal advice when implementing such measures, which will then be tested by regulators and a precedent set.
“Is a staff check of one in a million enough human intervention? Probably not, but [a precedent will be set] when you go down this untested road of how this is going to fall out in court,” he says. “AI is a massive thing that is not going to be stopped by [the] GDPR – but it certainly will make organisations think about [whether] they do it in the right way, in an ethical and transparent way, and how they do it to the benefit of people and themselves while not harming other people with the decisions being made.”
Editing by Tom Osborn
コンテンツを印刷またはコピーできるのは、有料の購読契約を結んでいるユーザー、または法人購読契約の一員であるユーザーのみです。
これらのオプションやその他の購読特典を利用するには、info@risk.net にお問い合わせいただくか、こちらの購読オプションをご覧ください: http://subscriptions.risk.net/subscribe
現在、このコンテンツを印刷することはできません。詳しくはinfo@risk.netまでお問い合わせください。
現在、このコンテンツをコピーすることはできません。詳しくはinfo@risk.netまでお問い合わせください。
Copyright インフォプロ・デジタル・リミテッド.無断複写・転載を禁じます。
当社の利用規約、https://www.infopro-digital.com/terms-and-conditions/subscriptions/(ポイント2.4)に記載されているように、印刷は1部のみです。
追加の権利を購入したい場合は、info@risk.netまで電子メールでご連絡ください。
Copyright インフォプロ・デジタル・リミテッド.無断複写・転載を禁じます。
このコンテンツは、当社の記事ツールを使用して共有することができます。当社の利用規約、https://www.infopro-digital.com/terms-and-conditions/subscriptions/(第2.4項)に概説されているように、認定ユーザーは、個人的な使用のために資料のコピーを1部のみ作成することができます。また、2.5項の制限にも従わなければなりません。
追加権利の購入をご希望の場合は、info@risk.netまで電子メールでご連絡ください。
詳細はこちら リスク管理
内部監査人への依存度の高まりが、規制当局や業界に懸念を抱かせている
リスク管理の専門家たちは、米国が監督当局の役割を監査人に置き換えていると警告しており、これが独立性を損なう恐れがあると指摘しています
先物とオプションが示す戦争のコスト
現物価格は大きな混乱を示していますが、先物市場はこれが一時的なものだと示唆しており、オプション市場は不安定な状況が続くと示唆しています
担保に関して、TINAはTIAになることができるのか
あるエコノミストは、レポ取引やデリバティブ取引における担保としての米国債の優位性は、もはや揺るぎないものではないと指摘しています
CMEとFICCのクロス・ネッティング条件が対立を招いている
ヘッジファンドは、中央清算機関が取引を停止する権限を持つことに懸念を示していますが、清算会員は、これは標準的な慣行だと述べています
ホルムズ海峡の転機は、すぐ来るかもしれない
エージェントベースのモデルによると、4週間後には遅延や品不足が加速する可能性が高いとされています
オペリスクデータ:香港、2億ドルの買収案件で厳しい姿勢を示す
また、インドでは銀行職員が公的資金を横領したほか、ヴァンガードは米国のネットゼロ訴訟で和解に至りました。データ提供:ORX News
CRO view: Emerging risks in the age of AI
The risk agenda is shifting beyond market and credit volatility towards operational resilience, AI governance and culture
金利変動の横風がIRRBBチームを直撃
政治的な介入や矢継ぎ早な法改正が、銀行のキャッシュフロー予測モデルに打撃を与えています