GDPR uncertainty could spell trouble for machine learning

Advisers warn of discrepancy in advice to regulators over automated decision-making

Human shaking hands with a robot
Banks' growing use of machine learning techniques to automate credit checks could fall foul of GDPR

Industry participants have voiced concern over an apparent discrepancy between the text of the European Union’s new data privacy laws and subsequent interpretative guidance to regulators on how the new regime should treat a key use case of machine learning. Advisers say the first legal precedents set under the rules could have major implications for banks’ use of machine learning-based algorithms to automate everyday decisions such as credit card approvals.

Article 22 of the General Data Protection Regulation (GDPR), which came into effect on May 25, says a consumer has the right not to be subject to a solely automated decision that “produces legal effects concerning him or her, or similarly significantly affects him or her”.

Exceptions exist, however, where such automation is deemed a necessary part of a contract between two parties, or where explicitly authorised by local law. In this context, industry experts believe the right to a human decision-maker would be an active one that consumers will need to invoke.

However, guidelines on GDPR issued earlier this year by the Article 29 Working Party – a group of representatives from local data-protection regulators of EU member states, now superseded by a permanent body – interpreted Article 22 to be a passive right. This implies solely automated decision-making would be generally prohibited, rather than a right that consumers have to request.

Such a strict reading of the regulation could have a severe impact on banks’ rapidly growing use of machine-learning techniques – a subset of artificial intelligence (AI) that relies on automation to create accurate predictions from large, dense data sets – to speed up decisions, such as whether to pre-approve a potential customer for a loan, experts say. This discrepancy has left market participants uncertain how to comply with the regulation.

“[The guidance] is not clear. Many organisations are battling with it because the regulation says one thing, and then the guidance comes out and it actually doesn’t clarify, but almost provides more confusion over some of these things,” says one senior London-based cyber risk consultant.

Is a staff check of one in a million enough human intervention? Probably not
Cyber risk consultant

The Article 29 Working Party was replaced formally by the European Data Protection Board at the end of May, when the GDPR entered force. The EDPB’s goal is to ensure supervisory authorities interpret and enforce the regulation in a consistent way across the different member states.

Asked for comment, a spokesperson for the EDPB says that, given the sheer breadth of firms affected by the GDPR, offering regulatory guidelines customisable for individual people or firms would have been unfeasible. Instead, where banks and other financial services firms acting as data controllers – defined as any party that determines the purpose and means of processing personal data – feel the regulation should be interpreted and applied differently, they must log this with the regulator and justify why.

“As data protection is a wide field with various different procedures, depending on how the data is processed and used, we cannot provide tailor-made codes of conduct for organisations,” the spokesperson says. “The ultimate responsibility for GDPR compliance lies with the controllers. If they believe the GDPR needs to be implemented in a certain way to suit their specific needs, which may differ from the proposed method by the EDPB, they will need to log and justify this for possible future audits [or] controls.”

Bigger issue

Andrew Burt, chief privacy officer and legal engineer of data-management platform Immuta, agrees that the difference between the base text of the GDPR and February’s guidance appears substantial. However, he says the bigger issue is the uncertainty around how regulators plan to enforce the GDPR on machine learning in general.

“The Article 29 Working Party interpretation would add a significant compliance burden to machine learning within the EU, pure and simple,” Burt says. “What I’m concerned about is the current, pretty significant ambiguity in how data-protection authorities are actually going to implement these provisions. When it comes to automated decision-making, the line between GDPR compliance and non-compliance is simply not yet clear.”

That indecision has left market participants questioning how they should oversee automated and machine-learning techniques within their own firms.

The senior consultant suggests banks could look to implement an “airgap” of human intervention in their models. However, the level of involvement required from humans before a model can be judged as being no longer solely automated is unknown, he adds. In the absence of further guidance, firms will have to rely on legal advice when implementing such measures, which will then be tested by regulators and a precedent set.

“Is a staff check of one in a million enough human intervention? Probably not, but [a precedent will be set] when you go down this untested road of how this is going to fall out in court,” he says. “AI is a massive thing that is not going to be stopped by [the] GDPR – but it certainly will make organisations think about [whether] they do it in the right way, in an ethical and transparent way, and how they do it to the benefit of people and themselves while not harming other people with the decisions being made.”

Editing by Tom Osborn

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here