Senior management to be responsible for processes based on big data and AI, German regulator says

Maria Nikolova

It is the responsibility of supervised firms to guarantee the explainability of BDAI-based decisions, BaFin says.

There will be no black box excuses for the management of financial services companies in Germany, in case they want to shift the responsibility for a decision involving big data and artificial intelligence (BDAI) to machines rather than human beings, the Federal Financial Supervisory Authority of Germany stated in a report published earlier today.

The report challenges the traditional approach to algorithms as black boxes. BaFin insists that senior management remains responsible for BDAI-based processes and decisions.

BaFin agrees that BDAI will create additional opportunities for automating standard market processes – from chatbots to liquidity management. Examples include the applications of automation for the lending business at banks or for claims management at insurance companies. Further automation is aimed at increasing efficiency and effectiveness – and thus cutting costs.

However, the regulator notes that this does not mean that responsibility for the results of BDAI-supported processes should be shifted to machines. When designing (partially) automated processes, it is therefore important to ensure that they are embedded in an effective, appropriate and proper business organisation.

Ultimately, responsibility for automated processes has to remain with the senior management of the supervised firm, says BaFin.

According to the regulator, supervised firms are the ones that have to guarantee the explainability/traceability of BDAI-based decisions. Supervisory and regulatory authorities will not accept any models presented as an unexplainable black box.

In the case of black boxes a user has no direct possibility of finding out why or how the algorithm has made its decision and thus produced the result. This is mainly due to the fact that the input and output values of neural networks are linked in a very non-linear and complex way.

According to the regulator, a better understanding of models provides an opportunity to improve the analysis process – for example, the users are then in a position to recognise problems such as overfitting and data bias.

As far as the intelligibility of AI systems is concerned, a distinction is made between transparency and explainability. Transparency describes the behaviour of the system being completely comprehensible. But this requirement can almost never be fulfilled since many models are necessarily complex. Put otherwise, a black box can hardly be made transparent.

In contrast, explainability describes being able to list the major factors of influence for a concrete individual decision. This is much easier to fulfill in terms of technology.

There are two main approaches for the creation of “transparent” systems:

  • An attempt can be made to generate comprehensible approximations for any models. This approximation should be characterised by being able to reproduce the behaviour of the original model with as little deviation as possible, despite being able to be depicted in a way that humans can understand. The TREPAN algorithm is an example of such an approach.
  • There are also approaches for the generation of directly comprehensible models from data, such as decision trees or subgroups. Such models are not always ideal for the learning task, yet if transparency is more important than the quality of the result, these methods are more appropriate.

The generation of explanations is currently a fast-developing field of research in machine learning. One of the most famous approaches is the LIME algorithm, which uses simpler methods to form a local explanation model for the individual case to be explained and similar data points. Other approaches are prototype-based and deliver representative individual cases to explain decisions.

The German regulator stresses that new approaches could provide at least some insight into how the models work and the reasons for decisions, even with very complex models. This is set to prevent models from being seen and treated purely as black boxes.

A consultation has been open into the detailed report which raises many important questions around data mining, privacy, and innovation. Feedback is expected not later than September 30, 2018.

Read this next

Industry News

Eshaq Nawabi ordered to pay $9 million after Forex Ponzi scheme

To conceal their misappropriation, Nawabi created and issued false account statements that misrepresented trading returns the pool participants supposedly earned. When clients wanted their money back, Nawabi wouldn’t return them their funds.

Market News

Gold Price XAU/USD Reaches Crucial Resistance Level

Today, the XAU/USD gold chart shows a historic milestone as the price of the precious metal surpasses USD 2,400 per ounce.

Retail FX

Webull Canada finally launches desktop platform

“The Webull Desktop platform, which has been in demand since our launch earlier this year, ties this all together.”

Executive Moves

GTN appoints ex-LSEG Bobby Bok as Head of Sales APAC

“My new role marks a new milestone for me, and I am excited to be part of a rapidly growing company redefining investing and trading. GTN’s mission resonates with my passion for harnessing technology to empower fintechs and financial institutions to foster financial inclusion.”

Market News, Tech and Fundamental

USD Strengthens on Hot US CPI Data, EURUSD Trends, and USDJPY Climbs Amidst Economic Indicators

Last night (Australian time) at 10:30 pm, a highly anticipated economic indicator was released from the United States: Retail Sales and Core Retail Sales MoM.

Opinion

Opinion: Cracks Are Beginning to Show In Tech Stacks…It’s Time to Address Them

The retail FX industry has rapidly evolved in the last 15 years so it’s no wonder that systems purchased or developed over the last 10 to 15 years are no longer fit for purpose. Patching up tech stacks is not the answer. The way forward for brokers is to streamline their operations with SaaS-based, customisable, consolidated tech stacks.

Inside View, Interviews

Exclusive interview with Tools for Brokers on its 14th anniversary

Celebrating its 14th anniversary, Tools for Brokers (TFB), hosted a private networking event in Cyprus, gathering industry professionals to discuss future trends and innovations.

blockdag

BlockDAG Targets 20,000x ROI, Excels Beyond Litecoin’s Rise, and Enhances Ethereum Layer 2 Activity

Explore BlockDAG’s promising 20,000X ROI as it leads, with significant developments in Ethereum Layer 2 and a surge in Litecoin’s value post-Dencun upgrade.

Digital Assets

Hong Kong regulators approve spot Bitcoin and Ether ETFs

Hong Kong-based asset managers received approval from regulators on Monday to launch spot Bitcoin and Ether ETFs.

<