For almost two years, the interdisciplinary working group algorithm monitoring of the D21 initiative dealt with questions relating to responsibility and control, non-discrimination, transparency and traceability as well as sustainability. Lawyers, philosophers, economists and IT experts came together to share their knowledge and discuss relevant topics. On 20 November 2019, D21 initiative, together with KPMG, presented nine guidelines, which are the condensed version of three ‘Essays on Digital Ethics’ which were created in the group.

During an evening event, around 100 participants from politics, administration, business and academia discussed the opportunities, challenges and untapped potentials of monitoring algorithmic systems. “Algorithmic systems” because it’s not just about the algorithm itself, but also about data and the people who decide, create, review, and use it – a complex system that needs to be looked at in its entirety to make the results of the algorithms fairer.

Irina Eckardt präsentiert UAG AlgoMon

With the guidelines, D21 is also promoting the better recognition of biases that have entered the technology. There is „no algorithmic system without human perceptions and decisions,“ explained Dr. Irina Eckardt of KPMG, head of wht working group Algorithm-Monitoring, at the launch of the event. The social, cultural-historical and economic background guides human decisions. Moreover, these factors are unconsciously involved in every development phase of algorithmic systems.

Previously hidden subjective evaluations could become more visible in algorithmic structures, which would help to put the former to the test. In any case, clear and binding guidelines on how to deal with biases are needed.

Irina Eckardt presents guideline 2 on bias

These and the other findings, which were condensed into the guidelines, were evaluated and discussed in a marketplace moderated by participants of the working group.

Discussion of guidelines for monitoring algorithmic systems
Discussion of AlgoMon Guidelines

Afterwards the practical use and possible implementation measures were discussed at the panel with Prof. Christiane Wendehorst, Co-Chair of the Data Ethics Commission, Anke Domscheit-Berg, MdB and member of the Enquete-Commission KI, Iris Plöger, member of the EU High-Level Expert Group on Artificial Intelligence, and Lena-Sophie Müller, Managing Director of the Initiative D21.

AlgoMon panel discussion

Prof. Wendehorst noted that the diverse guidelines stimulated the debate and suggested that more concrete regulations should be produced. Domscheit-Berg also wanted to move from the „meta-recommendations“ and sector instructions for algorithmic systems to a binding implementation. The difficulty, however, is that an agreement on principles does not yet mean agreement on details. She sees this again and again in the Enquete Commission. Legally binding steps should be taken, but in any case, there must be European standards and regulations, and under no circumstances we should fall back to national levels, Plöger added.

Domscheidt Berg bei AlgoMon Panel discussion
Prof Wendehorst on panel discussion

The panel agreed that it is time for politics to take actions. The guidelines for monitoring algorithmic systems of the D21 initiative should be used to create a wider public discussion. In addition to regulations, it is also necessary to be self-empowered and to become aware not only of the risks, but also of the potentials of AI.

audience algomon discussion

We would like to thank the participants of the working group Algorithm Monitoring and KPMG for their support.

working group algorithm monitoring

Essays on Digital Ethics

Bias in algorithmic systems

English cover of Bias in algorithmic systems

Transparency and explainability of algorithmic systems

AlgoMon Transparency Cover

Responsibility for algorithmic systems


Guidelines for monitoring algorithmic systems


Illustrations for monitoring algorithmic systems

Impressions of the event

About Initiative D21