Presented in April 2021, the draft European regulation on artificial intelligence provides that ‘high risk’ systems – such as facial recognition devices – will have to undergo audits in order to be marketed in the single market. The objective is to verify that they comply with the regulations in force (principle of non-discrimination, respect for privacy, etc.).
understand to audit
The National Commission for Computing and Liberties (Cnil) could be authority in charge of monitoring and sanctioning companies in the event of breaches. However, to carry out these future missions, it will have to be able to examine the functioning of the algorithms. To prepare for this, she worked with Clément Henin, current referendum adviser in extraordinary service of the Court of Auditors and researcher within the Privatics project-team of the National Institute for Research in Digital Sciences and Technologies (Inria ). He is the author of a thesis published in 2021 on the explanation and justification of “algorithmic decision systems”, carried out under the supervision of Daniel Le Métayer and Claude Castelluccia.
It is important to specify that Clément Henin’s work does not only concern artificial intelligence. He also regrets that the European Commission chose this term to name its regulations. “There are other algorithms that can cause problems even if they don’t use AI“, he explains to The Digital Factory. This is how he defines AI as “an algorithm that does not have its behavior entirely programmed by a human but is capable of some form of learning from data (machine learning) or experiences (reinforcement learning)“. To this definition, he opposes that of an algorithmic decision system: “the use of an algorithm in a decision process” in which the human has a much more important place.
The CNIL tested two tools: IBEX and Algocate
Within the framework of the collaboration with the Cnil, about thirty agents in charge of control tested two tools developed by Clément Henin: IBEX (for Interactive Black-box EXplanation) and Algocate (concatenation of algorithm and advocate). The first is a black box explanation system, that is to say operating from inputs and outputs without accessing the system code. It allows the user – layman or expert – to choose the type of explanation he wishes to obtain, in the form of a diagram, a decision rule or the presentation of a counterfactual. To construct explanations, IBEX begins by selecting a set of entries. The links between these inputs and the corresponding outputs are then analyzed in order to generate the final explanation.
For legal reasons, a fictitious case was chosen, that of a company offering consumer credit. From a set of data, the teams built an algorithm to be audited on which IBEX was tested. The experiment was conducted from an online platform. Cnil agents thus had to answer questions about fictitious algorithmic decisions to test their understanding of the system.
Confronting decisions with societal norms
For its part, Algocate works according to “an affirmation system“. Unlike the explanations offered by IBEX, whose purpose is to understand how the algorithm works, the justifications attempt to convince that the decision is good. Clément Hénin explains to obtain a justification, it is first necessary that be defended a decision, named “the expected decision“. A user must therefore give the reasons which lead him to believe that the decision resulting from the algorithmic system is bad. In this case, he can contest the refusal of consumer credit arguing that this is his first request for credit. credit and that it is a low amount.Once formalized, the dispute is analyzed by Algocate in the light of the company’s standards, for example the principles of non-discrimination.
Clément Hénin explains to us that, in general, these experiments have been fruitful; on its side to test IBEX and Algocate in the field and on the side of the Cnil to improve the performance of its agents.
We would love to say thanks to the author of this write-up for this amazing web content
In view of the new regulations on AI, the Cnil is preparing to audit the algorithms
Check out our social media accounts as well as other pages related to it.https://yaroos.com/related-pages/