Wednesday, October 16, 2024 - 7:51 pm
HomeEntertainment NewsThe associations refer the matter to the Council of State.

The associations refer the matter to the Council of State.

Does the National Family Allowance Fund (CNAF) classify its beneficiaries in violation of the law? Fifteen associations announced on Wednesday, October 16, that they would contact the Council of State to ask it to put an end to the system established by the CNAF to guide its controls.

Since 2010, the social organization has used algorithms to try to identify beneficiaries who must be controlled as a priority, due to a greater risk of error in their file. Each file is assigned a score from 0 to 10, depending on the personal data. The higher the score, the greater the chances of being reviewed. But, instead of focusing on suspicious behavior, this system is based on the personal characteristics of the recipients, as shown the world in a survey published in 2023, which was based in particular on the analysis of the computer code used by the CNAF. It is, for example, by studying the age of the children, the level of income or the number of recent moves that the organization judges whether a household is “at risk” or not.

A tool accused of discriminating against the most vulnerable

This method of operation is considered discriminatory by the associations that originated the complaint, among which are La Quadrature du Net, Amnesty International France and the Changer de cap collective. The first time they wrote to the CNAF, in July, they asked it to give up its risk score. When they received no response, they decided to take the matter to the Council of State. Your request, the world was able to consult, focuses largely on the alleged unfairness of the algorithm.

Certain calculation rules linked to the age of the beneficiaries or their family members are presented as directly discriminatory. Others are likely to cause indirect discrimination. This is, for example, the case of the criterion aimed at single-parent families, which mechanically has the effect of excessively controlling women, many of whom raise their children alone. Since women, but also people with low or precarious incomes, or even young people, are overrepresented in the checks, we can consider that they are discriminated against, the associations argue.

The CNAF, for its part, has always defended itself against any discrimination. The system, according to the organization, only looks for the profiles most likely to make mistakes. And, if the risk score is then used to guide controls, controllers process each case manually: beneficiaries whose file is in order are not penalized, the administration estimates.

Also read: Article reserved for our subscribers. The opacity of algorithms encourages abuses within public establishments

Possible data protection violations

An argument refuted by Katia Roux, head of rights defense at Amnesty International France: “The question is whether or not this tool respects human rights in its operation.”. Discrimination must be prohibited. This is all the more important for us as systems of this type are being developed in many other administrations, both in France and in the rest of Europe. »

The associations also accuse the CNAF of violating the General Data Protection Regulation (GDPR). First, because the amount of data collected would be disproportionate to the desired objective. The system combines information on more than 32 million people to carry out around 90,000 on-site checks each year, according to CNAF figures.

The risk scoring system could also be contrary to Article 22 of the GDPR, which in principle prohibits fully automated individual decision-making based on the processing of personal data. However, a ruling by the Court of Justice of the European Union of December 7, 2023 concluded that a rating system used by a German company to validate or not a real estate loan application was contrary to the GDPR. For associations, the same logic could be applied to the issue of the CNAF system.

Lack of transparency on the part of the CNAF

The CNAF has so far attempted to downplay criticism of its risk score. “We have nothing to be ashamed of or apologize for the way we use these tools”wrote the director of the organization, Nicolas Grivel, in an internal message released in December 2023, after the investigation by the World.

the world

Support a newsroom of 550 journalists.

Unlimited access to all our content from €7.99/month for 1 year.

Subscribe

Grivel also promised that a reflection on the use of algorithms in the CNAF in which various specialists would participate would facilitate “first conclusions for spring 2024”. According to our information, the issue was discussed during the board of directors of the social organization at the beginning of summer. “The idea was above all to justify the use of this system and discuss consultation projects for the future. “There wasn’t much concrete.”estimates Joël Raffard, representative of the CGT in the instance.

While the CNAF has tried to avoid the debate until now, referral to the Council of State could force it to be more transparent. In particular, you may need to explain in more detail the choice of this or that criterion for calculating your risk score. The associations also expect the organization to communicate accurate statistics on the nature of the audiences controlled by the algorithm, to see who is or is not being targeted.

Read our survey again: Article reserved for our subscribers. Abuses of the family allowance funds algorithm

Reuse this content

Source

Anthony Robbins
Anthony Robbins
Anthony Robbins is a tech-savvy blogger and digital influencer known for breaking down complex technology trends and innovations into accessible insights.
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Recent Posts