Home Breaking News The Rights Defender calls for vigilance on the use of algorithms by...

The Rights Defender calls for vigilance on the use of algorithms by public services

23
0
The Rights Defender calls for vigilance on the use of algorithms by public services

This is the story of a young retiree from Montcel (Savoy) who is struggling to finalize her pension file with her regional fund. Leave a negative comment on the Services Publics + site to complain. He received comments a month later: “If you encounter difficulties in your procedures and have not been able to contact us, you can request that we call you back. » The message is signed “Sylvie” and followed by the words “This response was 83% generated by AI (artificial intelligence) and verified by an agent”.

Exchanges of this type between the administration and its users have become common since the emergence of generative AI two years ago. Tax calculation, social assistance allocation, daycare place allocation… Entire sectors of public services now use algorithms or artificial intelligence to fulfill their mission. These technological innovations, designed with the ambition of responding more quickly and better to the needs of citizens, are expected to multiply in the coming years. But they can pose problems if not properly controlled, warns a report by the Ombudsman published on Wednesday, November 13.

Read our survey again: Article reserved for our subscribers. Profiling and discrimination: investigation into the abuses of the family allowance boxes algorithm

A risk of disempowerment

The document calls, in particular, to maintain human control of automated systems, to “hold your hand » about important decisions. The authors consider it necessary to ensure that systems respect the law from their design and during their use, through periodic controls. For example, machine learning algorithms, which adapt over time, introduce a greater risk of error, bias, and discrimination.

Several French and European texts establish in principle safeguards for the use of algorithms, in particular the guarantee of human interventions. But practice reveals certain deviations. The report cites Affelnet, the procedure for assigning students after their third grade.my. The tool uses different criteria, including academic results, to automatically assign points to each student. However, the Rights Defender explains having received a case from a student that appears to have been processed in a completely computerized manner. Only scores of 0 appeared in the “evaluations” category of his file, “without the exceptional nature of this data leading the commission to carry out verifications”.

You have 51.12% of this article left to read. The rest is reserved for subscribers.

LEAVE A REPLY

Please enter your comment!
Please enter your name here