UN warns of artificial intelligence risks to civil liberties

Artificial Intelligence Risks Technology

The UN High Commissioner for Human Rights, Michelle Bachelet, called for a moratorium on the development of artificial intelligence systems that threaten civil liberties, such as who gets a job or not.

In an intervention before the UN Human Rights Council, where on September 15 a report was presented on the impact of these new technologies on fundamental freedoms, Bachelet asked to stop the development of some of them “until they have been put into practice. appropriate safeguards in place.”

“We cannot continue to react belatedly to the effects of AI, nor allow it to be used in an unlimited manner, without borders or supervision, and then face its almost inevitable consequences on human rights,” said the high commissioner.

Automatic profiling

The report has studied the use of AI in automatic profiling systems, decision-making and other learning technologies for machines, which, it concludes, can violate the right to privacy and others related to health, education, freedom of movement or expression.

According to the document, there have already been cases in which some people have been “unfairly treated” for the use of AI, for example by denying them social security benefits or being detained as a result of errors in facial recognition systems (highly developed and already widely used in China).

“The data that feed and guide AI systems can be deficient, discriminatory, obsolete or irrelevant”, concludes the document, which points out that these skewed bases “can lead to the adoption of discriminatory decisions, even greater risk in groups. who are already marginalized”

AI can decide who gets a job

The former Chilean president recalled that “artificial intelligence now reaches almost every corner of our lives” and can decide “who receives public services, or who has the opportunity to obtain a job.”

The report presented today before the Human Rights Council also calls on the companies and states that develop this technology to increase transparency in this investigation.

Probably wanna read: