Combating algorithmic bias in AI

United Nations publishes guidance to combat racial profiling in AI.

The United Nations Committee on the Elimination of Racial Discrimination has published its guidance to combat racial profiling, emphasising, among other issues, the serious risk of algorithmic bias when artificial intelligence (AI) is used in law enforcement.

The Committee on the Elimination of Racial Discrimination (CERD), made up of 18 individual experts, has issued its general recommendation on preventing and combating racial profiling by law enforcement officials.

General recommendations are legal guidance to assist the 182 State members to fulfil their obligations to the Convention on the Elimination of All Forms of Racial Discrimination.

CERD notes that the increased use by law enforcement of big data, AI, facial recognition, and other new technology risks deepening racism, racial discrimination, xenophobia and consequently the violation of many human rights.

“Big data and AI tools may reproduce and reinforce already existing biases and lead to even more discriminatory practices. We are deeply concerned with the particular risks when algorithmic profiling is used for determining the likelihood of criminal activity,” said CERD member Verene Shepherd who led the drafting of the general recommendation.

“For example, historical arrest data about a neighbourhood may reflect racially biased policing practices; and such data will deepen the risk of over-policing in the same neighbourhood, which in turn may lead to more arrests, creating a dangerous feedback loop,” she explained.

“The increasing use of facial recognition and surveillance technologies to track and control specific demographics raises concerns with respect to many human rights, including the rights to privacy, freedom of peaceful assembly and association; freedom of expression and freedom of movement,” Shepherd added.

Among its recommendations, the Committee stressed that algorithmic profiling systems should be in full compliance with international human rights law. It underscored the importance of transparency in the design and application of algorithmic profiling systems when they are deployed for law enforcement purposes.

“This includes public disclosure of the use of such systems and explanations of how the systems work, what data sets are being used and what measures preventing human rights harms are in place,” the Committee said in its recommendation.

The Committee believes that private companies that develop, sell or operate algorithmic profiling systems for law enforcement purposes also have a responsibility to involve individuals from various sectors, including legal experts, to assess the risk of human rights violation such systems may pose.

The experts also recommended that States should carefully evaluate the human rights impact prior to employing facial recognition technology.

“In addition to being unlawful, racial profiling may also be ineffective and counterproductive as a law enforcement tool,” the Committee warned. “People subjected to discriminatory law enforcement tend to have less trust in the police and, as a result, be less willing to cooperate with them.”

The UN also recommends data collected should not be misused. It should be collect regularly and monitor disaggregated quantitative and qualitative data on relevant law enforcement practices such as identity checks, traffic stops or border searches, which include information on the prohibited grounds for racial discrimination, including its intersecting forms, as well as the reason for the law enforcement action and the outcome of the encounter.

The anonymised statistics generated by such practices should be made available to the public and discussed with local communities. Such data should be collected in accordance with human rights standards and principles, such as data protection
regulations and privacy guarantees.

States should also guard against automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work,
economic situation, health, personal preferences, interests, reliability, behaviour, location or movements

 

 

Leave a Comment

Related posts