fbpx

The White House proposes an AI Bill of Rights

The White House proposes an AI Bill of Rights
Image Credits: PBS

This morning, the White House published a blueprint for an AI Bill of Rights, a document that maps five main principles guiding “the design, use, and deployment of automated systems to protect the American public in the age of artificial intelligence.”

Informed by “insights from researchers, technologists, advocates, journalists, and policymakers”, the handbook aims to mitigate the harmful consequences of AI applications by enhancing its standard practices of safety, privacy, and transparency. 

To achieve this, the handbook explains that AI systems must be proven safe and effective through testing and consultation with stakeholders, continuously monitored in production, and designed to protect both communities and individuals from biased decision-making, or what is otherwise known as ‘algorithmic discrimination’.

It also adds that users should have control over how their data is used, informed about their options in plain language, and given the choice to opt out of interactions in the event of a system failure.

As the White House seeks to inspire federal agencies ‘by example’, private corporations are not mandated to follow the AI Bill of Rights. The text reads: “This framework is accompanied by From Principles to Practice—a handbook for anyone seeking to incorporate these protections into policy and practice, including detailed steps toward actualizing these principles in the technological design process.”

It continues: “These principles help provide guidance whenever automated systems can meaningfully impact the public’s rights, opportunities, or access to critical needs.”

In the coming months, a number of institutions, such as the Department of Health and Human Services and the Department of Education, are expected to publish further guidance on the use of damaging or dangerous algorithmic technologies in specific settings.

If you see something out of place or would like to contribute to this story, check out our Ethics and Policy section.