Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

EU takes major step toward AI regulation as the AI Act moves into final phase

The AI Act would put restrictions on the riskiest uses of AI.

Published onJune 14, 2023

Bing, ChatGPT, and SoundHound Chat AI icons on an Android homescreen
Rita El Khoury / Android Authority
  • The European Parliament passed the first draft of the AI Act.
  • The final version of the AI Act is expected to be passed later this year.
  • The AI Act would be one of the first major laws to regulate AI.

As companies dive head first into AI development, calls for regulation on the burgeoning technology have grown. So far, the EU has been leading the way with the AI Act. Now the governing body has taken a major step towards turning the AI Act into law.

On Wednesday, the European Parliament passed the first draft of the AI Act. It’s expected that the regulation will be passed sometime later this year after the European Parliament, European Commission, and Council of the EU finish negotiations.

The AI Act is a law that focuses on the safe development and application of AI, like ChatGPT and Bard. It puts restrictions on the riskiest uses of the technology, like the use of facial recognition software. The law would also require AI developers to be more transparent by disclosing the data used to create their programs.

Currently, the draft contains provisions that would prevent AI tools from generating illegal, racist, or biased content. It’s also geared toward making sure the generative AI respects international human rights laws.

If the AI Act gets put into law, it would become one of the first major laws to regulate the technology. This would put the EU squarely ahead of other countries like the US and China on this front. However, US lawmakers have recently put forth similar bills. One would require companies to be transparent when people interact with AI content. Another would require government agencies to disclose to the public when an agency is using AI.

You might like