California Governor Gavin Newsom Signs Tough New AI Regulations to Combat Deepfakes and Protect Actors' Likeness
In a groundbreaking move, California Governor Gavin Newsom has signed some of the toughest laws in America to regulate the artificial intelligence sector. These new laws are set to crack down on AI deepfakes that pose a threat to elections and protect Hollywood actors from unauthorized use of their likeness.
Governor Newsom's office stated, "Home to the majority of the world's leading AI companies, California is working to harness these transformative technologies to address pressing challenges while also examining the risks they present."
One of the key laws, AB 2655, mandates that large online platforms such as Facebook must remove or label AI deepfakes related to elections and establish channels for reporting such content. Candidates and elected officials can take legal action if platforms fail to comply.
Another law, 2355, requires transparency in AI-generated political advertisements. This means that misleading posts like AI deepfakes of celebrities endorsing politicians may no longer go unnoticed. The FCC has also proposed a similar disclosure rule at a national level.
Furthermore, two additional laws focus on the media industry in California. AB 2602 mandates studios to obtain permission from actors before creating AI-generated replicas of their voice or likeness. Meanwhile, AB 1836 prohibits the creation of digital replicas of deceased performers without consent from their estates.
Governor Newsom is currently reviewing other AI-related bills, including SB 1047, which has sparked controversy. Critics argue that the bill could stifle innovation in the open-source community. Newsom has two weeks to make a decision on this bill.
In conclusion, these new AI regulations in California aim to safeguard elections, protect actors' rights, and promote transparency in online content. It is crucial for individuals to stay informed about these developments to understand how they can impact their lives and finances in the future.