A bipartisan group of U.S. senators introduced the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act).
According to The Verge, the proposed legislation aims to authenticate and detect AI-generated content, providing a shield for journalists and artists against unauthorized use of their work.
See Also: OpenAI Transcribes Over A Million YouTube Hours: Navigating The Gray Area Of AI Data Use
The COPIED Act, if passed, will task the National Institute of Standards and Technology (NIST) with developing standards and guidelines to verify the origin of content and identify synthetic content through methods like watermarking.
Additionally, the bill mandates security measures to prevent tampering and requires AI tools used in creative or journalistic endeavors to include origin information that cannot be removed. Such content would also be barred from being used to train AI models without explicit permission.
Industry Leaders, Lawmakers Back COPIED Act
Senate Commerce Committee Chair Maria Cantwell (D-Wash.), alongside Senate AI Working Group member Martin Heinrich (D-N.M.) and Commerce Committee member Marsha Blackburn (R-Tenn.), spearheaded the bill. It is part of a broader effort in the Senate to understand and regulate AI technologies.
Senate Majority Leader Chuck Schumer (D-NY) previously emphasized the importance of creating an AI regulatory framework, with specific laws being deliberated in individual committees.
Several organizations representing content creators have expressed strong support for the COPIED Act. The bill has garnered endorsements from SAG-AFTRA, the Recording Industry Association of America, the News/Media Alliance, and the Artist Rights Alliance.
Duncan Crabtree-Ireland, SAG-AFTRA’s national executive director and chief negotiator, stressed the urgency of the bill: “The capacity of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members.”
“We need a fully transparent and accountable supply chain for generative Artificial Intelligence and the content it creates in order to protect everyone's basic right to control the use of their face, voice, and persona,” he added.
Read Next:
- House Bill Aims To Enhance Transparency In AI Training Practices
- Will AI Disrupt Video Games? Someone Just Built Original ‘Pong’ Game In Less Than 60 Seconds Using GPT-4
- How Israel’s Military Uses AI System To Target In Gaza: Report
Photo: Shutterstock.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.