Several media organizations are asking for rules to protect copyright in the data that's used to train generative AI models. In an open letter, they're urging lawmakers around the world to think about making regulations.
These rules ask for transparency when it comes to the training data that are used and permission from the owners of the rights before using the data to train AI.
They're also requesting that AI companies let media companies talk to them about things, like figuring out if something was made by AI and making sure AI doesn't put in any bias or wrong information.
Some of the big names that signed the letter are Agence France-Presse, the European Pressphoto Agency, the European Publishers' Council, Gannett, Getty Images, the National Press Photographers Association, the National Writers Union, News Media Alliance, The Associated Press, and The Authors Guild.
These groups are saying that AI models that learn from media stuff are putting out information without giving credit to the real creators. This is hurting the business models of media companies because they make money from people reading, watching, paying for licenses, and seeing ads.
This isn't just about breaking copyright rules – it's also making it hard for media companies to do their job and give good information to the public.
This letter came out because Google showed its AI tool Genesis to some big newspapers like The New York Times and The Washington Post. But these newspapers and others found out that the AI-made stories had mistakes in them.
It's not just the news companies that are worried about AI using stuff that's protected by copyright. The Senate has talked about it in meetings, and there's even a lawsuit going on where artists say their rights were taken by AI companies. Comedian Sarah Silverman and two writers also sued OpenAI for doing things that weren't allowed by copyright.
The people who signed the letter say they think AI could be good for companies and the public, but they want to be part of discussions about how to do it right.
There's a report that says some of the people who signed the letter are already letting AI companies use their stuff to train their AI. For example, The Associated Press let OpenAI use part of its old stuff and try out AI for writing news.