YouTube has come out with this big plan to handle the way AI technology is shaking things up in the music world. You know, AI is all about artificial intelligence, like smart computer stuff.
And YouTube is right in the middle of this, especially when it comes to their video platform and all the cool stuff happening in the music scene – you know, musicians, record labels, and all those folks who have rights to stuff.
YouTube is seeing some pretty cool things that AI can do for music. It's like AI can make music even more creative and awesome. But here's the kicker – YouTube also wants to make sure that artists' hard work and creations are safe and sound. They're really into protecting what musicians make.
YouTube is teaming up with artists, songwriters, and producers from the music industry to create the YouTube Music AI Incubator to explore the best ways to use AI in music.
Starting things off, YouTube is partnering with Universal Music Group (UMG) and its lineup of talented people, including well-known names like Anitta, Björn Ulvaeus, Don Was, Juanes, Louis Bell, Max Richter, Rodney Jerkins, Rosanne Cash, Ryan Tedder, Yo Gotti, and the estate of Frank Sinatra.
But, unlike YouTube, UMG has been more cautious about AI. This year, it asked music streaming services like Spotify not to let AI companies use its music for training. It even took action against AI-made YouTube videos that used its artists' work. UMG removed a popular AI song that copied the style of Drake and The Weeknd from Spotify and Apple Music.
UMG's main concern, like others in the creative field, is that artists' work is used to teach AI, which is then used to create new things without asking or paying the creators. That's why UMG teamed up with YouTube to find a way to make sure creators are compensated.
YouTube mentions its past experiences with similar challenges. It has invested a lot in systems that balance copyright holders' interests with those of the creative community on YouTube.
They point to the Content ID system, which pays rightsholders when their content is used on the platform. They suggest a similar idea might work for AI music, at least for music partners who agree to join.
YouTube also talks about trust and safety. They already have rules against misleading content and plan to apply similar standards to AI-generated content to prevent misuse. Instead, they'll use AI to spot this kind of content.
They're planning to spill the beans about the technology they're using, how regular folks like us can make some money, and the rules they're putting in place. So, in the near future, we'll be clued in on all the details about this music system powered by artificial intelligence.
Sources: blog.youtube / support.google.com