📣 In this issue:
Chakras, Wall Reliefs, Pink Floyd, Stephen Hawking, fake Elon, and the Martian Railway Corporation
📰 News-To-Know
1
Entertainment Weekly is talking about Damián Gaume, an Argentinian-Australian 3D artist who used artificial intelligence to win Pink Floyd’s "Dark Side of the Moon" animated video competition for the song "Any Colour You Like." This was part of a contest marking the album's 50th anniversary, with a winner for each of the 10 songs on the album. Despite Gaume's creative use of AI, his win sparked controversy, particularly among those who preferred traditional animation methods.
Context: The competition was judged by a panel that included Pink Floyd’s drummer, Nick Mason, and received over 900 entries from around the world. Gaume’s entry, which utilized AI-generated animation, was highlighted against traditional methods, stirring debates about the role and acceptance of AI in creative industries.
Significance: This is significant because it touches on the evolving intersection of technology and art, specifically how AI tools are being integrated into creative processes traditionally dominated by manual human effort. The controversy around Gaume's win highlights broader debates within the arts community about the authenticity and value of AI-generated art.
2
A digital trading card game developer, Champions of Otherworldly Magic, has hired an 'AI artist' to create card art for their game. This AI artist, who is highly skilled in digital art and does not use social media, is paid $90,000 annually ($15,000 per month for 10 hours of work) and has produced hundreds of pieces of artwork significantly faster than traditional artists could. The artwork supports the game's model of using NFTs and traditional sales to generate revenue, with Champions TCG boasting about $500K in card sales so far.
Context: Champions TCG emphasizes the efficiency and cost-effectiveness of using generative AI for art production, citing that no traditional artist team could match the quality delivered at this speed and price. The AI-generated art is described as having no errors typical of early AI outputs like extra fingers or generic designs, and each piece is fine-tuned by hand to ensure quality. The acceptance and integration of AI in game art design highlight the shifting landscape in digital art production, reflecting broader trends and debates in the intersection of AI technology and creative industries.
Significance: This development is noteworthy as it represents a concrete example of how AI is transforming the economics and production methods within the gaming and broader entertainment industries.
3
Mashable reports on a series of fake YouTube livestreams that purportedly feature Elon Musk and SpaceX during a solar eclipse to promote cryptocurrency scams. These livestreams, attracting hundreds of thousands of viewers, employ AI-generated videos of Elon Musk promoting a fake cryptocurrency investment opportunity. The scammers manipulate channel names to appear as official SpaceX channels and use QR codes in the videos directing viewers to fraudulent investment sites. Despite YouTube's efforts to remove these streams, they continue to be a significant problem, leveraging popular topics like Musk and SpaceX to scam viewers.
Context: The scammers utilize high-interest events such as the solar eclipse and popular figures like Elon Musk to create a semblance of legitimacy, thus luring a large audience. The fake streams feature sophisticated AI technology to generate realistic videos and voiceovers of Musk. This case reflects a growing trend of using AI and deepfake technology in scams, which poses a significant challenge to platforms like YouTube in policing content and protecting users.
Significance: This development is critical as it highlights the potential misuse of AI and digital media to conduct large-scale scams, which can undermine trust in digital content and platforms. The situation calls for more advanced and proactive measures from tech companies to detect and prevent such fraudulent activities.
4
The Guardian discusses a new bill introduced by California Democratic Congressman Adam Schiff, aimed at regulating the use of copyrighted content by artificial intelligence (AI) companies. Named the Generative AI Copyright Disclosure Act, the bill requires these companies to disclose any copyrighted works used in their AI training datasets to the Register of Copyrights before launching new systems. This disclosure must happen at least 30 days prior to the public debut of their AI tools, with failure to comply resulting in a financial penalty. This move is part of a broader effort to balance the potential benefits of AI with the need for ethical standards and protections for intellectual property.
Context: This legislative effort comes in response to ongoing concerns about how AI firms, such as OpenAI, use copyrighted creative works like music, movies, books, and visual art to develop their products. The issue has sparked numerous lawsuits and government investigations, questioning whether these companies are violating copyright laws by leveraging protected works without authorization to train their generative AI models, which produce text, images, and other media in response to user prompts.
Significance: The bill highlights a critical junction in the intersection of technology and intellectual property law. It aims to make the operations of AI companies more transparent by forcing them to publicly acknowledge the copyrighted materials in their training datasets. This could significantly affect how generative AI tools are developed and used, potentially curtailing their capabilities if access to extensive copyrighted content is restricted.