Examining the Impact of AI-Generated Content on Creative Industries and Society
Generative Artificial Intelligence (AI) is a type of AI technology which generates entirely new original content by algorithms and machine learning (ML) techniques which are trained on large datasets.
What does it mean for creative industries?
The creative industry is one of the earliest industries to harness the potential of generative AI. It has also seen how these technologies can harm people, their livelihoods, and potentially threaten the continuance of whole sectors of the economy.
AI has enabled further creativity, amplification of works, and the enhancement of peoples’ experiences. Generative AI technologies are today primarily visible in music composition, literary composition, and image creation, but their expansion to cinematic and more immersive artistic experiences such as gaming are also soon to be expected.
In August 2022, FN Meka became the first AI-generated artist to be signed to a major record label, Capitol Records. With over 10 million TikTok followers, the virtual rapper’s character and music, except his voice, is entirely AI-generated. The music and lyrics were generated on input data from video games and social media. However, shortly after his signing, FN Meka was dropped due to his racial stereotyping and use of racial slurs.
Alongside questions regarding AI-generated music and ownership rights, this brought to the public’s attention further issues: who is responsible and accountable when AI generates illegal and offensive speech? The human who inputs the prompts, or the human who created the technology?
A few months ago, author A. J. Jacob’s blog post on ‘rip off’ counterfeits of his novel ‘The Puzzler’ went viral. He describes finding several AI generated versions of his work, which were somewhat incomprehensible and mostly inarticulate, and described as “one of the shadiest corners of the publishing industry”.
Jacob’s experience is far from unique; AI-generated books are widespread on Amazon and other online services, and editors are overcome by the plethora of submissions of AI generated works. Thanks to ChatGPT and similar tools widely and freely available, anyone, anywhere can input simple instructions based on recognisable publisher’s works, create an article, novel, play, and more, and publish and sell the counterfeit piece online within a few hours.
In addition to questions concerning copyright and counterfeiting, real human authors are now not only competing with these fake works in terms of visibility of their works online, but also their ability to get published.
In 2022, Jason Allen won first place in the Digital Art section of the Colorado’s State Fair Art contest for his piece ‘Théâtre D’opéra Spatial’, created using Midjourney’s AI image-generating programme. Allen inputted a combination of words and phrases and chose an image from over 900 outputs generated by the programme before printing the final product on canvas.
Aside from the significant backlash Allen received from fellow artists and that the piece includes recognisable imagery from existing paintings and photographs, Allen has thus far failed to obtain copyright ownership of the piece.
What are the Regulatory Implications?
Many of the legal and regulatory questions raised above are not new to the creative sector. However, it is clear; generative AI not only amplifies existing challenges but also has brought to light new legal questions which current regulatory systems are not prepared for.
When looking at generative AI from a legal perspective, we can consider two distinct sets of challenges, those related to input versus those related to output content.
By nature, generative AI systems depend on access to mass data. The more data inputted into training algorithms, the better the system learns; the better quality of data inputted, the better quality the output product. If the data inputted into these systems contains fake information, misinformation, biased, or illicit content, the output of these systems will contain the same.
While intellectual property rights vary across jurisdiction, largely, creators and copyright owners have control to determine how their content is used. Creators can choose to enable their work to be made freely available online, or to subject the use of their work to licencing agreements. Many AI-generative systems today scrawl the web gathering content for their input data without having regard to such IP law. Only legitimate content, whether publicly available or accessed via licencing agreements, should be used as input data to generative AI systems.
More novel regulatory questions arise regarding the outputs of AI generative systems. Generally, to be granted authorship or copyright protection of a work, a certain amount of human input and originality is necessary – as AI system’s lack legal personality, they cannot be held liable.
In the case of an AI-generated piece of work, determining who will gain this copyright will ultimately depend on the type of AI system used, how it was designed by the programmer, and how much input was given by the person giving instructions to the system.
While human authors, journalists, and reporters adhere to moral codes and regulation when creating their works, AI systems do not inherently comply with existing legal frameworks – these must be visibly distinguished and labelled.
Policymakers globally are considering how to protect human creators while enabling continued technological development and innovation in other fields.
In 2021, the US Federal Trade Commission (FTC) issued guidelines for companies on generative AI, outlining transparency requirements. The UK’s Policy Paper on AI regulation published in March 2023 foresees regulatory clarifications regarding generative AI and IP rights. The EU AI Act is expected to be the first global regulation on AI, with policymakers deep in negotiations considering how best to include generative AI within scope the scope of the legislation.
At a minimum, generative-AI systems must be subject to transparency and accountability obligations. IP rights should be upheld and new exemptions under the guise of promoting innovation should not come at the expense of the rights and compensation of creators and artists. Technologies can continue to develop and support society while protecting human creators by relying on licenced data with consent.
The economic viability of the sector and livelihood of creators relies on the continued value placed on human creativity in its protection in our laws. If you want to learn about how the EU plans to regulate generative AI and its impact on the creative industry, you can attend our webinar “Generative AI & the Creative Sector: The EU’s AI Act” on April 27, 2023 at 14:00 – 15:00 BST. Access Partnership is closely monitoring regulatory developments related to the EU AI Act. If you want more information, you can contact Lydia Dettling at firstname.lastname@example.org.