Partial Use of AI in Products Requires Disclosure Under New Law
A new AI regulation law mandates clear disclosure of artificial intelligence use in a wide range of products, including video games. According to the National Assembly Research Service's interpretation, video games fall under the AI Framework Act, which takes effect in January next year. This means developers must notify users if AI technology was involved in their product's creation.
Under the law, video game makers who use AI during development are required to implement a risk assessment and management system. AI has long been part of gaming, from adjusting difficulty levels to controlling nonplayer characters. Recent examples include games like Uncover the Smoking Gun, which employ conversational AI for dynamic storytelling.
Disclosure Extends Beyond Gameplay AI
The requirement to disclose AI use isn't limited to gameplay mechanics. It also covers AI-generated assets such as images, sounds, or 3D models. Even if AI contributes partially to the creative content, the product must be identified as AI-based. The parliamentary think tank clarified that titles using AI models for text, images, or audio could be categorized as AI products, and their publishers considered part of the AI industry.
However, the lawβs application depends on the extent of generative AI use and the degree of human input involved. This allows some flexibility in interpretation based on the product's specific development process.
Balancing Creator Rights and Industry Growth
The increasing use of AI in creative fields like sound, images, and video is blurring traditional boundaries. This raises concerns about protecting creatorsβ rights while supporting innovation. Legislators acknowledge the need for policies that safeguard creators without stifling industry progress.
The Basic Act on the Development of Artificial Intelligence and the Creation of a Foundation for Trust, passed in December and enacted in January, sets out legal guidelines for AI use in creative industries including music, movies, and cartoons. It aims to provide clarity and ensure responsible AI adoption.
For example, the Korea Music Copyright Association now requires songwriters to confirm that AI was not used in composing their songs, reflecting concerns over copyright eligibility for AI-generated music. Their position is that AI-created songs may not qualify for copyright protection.
Implications for Legal Professionals
- Ensure clients in creative sectors understand disclosure obligations under the AI Framework Act.
- Advise on implementing risk assessment and management systems when AI is involved in product development.
- Monitor the evolving interpretation of partial AI use in products to guide compliance strategies.
- Assess copyright risks related to AI-generated creative content, especially in music and media.
The AI Framework Act is set for discussion in a National Assembly plenary session on May 1, which could further clarify enforcement and compliance requirements.
For legal professionals interested in expanding AI-related expertise, exploring targeted AI courses can provide valuable insights into regulatory frameworks and technology applications. Resources such as Complete AI Training's courses by job offer practical knowledge tailored to legal and compliance roles.
Your membership also unlocks: