South Korea's AI Action Plan Faces Creator Backlash: What Government Officials Need to Weigh Now
January 15, 2026 - South Korean creator and copyright groups have issued a joint statement rejecting the Korea AI Action Plan, warning it could normalize AI training on copyrighted works without prior permission and, effectively, without payment.
Their core objection: a "use first, pay later" direction that weakens control over creative works and threatens the long-term health of Korea's cultural industries.
The flashpoint: Action Plan No. 32
Action Plan No. 32 calls for "activating the ecosystem for the use and distribution of copyrighted works for AI training and evaluation."
It recommends that the Culture Ministry, with other ministries, prepare by Q2 either amendments to the AI Basic Act or a separate AI Special Act to allow training on copyrighted works "without legal uncertainty."
Why creators are pushing back
- They argue the plan stretches fair use toward private commercial interests and misrepresents global trends as moving to broader exemptions.
- They warn the policy shifts enforcement burdens onto individual creators who lack resources to police large-scale AI training.
- They criticize "opt-out" protections that depend on machine-readable technical measures, calling them unrealistic for most creators.
- They fear once unpaid or low-cost training is normalized, restoring meaningful compensation will be difficult.
They've called this "a declaration that the government is abandoning the sustainability of Korea's cultural industry" and are urging a fundamental policy review.
Context for government
Promoting AI remains a key priority for the Lee Jae Myung administration. The Presidential Council on National Artificial Intelligence Strategy, launched Sept. 8, is tasked with coordinating the national agenda.
The draft Korea AI Action Plan, unveiled Dec. 15, includes 98 action items, spanning computing infrastructure, AI semiconductors, sectoral adoption, and reforms on copyrighted works for AI training and evaluation. Public feedback closed Jan. 4.
Policy risks to consider
- Trust and legitimacy: Aggressive training exemptions could damage credibility with creators and cultural industries.
- Economic balance: Short-term gains for AI firms may come at the expense of Korea's content exports and creative jobs.
- Legal exposure: Broad exemptions risk litigation and international friction if they outpace comparable jurisdictions.
- Administrative burden: "Opt-out" regimes that rely on technical measures may be unworkable for most rights holders.
What "opt-out" looks like elsewhere
In the EU, the copyright framework allows text-and-data mining for commercial use unless rights holders reserve their rights in a machine-readable way (Article 4, DSM Directive).
That model has been criticized for favoring large platforms and leaving small creators with limited practical defense. See the directive text here: EU Directive 2019/790.
Choices in front of policymakers
- Opt-in (prior permission) by default: Affirm that the right holder decides in advance; support collective licensing for scale and predictability.
- True, simple opt-out: If opt-out is retained, create a national registry and plain-language tags; do not require complex technical barriers to be enforceable.
- Compulsory licensing: Allow use with statutory rates, dataset logging, audit rights, and transparent reporting to rights organizations.
- Research-only carve-out: Limit broad exceptions to nonprofit research; require licenses for commercial training.
- Dataset transparency: Mandate source disclosures, data provenance records, and a duty to honor machine-readable reservations.
- Enforcement tools: Establish notice-and-takedown for datasets, timely dispute resolution, penalties for noncompliance, and funding for creators' tooling.
- Pilots and sunset clauses: Test narrow models, measure impact on creators and AI adoption, and revisit based on evidence.
Practical next steps for agencies
- Publish draft legal text for Action Plan No. 32 and solicit targeted feedback from creators, publishers, broadcasters, and AI developers.
- Convene a standing working group with creator groups, collective management organizations, and startups to design workable licensing paths.
- Commission an impact assessment quantifying effects on creator incomes, AI competitiveness, and administrative costs.
- Prototype a national "rights reservation" registry and plain-language guidance; test with small creators before rollout.
- Prepare guidance for public-sector AI projects on permitted data sources, documentation, and procurement standards.
What to watch next
Sixteen organizations signed the joint statement, spanning independent producers, digital creators, writers, screenwriters, performers, choreographers, music rights holders, and the broadcasting industry.
An official at the council said a discussion session is planned. Creator groups say they will continue strong action until policy shifts to protect rights and ensure fair compensation as a core principle.
For public-sector leaders building AI capability
If your agency is planning staff training to meet new AI policy and procurement standards, see practical course collections here: AI courses by job.
Your membership also unlocks: