South Korea Enacts First Comprehensive AI Safety Law: What Legal Teams Need to Know
South Korea has enacted the Basic Act on the Development of Artificial Intelligence and the Establishment of a Foundation for Trustworthiness, the first comprehensive statute focused on AI safety, according to Yonhap News. The law is now in effect and places new, concrete obligations on companies building or deploying AI systems in the country.
The headline requirements: stricter duties around deepfakes and AI-driven misinformation, mandatory user disclosures for high-risk use cases, and watermarking for AI-generated content. There is a one-year grace period before penalties apply.
Scope: Who Is In and Why It Matters
The act applies to companies and AI developers operating in South Korea. International companies are explicitly covered if they meet any of these thresholds:
- Global annual revenue of at least 1 trillion won (about $680 million), or
- Domestic sales of at least 10 billion won (about $6.8 million), or
- 1 million or more daily users in South Korea.
OpenAI and Google currently fall under these criteria.
High-Risk AI: Examples and Expectations
"High-risk" refers to AI systems capable of generating content that affects daily life or people's safety. The law cites hiring processes, loan assessments, and medical advice as examples.
For these use cases, companies must inform users that the service is AI-based and implement appropriate safety measures. That implies documented controls, audits, and a clear risk review process.
Transparency: Watermarking and Disclosures
All AI-generated content must carry watermarks to show its origin. As a Science Ministry official put it: "Applying watermarks to AI-generated content is the minimum safeguard to prevent side effects from the abuse of AI technology, such as deepfake content."
Expect downstream obligations too-procurement, marketing, and product teams will need to ensure watermark integrity across formats (text, image, audio, video) and distribution channels.
Local Representative Requirement
International companies meeting the thresholds must designate a local representative in South Korea. This representative will be the point of contact for regulators and may carry procedural responsibilities similar to data protection representatives under other regimes.
Enforcement, Penalties, and Timeline
- Fines: Up to 30 million won (about $20,418) for violations.
- Grace period: One year before penalties are imposed to allow industry adjustment.
- Policy cadence: The science minister must present a policy blueprint every three years to support AI industry development.
Immediate Action Items for Legal and Compliance
- Map systems and vendors: Identify all AI deployments touching hiring, lending, medical, or other sensitive decisions. Flag anything that could influence safety or daily life.
- Classify risk: Document rationale for whether a system is high-risk under the act. Build a repeatable assessment process.
- User disclosures: Draft and implement clear notices for AI-based services, including where and how users see them.
- Watermarking: Establish technical and operational controls to watermark AI outputs across all supported media. Add QA checks and incident response for misuse or removal.
- Local representative: If thresholds are met, appoint and empower a local representative. Define scope, reporting lines, and response SLAs.
- Contract updates: Add clauses on watermarking, mis/disinformation safeguards, and cooperation with investigations for vendors and distribution partners.
- Governance: Stand up or expand an AI risk committee; set audit trails, model cards, and change logs for high-risk systems.
- Training: Brief HR, risk, product, and marketing teams on the new obligations and enforcement timeline.
What to Watch
- Interpretive guidance from regulators on "high-risk" boundaries and acceptable watermarking standards.
- How enforcement prioritizes deepfake and misinformation cases during and after the grace period.
- Alignment with other regimes (for example, EU transparency duties) to reduce duplicative controls.
Official Resources
For updates and guidance, monitor the Ministry of Science and ICT's English site: MSIT.
Need to Upskill Your Team?
If you're building an internal training track for counsel or compliance on AI obligations, you can browse role-focused programs here: Complete AI Training - Courses by Job.
Your membership also unlocks: