KR
South Korea | Regulatory Framework Status: in force
Effective: 2026-01-22
high

MSIT releases AI transparency implementation guideline under AI Basic Act (Jan 2026)

AI Transparency Assurance Guideline (implementation guidance for transparency obligation under the AI Basic Act)

I. Regulatory Summary

Direct compliance impact for advisors and legal teams supporting AI product deployments in Korea: requires building user-notice and content-labeling controls aligned with the AI Basic Act transparency obligation.

II. Full Description

MSIT publicly released guidance to operationalise the AI Basic Act’s transparency obligation shortly before/alongside the Act’s commencement. Public reporting about Korea’s AI framework emphasised user notification when interacting with certain AI systems and labeling/marking of generative AI outputs (including deepfake-like content), with administrative enforcement and a reported grace period before certain penalties apply.

III. Scope & Application

Official ministry guidance explaining how to implement the AI Basic Act’s transparency obligation, including notification and labeling approaches for certain AI interactions and generative AI outputs (e.g., deepfakes and AI-generated content markers), intended to reduce operational uncertainty for providers.

IV. Policy Impact Assessment

Direct compliance impact for advisors and legal teams supporting AI product deployments in Korea: requires building user-notice and content-labeling controls aligned with the AI Basic Act transparency obligation.

Primary Focus: AI transparency / labeling guidance