GB | Incident Report
Regulatory Action
2026-01-30
UKJT publishes draft statement on liability for AI harms and opens consultation
AI Model: Unspecified AI systems AI LiabilityConsultationPrivate LawNegligenceContract Risk AllocationEvidence & CausationUKJT
I. Executive Summary
The UK Jurisdiction Taskforce (UKJT) published a draft legal statement addressing how English private law may allocate liability for non-deliberate harms caused by AI systems, alongside a public consultation process. The draft emphasises that AI has no legal personality under English law and frames liability primarily through existing doctrines (contract, negligence, product liability, vicarious liability, misrepresentation/defamation). The consultation is positioned as non-binding but intended to guide market practice and future judicial reasoning.
II. Key Facts
- UKJT published a draft legal statement on liability for AI harms under English private law.• Consultation opened with a stated deadline (13 Feb 2026) and an associated consultation process/public engagement.• Draft assumes no AI legal personality; liability attributed to legal persons via existing doctrines.• Highlights litigation-specific issues: causation/foreseeability challenges and evidentiary opacity (“black box‘) in AI systems.
III. Regulatory & Ethical Implications
Although not binding, the draft statement is a practice-shaping reference point for litigators and transactional advisors: it signals how courts may analyse duty, causation, contractual allocation, and evidentiary burdens in AI-harm disputes. It also strengthens the case for AI-specific contracting (warranties/indemnities/limitations), governance documentation, and expert-evidence readiness—key risk controls for firms advising clients deploying AI at scale.