CLARENDON
2026 REGULATORY INTELLIGENCE
UK AI Governance Map
FINANCIAL ADVISERS · WEALTH MANAGERS · IFA NETWORKS · BUILDING SOCIETIES
| Domain | Deadline | Who is personally liable | What good looks like | Evidence file required | First 30 days action |
|---|---|---|---|---|---|
| Consumer Duty | Live July 2023 | SMF6 or SMF29 | Every AI touchpoint in the advice journey is mapped and documented. Human in the loop is confirmed at each automated decision point. Good outcomes are evidenced rather than assumed. | Documented advice journey map with human sign off points. | Map every AI assisted client interaction. Confirm a human override exists at each decision point. |
| SM&CR AI Accountability | Live | Designated SMF for each AI system | Named SMF is accountable for every AI tool approved and in use. Documented reasonable steps exist for each system and are reviewable by the Financial Conduct Authority on request. | Signed reasonable steps record for every AI system. | Assign a named SMF to each AI tool currently in use. Draft a one page reasonable steps record for each system. |
| Data Protection and Automated Decisions | Live and updated June 2025 | Data Protection Officer or responsible SMF | Lawful basis is documented for every automated decision. Records of processing are maintained. Data processing agreements are signed with every AI vendor. | Article 30 record of processing activities and signed vendor agreements. | Audit all AI systems for automated decision making. Confirm which triggers Article 22 obligations. |
| Operational Resilience | Live March 2022 | SMF24 or Chief Risk Officer | All AI systems are scored by client impact. Business continuity plans exist for critical systems with defined recovery time objectives. Semantic failure scenarios are documented. | Board approved business continuity plans including semantic failure scenarios. | Score all AI systems using a four dimension client impact model. Build or update business continuity plans. |
| Third Party AI Oversight | Live | Chief Risk Officer or technology SMF | Due diligence is completed on every AI embedded platform. Certification status like ISO 42001 or SOC 2 Type II is logged. Vendor AI governance documentation is reviewed. | Vendor due diligence file including certification logs and audit reports. | Request AI governance documentation from every platform provider in use. Log certification status. |
| EU AI Act Indirect Exposure | Phased 2025 to 2026 | Board or risk SMF | All platform providers identified as EU hosted or EU serving. Compliance status is verified and logged. High risk classification is reviewed for every tool. | Written EU AI Act compliance statement and vendor register. | Identify every AI platform that operates from an EU member state. Request a written compliance statement from each. |
| Model Risk Management | Live May 2024 | Chief Risk Officer or Head of Risk | AI model inventory is maintained. Validation process is defined and applied before deployment. Bias testing is evidenced. Models are reviewed when material changes occur. | Model inventory and formal validation testing report. | Build a model inventory covering all AI tools that influence client outcomes. Apply validation standards. |
| AI Literacy | Live | SMF responsible for training | Staff competence is assessed against AI capability requirements. Training logs evidence understanding of the firm AI policy and principles. | Training completion certificates and competency assessment logs. | Audit current staff competence. Roll out AI awareness training to all advisers. |