Tag Archives: Data Architecture

The 2026 UK Financial Services Lakehouse Reference Architecture

An opinionated but practical blueprint for regulated, temporal, multi-domain data platforms: focused on authority, belief, and point-in-time defensibility. This article lays out a reference architecture for UK FS in 2026: not as a rigid blueprint, but as a description of what “good” now looks like in banks, insurers, payments firms, wealth platforms, and capital markets organisations operating under FCA/PRA supervision.

Continue reading

From Build to Run Without Losing Temporal Truth: Operating Model Realities for Regulated Financial Services Data Platforms

This article explores why most regulated data platforms fail operationally rather than technically. It argues that the operating model is the mechanism by which architectural intent survives change, pressure, and organisational churn. Focusing on invariants, authority, correction workflows, and accountability, it shows how platforms must be designed to operate safely under stress, not just in steady state. The piece bridges architecture and real-world execution, ensuring temporal truth and regulatory trust persist long after delivery.

Continue reading

Cost Is a Control: FinOps and Cost Management in Regulated Financial Services Data Platforms

This article positions cost management as a first-class architectural control rather than a post-hoc optimisation exercise. In regulated environments, cost decisions directly constrain temporal truth, optionality, velocity, and compliance. The article explains why FinOps must prioritise predictability, authority, and value alignment over minimisation, and how poorly designed cost pressure undermines regulatory defensibility. By linking cost to long-term value creation and regulatory outcomes, it provides a principled framework for sustaining compliant, scalable data platforms.

Continue reading

Collapsing the Medallion: Layers as Patterns, Not Physical Boundaries

The medallion model was never meant to be a physical storage mandate. It is a pattern language for expressing guarantees about evidence, interpretation, and trust. In mature, regulated platforms, those guarantees increasingly live in contracts, lineage, governance, and tests: not in rigid physical layers. Collapsing the medallion does not weaken regulatory substantiation; it strengthens it by decoupling invariants from layout. This article explains why layers were necessary, why they eventually collapse, and what must never be lost when they do.

Continue reading

From Writes to Reads: Applying CQRS Thinking to Regulated Data Platforms

In regulated financial environments, data duplication is often treated as a failure rather than a necessity. Command Query Responsibility Segregation (CQRS) is an approach to separate concerns such as reads versus writes. This article reframes duplication through CQRS-style thinking, arguing that separating write models (which execute actions) from read models (which explain outcomes) is essential for both safe operation and regulatory defensibility. By making authority explicit and accepting eventual consistency, institutions can act in real time while reconstructing explainable, auditable belief over time. CQRS is presented not as a framework, but as a mental model for survivable data platforms.

Continue reading

Edge Systems Are a Feature: Why OLTP, CRM, and Low-Latency Stores Must Exist

Modern data platforms often treat operational systems as legacy constraints to be eliminated. This article argues the opposite. Transactional systems, CRM platforms, and low-latency decision stores exist because some decisions must be made synchronously, locally, and with authority. These “edge systems” are not architectural debt but purpose-built domains of control. A mature data platform does not replace them or centralise authority falsely; it integrates with them honestly, preserving their decisions, context, and evolution over time.

Continue reading

Why Transactions Are Events, Not Slowly Changing Dimensions

This article argues that modelling transactions as slowly changing dimensions is a fundamental category error in financial data platforms. Transactions are immutable events that occur once and do not change; what evolves is the organisation’s interpretation of them through enrichment, classification, and belief updates. Applying SCD2 logic to transactions conflates fact with interpretation, corrupts history, and undermines regulatory defensibility. By separating immutable event records from mutable interpretations, platforms become clearer, auditable, and capable of reconstructing past decisions without rewriting reality.

Continue reading

Authority, Truth, and Belief in Financial Services Data Platforms

Financial services data architectures often fail by asking the wrong question: “Which system is the system of record?” This article argues that regulated firms operate with multiple systems of authority, while truth exists outside systems altogether. What data platforms actually manage is institutional belief: what the firm believed at a given time, based on available evidence. By separating authority, truth, and belief, firms can build architectures that preserve history, explain disagreement, and withstand regulatory scrutiny through accountable, reconstructable decision-making.

Continue reading

Eventual Consistency in Regulated Financial Services Data Platforms

In regulated financial services, eventual consistency is often treated as a technical weakness to be minimised or hidden. This article argues the opposite: eventual consistency is the only honest and defensible consistency model in a multi-system, regulator-supervised institution. Regulators do not require instantaneous agreement: they require explainability, reconstructability, and reasonableness at the time decisions were made. By treating eventual consistency as an explicit architectural and regulatory contract, firms can bound inconsistency, preserve historical belief, and strengthen audit defensibility rather than undermine it.

Continue reading

Networks, Relationships & Financial Crime Graphs on the Bronze Layer

Financial crime rarely appears in isolated records; it emerges through networks of entities, relationships, and behaviours over time. This article explains why financial crime graphs must be treated as foundational, temporal structures anchored near the Bronze layer of a regulated data platform. It explores how relationships are inferred, versioned, and governed, why “known then” versus “known now” matters, and how poorly designed graphs undermine regulatory defensibility. Done correctly, crime graphs provide explainable, rebuildable network intelligence that stands up to scrutiny years later.

Continue reading

Aligning the Data Platform to Enterprise Data & AI Strategy

This article establishes the data platform as the execution engine of Enterprise Data & AI Strategy in Financial Services. It bridges executive strategy and technical delivery by showing how layered architecture (Bronze, Silver, Gold, Platinum), embedded governance, dual promotion lifecycles (North/South and East/West), and domain-aligned operating models turn strategic pillars, architecture & quality, governance, security & privacy, process & tools, and people & culture, into repeatable, regulator-ready outcomes. The result is a platform that delivers control, velocity, semantic alignment, and safe AI enablement at scale.

Continue reading

Measuring Value in a Modern FS Data Platform: Framework for Understanding, Quantifying, and Communicating Data Value in FS

Measuring Value in a Modern FS Data Platform reframes how Financial Services organisations should evaluate data platforms. Rather than measuring pipelines, volumes, or dashboards, true value emerges from consumption, velocity, optionality, semantic alignment, and control. By landing raw data, accelerating delivery through reuse, organising around business domains, and unifying meaning in a layered Bronze–Silver–Gold–Platinum architecture, modern platforms enable faster decisions, richer analytics, regulatory confidence, and long-term adaptability. This article provides a practical, consumption-driven framework for CDOs and CIOs to quantify and communicate real data value.

Continue reading

East/West vs North/South Promotion Lifecycles: How Modern Financial Services Data Platforms Support Operational Stability and Analytical Freedom Simultaneously

This article argues that modern Financial Services (FS) data platforms must deliberately support two distinct but complementary promotion lifecycles. The well known and understood North/South lifecycle provides operational stability, governance, and regulatory safety for customer-facing and auditor-visible systems. In parallel, the East/West lifecycle enables analytical exploration, experimentation, and rapid innovation for data science and analytics teams. By mapping these lifecycles onto layered data architectures (Bronze to Platinum) and introducing clear promotion gates, FS organisations can protect operational integrity while sustaining analytical freedom and innovation.

Continue reading

Gold & Platinum Layer Architecture After Silver

Modern Financial Services data platforms require more than Bronze, Silver, and Gold layers to manage complexity, meaning, and governance. While Silver provides current-state truth and Gold delivers consumption-driven business meaning, neither resolves enterprise-wide semantics. This article introduces the Platinum layer as the conceptual truth layer, reconciling how different domains, systems, and analytical communities understand the same data. Together, Gold and Platinum bridge operational use, analytical insight, and long-lived domain semantics, enabling clarity, velocity, and governed understanding at scale.

Continue reading

Entity Resolution & Matching at Scale on the Bronze Layer

Entity resolution has become one of the hardest unsolved problems in modern UK Financial Services data platforms. This article sets out a Bronze-layer–anchored approach to resolving customers, accounts, and parties at scale using SCD2 as the temporal backbone. It explains how deterministic, fuzzy, and probabilistic matching techniques combine with blocking, clustering, and survivorship to produce persistent, auditable entity identities. By treating entity resolution as platform infrastructure rather than an application feature, firms can build defensible Customer 360 views, support point-in-time reconstruction, and meet growing FCA and PRA expectations.

Continue reading

Handling Embedded XML/JSON Blobs to Audit-Grade SCD2 Bronze

Financial Services platforms routinely ingest XML and JSON embedded in opaque fields, creating tension between audit fidelity and analytical usability. This article presents a regulator-defensible approach to handling such payloads in the Bronze layer: landing raw data immutably, extracting only high-value attributes, applying attribute-level SCD2, and managing schema drift without data loss. Using hybrid flattening, temporal compaction, and disciplined lineage, banks can transform messy blobs into audit-grade Bronze assets while preserving point-in-time reconstruction and regulatory confidence.

Continue reading

Advanced SCD2 Optimisation Techniques for Mature Data Platforms

Advanced SCD2 optimisation techniques are essential for mature Financial Services data platforms, where historical accuracy, regulatory traceability, and scale demands exceed the limits of basic SCD2 patterns. Attribute-level SCD2 significantly reduces storage and computation by tracking changes per column rather than per row. Hybrid SCD2 pipelines, combining lightweight delta logs with periodic MERGEs into the main Bronze table, minimise write amplification and improve reliability. Hash-based and probabilistic change detection eliminate unnecessary updates and accelerate temporal comparison at scale. Together, these techniques enable high-performance, audit-grade SCD2 in platforms such as Databricks, Snowflake, BigQuery, Iceberg, and Hudi, supporting the long-term data lineage and reconstruction needs of regulated UK Financial Services institutions.

Continue reading

Using SCD2 in the Bronze Layer with a Non-SCD2 Silver Layer: A Modern Data Architecture Pattern for UK Financial Services

UK Financial Services firms increasingly implement SCD2 history in the Bronze layer while providing simplified, non-SCD2 current-state views in the Silver layer. This pattern preserves full historical auditability for FCA/PRA compliance and regulatory forensics, while delivering cleaner, faster, easier-to-use datasets for analytics, BI, and data science. It separates “truth” from “insight,” improves governance, supports Data Mesh models, reduces duplicated logic, and enables deterministic rebuilds across the lakehouse. In regulated UK Financial Services today, it is the only pattern I have seen that satisfies the full, real-world constraint set with no material trade-offs.

Continue reading