Production-Grade Testing for SCD2 & Temporal Pipelines

The testing discipline that prevents regulatory failure, data corruption, and sleepless nights in Financial Services. Slowly Changing Dimension Type 2 pipelines underpin regulatory reporting, remediation, risk models, and point-in-time evidence across Financial Services — yet most are effectively untested. As data platforms adopt CDC, hybrid SCD2 patterns, and large-scale reprocessing, silent temporal defects become both more likely and harder to detect. This article sets out a production-grade testing discipline for SCD2 and temporal pipelines, focused on determinism, late data, precedence, replay, and PIT reconstruction. The goal is simple: prevent silent corruption and ensure SCD2 outputs remain defensible under regulatory scrutiny.

Continue reading

Event-Driven CDC to Correct SCD2 Bronze in 2025–2026

Broken history often stays hidden until remediation or skilled-person reviews. Why? Event-driven Change Data Capture fundamentally changes how history behaves in a data platform. When Financial Services organisations move from batch ingestion to streaming CDC, long-standing SCD2 assumptions quietly break — often without immediate symptoms. Late, duplicated, partial, or out-of-order events can silently corrupt Bronze history and undermine regulatory confidence. This article sets out what “correct” SCD2 means in a streaming world, why most implementations fail, and how to design Bronze pipelines that remain temporally accurate, replayable, and defensible under PRA/FCA scrutiny in 2025–2026.

Continue reading

How to Safely Increase System Volume on Windows

A practical guide to safely increasing system-wide audio volume on Windows using Equalizer APO and the LoudMax VST plugin. The article explains how to install and configure both tools to achieve up to 20 dB of additional loudness without clipping or distortion, using true-peak limiting and clean gain staging suitable for everyday listening, gaming, and media consumption.

Continue reading

Golden-Source Resolution, Multi-Source Precedence, and Regulatory Point-in-Time Reporting on SCD2 Bronze

Why Deterministic Precedence Is the Line Between “Data Platform” and “Regulatory Liability”. Modern UK Financial Services organisations ingest customer, account, and product data from 5–20 different systems of record, each holding overlapping and often conflicting truth. Delivering a reliable “Customer 360” or “Account 360” requires deterministic, audit-defensible precedence rules, survivorship logic, temporal correction workflows, and regulatory point-in-time (PIT) reconstructions: all operating on an SCD2 Bronze layer. This article explains how mature banks resolve multi-source conflicts, maintain lineage, rebalance history when higher-precedence data arrives late, and produce FCA/PRA-ready temporal truth. It describes the real patterns used in Tier-1 institutions, and the architectural techniques required to make them deterministic, scalable, and regulator-defensible.

Continue reading

Entity Resolution & Matching at Scale on the Bronze Layer

Entity resolution has become one of the hardest unsolved problems in modern UK Financial Services data platforms. This article sets out a Bronze-layer–anchored approach to resolving customers, accounts, and parties at scale using SCD2 as the temporal backbone. It explains how deterministic, fuzzy, and probabilistic matching techniques combine with blocking, clustering, and survivorship to produce persistent, auditable entity identities. By treating entity resolution as platform infrastructure rather than an application feature, firms can build defensible Customer 360 views, support point-in-time reconstruction, and meet growing FCA and PRA expectations.

Continue reading

Handling Embedded XML/JSON Blobs to Audit-Grade SCD2 Bronze

Financial Services platforms routinely ingest XML and JSON embedded in opaque fields, creating tension between audit fidelity and analytical usability. This article presents a regulator-defensible approach to handling such payloads in the Bronze layer: landing raw data immutably, extracting only high-value attributes, applying attribute-level SCD2, and managing schema drift without data loss. Using hybrid flattening, temporal compaction, and disciplined lineage, banks can transform messy blobs into audit-grade Bronze assets while preserving point-in-time reconstruction and regulatory confidence.

Continue reading

From SCD2 Bronze to a Non-SCD Silver Layer in Other Tech (Iceberg, Hudi, BigQuery, Fabric)

Modern data platforms consistently separate historical truth from analytical usability by storing full SCD2 history in a Bronze layer and exposing a simplified, current-state Silver layer. Whether using Apache Iceberg, Apache Hudi, Google BigQuery, or Microsoft Fabric, the same pattern applies: Bronze preserves immutable, auditable change history, while Silver removes temporal complexity to deliver one row per business entity. Each platform implements this differently, via snapshots, incremental queries, QUALIFY, or Delta MERGE, but the architectural principle remains universal and essential for regulated environments.

Continue reading

From SCD2 Bronze to a Non-SCD Silver Layer in Snowflake

This article explains a best-practice Snowflake pattern for transforming an SCD2 Bronze layer into a non-SCD Silver layer that exposes clean, current-state data. By retaining full historical truth in Bronze and using Streams, Tasks, and incremental MERGE logic, organisations can efficiently materialise one-row-per-entity Silver tables optimised for analytics. The approach simplifies governance, reduces cost, and delivers predictable performance for BI, ML, and regulatory reporting, while preserving complete auditability required in highly regulated financial services environments.

Continue reading

From SCD2 Bronze to a Non-SCD Silver Layer in Databricks

This article explains a best-practice Databricks lakehouse pattern for transforming fully historical SCD2 Bronze data into clean, non-SCD Silver tables. Bronze preserves complete temporal truth for audit, compliance, and investigation, while Silver exposes simplified, current-state views optimised for analytics and data products. Using Delta Lake features such as MERGE, Change Data Feed, OPTIMIZE, and ZORDER, organisations, particularly in regulated Financial Services, can efficiently maintain audit-proof history while delivering fast, intuitive, consumption-ready datasets.

Continue reading

Operationalising SCD2 at Scale: Monitoring, Cost Controls, and Governance for a Healthy Bronze Layer

This article explains how to operationalise Slowly Changing Dimension Type 2 (SCD2) at scale in the Bronze layer of a medallion architecture, with a focus on highly regulated Financial Services environments. It outlines three critical pillars: monitoring, cost control, and governance, needed to keep historical data trustworthy, performant, and compliant. By tracking growth patterns, preventing meaningless updates, controlling storage and compute costs, and enforcing clear governance, organisations can ensure their Bronze layer remains a reliable audit-grade historical asset rather than an unmanaged data swamp.

Continue reading

Advanced SCD2 Optimisation Techniques for Mature Data Platforms

Advanced SCD2 optimisation techniques are essential for mature Financial Services data platforms, where historical accuracy, regulatory traceability, and scale demands exceed the limits of basic SCD2 patterns. Attribute-level SCD2 significantly reduces storage and computation by tracking changes per column rather than per row. Hybrid SCD2 pipelines, combining lightweight delta logs with periodic MERGEs into the main Bronze table, minimise write amplification and improve reliability. Hash-based and probabilistic change detection eliminate unnecessary updates and accelerate temporal comparison at scale. Together, these techniques enable high-performance, audit-grade SCD2 in platforms such as Databricks, Snowflake, BigQuery, Iceberg, and Hudi, supporting the long-term data lineage and reconstruction needs of regulated UK Financial Services institutions.

Continue reading

Using SCD2 in the Bronze Layer with a Non-SCD2 Silver Layer: A Modern Data Architecture Pattern for UK Financial Services

UK Financial Services firms increasingly implement SCD2 history in the Bronze layer while providing simplified, non-SCD2 current-state views in the Silver layer. This pattern preserves full historical auditability for FCA/PRA compliance and regulatory forensics, while delivering cleaner, faster, easier-to-use datasets for analytics, BI, and data science. It separates “truth” from “insight,” improves governance, supports Data Mesh models, reduces duplicated logic, and enables deterministic rebuilds across the lakehouse. In regulated UK Financial Services today, it is the only pattern I have seen that satisfies the full, real-world constraint set with no material trade-offs.

Continue reading

WTF Is SCD? A Practical Guide to Slowly Changing Dimensions

Slowly Changing Dimensions (SCDs) are how data systems manage attributes that evolve without constantly rewriting history. They determine whether you keep only the latest value, preserve full historical versions, or maintain a limited snapshot of changes. The classic SCD types (0–3, plus hybrids) define different behaviours… from never updating values, to overwriting them, to keeping every version with timestamps. The real purpose of SCDs is to make an explicit choice about how truth should behave in your analytics: what should remain fixed, what should update, and what historical context matters. Modern data platforms make tracking changes easy, but they don’t make the design decisions for you. SCDs are ultimately the backbone of reliable, temporal, reality-preserving analytics.

Continue reading

Conflicting Social Dynamics: Population Collapse Versus Behavioural Sink

Modern societies face two anxieties that appear contradictory: fears of population collapse and fears of behavioural-sink-like social breakdown. This article shows that both can be true simultaneously because they operate on different dimensions: biological decline and functional overcrowding. By integrating demographic and psychosocial dynamics, it explains how civilisation can be both underpopulated and overwhelmed at the same time.

Continue reading

UK Flywheel and the Missing Middle: Cyber Scenes from the National Theatre

A first-hand account of the UK Flywheel event at the National Theatre: part love letter to the UK cyber ecosystem, part demolition of the comforting myths around funding, government “capability”, and NCSC’s role. From the NCSC Annual Review to West Midlands Cyber Hub, this is what the day looked like from the founder trenches rather than the podium.

Continue reading

The Rise of AI–Cyber Policy Convergence: Who’s Leading the Discussion?

AI and cybersecurity are no longer separate conversations. In the UK, they’re becoming one strategic priority, with new leaders, risks, and regulatory battles emerging fast. Until recently, AI and cybersecurity lived in different corners of policy and funding. But that era is over. From deepfake fraud and LLM jailbreaks to AI-assisted vulnerability discovery, the UK now faces a landscape where cyber threats and AI systems are not just overlapping; they are entangled. And the convergence is reshaping national security strategies, tech standards, and regulatory structures. This article explores the organisations, thinkers, and working groups shaping the AI–cyber policy crossover in the UK, and how startups, researchers, and advisors can influence what comes next.

Continue reading

The NCSC Annual Review 2025: Between Capability and Stasis

The article examines the NCSC Annual Review 2025 as both a testament to accomplishment and a warning. It praises the NCSC’s technical competence but questions its identity: regulator, delivery agency, or state-backed market player? It highlights contradictions — DSIT hailing it as “the jewel in the crown” while eroding its remit, diluting CyberFirst into TechFirst, ending its startup work, and overstating the benefits of Cyber Essentials. The piece concludes that the NCSC is overextended and under-defined, needing clarity of purpose more than new initiatives — less performance, more direction.

Continue reading

Women in Cyber Leadership: How Inclusion is Shaping UK Strategy

From boardrooms to government panels, women in cybersecurity are now shaping the UK’s strategic direction, not just participating in it. For years, the conversation about women in cybersecurity focused on “getting a foot in the door.” Today, it’s about who’s in the room when national decisions are made, and increasingly, women are leading those conversations. Inclusion is no longer a side project. In the UK, it’s becoming a strategic imperative, with policy, funding, and procurement now reflecting gender equity, diverse leadership, and lived experience as core components of resilience, innovation, and national capability. This article maps how women in cyber leadership are influencing strategy at every level, from community hubs and boardrooms to national working groups and international policy circles.

Continue reading

Systems in Tension: Britain’s China Crisis Spy Farce and the Architecture of Denial

A forensic if mordant look at how the “Chinese spies in Parliament” case collapsed.  I don’t think it was lies, more a system that’s eating itself. Legal, political, and economic silos each told their own version of the truth until coherence disappeared into the vortex. Between Cummings’ claims, Martin’s rebuttals, the embassy standoff, and Kemi Badenoch’s attack on Starmer, it’s a living portrait of Britain’s institutions locked in tension. Prosperity versus protection; diplomacy versus denial. But it doesn’t mean the system is broken; it might be working exactly as intended. Get the money in at all costs?

Continue reading