Category Archives: article

WTF is Short vs Long: What Is the Difference?

This article explains the difference between “long” and “short” positions in plain English, without the usual financial jargon. It breaks down how buying assets differs from selling what you don’t own, why professionals use both together, and why short positions carry unique risks. By the end, readers can confidently understand what investors mean when they say they’re long one thing and short another.

Continue reading

Series Wrap-Up: Reconstructing Time, Truth, and Trust in UK Financial Services Data Platforms

This series explored how UK Financial Services data platforms can preserve temporal truth, reconstruct institutional belief, and withstand regulatory scrutiny at scale. Beginning with foundational concepts such as SCD2 and event modelling, it developed into a comprehensive architectural pattern centred on an audit-grade Bronze layer, non-SCD Silver consumption, and point-in-time defensibility. Along the way, it addressed operational reality, governance, cost, AI integration, and regulatory expectations. This final article brings the work together, offering a structured map of the series and a coherent lens for understanding how modern, regulated data platforms actually succeed. Taken together, this body of work describes what I refer to as a “land it early, manage it early” data platform architecture for regulated industries.

Continue reading

When It Comes To Cyber The Midlands Defence Blueprint Is Polite Fiction

The Midlands Defence & Security Blueprint presents itself as decisive and strategic, but in reality it repeats the same structural failures that undermined Midlands Engine. Cyber remains subordinated, underfunded, and ownerless, while coordination is mistaken for delivery. Written from the perspective of a practitioner who has built cyber capability on the ground, this article argues that resilience will not come from another blueprint, but from funded authority, real centres, and delivery.

Continue reading

The 2026 UK Financial Services Lakehouse Reference Architecture

An opinionated but practical blueprint for regulated, temporal, multi-domain data platforms: focused on authority, belief, and point-in-time defensibility. This article lays out a reference architecture for UK FS in 2026: not as a rigid blueprint, but as a description of what “good” now looks like in banks, insurers, payments firms, wealth platforms, and capital markets organisations operating under FCA/PRA supervision.

Continue reading

When Everyone’s an Expert: What AI Can Learn from the Personal Trainer Industry

As AI adoption accelerates, expertise is increasingly “performed” rather than earned. By comparing AI’s current hype cycle with the long-standing lack of regulation in the personal trainer industry, this piece examines how unregulated expertise markets reward confidence over competence, normalise harm, and erode trust. The issue isn’t regulation for its own sake; it’s accountability before failure becomes infrastructure.

Continue reading

Why Bronze-Level Temporal Fidelity Obsoletes Traditional Data Lineage Tools in Regulated Platforms

This article argues that in regulated financial services, true data lineage cannot be retrofitted through catalogues or metadata overlays. Regulators require temporal lineage: proof of what was known, when it was known, and how it changed. By preserving audit-grade temporal truth at the Bronze layer, lineage becomes an inherent property of the data rather than a post-hoc reconstruction. The article explains why traditional lineage tools often create false confidence and why temporal fidelity is the only regulator-defensible foundation for lineage.

Continue reading

Snapchat’s Settlement Is Not the Story: The End of “We’re Just Platforms” Is

Snap’s quiet settlement of a social media addiction lawsuit is not a legal footnote, but a signal that the long-standing claim of platform neutrality is failing. As courts begin to scrutinise design-driven harm, exploitation does not disappear; it evolves. In a post-AI social environment, the greatest risk is no longer overt addiction, but systems that simulate agency and authorship so convincingly that dependency feels like sovereignty: posing a deeper threat to dignity than compulsion ever did.

Continue reading

From Build to Run Without Losing Temporal Truth: Operating Model Realities for Regulated Financial Services Data Platforms

This article explores why most regulated data platforms fail operationally rather than technically. It argues that the operating model is the mechanism by which architectural intent survives change, pressure, and organisational churn. Focusing on invariants, authority, correction workflows, and accountability, it shows how platforms must be designed to operate safely under stress, not just in steady state. The piece bridges architecture and real-world execution, ensuring temporal truth and regulatory trust persist long after delivery.

Continue reading

Should You Learn Java in 2026? A Practitioner’s View on Languages, Careers, and Context

Java is still widely taught in universities, but far less commonly chosen for new work in practice. This article reframes the question “Should I learn Java?” as a problem of context, career intent, and developer productivity, drawing on real-world demand rather than syllabus inertia.

Continue reading

Cost Is a Control: FinOps and Cost Management in Regulated Financial Services Data Platforms

This article positions cost management as a first-class architectural control rather than a post-hoc optimisation exercise. In regulated environments, cost decisions directly constrain temporal truth, optionality, velocity, and compliance. The article explains why FinOps must prioritise predictability, authority, and value alignment over minimisation, and how poorly designed cost pressure undermines regulatory defensibility. By linking cost to long-term value creation and regulatory outcomes, it provides a principled framework for sustaining compliant, scalable data platforms.

Continue reading

From Threat Model to Regulator Narrative: Security Architecture for Regulated Financial Services Data Platforms

This article reframes security as an architectural property of regulated financial services data platforms, not a bolt-on set of controls. It argues that true security lies in preserving temporal truth, enforcing authority over data, and enabling defensible reconstruction of decisions under scrutiny. By grounding security in threat models, data semantics, SCD2 foundations, and regulator-facing narratives, the article shows how platforms can prevent silent history rewriting, govern AI safely, and treat auditability as a first-class security requirement.

Continue reading

Collapsing the Medallion: Layers as Patterns, Not Physical Boundaries

The medallion model was never meant to be a physical storage mandate. It is a pattern language for expressing guarantees about evidence, interpretation, and trust. In mature, regulated platforms, those guarantees increasingly live in contracts, lineage, governance, and tests: not in rigid physical layers. Collapsing the medallion does not weaken regulatory substantiation; it strengthens it by decoupling invariants from layout. This article explains why layers were necessary, why they eventually collapse, and what must never be lost when they do.

Continue reading

From Writes to Reads: Applying CQRS Thinking to Regulated Data Platforms

In regulated financial environments, data duplication is often treated as a failure rather than a necessity. Command Query Responsibility Segregation (CQRS) is an approach to separate concerns such as reads versus writes. This article reframes duplication through CQRS-style thinking, arguing that separating write models (which execute actions) from read models (which explain outcomes) is essential for both safe operation and regulatory defensibility. By making authority explicit and accepting eventual consistency, institutions can act in real time while reconstructing explainable, auditable belief over time. CQRS is presented not as a framework, but as a mental model for survivable data platforms.

Continue reading

Edge Systems Are a Feature: Why OLTP, CRM, and Low-Latency Stores Must Exist

Modern data platforms often treat operational systems as legacy constraints to be eliminated. This article argues the opposite. Transactional systems, CRM platforms, and low-latency decision stores exist because some decisions must be made synchronously, locally, and with authority. These “edge systems” are not architectural debt but purpose-built domains of control. A mature data platform does not replace them or centralise authority falsely; it integrates with them honestly, preserving their decisions, context, and evolution over time.

Continue reading

Blobs as First-Class Artefacts in Regulated Data Platforms

In regulated financial services, semi-structured payloads such as XML, JSON, PDFs, and messages are not “raw data” to be discarded after parsing: they are primary evidence. This article argues that blobs must be treated as first-class artefacts: preserved intact, timestamped, queryable, and reinterpretable over time. Relational models are interpretations that evolve; original payloads anchor truth. Platforms that discard or mutate artefacts optimise for neatness today at the cost of defensibility tomorrow.

Continue reading

Serving the Community: The Story of Servol, Charles Jordan MBE and What Was, What Is, and What Could Be

This article traces the history of Servol Community Services from its founding in 1979 by Charles Jordan MBE, exploring its mission, growth, and the pressures facing modern social-care charities. It reflects on recognition, institutional memory, and the quiet sacrifices behind community work, ending with a personal appeal to honour the charity’s origins and reconnect present delivery with its founding spirit.

Continue reading

Why Transactions Are Events, Not Slowly Changing Dimensions

This article argues that modelling transactions as slowly changing dimensions is a fundamental category error in financial data platforms. Transactions are immutable events that occur once and do not change; what evolves is the organisation’s interpretation of them through enrichment, classification, and belief updates. Applying SCD2 logic to transactions conflates fact with interpretation, corrupts history, and undermines regulatory defensibility. By separating immutable event records from mutable interpretations, platforms become clearer, auditable, and capable of reconstructing past decisions without rewriting reality.

Continue reading

Stolen Valour, Borrowed Honour, and Intellectual Property

This article uses the concept of stolen valour as a metaphor to examine recognition, attribution, and integrity in intellectual property, research, and start-ups. It explores the difference between honour that can be shared and credit that must be earned, arguing that while recognition can be gifted, it only retains meaning when grounded in truth. When attribution is misused, generosity curdles into erasure.

Continue reading

Authority, Truth, and Belief in Financial Services Data Platforms

Financial services data architectures often fail by asking the wrong question: “Which system is the system of record?” This article argues that regulated firms operate with multiple systems of authority, while truth exists outside systems altogether. What data platforms actually manage is institutional belief: what the firm believed at a given time, based on available evidence. By separating authority, truth, and belief, firms can build architectures that preserve history, explain disagreement, and withstand regulatory scrutiny through accountable, reconstructable decision-making.

Continue reading

Cyberpsychology Today: Signal, Noise, and What We’re Actually Talking About

As cyberpsychology gains visibility, it is also losing precision. This article maps how the term is currently used, identifies common category errors, and explains why collapsing distinct domains into a single label weakens both theory and practice. It clarifies the boundary between cyberpsychology and human-factors work, and positions Psyber Inc as downstream application rather than field definition.

Continue reading